The Journal of Writing Assessment provides a peer-reviewed forum for the publication of manuscripts from a variety of disciplines and perspectives that address topics in writing assessment. Submissions may investigate such assessment-related topics as grading and response, program assessment, historical perspectives on assessment, assessment theory, and educational measurement as well as other relevant topics. Articles are welcome from a variety of areas including K-12, college classes, large-scale assessment, and noneducational settings. We also welcome book reviews of recent publications related to writing assessment and annotated bibliographies of current issues in writing assessment.
Please refer to the submission guidelines on this page for information for authors and submission guidelines.
Volume 15, Issue 1, 2022
This editor's column provides an overview of Tamara Tate and Mark Warschauer's "Access, Digital Writing, and Achievement," Mary Stewart’s “Confronting the Ideologies of Assimilation and Colorblindness in Writing Program Assessment through Antiracist Dynamic Criteria Mapping," and Analeigh Horton's “Two Sisters and a Heuristic for Listening to Multilingual, International Students’ Directed Self-Placement Stories.”
Two Sisters and a Heuristic for Listening to Multilingual, International Students’ Directed Self-Placement Stories
Directed self-placement (DSP) is considered useful in linguistically and culturally diverse writing programs, but questions of self-efficacy and institutional knowledge sustain hesitancy in using DSP with English as an additional language (EAL) writers. This interview study grounded in sociocultural literacy theory explores multilingual, international students’ engagement with writing placement and courses, showcasing two quadrilingual, bicultural, international student sisters, Hemani and Kavya. Despite nearly identical linguistic, cultural, and educational backgrounds upon concurrently entering a writing program, they experienced DSP differently and enrolled in different sections: Hemani in mainstream and Kavya in EAL courses. Hemani shares DSP’s positive impacts on her writing program trajectory whereas Kavya’s story uncovers lost opportunities and feelings of otherness. Findings affirm that multilingual, international student placement is complex and that DSP is highly contextual. This study highlights DSP’s mission of building student agency as motivation for collecting primary data so marginalized students can explain DSP’s effects on their identity and development. Responding to the need for empirical research of EAL writers using DSP, the analysis considers effects of placement and offers a heuristic for examining placement experiences across contexts.
Confronting the Ideologies of Assimilation and Neutrality in Writing Program Assessment through Antiracist Dynamic Criteria Mapping
This article contributes to conversations about antiracist writing program assessment, with particular attention to the evaluation of first-year writing samples. In an effort to confront the racist ideologies of assimilation and neutrality, I employed a modified version of dynamic criteria mapping (DCM) that involved surveying students, conducting instructor focus groups, and analyzing writing prompts. The triangulated results informed the development of an assessment tool that was used to examine 89 writing samples. The goal of this assessment was not to produce a set of standards that mirror community values but rather to describe what was happening in the writing program and then use that information to facilitate critical reflection on the ways in which classroom practices align with or depart from the programmatic goal of delivering socially just writing instruction. By sharing my own experiences, I hope to help other writing program administrators (WPAs) develop processes for enacting antiracist writing assessment in their own contexts. I also reflect on the ways my procedure did—and did not—achieve its antiracist goals.
Students must compose texts using keyboards for college and career success. This study focuses on writing done in two school districts by students in Grades 4-11 on Google Docs to understand the relationships among digital device access, digital writing time, and standardized English language arts assessment scores. Our data cover three academic years: 2014-15, 2015-16, and 2016-17. We describe the amount of time spent writing in this mode and how it changed over grade levels and the relationship between Google Docs writing time and access to digital devices. Using fixed-effects regression, the amount of time spent writing digitally increased significantly during this time. Males and English learners spent fewer minutes writing in Google Docs compared to females and fluent English speakers. Students of color tended to spend more time writing in this mode than our White students. Device density (the number of school-provided digital devices per student) predicted the number of writing minutes in the first two, but not the third, years of our data. This study increases our foundational knowledge about the time spent by students on writing in this modality during a time in which these districts began to significantly adopt digital technology.