Back to Search Start Over

Accuracy of Automated Written Expression Curriculum-Based Measurement Scoring

Authors :
Mercer, Sterett H.
Cannon, Joanna E.
Squires, Bonita
Guo, Yue
Pinco, Ella
Source :
Grantee Submission. 2021.
Publication Year :
2021

Abstract

We examined the extent to which automated written expression curriculum-based measurement (aWE-CBM) can be accurately used to computer score student writing samples for screening and progress monitoring. Students (n = 174) with learning difficulties in Grades 1-12 who received 1:1 academic tutoring through a community-based organization completed narrative writing samples in the fall and spring across two academic years. The samples were evaluated using four automated and hand-calculated WE-CBM scoring metrics. Results indicated automated and hand-calculated scores were highly correlated at all four timepoints for counts of total words written (rs = 1.00), words spelled correctly (rs = 0.99-1.00), correct word sequences (CWS; rs = 0.96-0.97), and correct minus incorrect word sequences (CIWS; rs = 0.86-0.92). For CWS and CIWS, however, automated scores systematically overestimated hand-calculated scores, with an unacceptable amount of error for CIWS for some types of decisions. These findings provide preliminary evidence that aWE-CBM can be used to efficiently score narrative writing samples, potentially improving the feasibility of implementing multi-tiered systems of support in which the written expression skills of large numbers of students are screened and monitored. [This paper was published in "Canadian Journal of School Psychology" v36 p304-317 2021 (EJ1314974).]

Details

Language :
English
Database :
ERIC
Journal :
Grantee Submission
Publication Type :
Report
Accession number :
ED628087
Document Type :
Reports - Research
Full Text :
https://doi.org/10.1177/0829573520987753