Back to Search
Start Over
Marking Essays on Screen: An Investigation into the Reliability of Marking Extended Subjective Texts
- Source :
-
British Journal of Educational Technology . Sep 2010 41(5):814-826. - Publication Year :
- 2010
-
Abstract
- There is a growing body of research literature that considers how the mode of assessment, either computer-based or paper-based, might affect candidates' performances. Despite this, there is a fairly narrow literature that shifts the focus of attention to those making assessment judgements and which considers issues of assessor consistency when dealing with extended textual answers in different modes. This research project explored whether the mode in which a set of extended essay texts were accessed and read systematically influenced the assessment judgements made about them. During the project, 12 experienced English literature assessors marked two matched samples of 90 essay exam scripts on screen and on paper. A variety of statistical methods were used to compare the reliability of the essay marks given by the assessors across modes. It was found that mode did not present a systematic influence on marking reliability. The analyses also compared examiners' marks with a gold standard mark for each essay and found no shifts in the location of the standard of recognised attainment across modes.
Details
- Language :
- English
- ISSN :
- 0007-1013
- Volume :
- 41
- Issue :
- 5
- Database :
- ERIC
- Journal :
- British Journal of Educational Technology
- Publication Type :
- Academic Journal
- Accession number :
- EJ894508
- Document Type :
- Journal Articles<br />Reports - Research
- Full Text :
- https://doi.org/10.1111/j.1467-8535.2009.00979.x