Back to Search Start Over

The role of rapid guessing and test‐taking persistence in modelling test‐taking engagement.

Authors :
Nagy, Gabriel
Ulitzsch, Esther
Lindner, Marlit Annalena
Source :
Journal of Computer Assisted Learning; Jun2023, Vol. 39 Issue 3, p751-766, 16p
Publication Year :
2023

Abstract

Background: Item response times in computerized assessments are frequently used to identify rapid guessing behaviour as a manifestation of response disengagement. However, non‐rapid responses (i.e., with longer response times) are not necessarily engaged, which means that response‐time‐based procedures could overlook disengaged responses. Therefore, the identification of disengaged responses could be improved by considering additional indicators of disengagement. We investigated the extent to which decreases in individuals' item solution probabilities over the course of a test reflect disengaged response behaviour. Objectives: To disentangle different types of possibly disengaged responses and better understand non‐effortful test‐taking behaviour, we augmented responses‐time‐based procedures for identifying rapid guessing with strategies for detecting disengaged responses on the basis of performance declines in non‐rapid responses. Methods: We combined item response theory (IRT) models for rapid guessing and test‐taking persistence to examine the capability of response times and item positions to capture response disengagement. We used a computerized assessment in which science items were randomly distributed across positions for each student. This allowed us to estimate individual differences in test‐taking persistence (i.e., the duration for which the initial level of performance is maintained) while accounting for rapid responses. Results and Conclusions: Response times did not fully explain disengagement; item responses reflected test‐taking persistence even when rapid responses were accounted for. This interpretation was supported by a strong correlation of test‐taking persistence with decreases in self‐reported test‐taking effort. Furthermore, our results suggest that IRT models for test‐taking persistence can effectively account for the undesirable impact of low test‐taking effort even when response times are unavailable. Practitioner Notes: Assessments of proficiencies that attempt to quantify what individuals know and can do lead to biased results when individuals provide disengaged responses.Item response times are frequently used to identify rapid guessing behaviour as a manifestation of response disengagement, but response‐time‐based procedures could overlook disengaged responses.To disentangle different types of possibly disengaged responses and better understand non‐effortful test‐taking behaviour, we augmented responses‐time‐based procedures for identifying rapid guessing with strategies for detecting disengaged responses on the basis of performance declines in non‐rapid responses.In a sample of fifth and sixth graders, we found that response times did not fully explain disengagement, as many students showed performance declines in non‐rapid item responses.Our results suggest that item response theory models for test‐taking persistence can effectively account for the undesirable impact of low test‐taking effort even when response times are unavailable. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
02664909
Volume :
39
Issue :
3
Database :
Complementary Index
Journal :
Journal of Computer Assisted Learning
Publication Type :
Academic Journal
Accession number :
163886536
Full Text :
https://doi.org/10.1111/jcal.12719