Back to Search
Start Over
Modeling Rater Effects and Complex Learning Progressions Using Item Response Models
- Source :
-
ProQuest LLC . 2015Ph.D. Dissertation, University of California, Berkeley. - Publication Year :
- 2015
-
Abstract
- This dissertation is comprised of three papers that propose and apply psychometric models to deal with complexities and challenges in large-scale assessments, focusing on modeling rater effects and complex learning progressions. In particular, three papers investigate extensions and applications of multilevel and multidimensional item response models, with a primary focus on (1) detecting rater effects in double-scored performance assessments, (2) monitoring human raters with automated scoring engine, and (3) developing measurement models for complicated learning progressions. The first paper applies and assesses the trifactor model for multiple ratings data in double-scored performance assessments, in which two different raters give independent scores for the same responses (e.g., the GRE essay). The trifactor model incorporates a cross-classified structure (e.g., items and raters) in addition to the general dimension (e.g., examinees). The paper includes a simulation design that follows the GRE example to reflect the incompleteness and imbalance in the real world assessments. The effect of the missingness rate in the data and ignoring the differences among the raters are investigated using the simulations. The use of the trifactor model is illustrated with empirical data. The second paper applies mixed-effects ordered probit models for the purpose of examining the effectiveness and efficiency of utilizing scores from automated scoring engines (AE) to monitor and provide diagnostic feedback to human raters under training compared to the scores from the human experts (HE). Using the real rater training study data, three types of rater effects--severity, accuracy, and centrality of each rater--are related with model parameters, and compared for cases (a) when the AE is considered as the true score and (b) when the HE is considered as the true score.The third paper proposes a structured constructs model based on change-point analysis to deal with complicated learning progressions, in which relations between levels across multiple constructs are assumed in advance. Based on the change-point analysis, and reparameterizations of the multidimensional Rasch model and partial credit model, cut score parameters and discontinuity parameters are incorporated to classify the examinees into the levels in the learning progressions, and to model the hypothesized relations as the advantage for examinees belonging to a certain level in one construct to reach a level in another construct. Parameter recovery of the proposed model and the consequences of ignoring the hypothesized relations are assessed using simulations. The use of the proposed model is illustrated with empirical data and interpreted as contributing to validity evidence for the hypothesized relations. [The dissertation citations contained here are published with the permission of ProQuest LLC. Further reproduction is prohibited without permission. Copies of dissertations may be obtained by Telephone (800) 1-800-521-0600. Web page: http://www.proquest.com/en-US/products/dissertations/individuals.shtml.]
Details
- Language :
- English
- Database :
- ERIC
- Journal :
- ProQuest LLC
- Publication Type :
- Dissertation/ Thesis
- Accession number :
- ED600504
- Document Type :
- Dissertations/Theses - Doctoral Dissertations