Back to Search
Start Over
Using the Attribute Hierarchy Method to Identify and Interpret Cognitive Skills that Produce Group Differences
- Source :
- Journal of Educational Measurement. 45:65-89
- Publication Year :
- 2008
- Publisher :
- Wiley, 2008.
-
Abstract
- The purpose of this study is to describe how the attribute hierarchy method (AHM) can be used to evaluate differential group performance at the cognitive attribute level. The AHM is a psychometric method for classifying examinees’ test item responses into a set of attribute-mastery patterns associated with different components in a cognitive model of task performance. Attribute probabilities, computed using a neural network, can be estimated on each attribute for each examinee thereby providing specific information about the examinee’s attribute-mastery level. These probabilities can also be compared across groups. We describe a four-step procedure for estimating and interpreting group differences using the AHM. We also provide an example using student response data from a sample of algebra items on the SAT to illustrate our pattern recognition approach for studying group differences. Assessment engineering (AE) is emerging as a new research area in educational and psychological measurement (Luecht, 2006a, 2006b). AE is an innovative approach to measurement, where engineering-like principles are used to direct the design and the analysis of assessments as well as the scoring and the reporting of the results. With this approach, an assessment begins with specific, empirically derived cognitive models (e.g., Leighton & Gierl, 2007). Next, assessment task templates are created using established frameworks derived from the cognitive model to produce test items. Finally, psychometric methods are applied to the examinee response data, typically in a confirmatory mode, to produce scores that are interpretable (Luecht, Gierl, Tan, & Huff, 2006). AE differs from more traditional approaches to test design and analysis in three fundamental ways. First, cognitive models guide task design and item development, rather than content-based test specifications. While the categories in content blueprints can be included in the task templates, the assessment principles used to develop items are based on cognitive principles and, thus, provide more specific information for measuring problem-solving skills. Second, task templates are created to control and manipulate both the content and cognitive attributes of the items. Item writers are required to use the templates during development, thereby producing items that adhere to strict quality controls and that meet high psychometric standards. Third, psychometric models are employed in a confirmatory, versus exploratory, manner to assess the model-data fit relative to the intended underlying structure of the constructs or traits the test is designed to measure. The outcomes from these model-data fit analyses also provide developers with guidelines for
- Subjects :
- Cognitive model
Test design
business.industry
computer.software_genre
Education
Test (assessment)
Task (project management)
Pattern recognition (psychology)
Developmental and Educational Psychology
Psychology (miscellaneous)
Artificial intelligence
Cognitive skill
Data mining
Attribute hierarchy method
business
Set (psychology)
Psychology
computer
Applied Psychology
Natural language processing
Subjects
Details
- ISSN :
- 17453984 and 00220655
- Volume :
- 45
- Database :
- OpenAIRE
- Journal :
- Journal of Educational Measurement
- Accession number :
- edsair.doi...........3d543bde25f7212fbbba18a25cf69571