Back to Search Start Over

Using Subject Matter Experts To Assess Content Representation: A MDS Analysis.

Authors :
American Council on Education, Washington, DC. GED Testing Service.
Sireci, Stephen G.
Geisinger, Kurt
Publication Year :
1993

Abstract

Various methods used to assess the content of a test are reviewed, and a new procedure designed to improve on these methods is presented. The two tests considered are a professional licensure examination, the auditing section of the Uniform Certified Public Accountant Examination, and an educational achievement test, a nationally standardized social studies achievement test. Previous methods have generally been empirical, using factor analysis or multidimensional scaling (MDS) to analyze the inter-item correlation matrix derived from examinee responses, or subjective, using the data provided by subject matter experts (SMEs) to determine whether items represent content areas that the test purports to measure. A method has previously been proposed that uses MDS to discover dimensions obtained from the analysis of ratings by SMEs of the similarity of items comprising a test. This study expanded that method by using 2 groups of SMEs (15 for each test) to evaluate the content of the 2 tests studied. Correlation and cluster analyses results suggest that the content structure of a test can be evaluated adequately by analyzing item similarity data provided by SMEs. Results further suggest that the MDS procedure should be used to supplement analyses of item relevance data rather than replace them. Six figures and 18 tables present analysis findings. (Contains 23 references.) (SLD)

Details

Language :
English
Database :
ERIC
Publication Type :
Report
Accession number :
ED363646
Document Type :
Reports - Evaluative<br />Speeches/Meeting Papers