7 results on '"Jimmy de la Torre"'
Search Results
2. Adjusting Person Fit Index for Skewness in Cognitive Diagnosis Modeling
- Author
-
Jimmy de la Torre, K Santos, and Matthias von Davier
- Subjects
Computer science ,05 social sciences ,050401 social sciences methods ,Library and Information Sciences ,Edgeworth series ,01 natural sciences ,Cornish–Fisher expansion ,Normal distribution ,010104 statistics & probability ,Mathematics (miscellaneous) ,0504 sociology ,Skewness ,Item response theory ,Statistics ,Null distribution ,Psychology (miscellaneous) ,0101 mathematics ,Statistics, Probability and Uncertainty ,Statistic ,Type I and type II errors - Abstract
Because the validity of diagnostic information generated by cognitive diagnosis models (CDMs) depends on the appropriateness of the estimated attribute profiles, it is imperative to ensure the accurate measurement of students’ test performance by conducting person fit (PF) evaluation to avoid flawed remediation measures. The standardized log-likelihood statistic lZ has been extended to the CDM framework. However, its null distribution is found to be negatively skewed. To address this issue, this study applies different methods of adjusting the skewness of lZ that have been proposed in the item response theory context, namely, χ2-approximation, Cornish-Fisher expansion, and Edgeworth expansion to bring its null distribution closer to the standard normal distribution. The skewness-corrected PF statistics are investigated by calculating their type I error and detection rates using a simulation study. Fraction-subtraction data are also used to illustrate the application of these PF statistics.
- Published
- 2019
3. A General Method of Empirical Q-matrix Validation
- Author
-
Jimmy de la Torre and Chia Yi Chiu
- Subjects
Basis (linear algebra) ,Process (engineering) ,Applied Mathematics ,05 social sciences ,050401 social sciences methods ,050301 education ,Contrast (statistics) ,computer.software_genre ,Mathematical proof ,Domain (software engineering) ,0504 sociology ,Component (UML) ,Econometrics ,Fraction (mathematics) ,Data mining ,0503 education ,computer ,General Psychology ,Q-matrix ,Mathematics - Abstract
In contrast to unidimensional item response models that postulate a single underlying proficiency, cognitive diagnosis models (CDMs) posit multiple, discrete skills or attributes, thus allowing CDMs to provide a finer-grained assessment of examinees’ test performance. A common component of CDMs for specifying the attributes required for each item is the Q-matrix. Although construction of Q-matrix is typically performed by domain experts, it nonetheless, to a large extent, remains a subjective process, and misspecifications in the Q-matrix, if left unchecked, can have important practical implications. To address this concern, this paper proposes a discrimination index that can be used with a wide class of CDM subsumed by the generalized deterministic input, noisy “and” gate model to empirically validate the Q-matrix specifications by identifying and replacing misspecified entries in the Q-matrix. The rationale for using the index as the basis for a proposed validation method is provided in the form of mathematical proofs to several relevant lemmas and a theorem. The feasibility of the proposed method was examined using simulated data generated under various conditions. The proposed method is illustrated using fraction subtraction data.
- Published
- 2015
4. The identification and validation process of proportional reasoning attributes: an application of a cognitive diagnosis modeling framework
- Author
-
Jimmy de la Torre and Hartono Tjoe
- Subjects
Process (engineering) ,General Mathematics ,Proportional reasoning ,media_common.quotation_subject ,Teaching method ,Protocol analysis ,Resolution (logic) ,Deliberation ,Education ,Identification (information) ,ComputingMilieux_COMPUTERSANDEDUCATION ,Mathematics education ,Selection (linguistics) ,Psychology ,media_common - Abstract
In this paper, we discuss the process of identifying and validating students’ abilities to think proportionally. More specifically, we describe the methodology we used to identify these proportional reasoning attributes, beginning with the selection and review of relevant literature on proportional reasoning. We then continue with the deliberation and resolution of differing views by mathematics researchers, mathematics educators, and middle school mathematics teachers of what should be learned theoretically and what can be taught practically in everyday classroom settings. We also present the initial development of proportional reasoning items as part of the two-phase validation process of the previously identified attributes. In particular, we detail in the first phase of the validation process our collaboration with middle school mathematics teachers in the creation of prototype items and the verification of each item-attribute specification in consideration of the most common ways (among many different ways) in which middle school students would have solved these prototype items themselves. In the second phase of the validation process, we elaborate our think-aloud interview procedure in the search for evidence of whether students generally solved the prototype items in the way they were expected to.
- Published
- 2013
5. Model Evaluation and Multiple Strategies in Cognitive Diagnosis: An Analysis of Fraction Subtraction Data
- Author
-
Jimmy de la Torre and Jeffrey A. Douglas
- Subjects
business.industry ,Applied Mathematics ,Subtraction ,Markov process ,Markov chain Monte Carlo ,Machine learning ,computer.software_genre ,Latent class model ,symbols.namesake ,Goodness of fit ,Joint probability distribution ,Item response theory ,Statistics ,symbols ,Fraction (mathematics) ,Artificial intelligence ,business ,computer ,General Psychology ,Mathematics - Abstract
This paper studies three models for cognitive diagnosis, each illustrated with an application to fraction subtraction data. The objective of each of these models is to classify examinees according to their mastery of skills assumed to be required for fraction subtraction. We consider the DINA model, the NIDA model, and a new model that extends the DINA model to allow for multiple strategies of problem solving. For each of these models the joint distribution of the indicators of skill mastery is modeled using a single continuous higher-order latent trait, to explain the dependence in the mastery of distinct skills. This approach stems from viewing the skills as the specific states of knowledge required for exam performance, and viewing these skills as arising from a broadly defined latent trait resembling the θ of item response models. We discuss several techniques for comparing models and assessing goodness of fit. We then implement these methods using the fraction subtraction data with the aim of selecting the best of the three models for this application. We employ Markov chain Monte Carlo algorithms to fit the models, and we present simulation results to examine the performance of these algorithms.
- Published
- 2008
6. Higher-order latent trait models for cognitive diagnosis
- Author
-
Jeffrey A. Douglas and Jimmy de la Torre
- Subjects
Markov chain ,Computer science ,business.industry ,Applied Mathematics ,Markov process ,Markov chain Monte Carlo ,Machine learning ,computer.software_genre ,Latent class model ,symbols.namesake ,Joint probability distribution ,Statistics ,Item response theory ,symbols ,Artificial intelligence ,Latent variable model ,business ,computer ,General Psychology ,Q-matrix - Abstract
Higher-order latent traits are proposed for specifying the joint distribution of binary attributes in models for cognitive diagnosis. This approach results in a parsimonious model for the joint distribution of a high-dimensional attribute vector that is natural in many situations when specific cognitive information is sought but a less informative item response model would be a reasonable alternative. This approach stems from viewing the attributes as the specific knowledge required for examination performance, and modeling these attributes as arising from a broadly-defined latent trait resembling theϑ of item response models. In this way a relatively simple model for the joint distribution of the attributes results, which is based on a plausible model for the relationship between general aptitude and specific knowledge. Markov chain Monte Carlo algorithms for parameter estimation are given for selected response distributions, and simulation results are presented to examine the performance of the algorithm as well as the sensitivity of classification to model misspecification. An analysis of fraction subtraction data is provided as an example.
- Published
- 2004
7. Erratum to: The Generalized DINA Model Framework
- Author
-
Jimmy de la Torre
- Subjects
Computer science ,Applied Mathematics ,Applied mathematics ,General Psychology - Published
- 2011
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.