Project Activities
Standard dimensionality assessment techniques for IRT models are often applicable only to unidimensional models. This project developed statistical procedures for conducting dimensionality analysis for multidimensional IRT models. The goals of the project were to (1) develop a set of statistical procedures, including the creation of new discrepancy measures, for dimensionality analysis at the test, subtest, and item levels of analysis; (2) study and evaluate the performance of these procedures; and (3) create and freely distribute software to conduct such analyses in the free R software environment.
Structured Abstract
Research design and methods
The development of new discrepancy measures built off of past work by using the model-based covariance (MBC) for item-pairs as a building block to construct a generalized dimensionality discrepancy measure (GDDM). Further both MBC and GDDM were standardized to produce more interpretable discrepancy measures for assessing dimensionality. In addition, MBC, GDDM, and their standardized versions were modified to accommodate missing data. The project attempted to develop a comprehensive strategy to dimensionality assessment that includes investigation at the multiple levels of analysis through a combination of results at the test, subtest, and item-pair levels. The use of the above discrepancy measures to support inferences requires a framework for evaluating the values of the statistics. Posterior predictive model-checking (PPMC), a flexible Bayesian approach to model-checking that can be employed in a wide variety of settings, served as the statistical model-checking framework for this work. A simulation study was done to examine the proposed dimensionality analysis procedures in contexts with dichotomous and polytomous data, the possibility of guessing (warranted for multiple-choice formats), and missing data. The simulation study also provided evidence regarding the utility of PPMC with the discrepancy measures. The simulation study also helped prepare for an examination of the dimensionality analysis procedures using data from the National Assessment of Educational Progress Science Assessment.
People and institutions involved
IES program contact(s)
Products and publications
ERIC Citations: Find available citations in ERIC for this award here.
Journal article, monograph, or newsletter
Levy, R., Xu, Y., Yel, N., and Svetina, D. (2015). A Standardized Generalized Dimensionality Discrepancy Measure and a Standardized Model-Based Covariance for Dimensionality Assessment for Multidimensional Models. Journal of Educational Measurement, 52(2), 144-158.
Svetina, D., and Levy, R. (2014). A Framework for Dimensionality Assessment for Multidimensional Item Response Models. Educational Assessment, 19(1), 35-57.
Svetina, D., and Levy, R. (2012). An Overview of Software for Conducting Dimensionality Assessment in Multidimensional Models. Applied Psychological Measurement, 36(8): 659-669.
Questions about this project?
To answer additional questions about this project or provide feedback, please contact the program officer.