Influence of Attributes Relationship to Parameter Estimation in DINA

Article Preview

Abstract:

DINA model appreciated by many researchers considers the relationships among attributes as mutually independent, conjunction and non-compensation relationship, but in real applications, there may not be able to meet such a relationship. Study the impacts on parameter estimates for the DINA model because of attribute relationships. Simulation results show that when there is a hierarchical relationship between attributes, there will be a great impact on the parameter estimation accuracy of DINA model; and the parameter estimation accuracy mainly affected by attribute hierarchy and the number of subjects. When existing hierarchy relationships among attributes, using DINA model as cognitive diagnosis model would affect the validity of diagnostic tests.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

491-494

Citation:

Online since:

June 2014

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2014 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

* - Corresponding Author

[1] De La Torre, J. An empirically based method of Q-matrix validation for the DINA model: development and applications. Journal of Educational Measurement, 2008, 45(4), 343–362.

DOI: 10.1111/j.1745-3984.2008.00069.x

Google Scholar

[2] Junker, B. W., & Sijtsma, K. Cognitive assessment models with few assumptions, and connections with nonparametric item response theory. Applied Psychological Measurement, 2001, 25(3), 258-272.

DOI: 10.1177/01466210122032064

Google Scholar

[3] Leighton, J. P., Gierl, M. J., & Hunka, S. M. The attribute hierarchy method for cognitive assessment: a variation on Tatsuoka's rule-space approach. Journal of Educational Measurement, 2004, 41(3), 205–237.

DOI: 10.1111/j.1745-3984.2004.tb01163.x

Google Scholar

[4] Rupp, A. A., & Templin, J. L. The effects of Q-matrix misspecification on parameter estimates and classification accuracy in the DINA model. Educational and Psychological Measurement, 2008, 68(1), 78-96.

DOI: 10.1177/0013164407301545

Google Scholar

[5] Roussos, L. A, Templin, J. L., & Henson, R. A. Skills diagnosis using IRT-based latent class models. Journal of Educational Measurement, 2007, 44(4), 293-311.

DOI: 10.1111/j.1745-3984.2007.00040.x

Google Scholar

[6] Tatsuoka, K. K. Rule space: an approach for dealing with misconceptions based on item response theory. Journal of Educational Measurement, 1983, 20(4), 345-354.

DOI: 10.1111/j.1745-3984.1983.tb00212.x

Google Scholar