Skip to main content Accessibility help
×
Hostname: page-component-8448b6f56d-jr42d Total loading time: 0 Render date: 2024-04-24T02:10:02.583Z Has data issue: false hasContentIssue false

7 - Cognitively Based Statistical Methods – Technical Illustrations

Published online by Cambridge University Press:  05 June 2012

Jacqueline P. Leighton
Affiliation:
University of Alberta
Mark J. Gierl
Affiliation:
University of Alberta
Get access

Summary

Research focused on the application of cognitive principles to assessment practices is thriving in educational measurement, particularly in the area of cognitively based statistical methods. Since the publication of key articles, chapters, and books two decades ago (e.g. Frederiksen, Glaser, Lesgold, & Shafto, 1990; Nichols, 1994; Nichols, Chipman, & Brennan, 1995; Ronning, Glover, Conoley, & Witt, 1990; Snow & Lohman, 1989), there has been clear recognition that inferences about examinees' knowledge and skills require detailed information on the organization, representation, and production of attributes from a cognitive model. The form that these models may take, their learning scientific foundations, and their applicability to the design and development of large-scale educational assessment are topics in this book. The desire to integrate cognition with assessment has also spawned a range of ambitious research activities designed to identify and evaluate examinees' knowledge and skills using new statistical methods. Many recent examples can be cited to document these focused research efforts. For example, an American Educational Research Association (AERA) special interest group, Cognition and Assessment, was formed in 2007 to provide a platform for scholars presenting cutting-edge research that combines the fields of cognitive psychology, cognitive science, educational psychology, educational assessment, and statistics to solve complex assessment problems using a multi-disciplinary approach. A special issue of the Journal of Educational Measurement was published in 2007 devoted to cognitively based statistical methods.

Type
Chapter
Information
The Learning Sciences in Educational Assessment
The Role of Cognitive Models
, pp. 234 - 264
Publisher: Cambridge University Press
Print publication year: 2011

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Almond, R. & Shute, A. (2009, April). Calibration of Bayesian network-based diagnostic assessment. In A. Rupp (Chair), Software for calibrating diagnostic classification models: An overview of the current state-of-the-art. Symposium conducted at the meeting of the American Educational Research Association, San Diego, CA.Google Scholar
Bolt, D., Chen, H., DiBello, L., Hartz, S., Henson, R., Roussos, L., Stout, W., & Templin, J. (April 2009). Cognitive diagnostic psychometric modelling: Fitting the fusion model with the Arpeggio system software. In A. Rupp (Chair), Software for calibrating diagnostic classification models: An overview of the current state-of-the-art. Symposium conducted at the meeting of the American Educational Research Association, San Diego, CA.Google Scholar
Cui, Y., Gierl, M.J., & Leighton, J.P. (2009, April). Estimating the attribute hierarchy method with Mathematica. In A. Rupp (Chair), Software for calibrating diagnostic classification models: An overview of the current state-of-the-art. Symposium conducted at the meeting of the American Educational Research Association, San Diego, CA.Google Scholar
Cui, Y. & Leighton, J.P. (2009). The hierarchy consistency index: Evaluating person fit for cognitive diagnostic assessment. Journal of Educational Measurement, 46, 429–449.CrossRefGoogle Scholar
Torre, J. (2008, July). The generalized DINA model. Paper presented at the annual International Meeting of the Psychometric Society (IMPS), Durham, NH.Google Scholar
Torre, J. (2009, April). Estimation code for the G-DINA model. In A. Rupp (Chair), Software for calibrating diagnostic classification models: An overview of the current state-of-the-art. Symposium conducted at the meeting of the American Educational Research Association, San Diego, CA.Google Scholar
Torre, J. & Douglas, J.A. (2004). Higher-order latent trait models for cognitive diagnosis. Psychometrika, 69, 333–353.CrossRefGoogle Scholar
Frederiksen, N., Glaser, R.L., Lesgold, A.M., & Shafto, M.G. (1990). Diagnostic monitoring of skills and knowledge acquisition. Hillsdale, NJ: Erlbaum.Google Scholar
Fu, J. & Li, Y. (2007, April). Cognitively diagnostic psychometric models: An integrated review. Paper presented at the annual meeting of the National Council on Measurement in Education, Chicago, IL.Google Scholar
Gierl, M.J., Cui, Y., & Hunka, S. (2008a). Using connectionist models to evaluate examinees' response patterns on tests. Journal of Modern Applied Statistical Methods, 7, 234–245.CrossRefGoogle Scholar
Gierl, M.J., Cui, Y., & Zhou, J. (2009). Reliability of attribute-based scoring in cognitive diagnostic assessment. Journal of Educational Measurement, 46, 293–313.CrossRefGoogle Scholar
Gierl, M.J., Wang, C., & Zhou, J. (2008b). Using the attribute hierarchy method to make diagnostic inferences about examinees' cognitive skills in algebra on the SAT©. Journal of Technology, Learning, and Assessment, 6 (6). Retrieved [date] from http://www.jtla.org.Google Scholar
Gierl, M.J., Alves, C., & Taylor-Majeau, R. (2010). Using the Attribute Hierarchy Method to make diagnostic inferences about examinees' skills in mathematics: An operational implementation of cognitive diagnostic assessment. International Journal of Testing, 10, 318–341.CrossRefGoogle Scholar
Hartz, S. (2002). A Bayesian framework for the unified model for assessing cognitive abilities: Blending theory with practicality. Unpublished doctoral dissertation, University of Illinois at Urbana-Champaign, Urbana-Champaign, IL.Google Scholar
Henson, R.A., Templin, J.L., & Willse, J.T. (2008). Defining a family of cognitive diagnostic model using log-linear models with latent variables. Psychometrika, 74, 191–210.CrossRefGoogle Scholar
Jang, E.E. (2009). Cognitive diagnostic assessment of L2 reading comprehension ability: Validity arguments for fusion model application to LanguEdge assessment. Language Testing, 26, 31–73.CrossRefGoogle Scholar
Junker, B.W. & Sijtsma, K. (2001). Cognitive assessment models with few assumptions, and connections with nonparametric item response theory. Applied Psychological Measurement, 25, 258–272.CrossRefGoogle Scholar
Lee, Y.W. & Sawaki, Y. (2009). Cognitive diagnosis approaches to language assessment: An overview. Language Assessment Quarterly, 6, 172–189.CrossRefGoogle Scholar
Leighton, J.P. (2009). Where's the psychology?: A commentary on “Unique Characteristics of Diagnostic Classification Models: A Comprehensive Review of the Current State-of-the-Art.”Measurement: Interdisciplinary Research and Perspectives, 6 (4), 272–275.Google Scholar
Leighton, J.P. & Gierl, M.J. (2007a). Defining and evaluating models of cognition used in educational measurement to make inferences about examinees' thinking processes. Educational Measurement: Issues and Practice, 26, 3–16.CrossRefGoogle Scholar
Leighton, J.P. & Gierl, M.J. (Eds.). (2007b). Cognitive diagnostic assessment for education: Theory and applications. Cambridge, UK: Cambridge University Press.CrossRefGoogle Scholar
Leighton, J.P., Gierl, M.J., & Hunka, S. (2004). The attribute hierarchy method for cognitive assessment: A variation on Tatsuoka's rule-space approach. Journal of Educational Measurement, 41, 205–236.CrossRefGoogle Scholar
Maris, E. (1995). Psychometric latent response models. Psychometrika, 60, 523–547.CrossRefGoogle Scholar
McClelland, J.L. (1998). Connectionist models and Bayesian inference. In Oaksford, M. & Chater, N. (Eds.), Rational models of cognition (pp. 21–53). Oxford: Oxford University Press.Google Scholar
Mislevy, R.J. & Levy, R. (2007). Bayesian psychometric modeling from an evidence-centered design perspective. In Rao, C. R. & Sinharay, S. (Eds.), Handbook of statistics, Vol. 26 (pp. 839–865). Amsterdam, The Netherlands: Elsevier.Google Scholar
Nichols, P.D. (1994). A framework for developing cognitively diagnostic assessments. Review of Educational Research, 64, 575–603.CrossRefGoogle Scholar
Nichols, P.D., Chipman, S.F., & Brennan, R.L. (1995). Cognitively diagnostic assessment. Hillsdale, NJ: Erlbaum.Google Scholar
Software, Norsys (2007). Netica manual. Http://www.norsys.com.
Pearl, J. (1988). Probabilistic reasoning in intelligent systems: Networks of plausible inference. San Mateo, CA: Kaufmann.Google Scholar
Ronning, R., Glover, J., Conoley, J.C., & Witt, J. (1990). The influence of cognitive psychology on testing and measurement: The Buros-Nebraska symposium on measurement and testing (Vol. 3.). Hillsdale, NJ: Erlbaum.Google Scholar
Roussos, L., DiBello, L.V., Stout, W., Hartz, S., Henson, R.A., & Templin, J.H. (2007). The fusion model skills diagnosis system. In Leighton, J. P., & Gierl, M. J. (Eds.), Cognitive diagnostic assessment for education: Theory and applications (pp. 275–318). Cambridge, UK: Cambridge University Press.Google Scholar
Rumelhart, D.E., Hinton, G.E., & Williams, R.J. (1986a). Learning representations by back-propagating errors. Nature, 323, 533–536.CrossRefGoogle Scholar
Rumelhart, D.E., Hinton, G.E., & Williams, R.J. (1986b). Parallel distributed processing (Vol. 1). Cambridge, MA: MIT Press.Google Scholar
Rupp, A. A. (2009, April). Software for calibrating diagnostic classification models: An overview of the current state-of-the-art. Symposium conducted at the meeting of the American Educational Research Association, San Diego, CA.Google Scholar
Rupp, A. A. & Templin, J. (2008). Unique characteristics of diagnostic classification models: A comprehensive review of the current state-of-the-art. Measurement: Interdisciplinary Research and Perspectives, 6, 219–262.Google Scholar
Rupp, A.A., Templin, J.L., & Henson, R.A. (2010). Diagnostic measurement: Theory, methods, and applications. New York: The Guilford Press.Google Scholar
Snow, R.E. & Lohman, D.F. (1989). Implications of cognitive psychology for educational measurement. In Linn, R. L. (Ed.), Educational measurement (3rd ed., pp. 263–331). New York: American Council on Education, Macmillan.Google Scholar
Tatsuoka, K.K. (1983). Rule space: An approach for dealing with misconceptions based on item response theory. Journal of Educational Measurement, 20, 345–354.CrossRefGoogle Scholar
Tatsuoka, K.K. (1995). Architecture of knowledge structures and cognitive diagnosis: A statistical pattern recognition and classification approach. In Nichols, P.D., Chipman, S.F., & Brennan, R.L. (Eds.), Cognitively diagnostic assessment (pp. 327–359). Hillsdale, NJ: Erlbaum.Google Scholar
Tatsuoka, K.K. & Tatsuoka, C. (2009, April). The rule space methodology: The Q matrix theory, rule space classification space and POSET model. In A. Rupp (Chair), Software for calibrating diagnostic classification models: An overview of the current state-of-the-art. Symposium conducted at the meeting of the American Educational Research Association, San Diego, CA.Google Scholar
Templin, J.L. (2006). CDM user's guide. Unpublished manuscript.
Templin, J.L. & Henson, R.A. (2006). Measurement of psychological disorders using cognitive diagnosis models. Psychological Methods, 11, 287–305.CrossRefGoogle ScholarPubMed
Templin, J.L, Henson, R.A., Douglas, J., & Hoffman, L. (2009, April). Estimating log-linear diagnostic classification models with Mplus and SAS. In A. Rupp (Chair), Software for calibrating diagnostic classification models: An overview of the current state-of-the-art. Symposium conducted at the meeting of the American Educational Research Association, San Diego, CA.Google Scholar
Thissen, D. & Steinberg, L. (1986). A taxonomy of item response models. Psychometrika, 51, 567–577.CrossRefGoogle Scholar
Davier, M. (2005). A general diagnostic model applied to language testing data (ETS Research Report No. RR-05–16). Princeton, NJ: Educational Testing Service.Google Scholar
Davier, M. (2007). Hierarchical general diagnostic models (Research Report No. RR-07–19). Princeton, NJ: Educational Testing Service.Google Scholar
Davier, M. & Xu, X. (2009, April). Estimating latent structure models (including diagnostic classification models) with mdltm: A software for multidimensional discrete latent traits models. In A. Rupp (Chair), Software for calibrating diagnostic classification models: An overview of the current state-of-the-art. Symposium conducted at the meeting of the American Educational Research Association, San Diego, CA.Google Scholar
West, P., Rutstein, D.W., Mislevy, R.J., Liu, J., Levy, R., DiCerbo, K.E., Crawford, A., Choi, Y., & Behrens, J.T. (2009, June). A bayes net approach to modeling learning progressions and task performances. Paper presented at the Learning Progression in Science Conference, Iowa City, IA.Google Scholar
Yan, D., Mislevy, R.J., & Almond, R.G. (2003). Design and analysis in a cognitive assessment (ETS Research Report No. RR-03–32). Princeton, NJ: Educational Testing Service.Google Scholar

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×