Skip to main content Accessibility help
×
Hostname: page-component-8448b6f56d-wq2xx Total loading time: 0 Render date: 2024-04-25T05:43:51.188Z Has data issue: false hasContentIssue false

6 - Verbal Reports as Data for Cognitive Diagnostic Assessment

Published online by Cambridge University Press:  23 November 2009

Jacqueline P. Leighton
Affiliation:
Associate Professor of Educational Psychology, Centre for Research in Applied Measurement and Evaluation, University of Alberta
Mark J. Gierl
Affiliation:
Professor of Educational Psychology and Director of the Centre for Research in Applied Measurement and Evaluation, University of Alberta
Jacqueline Leighton
Affiliation:
University of Alberta
Mark Gierl
Affiliation:
University of Alberta
Get access

Summary

The term cognitive diagnostic assessment (CDA) is used in this chapter to refer to a specific type of student evaluation. Unlike classroom-based tests designed by teachers or large-scale assessments designed by test developers to measure how much an examinee knows about a subject domain, CDAs are designed to measure the specific knowledge structures (e.g., distributive rule in mathematics) and processing skills (e.g., applying the distributive rule in appropriate mathematical contexts) an examinee has acquired. The type of information provided by results from a CDA should answer questions such as the following: Does the examinee know the content material well? Does the examinee have any misconceptions? Does the examinee show strengths for some knowledge and skills but not others? The objective of CDAs, then, is to inform stakeholders of examinees' learning by pinpointing the location where the examinee might have specific problem-solving weaknesses that could lead to difficulties in learning. To serve this objective, CDAs are normally informed by empirical investigations of how examinees understand, conceptualize, reason, and solve problems in content domains (Frederiksen, Glaser, Lesgold, & Shafto, 1990; Nichols, 1994; Nichols, Chipman, & Brennan, 1995).

In this chapter, we focus on two methods for making sense of empirical investigations of how examinees understand, conceptualize, reason, and solve problems in content domains. As a way of introduction, we first briefly discuss the importance of CDAs for providing information about examinees' strengths and weaknesses, including the ways in which CDAs differ from traditional classroom-based tests and large-scale tests.

Type
Chapter
Information
Cognitive Diagnostic Assessment for Education
Theory and Applications
, pp. 146 - 172
Publisher: Cambridge University Press
Print publication year: 2007

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Anderson, J.R. (1990). Cognitive psychology and its implications. New York: W.H. Freeman.Google Scholar
Baddeley, A.D. (1986). Working memory. Oxford, UK: Oxford University Press.Google ScholarPubMed
Begg, I., & Harris, G. (1982). On the interpretation of syllogisms. Journal of Verbal Learning and Verbal Behavior, 21, 595–620.CrossRefGoogle Scholar
Byrne, R.M. (1989). Suppressing valid inferences with conditionals. Cognition, 31, 61–83.CrossRefGoogle ScholarPubMed
Chi, M.T.H. (1997). Quantifying qualitative analyses of verbal data: A practical guide. Journal of the Learning Sciences, 6, 271–315.CrossRefGoogle Scholar
Chi, M.T.H., Leeuw, N., Chiu, M.H., & LaVancher, C. (1994). Eliciting self-explanations improves understanding. Cognitive Science, 18, 439–477.Google Scholar
Chi, M.T.H., & VanLehn, K.A. (1991). The content of physics self-explanations. Journal of the Learning Sciences, 1, 69–105.CrossRefGoogle Scholar
Dawson, M.R.W. (1998). Understanding cognitive science. Malden, MA: Blackwell.Google Scholar
Desimone, L.M., & LeFloch, K.C. (2004). Are we asking the right questions? Using cognitive interviews to improve surveys in education research. Educational Evaluation and Policy Analysis, 26, 1–22.CrossRefGoogle Scholar
Embretson, S., & Gorin, J. (2001). Improving construct validity with cognitive psychology principles. Journal of Educational Measurement, 38, 343–368.CrossRefGoogle Scholar
Ericsson, K.A., & Simon, H.A. (1980). Verbal reports as data. Psychological Review, 87, 215–251.CrossRefGoogle Scholar
Ericsson, K.A., & Simon, H.A. (1993). Protocol analysis. Cambridge, MA: MIT Press.Google Scholar
Evans, J.St.B.T., Handley, S.J., Harper, C.N.J., & Johnson-Laird, P.N. (1999). Reasoning about necessity and possibility: A test of the mental model theory of deduction. Journal of Experimental Psychology: Learning, Memory, and Cognition, 25, 1495–1513.Google Scholar
Frederiksen, N., Glaser, R., Lesgold, A., & Shafto, M.G. (Eds.). (1990). Diagnostic monitoring of skill and knowledge acquisition. New Jersey: Lawrence Erlbaum Associates.Google Scholar
Galotti, K.M., Baron, J., & Sabini, J.P. (1986). Individual differences in syllogistic reasoning: Deduction rules or mental models?Journal of Experimental Psychology: General, 115, 16–25.CrossRefGoogle ScholarPubMed
Gierl, M.J., Leighton, J.P., & Hunka, S. (2000). Exploring the logic of Tatsuoka's rule-space model for test development and analysis. Educational Measurement: Issues and Practice, 19, 34–44.CrossRefGoogle Scholar
Girotto, V. (2004). Task understanding. In Leighton, J.P. & Sternberg, R.J. (Eds.), Nature of reasoning (pp. 103–128). New York: Cambridge University Press.Google Scholar
Hamilton, L.S., Nussbaum, E.M., & Snow, R.E. (1997). Interview procedures for validating science assessments. Applied Measurement in Education, 10, 181–200.CrossRefGoogle Scholar
Johnson-Laird, P.N. (1983). Mental models. Towards a cognitive science of language, inference, and consciousness. Cambridge, MA: Harvard University Press.Google Scholar
Johnson-Laird, P.N. (2004). Mental models and reasoning. In Leighton, J.P. & Sternberg, R.J. (Eds.), Nature of reasoning (pp. 169–204). Cambridge, UK: Cambridge University Press.Google Scholar
Johnson-Laird, P.N., & Bara, B.G. (1984). Syllogistic inference. Cognition, 16, 1–61.CrossRefGoogle ScholarPubMed
Katz, I.R., Bennett, E., & Berger, A.E. (2000). Effects of response format on difficulty of SAT-Mathematics items: It's not the strategy. Journal of Educational Measurement, 37, 39–57.CrossRefGoogle Scholar
Leighton, J.P. (2004). Avoiding misconceptions, misuse, and missed opportunities: The collection of verbal reports in educational achievement testing. Educational Measurement: Issues and Practice, Winter, 1–10.Google Scholar
Leighton, J.P. (2005). Teaching and assessing deductive reasoning skills. Journal of Experimental Education, 74, 109–136.Google Scholar
Leighton, J.P., & Gierl, M.J. (in press). Defining and evaluating models of cognition used in educational measurement to make inferences about examinees' thinking processes. Educational Measurement: Issues and Practice.CrossRef
Leighton, J.P., Gierl, M.J., & Hunka, S. (2004). The attribute hierarchy model: An approach for integrating cognitive theory with assessment practice. Journal of Educational Measurement, 41, 205–236.CrossRefGoogle Scholar
Leighton, J.P., & Gokiert, R. (2005a, April). The cognitive effects of test item features: Identifying construct irrelevant variance and informing item generation. Paper presented at the annual meeting of the National Council on Measurement in Education, Montreal.Google Scholar
Leighton, J.P., & Gokiert, R. (2005b, April). Investigating test items designed to measure higher-order reasoning using think-aloud methods. Paper presented at the annual meeting of the American Educational Research Association (AERA), Montreal.Google Scholar
Lohman, D.F. (2000). Complex information processing and intelligence. In Sternberg, R.J. (Ed.), Handbook of intelligence (pp. 285–340). New York: Cambridge University Press.CrossRefGoogle Scholar
Lukin, L.E., Bandalos, D.L., Eckhout, T.J., & Mickelson, K. (2004). Facilitating the development of assessment literacy. Educational Measurement: Issues and Practice, 23, 26–32.CrossRefGoogle Scholar
Millman, J., & Greene, J. (1989). The specification and development of tests of achievement and ability. In Linn, R.L. (Ed.), Educational measurement (3rd ed., pp. 335–366). New York: American Council of Education/Macmillan.Google Scholar
Mislevy, R.J. (1996). Test theory reconceived. Journal of Educational Measurement, 33, 379–416.CrossRefGoogle Scholar
Newell, A., & Simon, H.A. (1972). Human problem solving. Englewood Cliffs, New Jersey: Prentice Hall.Google Scholar
Nichols, P. (1994). A framework of developing cognitively diagnostic assessments. Review of Educational Research, 64, 575–603.CrossRefGoogle Scholar
Nichols, P.D., Chipman, S.F., & Brennan, R.L. (Eds.). (1995). Cognitively diagnostic assessment. Hillsdale, NJ: Erlbaum.Google Scholar
Nisbett, R., & Wilson, T.D. (1977). Telling more than we can know: Verbal reports on mental processes. Psychological Review, 84, 231–259.CrossRefGoogle Scholar
Norris, S.P. (1990). Effect of eliciting verbal reports of thinking on critical thinking test performance. Journal of Educational Measurement, 27, 41–58.CrossRefGoogle Scholar
Payne, J.W., Braunstein, M.L., & Carroll, J.S. (1978). Exploring predecisional behavior: An alternative approach to decision research. Organizational Behavior and Human Performance, 22, 17–44.CrossRefGoogle Scholar
Pellegrino, J.W., Baxter, G.P., & Glaser, R. (1999). Addressing the “Two Disciplines” problem: Linking theories of cognition and learning with assessment and instructional practice. Review of Research in Education, 24, 307–353.Google Scholar
Pressley, M., & Afflerbach, P. (1995). Verbal protocols of reading: The nature of constructively responsive reading. Hillsdale, NJ: Erlbaum.Google Scholar
Roberts, M.J. (2004). Heuristics and reasoning I: Making deduction simple. In Leighton, J.P. & Sternberg, R.J. (Eds.), Nature of reasoning (pp. 234–272). New York: Cambridge University Press.Google Scholar
Russo, J.E., Johnson, E.J., & Stephens, D.L. (1989). The validity of verbal protocols. Memory & Cognition, 17, 759–769.CrossRefGoogle ScholarPubMed
Snow, R.E., & Lohman, D.F. (1989). Implications of cognitive psychology for educational measurement. In Linn, R.L. (Ed.), Educational measurement (3rd ed., pp. 263–331). New York: American Council on Education/Macmillan.Google Scholar
Sternberg, R.J. (1990). Metaphors of mind: Conceptions of the nature of intelligence. Cambridge, UK: Cambridge University Press.Google Scholar
Tatsuoka, K.K. (1983). Rule space: An approach for dealing with misconceptions based on item response theory. Journal of Educational Measurement, 20, 345–354.CrossRefGoogle Scholar
Tatsuoka, K.K. (1990). Toward an integration of item-response theory and cognitive error diagnosis. In Frederiksen, N., Glaser, R., Lesgold, A., & Shafto, M. (Eds.), Diagnostic monitoring of skill and knowledge acquisition (pp. 453–488). Hillsdale, NJ: Erlbaum.Google Scholar
Taylor, K.L., & Dionne, J-P. (2000). Accessing problem-solving strategy knowledge: The complementary use of concurrent verbal protocols and retrospective debriefing. Journal of Educational Psychology, 92, 413–425.CrossRefGoogle Scholar
Willis, G.B. (2005). Cognitive interviewing: A tool for improving questionnaire design. Thousand Oaks, CA: Sage.CrossRefGoogle Scholar
Wilson, T.D. (1994). The proper protocol: Validity and completeness of verbal reports. Psychological Science, 5, 249–252.CrossRefGoogle Scholar

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×