Skip to main content Accessibility help
×
Hostname: page-component-848d4c4894-xfwgj Total loading time: 0 Render date: 2024-06-20T07:03:40.704Z Has data issue: false hasContentIssue false

15 - The Influence of Learning Research on the Design and Use of Assessment

Published online by Cambridge University Press:  04 August 2010

K. Anders Ericsson
Affiliation:
Florida State University
Get access

Summary

In the education and training worlds, the term “assessment” is on everyone's agenda – from classroom instruction to local schools, to on-the-job training, to international comparisons of education, both at high school and college levels. Although the term has technical variations in meaning, the ascriptions of assessment dramatically shift according to the education and training settings in which the terms are used. In the clinical psychology realm, assessment usually denotes the use of specific procedures to examine and interpret patient status as well as the results of these procedures. In the military and in business sectors, assessment applies to a broader range of activities and purposes that can include the evaluation of a program's effectiveness, an individual's productivity, and even the estimate of future status of an entire area, as in “technology assessment.” In this discussion, I will use the term assessment and its functional synonym, “test,” to refer both to (1) the procedures designed and used, and (2) the criteria and judgments made, which together estimate the proficiency of an individual or group with respect to a domain of desired behaviors, achievements, or performances. Assessment procedures within this interpretation may vary with respect to their purposes, design, surface features, methods of analysis, and reporting approaches. They also may explicitly favor uniform or adaptive performance. Assessment can also be used as part of the dependent variables for efforts such as research studies or program evaluations.

Type
Chapter
Information
Development of Professional Expertise
Toward Measurement of Expert Performance and Design of Optimal Learning Environments
, pp. 333 - 355
Publisher: Cambridge University Press
Print publication year: 2009

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

,American Educational Research Association, American Psychological Association, & National Council on Measurement in Education. (1999). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.Google Scholar
Anderson, J. R. (1983). The architecture of cognition. Cambridge, MA: Harvard University Press.Google Scholar
Anderson, L. W., & Krathwohl, D. R. (Eds.). (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom's taxonomy of educational objectives. New York: Longman.
Bachman, L. F., & Palmer, A. S. (1996). Language testing in practice. Oxford, UK: Oxford University Press.Google Scholar
Bailey, A. L., & Butler, F. A. (2004). Ethical considerations in the assessment of the language and content knowledge of English language learners K-12. Language Assessment Quarterly, 1(2&3), 177–193.CrossRef
Baker, E. L. (2005). Aligning curriculum, standards, and assessments: Fulfilling the promise of school reform. In Dwyer, C. A. (Ed.), Measurement and research in the accountability era (pp. 315–335). Mahwah, NJ: Lawrence Erlbaum Associates.Google Scholar
Baker, E. L. (2007). Model-based assessments to support learning and accountability: The evolution of CRESST's research on multiple-purpose measures. Educational Assessment (Special Issue), 12(3&4), 179–194.
Baker, E. L., Abedi, J., Linn, R. L., & Niemi, D. (1996, March/April). Dimensionality and generalizability of domain-independent performance assessments. Journal of Educational Research, 8(4), 197–205.CrossRef
Bassok, M., & Holyoak, K. J. (1989). Transfer of domain-specific problem solving procedures. Journal of Experimental Psychology: Learning, Memory, and Cognition, 16, 522–533.
Berka, C., Chung, G. K. W. K., Nagashima, S. O., Musacchia, A., Davis, G., Johnson, R., et al. (2008, March). Using interactive neuro-educational technology to increase the pace and efficiency of rifle marksmanship training. Paper presented at the annual meeting of the American Educational Research Association, New York, NY.
Bjork, R. A., & Richardson-Klavhen, A. (1989). On the puzzling relationship between environment context and human memory. In Izawa, C. (Ed.), Current issues in cognitive processes: The Tulane Flowerree Symposium on Cognition. Hillsdale, NJ: Lawrence Erlbaum Associates.Google Scholar
Bloom, B. S. (Ed.). (with Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R.). (1956). Taxonomy of educational objectives: The classification of education goals. Handbook 1: Cognitive domain. New York: David McKay.Google Scholar
Bloom, B. S. (Ed.). (1964). Stability and change in human characteristics. New York: John Wiley.
Bransford, J. D., & Johnson, M. K. (1973). Consideration of some problems of comprehension. In Chase, W. (Ed.), Visual information processing (pp. 383–438). New York: Academic Press.CrossRefGoogle Scholar
Bruner, J. S. (1977). The process of education. Cambridge, MA: Harvard University Press.Google Scholar
Cannon-Bowers, J. A., & Salas, E. (1997). A framework for developing team performance measures in training. In Brannick, M. T., Salas, E., & Prince, C. (Eds.), Team performance assessment and measurement: Theory, methods, and applications (pp. 45–62). Mahwah, NJ: Lawrence Erlbaum Associates.Google Scholar
Cannon-Bowers, J., & Salas, E. (1998). Making decisions under stress: Implications for individual and team training. Washington, DC: American Psychological Association.CrossRefGoogle Scholar
Chi, M. T. H., Glaser, R., & Farr, M. (Eds.). (1988). The nature of expertise. Hillsdale, NJ: Lawrence Erlbaum Associates.
Chung, G. K. W. K., Dionne, G. B., & Elmore, J. J. (2006). Diagnosis and prescription design: Rifle marksmanship skills (Deliverable to the Office of Naval Research). Los Angeles: University of California, National Center for Research on Evaluation, Standards, and Student Testing (CRESST).Google Scholar
Chung, G. K. W. K., Baker, E. L., Niemi, D., Delacruz, G. C., & O'Neil, H. F. (2007). Knowledge mapping (CRESST white paper). Los Angeles: University of California, National Center for Research on Evaluation, Standards, and Student Testing.Google Scholar
Chung, G. K. W. K., Delacruz, G. C., Dionne, G. B., & Bewley, W. L. (2003). Linking assessment and instruction using ontologies. Proceedings of the I/ITSEC, 25, 1811–1822.
Chung, G. K. W. K., Niemi, D., & Bewley, W. L. (2003, April). Assessment applications of ontologies. Paper presented at the annual meeting of the American Educational Research Association, Chicago, IL.
Cronbach, L. J., & Suppes, P. (Eds.). (1969). Research for tomorrow's schools: Disciplined inquiry for education. Stanford, CA/New York: National Academy of Education, Committee on Educational Research/ Macmillan.
Delacruz, G. C. (in preparation). Moving closer to the target: Investigating the impact of an evaluation tool on the training process. Unpublished doctoral dissertation, University of California, Los Angeles.
Delacruz, G. C., Chung, G. K. W. K., Heritage, M., Vendlinski, T., Bailey, A., & Kim, J-O. (2007, April). Validating knowledge elicitation techniques: Examining the relation between measures of content knowledge and knowledge of teaching algebra. Paper presented at the annual meeting of the National Council on Measurement in Education, Chicago, IL.
Ericsson, K. A. (Ed). (1996). The road to excellence: The acquisition of expert performance in the arts and sciences, sports, and games. Hillsdale, NJ: Lawrence Erlbaum Associates.
Ericsson, K. A. (Ed). (2003). The search for general abilities and basic capacities: Theoretical implications from the modifiability and complexity of mechanisms mediating expert performance. In Sternberg, R. J. & Grigorenko, E. L. (Eds.), Perspectives on the psychology of abilities, competencies, and expertise (pp. 93–125). Cambridge, UK: Cambridge University Press.CrossRef
Gagné, R. M. (1985). The conditions of learning (4th ed.). New York: Holt, Rinehart & Winston.Google Scholar
Gagné, R. M., Briggs, L. J., & Wagner, W. W. (1992). Principles of instructional design (4th ed.). Fort Worth, TX: Harcourt Brace Jovanovich.Google Scholar
Glaser, R. (1977). Adaptive education: Individual diversity and learning. New York: Holt, Rinehart and Winston.Google Scholar
Harris, C. W. (Ed.). (1963). Problems in measuring change. Madison: The University of Wisconsin Press.
Hively, W., Patterson, H. L., & Page, S. H. (1968). A “universe-defined” system of arithmetic achievement tests. Journal of Educational Measurement, 5, 275–290.CrossRef
Larkin, J. H., McDermott, J., Simon, D. P., & Simon, H. A. (1980). Expert and novice performance in solving physics problems. Science, 208, 1335–1342.CrossRef
Linn, R. L., Baker, E. L., & Dunbar, S. B. (1991). Complex, performance-based assessment: Expectations and validation criteria. Educational Researcher, 20(8), 15–21.CrossRef
Mayer, R. E. (2003). Learning and instruction. Upper Saddle River, NJ: Merrill Prentice-Hall.Google Scholar
Merrill, M. D. (2000). Knowledge objects and mental models. In Wiley, D. A. (Ed.), The instructional use of learning objects [on-line version]. Retrieved August 13, 2007, from, http://reusability.org/read/chapters/merrill.doc.
Messick, S. (1989). Validity. In Linn, R. L. (Ed.), Educational measurement (3rd ed., pp. 13–103). New York: Macmillan.Google Scholar
,National Research Council. (2001). Knowing what students know: The science and design of educational assessment. Committee on the Foundations of Assessment. Pelligrino, J., Chudowsky, N., & Glaser, R. (Eds.). Board on Testing and Assessment, Center for Education, Division of Behavioral and Social Sciences and Education. Washington, DC: National Academy Press.Google Scholar
Niemi, D., Vallone, J., & Vendlinski, T. (2006). The power of big ideas in mathematics education: Development and pilot testing of POWERSOURCE assessments (CSE Rep. No. 697). Los Angeles: University of California, National Center for Research on Evaluation, Standards, and Student Testing (CRESST).Google Scholar
,No Child Left Behind Act of 2001, Pub. L. No. 107–110, § 115 Stat. 1425 (2002).
O'Neil, H. F., & Chuang, S.-H. (2005). Self-regulation strategies. In O'Neil, H. F. (Ed.), What works in distance learning. Guidelines (pp. 111–121). Greenwich, CT: Information Age Publishing.Google Scholar
Spiro, R. J., Feltovich, P. J., Jackson, J. J., & Coulson, R. L. (1991). Cognitive flexibility, constructivism, and hypertext: Random access instruction for advanced knowledge acquisition in ill-structured domains. Educational Technology, 31(5), 24–33.
Sweller, J. (1999). Instructional design in technical areas. Camberwell, Australia: ACER Press.Google Scholar
Thorndike, E. L. (1904). An introduction to the theory of mental and social measurement. New York: Science Press.CrossRefGoogle Scholar

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×