Skip to main content Accessibility help
×
Hostname: page-component-848d4c4894-mwx4w Total loading time: 0 Render date: 2024-07-02T10:50:20.860Z Has data issue: false hasContentIssue false

10 - Issues of Structure and Issues of Scale in Assessment from a Situative/Sociocultural Perspective

Published online by Cambridge University Press:  05 June 2012

Robert J. Mislevy
Affiliation:
Professor of Measurement and Statistics, University of Maryland, College Park
Pamela A. Moss
Affiliation:
University of Michigan, Ann Arbor
Diana C. Pullin
Affiliation:
Boston College, Massachusetts
James Paul Gee
Affiliation:
University of Wisconsin, Madison
Edward H. Haertel
Affiliation:
Stanford University, California
Lauren Jones Young
Affiliation:
The Spencer Foundation, Chicago
Get access

Summary

INTRODUCTION

A situative/sociocultural (S/SC) perspective “views knowledge as distributed among people and their environments, including the objects, artifacts, tools, books, and communities of which they are a part. Analyses of activity in this perspective focus on processes of interaction of individuals with other people and with physical and technological systems” (Greeno, Collins, and Resnick 1997). Accordingly, “a situated view of assessment emphasizes questions about the quality of student participation in activities of inquiry and sense making, and considers assessment practices as integral components of the general systems of activity in which they occur” (p. 37). Research on school learning from the SC perspective “incorporates explanatory concepts that have proved useful in fields such as ethnography and sociocultural psychology to study collaborative work, …mutual understanding in conversation, and other characteristics of interaction that are relevant to the functional success of the participants' activities” (p. 7). In such analyses, attention focuses on patterns of interactions that occur in detailed and particular situations, yields “thick” descriptions of the activities, and often produces voluminous data. Studies at this level of detail are essential for understanding the conditions and interactions through which students learn; that is, “opportunities to learn” that particular circumstances afford particular students in light of their particular personal and educational histories of experience.

Yet no practical assessment at the level of the classroom, let alone a school or a program, can demand scores of hours of video per student, all analyzed by a team of graduate students, each producing a multipage ideographic report.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2008

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

American Council on the Training of Foreign Languages. 1989. ACTFL Proficiency Guidelines. Yonkers: Author.
Bachman, L. F. 2005. Building and supporting a case for test use. Language Assessment Quarterly 2: 1–34.CrossRefGoogle Scholar
Bachman, L. F. and Palmer, A. S.. 1996. Language testing in practice. Oxford: Oxford University Press.Google Scholar
Bormuth, J. R. 1970. On the theory of achievement test items. Chicago: University of Chicago Press.Google Scholar
Chi, M. T. H., Glaser, R., and Farr, M., eds. 1988. The nature of expertise. Mahwah, N.J.: Erlbaum.Google Scholar
Collins, A., J. S. Brown, and S. E. Newman. 1989. Cognitive apprenticeship: Teaching the crafts of reading, writing, and mathematics. In Knowing, learning, and instruction: Essays in honor of Robert Glaser, edited by Resnick, L. B., 453–94. Hillsdale, N.J.: Lawrence Erlbaum Associates.Google Scholar
Cronbach, L. J., G. C. Gleser, H. Nanda, and N. Rajaratnam. 1972. The dependability of behavioral measurements: Theory of generalizability for scores and profiles. New York: Wiley.Google Scholar
Douglas, D. 2000. Assessing language for specific purposes. Cambridge: Cambridge University Press.Google Scholar
Enright, M. K., W. Grabe, K. Koda, P. Mosenthal, P. Mulcahy, and M. Schedl, 2000. TOEFL 2000 reading framework: A working paper (TOEFL Monograph Series MS-17). Princeton: Educational Testing Service.Google Scholar
Ericsson, K. A. 1996. The acquisition of expert performance: An introduction to some of the issues. In The road to excellence: The acquisition of expert performances, sports, and games, edited by Ericsson, K. A.. Mahwah, N.J.: Lawrence Erlbaum Associates.Google Scholar
Gasser, H. 1955. How to draw and paint. New York: Dell.Google Scholar
Gitomer, D. H., L. S. Steinberg, and R. J. Mislevy. 1995. Diagnostic assessment of trouble-shooting skill in an intelligent tutoring system. In Cognitively diagnostic assessment, edited by Nichols, P., Chipman, S., and Brennan, R., 73–101. Hillsdale, N.J.: Erlbaum.Google Scholar
Greeno, J. G. 1983. Conceptual entities. In Mental models, edited by Gentner, D. and Stevens, A. L.. Hillsdale, N.J.: Lawrence Erlbaum Associates.Google Scholar
Greeno, J. G., A. M. Collins, and L. B. Resnick. 1997. Cognition and learning. In Handbook of educational psychology, edited by Berliner, D. and Calfee, R., 15–47. New York: Simon and Schuster Macmillan.Google Scholar
Greeno, J. G., P. D. Pearson, and A. H. Schoenfeld. 1997. Implications for the National Assessment of Educational Progress of research on learning and cognition. In Assessment in transition: Monitoring the nation's educational progress, background studies, edited by Linn, R., Glaser, R., and Bohrnstedt, G., 151–215. Stanford: The National Academy of Education.Google Scholar
Holland, P. W. and Wainer, H.. 1993. Differential item functioning. Hillsdale, N.J.: Erlbaum.Google Scholar
Kadane, J. B. and Schum, D. A.. 1996. A probabilistic analysis of the Sacco and Vanzetti evidence. New York: Wiley.Google Scholar
Lave, J. 1988. Cognition in practice. New York: Cambridge University Press.CrossRefGoogle Scholar
Linn, R. L. 1993. Linking results of distinct assessments. Applied Measurement in Education 6: 83–102.Google Scholar
Messick, S. 1989. Validity. In Educational measurement, 3rd ed., edited by Linn, R. L., 13–103. New York: American Council on Education/Macmillan.Google Scholar
Messick, S. 1994. The interplay of evidence and consequences in the validation of performance assessments. Educational Researcher 23: 13–23.CrossRefGoogle Scholar
Mislevy, R. J. 1994. Evidence and inference in educational assessment. Psychometrika 59: 439–83.CrossRefGoogle Scholar
Mislevy, R. J. 2003. Substance and structure in assessment arguments. Law, Probability, and Risk 2: 237–58.CrossRefGoogle Scholar
Mislevy, R. J. and Gitomer, D. H.. 1996. The role of probability-based inference in an intelligent tutoring system. User-Modeling and User-Adapted Interaction 5: 253–82.CrossRefGoogle Scholar
Mislevy, R. J., Steinberg, L., and Almond, R.. 2003. On the structure of educational assessment. Measurement: Interdisciplinary Research and Perspectives 1: 3–62.Google Scholar
Mitchell, R. 1992. Testing for learning: How new approaches to evaluation can improve American schools. New York: The Free Press.Google Scholar
Myford, C. M. and Mislevy, R. J.. 1996. Monitoring and improving a portfolio assessment system. CSE Technical Report 402. Los Angeles: National Center for Research on Evaluation, Standards, and Student Testing (CRESST).Google Scholar
Newell, A. and Simon, H. A.. 1972. Human problem solving. Englewood Cliffs, N.J.: Prentice-Hall.Google Scholar
Resnick, L. B. 1997. Student performance portfolios. In Psychology and educational practice, edited by Walberg, H. J. and Haertel, G. D., 158–75. Berkeley: McCutchan.Google Scholar
Riconscente, M., Mislevy, R. J., and Hamel, L.. 2005. An introduction to PADI task templates. PADI Technical Report #3. Menlo Park, Calif.: SRI International.Google Scholar
Salthouse, T. A. 1991. Expertise as the circumvention of human processing limitations. In Toward a general theory of expertise, edited by Ericcson, K. A. and Smith, J., 286–300. Cambridge: Cambridge University Press.Google Scholar
Schum, D. A. 1994. The evidential foundations of probabilistic reasoning. New York: Wiley.Google Scholar
Schutz, A. and Moss, P. A.. 2004. Reasonable decisions in portfolio assessment: Evaluating complex evidence of teaching. Education Policy Analysis Archives 12. http://epaa.asu.edu/epaa/v12n33/.CrossRefGoogle Scholar
Shafer, G. 1976. A mathematical theory of evidence. Princeton: Princeton University Press.Google Scholar
Steinberg, L. S. and Gitomer, D. G.. 1996. Intelligent tutoring and assessment built on an understanding of a technical problem-solving task. Instructional Science 24: 223–58.CrossRefGoogle Scholar
Stewart, J. and R. Hafner. 1994. Research on problem solving: Genetics. In Handbook of research on science teaching and learning, edited by Gabel, D., 284–300. New York: Macmillan.Google Scholar
Toulmin, S. E. 1958. The uses of argument. Cambridge: Cambridge University Press.Google Scholar
Wolf, D., J. Bixby, J. Glenn, and Gardner. 1991. To use their minds well: Investigating new forms of student assessment. In Review of educational research, vol. 17, edited by Grant, G., 31–74. Washington, D.C.: American Educational Research Association.Google Scholar

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×