Skip to main content Accessibility help
×
Home

Why Assessment Centers Do Not Work the Way They Are Supposed To

Published online by Cambridge University Press:  07 January 2015


Charles E. Lance
Affiliation:
The University of Georgia
Corresponding
E-mail address:

Abstract

Assessment centers (ACs) are often designed with the intent of measuring a number of dimensions as they are assessed in various exercises, but after 25 years of research, it is now clear that AC ratings that are completed at the end of each exercise (commonly known as postexercise dimension ratings) substantially reflect the effects of the exercises in which they were completed and not the dimensions they were designed to reflect. This is the crux of the long-standing “construct validity problem” for AC ratings. I review the existing research on AC construct validity and conclude that (a) contrary to previous notions, AC candidate behavior is inherently cross-situationally (i.e., cross-exercise) specific, not cross-situationally consistent as was once thought, (b) assessors rather accurately assess candidate behavior, and (c) these facts should be recognized in the redesign of ACs toward task- or role-based ACs and away from traditional dimension-based ACs.


Type
Focal Article
Copyright
Copyright © Society for Industrial and Organizational Psychology 2008

Access options

Get access to the full version of this content by using one of the access options below.

References

Arthur, W. Jr., Day, E. A., McNelly, T. L., & Edens, P. S. (2003). A meta-analysis of the criterion-related validity of assessment center dimensions. Personnel Psychology, 56, 125154.CrossRefGoogle Scholar
Arthur, W. Jr., Woehr, D. J., & Maldegan, R. (2000). Convergent and discriminant validity of assessment center dimensions: A conceptual and empirical re-examination of the assessment center construct-related validity paradox. Journal of Management, 26, 813835.Google Scholar
Beaubein, J. M., Baker, D. P., & Salvaggio, A. N. (2004). Improving the construct validity of line operational simulation ratings: Lessons learned from the assessment center. International Journal of Aviation Psychology, 14, 117.CrossRefGoogle Scholar
Bowler, M. C., & Woehr, D. J. (2006). A meta-analytic evaluation of the impact of dimension and exercise factors on assessment center ratings. Journal of Applied Psychology, 91, 11141124.CrossRefGoogle ScholarPubMed
Bray, D. W., Campbell, R. J., & Grant, D. L. (1974). Formative years in business: A long-term AT&T study of managerial lives. New York: Wiley.Google Scholar
Bray, D. W., & Grant, D. L. (1966). The assessment center in the measurement of potential for business management. Psychological Monographs, 80 (17, Whole No. 625), 127.Google Scholar
Buffett, J. (1999). I don’t know and I don’t care. On beach house on the moon [CD]. Key West, FL: Margaritaville/Island.Google Scholar
Byham, W. C. (1977). Application of the assessment center method. In Moses, J. L. & Byham, W. C. (Eds.), Applying the assessment center method (pp. 3143). New York: Pergamon.CrossRefGoogle Scholar
Campbell, C. H., Ford, P., Rumsey, M. G., Pulakos, E. D., Borman, W. C., Felker, D. B., et al. (1990). Development of multiple job performance measures in a representative sample of jobs. Personnel Psychology, 43, 277300.CrossRefGoogle Scholar
Campbell, D. T., & Fiske, D. W. (1959). Convergent and discriminant validation by the multitrait-multimethod matrix. Psychological Bulletin, 56, 81105.CrossRefGoogle ScholarPubMed
Donahue, L. M., Truxillo, D. M., Cornwell, J. M., & Gerrity, M. J. (1997). Assessment center construct validity and behavioral checklists: Some additional findings. Journal of Social Behavior and Personality, 12(5), 85108.Google Scholar
Dunnette, M. D. (1971). The assessment of managerial talent. In McReynolds, P. (Ed.), Advances in psychological assessment (Vol. 2, pp. 79108). Palo Alto, CA: Science and Behavior Books.Google Scholar
Epstein, S., & O’Brien, E. J. (1985). The person-situation debate in historical and current perspective. Psychological Bulletin, 98, 513537.CrossRefGoogle ScholarPubMed
Funder, D. C. (1987). Errors and mistakes: Evaluating the accuracy of social judgment. Psychological Bulletin, 101, 7590.CrossRefGoogle ScholarPubMed
Gaugler, B. B., Rosenthal, D. B., Thornton, G. C. III, & Bentson, C. (1987). Meta-analysis of assessment center validity. Journal of Applied Psychology, 72, 493511.CrossRefGoogle Scholar
Gaugler, B. B., & Thornton, G. C. III. (1989). Number of assessment center dimensions as a determinant of assessor accuracy. Journal of Applied Psychology, 74, 611618.CrossRefGoogle Scholar
Haaland, S., & Christiansen, N. D. (2002). Implications of trait-activation theory for evaluating the construct validity of assessment center ratings. Personnel Psychology, 55, 137163.CrossRefGoogle Scholar
Hardison, C. M., & Sackett, P. R. (2004, April). Assessment center criterion-related validity: A meta-analytic update. Paper presented at the 19th Annual Conference of the Society for Industrial and Organizational Psychology, Chicago, IL.Google Scholar
Harris, M. M., Becker, A. S., & Smith, D. E. (1993). Does the assessment center scoring method affect the cross-situational consistency of ratings? Journal of Applied Psychology, 78, 675678.CrossRefGoogle Scholar
Hennessy, J., Mabey, B., & Warr, P. (1998). Assessment center observation procedures: An experimental comparison of traditional, checklist and coding methods. International Journal of Selection and Assessment, 6, 222231.CrossRefGoogle Scholar
Howard, A. (1997). A reassessment of assessment centers: Challenges for the 21st century. Journal of Social Behavior and Personality, 12(5), 1352.Google Scholar
International Task Force on Assessment Center Guidelines. (2000). Guidelines and ethical considerations for assessment center operations. Public Personnel Management, 28, 315331.Google Scholar
Jackson, D. J. R., Barney, A. R., Stillman, J. A. & Kirkley, B. (2007). When traits are behaviors: The relationship between behavioral responses and trait-based overall assessment center ratings. Human Performance, 20, 415432.Google Scholar
Jackson, D. J. R., Stillman, J. A., & Atkins, S. G. (2005). Rating tasks versus dimensions in assessment centers: A psychometric comparison. Human Performance, 18, 213241.CrossRefGoogle Scholar
Joyce, L. W., Thayer, P. W., & Pond, S. B. III. (1994). Managerial functions: An alternative to traditional assessment center dimensions? Personnel Psychology, 47, 109121.CrossRefGoogle Scholar
Kleinmann, M. (1993). Are rating dimensions in assessment centers transparent for participants? Consequences for criterion and construct validity. Journal of Applied Psychology, 78, 988993.CrossRefGoogle Scholar
Kleinmann, M., & Köller, O. (1997). Construct validity of assessment centers: Appropriate use of confirmatory factor analysis and suitable construction principles. Journal of Social Behavior and Personality, 12(5), 6584.Google Scholar
Knapp, D. (2006). The U.S. joint-service job performance measurement project. In Bennett, W. Jr., Lance, C. E., & Woehr, D. J. (Eds.), Performance measurement: Current perspectives and future challenges (pp. 113140). Mahwah, NJ: Erlbaum.Google Scholar
Kolk, N. J., Born, M. P., & Van Der Flier, H. (2002). Impact of common rater variance on construct validity of assessment center dimension judgments. Human Performance, 15, 325337.CrossRefGoogle Scholar
Kolk, N. J., Born, M. P., & Van Der Flier, H. (2003). The transparent assessment center: The effects of revealing dimensions to candidates. Applied Psychology: An International Review, 52, 648668.CrossRefGoogle Scholar
Lance, C. E., Foster, M. R., Gentry, W. A., & Thoresen, J. D. (2004). Assessor cognitive processes in an operational assessment center. Journal of Applied Psychology, 89, 2235.CrossRefGoogle Scholar
Lance, C. E., Foster, M. R., Nemeth, Y. M., Gentry, W. A., & Drollinger, S. (2007). Extending the nomological network of assessment center construct validity: Prediction of cross-situationally consistent and specific aspects of assessment center performance. Human Performance, 20, 345362.Google Scholar
Lance, C. E., Lambert, T. A., Gewin, A. G., Lievens, F., & Conway, J. M. (2004). Revised estimates of dimension and exercise variance components in assessment center post-exercise dimension ratings. Journal of Applied Psychology, 89, 377385.CrossRefGoogle Scholar
Lance, C. E., Newbolt, W. H., Gatewood, R. D., Foster, M. R., French, N., & Smith, D. E. (2000). Assessment center exercise factors represent cross-situational specificity, not method bias. Human Performance, 13, 323353.CrossRefGoogle Scholar
Lance, C. E., Noble, C. L., & Scullen, S. E. (2002). A critique of the correlated trait–correlated method (CTCM) and correlated uniqueness (CU) models for multitrait-multimethod (MTMM) data. Psychological Methods, 7, 228244.CrossRefGoogle Scholar
Lievens, F. (1998). Factors which improve the construct validity of assessment centers: A review. International Journal of Selection and Assessment, 6, 141152.CrossRefGoogle Scholar
Lievens, F. (2001a). Assessor training strategies and their effects on accuracy, interrater reliability, and discriminant validity. Journal of Applied Psychology, 86, 255264.CrossRefGoogle Scholar
Lievens, F. (2001b). Assessors and use of assessment centre dimensions: A fresh look at a troubling issue. Journal of Organizational Behavior, 22, 203221.CrossRefGoogle Scholar
Lievens, F. (2002). Trying to understand the different pieces of the construct validity puzzle of assessment centers: An examination of assessor and assessee effects. Journal of Applied Psychology, 87, 675686.CrossRefGoogle ScholarPubMed
Lievens, F., Chasteen, C. S., Day, E. A., & Christiansen, N. D. (2006). Large-scale investigation of the role of trait activation theory for understanding assessment center convergent and discriminant validity. Journal of Applied Psychology, 91, 247258.CrossRefGoogle ScholarPubMed
Lievens, F., & Conway, J. M. (2001). Dimensions and exercise variance in assessment center scores: A large-scale evaluation of multitrait-multimethod studies. Journal of Applied Psychology, 86, 12021222.CrossRefGoogle ScholarPubMed
Lievens, F., & Klimoski, R. J. (2001). Understanding the assessment centre process: Where are we now? International Review of Industrial and Organizational Psychology, 16, 246286.Google Scholar
Lovler, B., Rose, M., & Wesley, S. (2002, April). Finding assessment center construct validity: Try behaviors instead of dimensions. Paper presented at the 17th Annual Conference of the Society for Industrial and Organizational Psychology, Toronto, Ontario, Canada.Google Scholar
Lowry, P. E. (1995). The assessment center process: Assessing leadership in the public sector. Public Personnel Management, 24, 443450.CrossRefGoogle Scholar
Lowry, P. E. (1997). The assessment center process: New directions. Journal of Social Behavior and Personality, 12(5), 5362.Google Scholar
MacKinnon, D. W. (1977). From selecting spies to selecting managers—The OSS assessment program. In Moses, J. L. & Byham, W. C. (Eds.), Applying the assessment center method (pp. 1330). New York: Pergamon.CrossRefGoogle Scholar
McArthur, L. Z., & Baron, R. M. (1983). Toward an ecological theory of social perception. Psychological Bulletin, 90, 215238.Google Scholar
Moses, J. L. (1977). The assessment center method. In Moses, J. L. & Byham, W. C. (Eds.), Applying the assessment center method (pp. 311). New York: Pergamon.CrossRefGoogle Scholar
Moses, J. L. & Byham, W. C. (Eds.). (1977). Applying the assessment center method. New York: Pergamon.Google Scholar
Murray, H. A. (1938). Explorations in personality. New York: Oxford University Press.Google Scholar
Neidig, R. D., & Neidig, P. J. (1984). Multiple assessment center exercises and job relatedness. Journal of Applied Psychology, 69, 182186.CrossRefGoogle Scholar
Pervin, L. A. (1989). Persons, situations, and interactions: The history of a controversy and a discussion of theoretical models. Academy of Management Review, 14, 350360.CrossRefGoogle Scholar
Reilly, R. R., Henry, S., & Smither, J. W. (1990). An examination of the effects of using behavior checklists on the construct validity of assessment center dimensions. Personnel Psychology, 43, 7184.CrossRefGoogle Scholar
Robie, C., Osburn, H. G., Morris, M. A., Etchegaray, J. M., & Adams, K. A. (2000). Effects of the rating process on the construct validity of assessment center dimension evaluations. Human Performance, 13, 355370.CrossRefGoogle Scholar
Roth, P. L., Bobko, P., & McFarland, L. A. (2005). A meta-analysis of work sample test validity: Updating and integrating some classic literature. Personnel Psychology, 58, 10091037.CrossRefGoogle Scholar
Russell, C. J., & Domm, D. R. (1995). Two field tests of an explanation of assessment centre validity. Journal of Occupational and Organizational Psychology, 68, 2547.CrossRefGoogle Scholar
Sackett, P. R., & Dreher, G. F. (1982). Constructs and assessment center dimensions: Some troubling findings. Journal of Applied Psychology, 67, 401410.CrossRefGoogle Scholar
Sackett, P. R., & Dreher, G. F. (1984). Situation specificity of behavior and assessment center validation strategies: A rejoinder to Neidig and Neidig. Journal of Applied Psychology, 69, 187190.CrossRefGoogle Scholar
Sackett, P. R., & Harris, M. J. (1988). A further examination of the constructs underlying assessment center ratings. Journal of Business Psychology, 3, 214229.CrossRefGoogle Scholar
Sackett, P. R., & Tuzinski, K. (2001). The role of dimensions in assessment center judgment. In London, M. (Ed.), How people evaluate others in organizations (pp. 111129). Mahwah, NJ: Erlbaum.Google Scholar
Sagie, A., & Magnezy, R. (1997). Assessor type, number of distinguishable categories, and assessment centre construct validity. Journal of Occupational and Organizational Psychology, 67, 401410.Google Scholar
Schleicher, D. J., Day, D. V., Mayes, B. T., & Riggio, R. E. (2002). A new frame for frame-of-reference training: Enhancing the construct validity of assessment centers. Journal of Applied Psychology, 87, 735746.CrossRefGoogle ScholarPubMed
Schmidt, F. L., & Hunter, J. E. (1998). The validity and utility of selection methods in personnel psychology: Practical and theoretical implications of 85 years of research findings. Psychological Bulletin, 124, 262274.CrossRefGoogle Scholar
Schneider, J. R., & Schmitt, N. (1992). An exercise design approach to understanding assessment center dimension and exercise constructs. Journal of Applied Psychology, 77, 3241.CrossRefGoogle Scholar
Silverman, W. H., Dalessio, A., Woods, S. B., & Johnson, R. L. Jr. (1986). Influence of assessment center methods on assessors’ ratings. Personnel Psychology, 39, 565578.CrossRefGoogle Scholar
Smith, F. D. (1991). Work samples as measures of performance. In Wigdor, A. K. & Green, B. F. (Eds.), Performance assessment for the workplace (Vol. 2, pp. 2752). Washington, DC: National Academy Press.Google Scholar
Swann, W. B. (1984). Quest for accuracy in person perception: A matter of pragmatics. Psychological Review, 91, 457477.CrossRefGoogle ScholarPubMed
Teachout, M. S., & Pellum, M. W. (1991). Air force research to link standards for enlistment to on-the-job performance (AFHRL-TR-90–90). Brooks AFB, TX: Air Force Human Resources Laboratory, Training Systems Division.CrossRefGoogle Scholar
Tett, R. P., & Burnett, D. D. (2003). A personality trait-based interactionist model of job performance. Journal of Applied Psychology, 88, 500517.CrossRefGoogle ScholarPubMed
Thoresen, J. D. (2002, October). Do we need dimensions? Dimensions limited or unlimited. Paper presented at the meeting of the International Congress of Assessment Center Methods, Pittsburg, PA.Google Scholar
Thornton, G. C. III, & Byham, W. C. (1982). Assessment centers and managerial performance. New York: Academic.Google Scholar
Thornton, G. C. III, & Mueller-Hanson, R. A. (2004). Developing organizational simulations: A guide for practitioners and students. Mahwah, NJ: Erlbaum.Google Scholar
Widaman, K. F. (1985). Hierarchically nested covariance structure models for multitrait-multimethod data. Applied Psychological Measurement, 9, 126.CrossRefGoogle Scholar
Woehr, D. J., & Arthur, W. Jr. (2003). The construct-related validity of assessment center ratings: A review and meta-analysis of the role of methodological factors. Journal of Management, 29, 231258.CrossRefGoogle Scholar

Full text views

Full text views reflects PDF downloads, PDFs sent to Google Drive, Dropbox and Kindle and HTML full text views.

Total number of HTML views: 0
Total number of PDF views: 339 *
View data table for this chart

* Views captured on Cambridge Core between September 2016 - 5th December 2020. This data will be updated every 24 hours.

Hostname: page-component-b4dcdd7-nlmhz Total loading time: 0.229 Render date: 2020-12-05T23:26:13.201Z Query parameters: { "hasAccess": "0", "openAccess": "0", "isLogged": "0", "lang": "en" } Feature Flags last update: Sat Dec 05 2020 23:01:37 GMT+0000 (Coordinated Universal Time) Feature Flags: { "metrics": true, "metricsAbstractViews": false, "peerReview": true, "crossMark": true, "comments": true, "relatedCommentaries": true, "subject": true, "clr": false, "languageSwitch": true }

Send article to Kindle

To send this article to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle. Find out more about sending to your Kindle.

Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Why Assessment Centers Do Not Work the Way They Are Supposed To
Available formats
×

Send article to Dropbox

To send this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Dropbox.

Why Assessment Centers Do Not Work the Way They Are Supposed To
Available formats
×

Send article to Google Drive

To send this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Google Drive.

Why Assessment Centers Do Not Work the Way They Are Supposed To
Available formats
×
×

Reply to: Submit a response


Your details


Conflicting interests

Do you have any conflicting interests? *