Hostname: page-component-848d4c4894-jbqgn Total loading time: 0 Render date: 2024-06-26T14:18:47.636Z Has data issue: false hasContentIssue false

Validation Is Like Motor Oil: Synthetic Is Better

Published online by Cambridge University Press:  07 January 2015

Jeff W. Johnson*
Affiliation:
Personnel Decisions Research Institutes
Piers Steel
Affiliation:
University Of Calgary
Charles A. Scherbaum
Affiliation:
Baruch College
Calvin C. Hoffman
Affiliation:
Los Angeles County Sheriff's Department and Alliant University
P. Richard Jeanneret
Affiliation:
Valtera Corporation
Jeff Foster
Affiliation:
Hogan Assessment Systems

Abstract

Although synthetic validation has long been suggested as a practical and defensible approach to establishing validity evidence, synthetic validation techniques are infrequently used and not well understood by the practitioners and researchers they could most benefit. Therefore, we describe the assumptions, origins, and methods for establishing validity evidence of the two primary types of synthetic validation techniques: (a) job component validity and (b) job requirements matrix. We then present the case for synthetic validation as the best approach for many situations and address the potential limitations of synthetic validation. We conclude by proposing the development of a comprehensive database to build prediction equations for use in synthetic validation of jobs across the U.S. economy and reviewing potential obstacles to the creation of such a database. We maintain that synthetic validation is a practically useful methodology that has great potential to advance the science and practice of industrial and organizational psychology.

Type
Focal Article
Copyright
Copyright © Society for Industrial and Organizational Psychology 2010 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

*

Personnel Decisions Research Institutes, 650 3rd Avenue s., Suite 1350, Minneapolis, MN 55402

**

Haskayne School of Business, University of Calgary

***

Department of Psychology, Baruch College

****

Los Angeles County Sheriff's Department and Alliant University

*****

Valtera Corporation

******

Hogan Assessment Systems.

References

Aguinis, H., & Stone-Romero, E. F. (1997). Methodological artifacts in moderated multiple regression and their effects on statistical power. Journal of Applied Psychology, 82, 192206.Google Scholar
American Educational Research Association, American Psychological Association, & National Council on Measurement in Education. (1999). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.Google Scholar
Balma, M. J. (1959). The development of processes for indirect or synthetic validity: 1. The concept of synthetic validity. A symposium. Personnel Psychology, 12, 395396.Google Scholar
Bartram, D. (2005). The great eight competencies: A criterion-centric approach to validation. Journal of Applied Psychology, 90, 11851203.Google Scholar
Bobko, P., & Roth, P. L. (2003). Meta-analysis and validity generalization as research tools: Issues of sample bias and degrees of mis-specification. In Murphy, K. R. (Ed.), Validity generalization: A critical review. Mahwah, NJ: Erlbaum.Google Scholar
Borman, W. C. (1979). Format and training effects on rating accuracy and rater errors. Journal of Applied Psychology, 64, 410421.Google Scholar
Borman, W. C., Buck, D. E., Hanson, M. A., Motowidlo, S. J., Stark, S., & Drasgow, F. (2001). An examination of the comparative reliability, validity, and accuracy of performance ratings made using computerized adaptive rating scales. Journal of Applied Psychology, 86, 965973.Google Scholar
Brannick, M. T. (2001). Implications of empirical Bayes meta-analysis for test validation. Journal of Applied Psychology, 86, 468480.Google Scholar
Brown, K.G., Le, H., & Schmidt, F. L. (2006). Specific aptitude theory revisited: Is there incremental validity for training performance? International Journal of Selection and Assessment, 14, 87100.Google Scholar
Burke, M. J., & Landis, R. S. (2003). Methodological and conceptual challenges in conducting and interpreting meta-analyses. In Murphy, K. R. (Ed.), Validity generalization: A critical review. Mahwah, NJ: Erlbaum.Google Scholar
Burke, M. J., & Pearlman, K. (1988). Recruiting, selecting, and matching people with jobs. In Campbell, J. P., & Campbell, R. J. (Eds.), Productivity in organizations (pp. 97142). San Francisco: Jossey-Bass.Google Scholar
Burke, M. J., Rupinski, M. T., Dunlap, W. P., & Davison, H. K. (1996). Do situational variables act as substantive causes of relationships between individual difference variables? Two large-scale tests of “common cause” models. Personnel Psychology, 49, 573598.Google Scholar
Campbell, J. P. (1990). Modeling the performance prediction problem in industrial and organizational psychology. In Dunnette, M. D., & Hough, L. M. (Eds.), Handbook of industrial and organizational psychology (2nd ed., Vol. 1, pp. 3974). Palo Alto, CA: Consulting Psychologists Press.Google Scholar
Civil Rights Act of 1991, 105 Stat. 1071 (1991).Google Scholar
Clarke, C. (2005). Automotive production systems and standardization: From Ford to the case of Mercedes-Benz. New York: Springer.Google Scholar
Connelly, B., & Ones, D. (2007, April). Combining conscientiousness scales: Can't get enough of the trait, baby . Paper presented at the 22nd Annual Conference of the Society for Industrial and Organizational Psychology, New York.Google Scholar
Cooper, W. H. (1981). Ubiquitous halo. Psychological Bulletin, 90, 218244.Google Scholar
Decotiis, T., & Petit, A. (1978). The performance appraisal process: A model and some testable propositions. Academy of Management Review, 3, 635646.Google Scholar
D’Egidio, E. L. (2001). Building a job component validity model using job analysis data from the Occupational Information Network . Unpublished doctoral dissertation, University of Houston, Houston, TX.Google Scholar
Dudley, N., Orvis, K., Lebiecki, J., & Cortina, J. (2006). A meta-analytic investigation of conscientiousness in the prediction of job performance: Examining the intercorrelations and the incremental validity of narrow traits. Journal of Applied Psychology, 91, 4057.Google Scholar
Equal Employment Opportunity Commission, Civil Service Commission, Department of Labor, & Department of Justice. (1978). Uniform guidelines on employee selection procedures. Federal Register, 43, 3829438309.Google Scholar
Gatewood, R. D., Feild, H. S., & Barrick, M. (2007). Human resource selection (6th ed.). Mason, OH: Thomson Learning.Google Scholar
Ghiselli, E. E. (1959). The development of processes for indirect or synthetic validity: II. The generalization of validity. A symposium. Personnel Psychology, 12, 397402.Google Scholar
Gibson, S., Harvey, R. J., & Quintela, Y. (2004, April). Holistic versus decomposed ratings of general dimensions of work activity . Poster presented at the 19th Annual Conference of the Society for Industrial and Organizational Psychology, Chicago.Google Scholar
Gibson, W. M., & Caplinger, J. A. (2007). Transportation of validation results. In McPhail, S. M. (Ed.), Alternative validation strategies: Developing new and leveraging existing validity evidence (pp. 2981). San Francisco: Jossey-Bass.Google Scholar
Green, N. (1997). Ready-to-wear and ready-to-work: A century of industry and immigrants in Paris and New York. Durham, NC: Duke University Press.Google Scholar
Guion, R. M. (1965). Synthetic validity in a small company: A demonstration. Personnel Psychology, 18, 4963.Google Scholar
Guion, R. M. (2006). Still learning. The Industrial-Organizational Psychologist, 44, 8386.Google Scholar
Hamilton, J. W., & Dickinson, T. L. (1987). Comparison of several procedures for generating J-coefficients. Journal of Applied Psychology, 72, 4954.Google Scholar
Hesketh, B., & Neal, A. (1999). Technology and performance. In Ilgen, D. R., & Pulakos, E. D. (Eds.), The changing nature of performance: Implications for staffing, motivation, and development (pp. 2155). San Francisco: Jossey-Bass.Google Scholar
Hirsh, H. R., Schmidt, F. L., & Hunter, J. E. (1986). Estimation of employment validities by less experienced judges. Personnel Psychology, 39, 337344.Google Scholar
Hoffman, C. C., Holden, L. M., & Gale, K. (2000). So many jobs, so little “N”: Applying expanded validation models to support generalization of cognitive test validity. Personnel Psychology, 53, 955991.Google Scholar
Hoffman, C. C., & McPhail, S. M. (1998). Exploring options for supporting test use in situations precluding local validation. Personnel Psychology, 51, 9871003.Google Scholar
Hoffman, C. C., Rashkovsky, B., & D’Egidio, E. L. (2007). Job component validity: Background, current research, and applications. In McPhail, S. M. (Ed.), Alternative validation strategies: Developing new and leveraging existing validity evidence (pp. 82121). San Francisco: Jossey-Bass.Google Scholar
Hollenbeck, J. R., & Whitener, E. M. (1988). Criterion-related validation for small sample contexts: An integrated approach to synthetic validity. Journal of Applied Psychology, 73, 536544.Google Scholar
Hough, L. M. (2001). I/Owes its advances to personality. In Roberts, B. W., & Hogan, R. T. (Eds.), The intersection of personality and industrial/organizational psychology (pp. 1944). Washington, DC: American Psychological Association.Google Scholar
Hough, L. M., & Ones, D. S. (2001). The structure, measurement, validity, and use of personality variables in industrial, work, and organizational psychology. In Anderson, N. R., Ones, D. S., Sinangil, H. K., & Viswesvaran, C. (Eds.), Handbook of work psychology (pp. 233277). London and New York: Sage.Google Scholar
Hulin, C. L., Henry, R. A., & Noon, S. L. (1990). Adding a dimension: Time as a factor in the generalizability of predictive relationships. Psychological Bulletin, 107, 328340.Google Scholar
Hulsheger, U. R., Maier, G. W., & Stumpp, T. (2007). Validity of general mental ability for the prediction of job performance and training success in Germany: A meta-analysis. International Journal of Selection and Assessment, 15, 318.Google Scholar
Hunter, J. E. (1983). Validity generalization for 12,000 jobs: An application of synthetic validity and validity generalization to the General Aptitude Test Battery (GATB). Washington, DC: U.S. Department of Labor, Employment Service.Google Scholar
Hunter, J. E., & Schmidt, F. L. (1996). Intelligence and job performance: Economic and social implications. Psychology, Public Policy, and Law, 2, 447472.Google Scholar
Hyland, A. M., & Muchinsky, P. M. (1991). Assessment of the structural validity of Holland's model with job analysis (PAQ) information. Journal of Applied Psychology, 76, 7580.Google Scholar
Ilgen, D. R., & Feldman, J. M. (1983). Performance appraisal: A process focus. In Staw, B. M., & Cummings, L. L. (Eds.), Research in organizational behavior (Vol. 5, pp. 141197). Greenwich, CT: JAI Press.Google Scholar
James, L. R., Demaree, R. G., Mulaik, S. A., & Ladd, R. T. (1992). Validity generalization in the context of situational models. Journal of Applied Psychology, 77, 314.Google Scholar
Jeanneret, P. R. (1992). Applications of job component/synthetic validity to construct validity. Human Performance, 5, 8196.Google Scholar
Jeanneret, P. R., Borman, W. C., Kubisiak, U. C., & Hanson, M. (1999). Generalized work activities. In Peterson, N. G., Mumford, M. D., Borman, W. C., Jeanneret, P. R., & Fleishman, E. A. (Eds.), An occupational information system for the 21st century: The development of O*NET. Washington, DC: American Psychological Association.Google Scholar
Jeanneret, P. R., & Strong, M. H. (2003). Linking O*NET job analysis information to job requirement predictors: An O*NET application. Personnel Psychology, 56, 465492.Google Scholar
Johnson, J. W. (2003). Toward a better understanding of the relationship between personality and individual job performance. In Barrick, M. R., & Ryan, A. M. (Eds.), Personality and work: Reconsidering the role of personality in organizations (pp. 83120). San Francisco: Jossey-Bass.Google Scholar
Johnson, J. W. (2007). Synthetic validity: A technique of use (finally). In McPhail, S. M. (Ed.), Alternative validation strategies: Developing new and leveraging existing validity evidence (pp. 122158). San Francisco: Jossey-Bass.Google Scholar
Johnson, J. W., & Carter, G. W. (in press). Validating synthetic validation: Comparing traditional and synthetic validity coefficients. Personnel Psychology.Google Scholar
Johnson, J. W., Carter, G. W., Davison, H. K., & Oliver, D. (2001). A synthetic validity approach to testing differential prediction hypotheses. Journal of Applied Psychology, 86, 774780.Google Scholar
LaPolice, C. C., Carter, G. W., & Johnson, J. W. (2008). Linking O*NET descriptors to occupational literacy requirements using job component validation. Personnel Psychology, 61, 405441.Google Scholar
Lawshe, C. H. (1952). Employee selection. Personnel Psychology, 6, 3134.Google Scholar
McCormick, E. J. (1959). The development of processes for indirect or synthetic validity: III. Application of job analysis to indirect validity. A symposium. Personnel Psychology, 12, 402413.Google Scholar
McCormick, E. J. (1979). Job analysis: Methods and applications. New York: American Management Association.Google Scholar
McCormick, E. J., DeNisi, A. S., & Shaw, J. B. (1979). Use of the Position Analysis Questionnaire for establishing the job component validity of tests. Journal of Applied Psychology, 64, 5156.Google Scholar
McCormick, E. J., & Jeanneret, P. R. (1988). Position Analysis Questionnaire (PAQ). In Gael, S. (Ed.), The job analysis handbook for business, industry, and government (Vol. II, pp. 825842). New York: Wiley.Google Scholar
McCormick, E. J., Jeanneret, P. R., & Mecham, R. C. (1972). A study of job characteristics and job dimensions as based on the Position Analysis Questionnaire (PAQ). Journal of Applied Psychology, 56, 347368 [Monograph].Google Scholar
McCormick, E. J., Mecham, R. C., & Jeanneret, P. R. (1989). Technical manual for the Position Analysis Questionnaire (PAQ) (2nd ed.). Logan, UT: PAQ Services.Google Scholar
McCoy. et al. v. Willamette Industries, Inc. U.S. District Court for the Southern District of Georgia, Savannah Division, Civil Action No. CV401-075 (2001).Google Scholar
McPhail, S. M. (2007). Alternative validation strategies: Developing new and leveraging existing validity evidence. San Francisco: Jossey-Bass.Google Scholar
Meyer, R. D., Dalal, R. S., & Bonaccio, S. (2009). A meta-analytic investigation into the moderating effects of situational strength on the conscientiousness-performance relationship. Journal of Organizational Behavior, 30, 10771102.Google Scholar
Morris, D. C., Hoffman, C. C., & Schultz, K. S. (2003). A comparison of job components validity estimates to meta-analytic validity estimates . Poster presented at the 18th Annual Conference of the Society for Industrial and Organizational Psychology, Orlando, FL.Google Scholar
Mossholder, K. W., & Arvey, R. D. (1984). Synthetic validity: A conceptual and comparative review. Journal of Applied Psychology, 69, 322333.Google Scholar
Murphy, K. (2008). Explaining the weak relationship between job performance and ratings of job performance. Industrial and Organizational Psychology: Perspectives on Science and Practice, 1, 148160.Google Scholar
Murphy, K. (2009). Validity, validation and values. The Academy of Management Annals, 3, 421461.Google Scholar
Nunnally, J. C., & Bernstein, I. H. (1994). Psychometric theory (3rd ed.). New York: McGraw-Hill.Google Scholar
Ones, D. S., & Viswesraran, C. (2003). Job-specific applicant pools and national norms for personality scales: Implications for range restriction corrections in validation research. Journal of Applied Psychology, 88, 570577.Google Scholar
Oswald, F. L., & McCloy, R. A. (2003). Meta-analysis and the art of the average. In Murphy, K. R. (Ed.), Validity generalization: A critical review. Mahwah, NJ: Erlbaum.Google Scholar
Pearlman, K., Schmidt, F. L., & Hunter, J. E. (1980). Validity generalization results for tests used to predict job proficiency and training success in clerical occupations. Journal of Applied Psychology, 65, 373406.Google Scholar
Peterson, N. G., Mumford, M. D., Levin, K. Y., Green, J., & Waksberg, J. (1999). Research method: Development and field testing of the content model. In Peterson, N. G., Mumford, M. D., Borman, W. C., Jeanneret, P. R., & Fleishman, E. A. (Eds.), An occupational information system for the 21st century: The development of O*NET. Washington, DC: American Psychological Association.Google Scholar
Peterson, N. G., Wise, L. L., Arabian, J., & Hoffman, R. G. (2001). Synthetic validation and validity generalization: When empirical validation is not possible. In Campbell, J. P., & Knapp, D. J. (Eds.), Exploring the limits of personnel selection and classification (pp. 411451). Mahwah, NJ: Erlbaum.Google Scholar
Primoff, E. S. (1957). The J-coefficient approach to jobs and tests. Personnel Administrator, 20, 3140.Google Scholar
Primoff, E. S. (1959). Empirical validation of the J-coefficient. Personnel Psychology, 12, 413418.Google Scholar
Primoff, E., & Fine, S. (1988). A history of job analysis. In Gael, S. (Ed.), The job analysis handbook for business, industry and government (pp. 1429). Toronto: John Wiley & Sons.Google Scholar
Pulakos, E. D., Arad, S., Donovan, M. A., & Plamondon, K. E. (2000). Adaptability in the workplace: Development of a taxonomy of adaptive performance. Journal of Applied Psychology, 85, 612624.Google Scholar
Rashkovsky, B., & Hoffman, C. C. (2005, April). Examining a potential extension of the job component validity model to include personality predictors. In Newman, D. A., & Hoffman, C. C. (Chairs), Personnel selection with multiple predictors: Issues and frontiers. Symposium conducted at the 20th Annual Conference of the Society for Industrial and Organizational Psychology, Los Angeles.Google Scholar
Ree, M. J., & Earles, J. A. (1991). Predicting training success: Not much more than g . Personnel Psychology, 44, 321332.Google Scholar
Ree, M. J., Earles, J. A., & Teachout, M. S. (1994). Predicting job performance: Not much more than g . Journal of Applied Psychology, 79, 518524.Google Scholar
Rotundo, M., & Sackett, P. R. (2002). The relative importance of task, citizenship, and counterproductive performance to global ratings of job performance: A policy-capturing approach. Journal of Applied Psychology, 87, 6680.Google Scholar
Rounds, J. B., Shubsachs, A. P. W., Dawis, R. V., & Lofquist, L. H. (1978). A test of Holland's environmental formulations. Journal of Applied Psychology, 63, 609616.Google Scholar
Russell, C. J., & Gilliland, S. W. (1995). Why meta-analysis doesn't tell us what the data really mean: Distinguishing between moderator effects and moderator processes. Journal of Management, 21, 813831.Google Scholar
Rybczynski, W. (2000). One good turn: A natural history of the screwdriver and the screw. Toronto: HarperFlamingo.Google Scholar
Sackett, P. R. (1991). Exploring strategies for clustering military occupations. In Wigdor, A., & Green, B. (Eds.), Performance assessment for the workplace (Vol. 2, pp. 305332). Washington, DC: National Academy Press.Google Scholar
Sackett, P. R. (2003). The status of validity generalization research: Key issues in drawing inferences from cumulative research findings. In Murphy, K. R. (Ed.), Validity generalization: A critical review. Mahwah, NJ: Erlbaum.Google Scholar
Sackett, P. R., & Ellingson, J. E. (1997). The effects of forming multi-predictor composites on group differences and adverse impact. Personnel Psychology, 50, 707721.Google Scholar
Sackett, P. R., Schmitt, N., Tenopyr, M., Kehoe, J., & Zedeck, S. (1985). Commentary on “Forty questions about validity generalization and meta-analysis.” Personnel Psychology, 38, 697798.Google Scholar
Scherbaum, C. A. (2005). Synthetic validity: Past, present, and future. Personnel Psychology, 58, 481515.Google Scholar
Schmidt, F. L., & Hunter, J. E. (1977). Development of a general solution to the problem of validity generalization. Journal of Applied Psychology, 62, 529540.Google Scholar
Schmidt, F. L., & Hunter, J. E. (1998). The validity and utility of selection methods in personnel psychology: Practical and theoretical implications of 85 years of research findings. Psychological Bulletin, 124, 262274.Google Scholar
Schmidt, F. L., Hunter, J. E., Croll, P. R., & McKenzie, R. C. (1983). Estimation of employment test validities by expert judgment. Journal of Applied Psychology, 68, 590601.Google Scholar
Schmidt, F. L., Hunter, J. E., & Pearlman, K. (1981). Task differences and the validity of aptitude tests in selection: A red herring. Journal of Applied Psychology, 66, 166185.Google Scholar
Schmidt, F. L., Hunter, J. E., & Pearlman, K. (1982). Progress in validity generalization: Comments on Callender and Osburn and further developments. Journal of Applied Psychology, 67, 835845.Google Scholar
Schmidt, F. L., Hunter, J. E., Pearlman, K., & Rothstein, H. R. (1985). Forty questions about validity generalization and meta-analysis. Personnel Psychology, 38, 697798.Google Scholar
Schneider, R. J., Hough, L. M., & Dunnette, M. D. (1996). Broadsided by broad traits: How to sink science in five dimensions or less. Journal of Organizational Behavior, 17, 639655.Google Scholar
Shultz, K. S., Riggs, M. L., & Kottke, J. L. (1999). The need for an evolving concept of validity in industrial and personnel psychology: Psychometric, legal, and emerging issues. Current Psychology, 17, 265286.Google Scholar
Smith, P. C. (1976). Behavior, results, and organizational effectiveness: The problem of criteria. In Dunnette, M. D. (Ed.), Handbook of industrial and organizational psychology (pp. 745775). Chicago: Rand-McNally.Google Scholar
Society for Industrial and Organizational Psychology (2003). Principles for the validation and use of personnel selection procedures. Bowling Green, OH: SIOP.Google Scholar
Sparrow, J. (1989). The utility of PAQ in relating job behaviors to traits. Journal of Occupational Psychology, 62, 151162.Google Scholar
Steel, P., Huffcutt, A., & Kammeyer-Muller, J. (2006). From the work one knows the worker: A systematic review of the challenges, solutions, and steps to creating synthetic validity. International Journal of Selection and Assessment, 14, 1636.Google Scholar
Steel, P., & Kammeyer-Mueller, J. (2002). Comparing meta-analytic moderator search techniques under realistic conditions. Journal of Applied Psychology, 87, 96111.Google Scholar
Steel, P., & Kammeyer-Mueller, J. (2008). Bayesian variance estimation for meta-analysis: Quantifying our uncertainty. Organizational Research Methods, 11, 5478.Google Scholar
Steel, P., & Kammeyer-Mueller, J. (2009). Using a meta-analytic perspective to enhance job component validation. Personnel Psychology, 62, 533552.Google Scholar
Steel, P., Schmidt, J., & Schultz, J. (2008). Refining the relationship between personality and subjective well-being. Psychological Bulletin, 134, 138161.Google Scholar
Taylor v. James River Corporation, CA 88-0818-T-C (TC) (S.D. AL, 1989).Google Scholar
Terpstra, D., & Rozell, E. (1997). Why some potentially effective staffing practices are seldom used. Public Personnel Management, 26, 483495.Google Scholar
Trattner, M. H. (1982). Synthetic validation and its application to the Uniform Guidelines validation requirements. Personnel Psychology, 35, 383397.Google Scholar
Verive, J. M., & McDaniel, M. A. (1996). Short-term memory tests in personnel selection: Low adverse impact and high validity. Intelligence, 23, 1532.Google Scholar
Viswesvaran, C., & Ones, D. S. (1995). Theory testing: Combining psychometric meta-analysis and structural equations modeling. Personnel Psychology, 48, 865885.Google Scholar
Viswesvaran, C., Schmidt, F. L., & Ones, D. S. (2005). Is there a general factor in ratings of job performance? A meta-analytic framework for disentangling substantive and error influences. Journal of Applied Psychology, 90, 108131.Google Scholar
Wards Cove v. Atonio, 490 U.S., 104 L.Ed. 2d 733, 109 S.Ct. 2115 (1989).Google Scholar
Wilk, S. L., Desmarais, L. B., & Sackett, P. R. (1995). Gravitation to jobs commensurate with ability: Longitudinal and cross-sectional tests. Journal of Applied Psychology, 80, 7985.Google Scholar
Wilk, S. L., & Sackett, P. R. (1996). Longitudinal analysis of ability-job complexity fit and job change. Personnel Psychology, 49, 937967.Google Scholar