Hostname: page-component-7bb8b95d7b-495rp Total loading time: 0 Render date: 2024-09-26T04:32:20.736Z Has data issue: false hasContentIssue false

On the Mystery (or Myth) of Challenging Principles and Methods of Validity Generalization (VG) Based on Fragmentary Knowledge and Improper or Outdated Practices of VG

Published online by Cambridge University Press:  30 August 2017

In-Sue Oh*
Affiliation:
Department of Human Resource Management, Fox School of Business, Temple University
Philip L. Roth*
Affiliation:
Department of Management, College of Business, Clemson University
*
Correspondence concerning this article should be addressed to In-Sue Oh, Department of Human Resource Management, Fox School of Business, Temple University, 1801 Liacouras Walk, Philadelphia, PA 19122-6083. E-mail: insue.oh@temple.edu
Philip L. Roth, Department of Management, College of Business, Clemson University, Clemson, SC 29634-1305. E-mail: ROTHP@clemson.edu

Extract

In their focal article, Tett, Hundley, and Christiansen (2017) stated in multiple places that if there are good reasons to expect moderating effect(s), the application of an overall validity generalization (VG) analysis (meta-analysis) is “moot,” “irrelevant,” “minimally useful,” and “a misrepresentation of the data.” They used multiple examples and, in particular, a hypothetical example about the relationship between agreeableness and job performance. Four noteworthy problems with the above statements, other similar statements elsewhere in Tett et al.’s article, and their underlying assumptions are discussed below along with alternative perspectives.

Type
Commentaries
Copyright
Copyright © Society for Industrial and Organizational Psychology 2017 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

We thank Chris Berry, Ernest O'Boyle, and Frank Schmidt for their helpful comments on an earlier version of this commentary.

References

Barrick, M. R., & Mount, M. K. (1991). The Big Five personality dimensions and job performance: A meta‐analysis. Personnel Psychology, 44, 126.Google Scholar
Cortina, J. M. (2003). Apples and oranges (and pears, oh my!): The search for moderators in meta-analysis. Organizational Research Methods, 6, 415439.CrossRefGoogle Scholar
Cortina, J. M., Aguinis, H., & DeShon, R. P. (2017). Twilight of dawn or of evening? A century of research methods in the Journal of Applied Psychology . Journal of Applied Psychology, 102, 274290.CrossRefGoogle ScholarPubMed
Field, A. P. (2001). Meta-analysis of correlation coefficients: A Monte Carlo comparison of fixed-and random-effects methods. Psychological Methods, 6, 161180.Google Scholar
Gaugler, B. B., Rosenthal, D. B., Thornton, G. C., & Bentson, C. (1987). Meta-analysis of assessment center validity. Journal of Applied Psychology, 72 (3), 493511.CrossRefGoogle Scholar
Huffcutt, A. I., & Arthur, W. (1994). Hunter and Hunter (1984) revisited: Interview validity for entry-level jobs. Journal of Applied Psychology, 79, 184190.CrossRefGoogle Scholar
Hunter, J. E., & Hunter, R. F. (1984). Validity and utility of alternative predictors of job performance. Psychological Bulletin, 96 (1), 7298.Google Scholar
Hunter, J. E., & Schmidt, F. L. (1990). Methods of meta-analysis: Correcting error and bias in research findings (1st ed.). Thousand Oaks, CA: Sage.Google Scholar
Hunter, J. E., & Schmidt, F. L. (2004). Methods of meta-analysis: Correcting for error and bias in research findings (2nd ed.). Thousand Oaks, CA: Sage.Google Scholar
James, L. R., & McIntyre, H. H. (2010). Situational specificity and validity generalization. In Farr, J. L. & Tippins, N. T. (eds.), Handbook of employee selection (pp. 909920). New York: Routledge.Google Scholar
Muchinsky, P. M., & Raines, J. M. (2013). The overgeneralized validity of validity generalization. Journal of Organizational Behavior, 34, 10571060.Google Scholar
Pearlman, K., Schmidt, F. L., & Hunter, J. E. (1980). Validity generalization results for tests used to predict job proficiency and training success in clerical occupations. Journal of Applied Psychology, 65, 373406.CrossRefGoogle Scholar
Roth, P. L., Bobko, P., & McFarland, L. A. (2005). A meta-analysis of work sample test validity: Updating and integrating some classic literature. Personnel Psychology, 58 (4), 10091037.CrossRefGoogle Scholar
Roth, P. L., Bevier, C. A., Switzer, F. S., & Schippmann, J. S. (1996). Meta-analyzing the relationship between grades and job performance. Journal of Applied Psychology, 81, 548556.CrossRefGoogle Scholar
Salgado, J. F., Anderson, N., Moscoso, S., Bertua, C., & Fruyt, F. (2003). International validity generalization of GMA and cognitive abilities: A European Community meta‐analysis. Personnel Psychology, 56, 573605.Google Scholar
Schmidt, F. L. (2008). Meta-analysis: A constantly evolving research integration tool. Organizational Research Methods, 11, 96113.Google Scholar
Schmidt, F. L., & Hunter, J. E. (1977). Development of a general solution to the problem of validity generalization. Journal of Applied Psychology, 62, 529540.Google Scholar
Schmidt, F. L., & Hunter, J. E. (2015). Methods of meta-analysis: Correcting error and bias in research findings (3rd ed.). Thousand Oaks, CA: Sage.Google Scholar
Schmidt, F. L., Oh, I.-S., & Hayes, T. (2009). Fixed versus random effects models in meta-analysis: Model properties and an empirical comparison of differences in results. British Journal of Mathematical and Statistical Psychology, 62, 97128.Google Scholar
Schulze, R. (2004). Meta-analysis: A comparison of approaches. Cambridge, MA: Hogrefe & Huber.Google Scholar
Tett, R. P., Hundley, N. A., & Christiansen, N. D. (2017). Meta-Analysis and the myth of generalizability. Industrial and Organizational Psychology: Perspectives on Science and Practice, 10 (3), 421–456.CrossRefGoogle Scholar