Hostname: page-component-76fb5796d-vvkck Total loading time: 0 Render date: 2024-04-26T06:44:20.785Z Has data issue: false hasContentIssue false

An Inconvenient Truth: Arbitrary Distinctions Between Organizational, Mechanical Turk, and Other Convenience Samples

Published online by Cambridge University Press:  26 March 2015

Richard N. Landers*
Affiliation:
Department of Psychology, Old Dominion University.
Tara S. Behrend
Affiliation:
Department of Organizational Sciences, The George Washington University.
*
Correspondence concerning this article should be addressed to Richard N. Landers, 250 Mills Godwin Building, Department of Psychology, Old Dominion University, Norfolk, VA 23529, rnlanders@odu.edu

Abstract

Sampling strategy has critical implications for the validity of a researcher's conclusions. Despite this, sampling is frequently neglected in research methods textbooks, during the research design process, and in the reporting of our journals. The lack of guidance on this issue often leads reviewers and journal editors to rely on simple rules of thumb, myth, and tradition for judgments about sampling, which promotes the unnecessary and counterproductive characterization of sampling strategies as universally “good” or “bad.” Such oversimplification, especially by journal editors and reviewers, slows the progress of the social sciences by considering legitimate data sources to be categorically unacceptable. Instead, we argue that sampling is better understood in methodological terms of range restriction and omitted variables bias. This considered approach has far-reaching implications because in industrial–organizational (I-O) psychology, as in most social sciences, virtually all of the samples are convenience samples. Organizational samples are not gold standard research sources; instead, they are merely a specific type of convenience sample with their own positive and negative implications for validity. This fact does not condemn the science of I-O psychology but does highlight the need for more careful consideration of how and when a finding may generalize based on the particular mix of validity-related affordances provided by each sample source that might be used to investigate a particular research question. We call for researchers to explore such considerations cautiously and explicitly both in the publication and in the review of research.

Type
Focal Article
Copyright
Copyright © Society for Industrial and Organizational Psychology 2015 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Aguinis, H., & Edwards, J. R. (2014). Methodological wishes for the next decade and how to make wishes come true.Journal of Management Studies, 51, 143174.Google Scholar
Aguinis, H., & Lawal, S. O. (2012). Conducting field experiments using eLancing’s natural environment.Journal of Business Venturing, 27, 493505.Google Scholar
Aguinis, H., & Vandenberg, R. J. (2014). An ounce of prevention is worth a pound of cure: Improving research quality before data collection.Annual Review of Organizational Psychology and Organizational Behavior, 1, 569595.Google Scholar
Behrend, T. S., Sharek, D. J., Meade, A. W., & Wiebe, E. N. (2011). The viability of crowdsourcing for survey research. Behavior Research Methods, 43, 800813. doi:10.3758/s13428-011-0081-0CrossRefGoogle ScholarPubMed
Buhrmester, M., Kwang, T., & Gosling, S. D. (2011). Amazon's Mechanical Turk: A new source of inexpensive, yet high-quality, data? Perspectives on Psychological Science, 6, 35. doi:10.1177/1745691610393980Google Scholar
Campbell, J. (1986). Labs, fields, and straw issues. In Locke, E. A. (Ed.), Generalizing from laboratory to field settings (pp. 269279). Lexington, MA: Lexington Books.Google Scholar
Cook, T. D., & Campbell, D. T. (1976). The design and conduct of quasi-experiments and true experiments in field settings. In Dunnette, M. D. (Ed.), Handbook of industrial and organizational psychology (pp. 223336). Chicago, IL: Rand McNally.Google Scholar
Dipboye, R. L. (1990). Laboratory vs. field research in industrial and organizational psychology. International Review of Industrial and Organizational Psychology, 5, 134.Google Scholar
Gordon, M. E., Slade, L. A., & Schmitt, N. (1986). The “science of the sophomore” revisited: From conjecture to empiricism. Academy of Management Review, 11, 191207. doi:10.2307/258340Google Scholar
Greenberg, J. (1987). The college sophomore as guinea pig: Setting the record straight. Academy of Management Review, 12, 157159. doi:10.5465/AMR.1987.4306516CrossRefGoogle Scholar
Henrich, J., Heine, S. J., & Norenzayan, A. (2010). The weirdest people in the world? Behavioral and Brain Sciences, 33, 6183. doi:10.1017/S0140525'0999152XGoogle Scholar
Hunter, J. E., & Schmidt, F. L. (2004). Methods of meta-analysis: Correcting error and bias in research findings. Thousand Oaks, CA: Sage.CrossRefGoogle Scholar
Ilgen, D. (1986). Laboratory research: A question of when, not if. In Locke, E. A. (Ed.), Generalizing from laboratory to field settings (pp. 257267). Lexington, MA: Lexington Books.Google Scholar
James, L. R. (1980). The unmeasured variables problem in path analysis. Journal of Applied Psychology, 65, 415421.Google Scholar
Johns, G. 2006. The essential impact of context on organizational behavior. Academy of Management Review, 31, 396408.Google Scholar
Kenny, D. A. (1979). Correlation and causality. New York, NY: Wiley.Google Scholar
Landy, F. J. (2008). Stereotypes, bias, and personnel decisions: Strange and stranger. Industrial and Organizational Psychology, 1, 379392. doi:10.1111/j.1754-9434.2008.00071.xCrossRefGoogle Scholar
Mauro, R. (1990). Understanding LOVE (left out variables error): A method for estimating the effects of omitted variables. Psychological Bulletin, 108, 314329.CrossRefGoogle Scholar
McGrath, J. E. (1982). Dilemmatics: The study of research choices and dilemmas. In McGrath, J. E., Martin, J., & Kulka, R. A. (Eds.), Judgment calls in research (pp. 69102). Beverly Hills, CA: Sage.Google Scholar
McGrath, J. E., & Brinberg, D. (1984). Alternative paths for research: Another view of the basic versus applied distinction. In Oskamp, S. (Ed.), Applied Social Psychology Annual (pp. 109132). Beverly Hills, CA: Sage.Google Scholar
Meade, A. W., Behrend, T. S., & Lance, C. E. (2009). Dr. StrangeLOVE, or: How I learned to stop worrying and love omitted variables. In Lance, C. E. & Vandenberg, R. J. (Eds.), Statistical and methodological myths and urban legends: Doctrine, verity and fable in the organizational and social sciences (pp. 89106). New York, NY: Routledge.Google Scholar
Meade, A. W., & Craig, S. B. (2012). Identifying careless responses in survey data. Psychological Methods, 17, 437455.CrossRefGoogle ScholarPubMed
Pedhazur, E. J., & Schmelkin, L. P. (2013). Measurement, design, and analysis: An integrated approach. New York, NY: Psychology Press.CrossRefGoogle Scholar
Reis, H. T., & Gosling, S. D. (2010). Social psychological methods outside the laboratory. In Fiske, S. T., Gilbert, D. T., & Lindzey, G. (Eds.), Handbook of Social Psychology (5th ed., Vol. 1). Hoboken, NJ: Wiley Hoboken.Google Scholar
Rynes, S. L. (2012). The research–practice gap in industrial–organizational psychology and related fields: Challenges and potential solutions. In Kozlowski, S. W. J. (ed.), Oxford handbook of organizational psychology (Vol. 1, pp. 409452). New York, NY: Oxford University Press.CrossRefGoogle Scholar
Rynes, S. L., Bartunek, J. M., & Daft, R. L. (2001). Across the great divide: Knowledge creation and transfer between practitioners and academics. Academy of Management Journal, 44, 340355. doi:10.2307/3069460Google Scholar
Sackett, P. R., & Larson, J. (1990). Research strategies and tactics in I-O psychology. In Dunnette, M. D. & Hough, L. (Eds.), Handbook of industrial and organizational psychology (2nd ed., pp. 1989). Palo Alto, CA: Consulting Psychologists Press.Google Scholar
Sackett, P. R., Lievens, F., Berry, C. M., & Landers, R. N. (2007). A cautionary note on the effects of range restriction on predictor intercorrelations. Journal of Applied Psychology, 92, 538544. doi:10.1037/0021-9010.92.2.538CrossRefGoogle ScholarPubMed
Sackett, P. R., & Yang, H. (2000). Correction for range restriction: An expanded typology. Journal of Applied Psychology, 85, 112118. doi:10.1037/0021-9010.85.1.112CrossRefGoogle ScholarPubMed
Schmidt, F. L., Law, K., Hunter, J. E., Rothstein, H. R., Pearlman, K., & McDaniel, M. (1993). Refinements in validity generalization methods: Implications for the situational specificity hypothesis. Journal of Applied Psychology, 78, 312. doi:10.1037/0021-9010.78.1.3Google Scholar
Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal influence. Boston, MA: Houghton Mifflin.Google Scholar
Spector, P. E., & Brannick, M. T. (2010). Methodological urban legends: The misuse of statistical control variables. Organizational Research Methods, 14, 287305.Google Scholar
Stanton, J. M., & Weiss, E. M. (2002). Online panels for social science research: An introduction to the StudyResponse project (Tech. Report No. 13001). Syracuse, NY: Syracuse University, School of Information Studies.Google Scholar
Staw, B. M., & Ross, J. (1980). Commitment in an experimenting society: A study of the attribution of leadership from administrative scenarios. Journal of Applied Psychology, 65, 249260.CrossRefGoogle Scholar
StudyReponse.net. (2015). Research FAQ. Retrieved from http://www.studyresponse.net/researcherFAQ.htmGoogle Scholar
Trochim, W., & Donnelly, J. (2008). The research methods knowledge base. Mason, OH: Atomic Dog.Google Scholar
Wang, M., & Hanges, P. J. (2011). Latent class procedures: Applications to organizational research. Organizational Research Methods, 14, 2431. doi:10.1177/1094428110383988Google Scholar
Wang, M., & Russell, S. S. (2005). Measurement equivalence of the job descriptive index across Chines and American workers: Results for confirmatory factor analysis and item response theory. Educational and Psychological Measurement, 65, 709732. doi:10.1177/0013164404272494CrossRefGoogle Scholar
Wang, M., Zhan, Y., Liu, S., & Shultz, K. S. (2008). Antecedents of bridge employment: A longitudinal investigation. Journal of Applied Psychology, 93, 818830. doi:10.1037/0021-9010.93.4.818CrossRefGoogle ScholarPubMed