Hostname: page-component-788cddb947-2s2w2 Total loading time: 0 Render date: 2024-10-19T19:29:52.193Z Has data issue: false hasContentIssue false

Holding the door open for the practitioner community

Published online by Cambridge University Press:  27 January 2023

Jessica J. Sim*
Affiliation:
Department of Psychology, Elmhurst University
*
Corresponding author. Email: jessica.sim@elmhurst.edu
Rights & Permissions [Opens in a new window]

Abstract

Type
Commentaries
Copyright
© The Author(s), 2023. Published by Cambridge University Press on behalf of the Society for Industrial and Organizational Psychology

Whereas Guzzo et al. (Reference Guzzo, Schneider and Nalbantian2022) describe the perils and potential of open science for practitioner–researchers and practice-oriented research in industrial–organizational (I-O) psychology, this commentary will focus on a related—but often neglected—voice in the conversation on open science: practitioners who are consumers of the research. Although there are benefits to open science for practitioners, the one-sided adoption of open science practices may unwittingly exacerbate the gap between scientists (who develop and test theories) and practitioners (who solve problems in the professional world).

The impact of open science on practitioners

The goal of I-O psychology is evidence-based practice; however, I-O consultants and human resource (HR) professionals often question the usefulness of scholarly research (Rynes et al., Reference Rynes, Colbert and O’Boyle2018). Recent debates on the “replication crisis” in psychology have cast further doubt on the credibility of researchers. The growing concerns about research misconduct and questionable research practices (QRPs; e.g., p-hacking, HARKing) have galvanized the open science movement or “psychology’s renaissance” (Nelson et al., Reference Nelson, Simmons and Simonsohn2018).

The open science movement offers potential benefits to practitioners. While there is variation in the adoption of open science practices across academic disciplines and even within the subfields of psychology, open science practices typically include preregistration, open data, and open materials (Banks et al., Reference Banks, Field, Oswald, O’Boyle, Landis, Rupp and Rogelberg2019). From the perspective of consumers of research, preregistration of design and analysis plans can reduce unintentional QRPs (Kupferschmidt, Reference Kupferschmidt2018), thereby increasing the trustworthiness of scientific knowledge. Open datasets which are publicly available for download can allow practitioners to verify, question, and build on the results. Namely, consultants can use open data to establish “proof of concept” to get buy-in from stakeholders or pilot extensions of the work (especially with big data; Guzzo et al., Reference Guzzo, Schneider and Nalbantian2022). Open study materials that are provided at no cost further facilitate direct and conceptual replications. This enhances the internal and external validity of research findings which can give practitioners more confidence in the generalizability of evidence-based recommendations. Open materials can also provide professionals with validated measures to use with organizational samples for benchmarking. Taken together, the adoption of open science practices holds promise for evidence-based management.

Nevertheless, the open science movement is relatively young and there is ongoing debate on best practices (Banks et al., Reference Banks, Field, Oswald, O’Boyle, Landis, Rupp and Rogelberg2019). Within psychology, many have only a cursory knowledge of the issues. As a result, there is uncertainty about how to educate students on the replication crisis and how to train the next generation of researchers to implement open science practices (Chopik et al., Reference Chopik, Bremner, Defever and Keller2018). Of concern, the conversations on open science occur primarily among academic communities and within scholarly circles. As such, the primary agents of culture change are authors, journal editors and reviewers, and funding agencies. The academic and scientific community have yet to include the diverse voices of HR professionals, I-O consultants, and policy makers into the discussion.

Widening the scientist–practitioner gap

The one-sided discourse on open science is problematic because there is already a disconnect between the knowledge that researchers produce and the knowledge that practitioners implement. Historically, the reasons for the research-practice gap include financial burden (e.g., articles behind paywalls), relevance of topics, time constraints, and technical and stylistic complexity (Banks et al., Reference Banks, Pollack, Bochantin, Kirkman, Whelpley and O’Boyle2016). Very few HR professionals and consultants read research-oriented journals (Rynes et al., Reference Rynes, Colbert and Brown2002). Instead, they draw on other sources such as professional peers and mainstream media outlets for information. These sources can misrepresent the replication crisis in psychology and intensify distrust of research.

In the focal article, Guzzo et al. (Reference Guzzo, Schneider and Nalbantian2022) discuss the difficulties that practitioner–researchers face in the publication process. Managers and HR professionals who are consumers of research face different challenges. While the greater transparency of research increases the rigor of research (e.g., studies that are better planned, fewer false positives), it also increases the complexity of knowledge transfer. The implementation of open science practices increases the length and difficulty of journal articles (e.g., detailed supplemental materials are more common). Greater openness to “messy” findings (Aguinis et al., Reference Aguinis, Banks, Rogelberg and Cascio2020) and to studies that “don’t work” (e.g., null effects may be more likely to be published with registered reports and results-blind reviews) may make it difficult to distill what is relevant and useful. While openness to imperfect findings may reduce the number of effects in file drawers and be a boon to the study of real-world problems in organizational research, open science initiatives are only useful to the extent that readers have the background to vet the quality of research. Practitioners who do not have the time and training to read dense journal articles and digest methodological details may be more “lost in translation” than ever before (Shapiro et al., Reference Shapiro, Kirkman and Courtney2007). So, while the open science movement has made great strides in increasing access (e.g., availability of articles and data), it still lags in inclusion (e.g., participation is limited to those with expertise in research methods).

Bridging the divide

The point of this commentary is not to reject or cast doubt on the value of open science initiatives. The key tenets of rigor, transparency, and reproducibility are critical for quality science. However, the adoption of open science initiatives without consideration of the consumers of research can magnify the research-practice gap. The field should hold the door open to all stakeholders, including practitioners who are essential to the application of evidence-based I-O psychology. To abate the research-practice gap in open science, I propose three approaches drawing on Aguinis et al.’s (Reference Aguinis, Banks, Rogelberg and Cascio2020) stages of bridge-building: knowledge production, transfer, and training.

Knowledge production

Besides co-creating knowledge with academics, practitioners and practitioner–researchers should have greater representation on editorial boards and in the peer review process. This can address Guzzo et al.’s (Reference Guzzo, Schneider and Nalbantian2022) concerns that certain open science practices exclude research conducted in organizational settings. As the field updates its knowledge production process, practitioners involved from the ground up can have a voice in the culture shift and set the norms for the next generation of research (e.g., advocating for flexibility in disclosure and sharing, championing the use of big data, promoting both confirmatory and clearly delineated exploratory research).

Knowledge transfer

A second approach is to renew efforts to translate research to practitioners. To transfer knowledge to practitioners, researchers have conducted meta-analyses and systematic reviews, published in practitioner and bridge journals, and leveraged online platforms to deliver research findings that are applicable and useful to practitioners (e.g., IO At Work: https://www.ioatwork.com/, WorkLife with Adam Grant: https://www.ted.com/podcasts/worklife). Beyond summaries of research, research evidence can be aggregated and evaluated based on open science frameworks. For example, the What Works Clearinghouse (WWC; https://ies.ed.gov/ncee/wwc/FWW) is an initiative of the US Department of Education’s Institute of Education Sciences. WWC staff with expertise in education and research methodology identify, review, and consolidate studies of educational interventions. There could be a similar clearinghouse for I-O psychology research, where practice guides and intervention reports are available by topic, and importantly, audited by experts in the field (e.g., for reproducibility of results, replicability of claims across contexts, incidence of QRPs).

Training

A third approach involves training on basic open-science practices (e.g., the massive open online course on “Transparent and Open Social Science Research”; https://www.bitss.org/education/mooc-parent-page/). Existing open science training materials (e.g., Chopik et al., Reference Chopik, Bremner, Defever and Keller2018) could be tailored to I-O practitioners to promote understanding of open science practices. Training could take the form of practitioner-friendly webinars, white papers, and conference offerings. To reach a broader audience, white papers could leverage the existing collaborations between the Society for Industrial and Organizational Psychology (SIOP) and the Society for Human Resource Management (SHRM), and webinars could be offered for continuing education or Professional Development Credits for SHRM-certified practitioners. Accessible training materials can empower managers and HR professionals to engage with, implement, and improve their evidence-based practice (e.g., by familiarizing practitioners with online repositories and Open Science badges). This would be in line with practitioner interest in SIOP training resources, tools, and tutorials (Solberg & Porr, Reference Solberg and Porr2019).

Conclusion

Open science practices can lead to a more positive and productive research culture, but the discourse on open science should not exclude practitioners who apply the research to workplace issues. I-O consultants, HR professionals, and policy makers should be a part of the broader discussion of the future of I-O psychological science; only then can we build a community with shared values related to open science.

Footnotes

*

The author is grateful for the insights provided by Carrie Hewitt and Courtney Green.

References

Aguinis, H., Banks, G. C., Rogelberg, S. G., & Cascio, W. F. (2020). Actionable recommendations for narrowing the science-practice gap in open science. Organizational Behavior and Human Decision Processes, 158, 2735. https://doi.org/10.1016/j.obhdp.2020.02.007 CrossRefGoogle Scholar
Banks, G. C., Pollack, J. M., Bochantin, J. E., Kirkman, B. L., Whelpley, C. E., & O’Boyle, E. H. (2016). Management’s science-practice gap: A grand challenge for all stakeholders. Academy of Management Journal, 59(6), 22052231. https://doi.org/10.5465/amj.2015.0728 CrossRefGoogle Scholar
Banks, G. C., Field, J. G., Oswald, F. L., O’Boyle, E. H., Landis, R. S., Rupp, D. E., & Rogelberg, S. G. (2019). Answers to 18 questions about open science practices. Journal of Business and Psychology, 34, 257270. https://doi.org/10.1007/s10869-018-9547-8 CrossRefGoogle Scholar
Chopik, W. J., Bremner, R. H., Defever, A. M., & Keller, V. N. (2018). How (and whether) to teach undergraduates about the replication crisis in psychological science. Teaching of Psychology, 45(2), 158163. https://doi.org/10.1177/0098628318762900 CrossRefGoogle Scholar
Guzzo, R. A., Schneider, B., & Nalbantian, H. R. (2022). Open science, closed doors: The perils and potential of open science for research in practice. Industrial and Organizational Psychology: Perspectives on Science and Practice, 15, 495515.CrossRefGoogle Scholar
Kupferschmidt, K. (2018). More and more scientists are preregistering their studies. Should you? https://www.science.org/content/article/more-and-more-scientists-are-preregistering-their-studies-should-you Google Scholar
Nelson, L. D., Simmons, J., & Simonsohn, U. (2018). Psychology’s renaissance. Annual Review of Psychology, 69, 511534. https://doi.org/10.1146/annurev-psych-122216-011836 CrossRefGoogle ScholarPubMed
Rynes, S. L., Colbert, A. E., & Brown, K. G. (2002). HR professionals’ beliefs about effective human resource practices: Correspondence between research and practice. Human Resource Management, 41(2), 149174. https://doi.org/10.1002/hrm.10029 CrossRefGoogle Scholar
Rynes, S. L., Colbert, A. E., & O’Boyle, E. H. (2018). When the “best available evidence” doesn’t win: How doubts about science and scientists threaten the future of evidence-based management. Journal of Management, 44(8), 29953010. https://doi.org/10.1177/0149206318796934 CrossRefGoogle Scholar
Shapiro, D. L., Kirkman, B. L., & Courtney, H. G. (2007). Perceived causes and solutions of the translation problem in management research. Academy of Management Journal, 50(2), 249266. https://doi.org/10.5465/amj.2007.24634433 CrossRefGoogle Scholar
Solberg, E., & Porr, B. (2019). What do practitioners want? Survey practitioner survey results revealed! https://www.siop.org/Research-Publications/Items-of-Interest/ArticleID/3179/ArtMID/19366 Google Scholar