Hostname: page-component-76fb5796d-45l2p Total loading time: 0 Render date: 2024-04-28T00:31:06.779Z Has data issue: false hasContentIssue false

Designing for dissemination among public health and clinical practitioners in the USA

Published online by Cambridge University Press:  14 December 2023

Thembekile Shato*
Affiliation:
Prevention Research Center, Brown School, Washington University in St. Louis, St. Louis, MO, USA Department of Surgery (Division of Public Health Sciences), Washington University School of Medicine, Washington University in St. Louis, St. Louis, MO, USA
Maura M. Kepper
Affiliation:
Prevention Research Center, Brown School, Washington University in St. Louis, St. Louis, MO, USA
Gabriella M. McLoughlin
Affiliation:
College of Public Health, Temple University, Philadelphia, PA, USA Implementation Science Center for Cancer Control, Brown School and School of Medicine, Washington University in St. Louis, St. Louis, MO, USA
Rachel G. Tabak
Affiliation:
Prevention Research Center, Brown School, Washington University in St. Louis, St. Louis, MO, USA
Russell E. Glasgow
Affiliation:
Department of Family Medicine and ACCORDS Research Center, University of Colorado Anschutz Medical Campus, Aurora, CO, USA
Ross C. Brownson
Affiliation:
Prevention Research Center, Brown School, Washington University in St. Louis, St. Louis, MO, USA Department of Surgery (Division of Public Health Sciences), Washington University School of Medicine, Washington University in St. Louis, St. Louis, MO, USA
*
Corresponding author: T. Shato, PhD; Email: shato@wustl.edu
Rights & Permissions [Opens in a new window]

Abstract

Introduction:

The slow adoption of evidence-based interventions reflects gaps in effective dissemination of research evidence. Existing studies examining designing for dissemination (D4D), a process that ensures interventions and implementation strategies consider adopters’ contexts, have focused primarily on researchers, with limited perspectives of practitioners. To address these gaps, this study examined D4D practice among public health and clinical practitioners in the USA.

Methods:

We conducted a cross-sectional study among public health and primary care practitioners in April to June 2022 (analyzed in July 2022 to December 2022). Both groups were recruited through national-level rosters. The survey was informed by previous D4D studies and pretested using cognitive interviewing.

Results:

Among 577 respondents, 45% were public health and 55% primary care practitioners, with an overall survey response rate of 5.5%. The most commonly ranked sources of research evidence were email announcements for public health practitioners (43.7%) and reading academic journals for clinical practitioners (37.9%). Practitioners used research findings to promote health equity (67%) and evaluate programs/services (66%). A higher proportion of clinical compared to public health practitioners strongly agreed/agreed that within their work setting they had adequate financial resources (36% vs. 23%, p < 0.001) and adequate staffing (36% vs. 24%, p = 0.001) to implement research findings. Only 20% of all practitioners reported having a designated individual or team responsible for finding and disseminating research evidence.

Conclusions:

Addressing both individual and modifiable barriers, including organizational capacity to access and use research evidence, may better align the efforts of researchers with priorities and resources of practitioners.

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2023. Published by Cambridge University Press on behalf of The Association for Clinical and Translational Science

Introduction

Although there has been significant investment in health-related research and development of interventions, translation into policy and routine practice remains slow [Reference Green, Ottoson, Garcia and Hiatt1,Reference Proctor, Ramsey, Saldana, Maddox, Chambers and Brownson2]. For example, in national studies among US public health departments, an estimated 58%–64% of programs and policies were reported as evidence-based [Reference Brownson, Fielding and Green3]. In another study among public health practitioners in health departments, an estimated three-quarters (75%) of programs were reported as evidence-based in two US states [Reference Jacobs, Clayton and Dove6]. Ebell and colleagues also found that 51% of clinical recommendations for primary care practice were based on patient-oriented evidence from original research, with only 18% based on high-quality evidence [Reference Ebell, Sokol, Lee, Simons and Early7]. These studies reflect gaps and barriers and suggest that evidence-based interventions (EBIs) are not being disseminated effectively [Reference Green, Ottoson, Garcia and Hiatt1,Reference Brownson, Jacobs, Tabak, Hoehner and Stamatakis8Reference Green, Ottoson, García, Hiatt and Roditis10].

The lingering research to practice gap is attributed to interacting barriers at multiple levels shaped by political, economic, cultural, scientific, and organizational contexts [Reference Glasgow and Emmons11]. These barriers include the lack of relevance of research findings to practice, research findings not packaged for ease of implementation, limited capacity and resources to disseminate or apply research, lack of organizational and structural supports to enhance access and adoption of research, and lack of funding [Reference Glasgow and Emmons11Reference Tabak, Stamatakis, Jacobs and Brownson13].

Dissemination, an active and intentional process of spreading EBIs to target audiences via determined channels using planned strategies [Reference Rabin, Viglione and Brownson14], is a critical step for effective adoption and implementation of these EBIs [Reference Baumann, Hooley and Kryzer15]. The process of dissemination is influenced by multiple factors related to characteristics of the individual, innovation, organization, and environment [Reference Dobbins, Ciliska, Cockerill, Barnsley and DiCenso16]. However, previous studies have shown that dissemination is too often passive and not aligned between those producing (often researchers) and those applying the research evidence (often practitioners) [Reference Brownson, Eyler, Harris, Moore and Tabak17Reference Brownson, Fielding and Green19], contributing to low uptake of EBIs [Reference Bero, Grilli, Grimshaw, Harvey, Oxman and Thomson20]. Dissemination approaches by researchers typically include publication in journals and presentations at conferences. Although such practices are important and effective for other researchers, they do not line up well with needs and communications approaches and preferences of practitioners who are the target adopters and implementers of research evidence [Reference Brownson, Eyler, Harris, Moore and Tabak17]. Designing for dissemination (D4D) seeks to address this disconnect to better align how researchers produce and communicate research evidence (push) with how practitioners or policymakers receive and utilize (pull) research evidence, and the structural supports needed to support evidence-based practice (capacity) [Reference Kwan, Brownson, Glasgow, Morrato and Luke18]. D4D is a process to ensure that products of research are designed and developed to match the contextual characteristics (i.e., needs, assets, and resources) of the target audiences, including practitioners, and their setting [Reference Rabin, Viglione and Brownson14,Reference Kwan, Brownson, Glasgow, Morrato and Luke18,Reference Brownson, Colditz and Proctor21].

Previous studies have examined the practice of D4D primarily among researchers. In a study among researchers in academic and national research institutions in the USA found that 73% spent less than 10% of their time on dissemination, 53% had a person or team in their unit dedicated to dissemination, and only a third (34%) involved stakeholders in the process [Reference Brownson, Jacobs, Tabak, Hoehner and Stamatakis8]. A more recent study among dissemination and implementation researchers in the USA and Canada found that overall engagement in dissemination-related activities and stakeholder involvement in the research were more common [Reference Knoepke, Ingle, Matlock, Brownson and Glasgow22]. However, dissemination-related activities (i.e., face-to-face meetings) identified as most impactful to practice or policy were used by only 40% of respondents, while dissemination-related activities such as journal publications, conference presentations, and reports to funders were used by the majority (>70%) of respondents [Reference Knoepke, Ingle, Matlock, Brownson and Glasgow22]. Organizational structures and supports, including dissemination expected by funding agencies and previous work in a practice or policy setting, were identified as significant and most important determinants of dissemination efforts by public health researchers to non-research audiences [Reference Tabak, Stamatakis, Jacobs and Brownson13].

Most prior D4D studies focused mostly on researchers (the push side), with few studies examining the perspectives of practitioners (the pull side). A qualitative study that explored the use of research evidence among public health officials in the USA found that most respondents used research to support grant writing; and primary sources of research evidence were professional organizations and government agencies, compared to research journals [Reference Narain, Zimmerman and Richards23]. In this same study, respondents also indicated a desire to participate in the planning phase of research projects and recommended simplifying for and tailoring for diverse target audiences to enhance usefulness of research evidence [Reference Narain, Zimmerman and Richards23]. Previous studies in Canada found that public health decision-makers in public health departments and community organizations preferred executive summaries of research evidence [Reference Dobbins, Jack, Thomas and Kothari24,Reference Dobbins, Rosenbaum, Plews, Law and Fysh25]. Additionally, organizational characteristics including perceived organizational value on use of research evidence and ongoing training were shown to be influential in the use of research evidence [Reference Dobbins, Cockerill, Barnsley and Ciliska26]. The divergence in perspectives between researchers and practitioners is also reflected in training programs that have focused primarily on building capacity for implementation in researchers [Reference Brownson, Fielding and Green3,Reference Brownson, Fielding and Green19], and Kwan and colleagues stressed that dissemination strategies have focused much more on the push side (researchers’ perspective) with little emphasis on the pull side (practitioners’ perspective), warranting more emphasis on practitioner engagement in dissemination [Reference Kwan, Brownson, Glasgow, Morrato and Luke18]. Since the COVID-19 pandemic, dissemination practices may have changed and thus we need to understand practitioners’ perspectives and preferences to expand our understanding of D4D.

By ensuring that research and interventions are designed and developed in ways that match with priorities and needs of adopters and implementers, D4D approach has the potential to improve the translation of evidence into practice [Reference Brownson, Jacobs, Tabak, Hoehner and Stamatakis8,Reference Kwan, Brownson, Glasgow, Morrato and Luke18]. D4D provides the avenue to identify all key stakeholders and collaboratively develop dissemination and implementation approaches that reflect the experiences of adopters, implementers, and beneficiaries of research evidence – a critical step toward achieving health equity [Reference Kwan, Brownson, Glasgow, Morrato and Luke18,Reference Brownson, Kumanyika, Kreuter and Haire-Joshu27]. Thus, this study aims to examine and describe the practice and patterns of D4D among practitioners in the USA. Information from this study will guide the co-design of dissemination products (e.g., research findings and interventions) and strategies engaging relevant stakeholders to maximize the reach and adoption of research evidence and EBIs.

Methods

Study Design and Participants

This cross-sectional study was conducted across the USA among public health and clinical practitioners in spring 2022. Public health practitioners were considered those working in local and state health departments. Clinical practitioners were primary care physicians working in the following settings: pediatrics, family medicine, internal medicine, obstetrics and gynecology, and emergency medicine.

Survey Development and Measures

The survey development was informed by three theoretical frameworks, including Diffusion of Innovations, Knowledge to Action (K2A) Frameworks, and Reach Effectiveness Adoption Implementation and Maintenance (RE-AIM) as well as previous D4D studies [Reference Brownson, Jacobs, Tabak, Hoehner and Stamatakis8,Reference Knoepke, Ingle, Matlock, Brownson and Glasgow22,Reference Brownson, Allen and Jacob28Reference Evenson, Brownson, Satinsky, Eyler and Kohl30]. Diffusion of Innovations helps to understand the spread of new or innovative ideas (e.g., research evidence) and characteristics of adopters, proposing that adoption of an innovation is accomplished in several stages beginning with the awareness of the innovation to the continued use of the innovation [Reference Dearing31,Reference Rogers, Singhal and Quinlan32]. The K2A framework highlights the key elements and outcomes of knowledge (e.g., research evidence) utilization in practice including the need to identify and understand multi-level factors (barriers and facilitators) that influence knowledge use. RE-AIM focuses on implementation outcomes and dimensions that together determine the public health impact of a program or policy (e.g., reach and implementation) [Reference Graham, Logan and Harrison33Reference Glasgow, Harden and Gaglio36], The linkage between survey content and these theoretical frameworks is summarized in Supplementary material 1. The survey was pretested by conducting cognitive interviewing using the think-aloud technique [Reference Willis37,Reference Beatty and Willis38] among 10 public health practitioners (n = 5) and clinical practitioners (n = 5). The practitioners who pretested the survey were recruited through the professional networks within the Prevention Research Center at Washington University in St Louis. Responses from the interviews informed the revision of the survey, including re-wording and addition or deletion of survey questions to enhance relevance and readability of the survey. The final survey had 24 questions (see Supplementary material 2).

The survey assessed individual (practitioner) and organizational factors. First, awareness and knowledge of research evidence (e.g., Diffusion of Innovation [Reference Dearing31,Reference Rogers, Singhal and Quinlan32]) included items assessing sources of information about and characteristics of presenting research evidence as well as whether an individual or team was designated in the organization or clinic to find and report research evidence. Second, adoption and implementation of research evidence (K2A [Reference Graham, Logan and Harrison33,Reference Wilson, Brady and Lesesne34] and RE-AIM [Reference Glasgow, Vogt and Boles35,Reference Glasgow, Harden and Gaglio36]) included items assessing the frequency, barriers, and facilitators of using research evidence. For the final section, engagement in research included items addressing ways in which respondents were involved in research within the past 2 years, including the impact of the COVID-19 pandemic.

Data Collection

Practitioners were recruited online through local- and national-level rosters – Missouri local health departments for the local public health practitioners, National Association of Chronic Disease Directors (NACDD) for the state health public health practitioners, and American Medical Association (AMA) for the clinical practitioners. Surveys were self-administered and conducted online through Qualtrics (Qualtrics, 2020). An initial email invitation in addition to three email reminders with a unique link to the survey were sent to a random sample of respondents from each list. Data were collected from April to June 2022. Upon completion of the survey, participants received a $50 gift card. This study was approved by Washington University in St Louis Institutional Review Board (IRB No. 202112167).

Analysis

Descriptive analyses were conducted to summarize data using frequencies (percentages) for categorical variables and means (standard deviations) for continuous variables. Subgroup analyses exploring differences between public health and clinical practitioners were assessed in bivariate analyses using chi-squared tests. All analyses were conducted in SAS v9.4.

Results

After deleting 44 invalid responses (e.g., duplicates), analysis was conducted on 623 respondents. The final analytic sample (n = 577) excluded 45 respondents who considered themselves only as researchers and 1 respondent who indicated they had retired. The overall response rate for the survey was 5.5% [9.1% (n = 41/451) among local public health practitioners, 22.5% (n = 262/1,162) among state public health practitioners, and 3.3% (320/9,648) among clinical public health practitioners]. Among 577 respondents, 55% were clinical practitioners and 45% were public health practitioners (Table 1). State public health practitioners (n = 222, 85%) comprised the majority of public health practitioners. The highest proportion of public health practitioners (81%) worked in state health departments, and the highest proportion of clinical practitioners (64.8%) worked in outpatient health facilities. The majority of clinical practitioners (95.7%) had a doctoral degree and public health practitioners had a master’s degree (58%). One-fifth of the respondents considered themselves as both practitioners and researchers (19%). The highest proportion of all practitioners ranked national government agencies (40%), followed by professional associations (27%) and researchers (21%) as their most common source of information for research findings (Table 2). All practitioners most commonly trusted email announcements (31%), reading academic journals (27%), and professional conferences (10%) as their source of information. For public health practitioners, the most common trusted sources of information were national government agencies (55%) followed by researchers (21%), and most often got information about research findings from email announcements (44%) followed by government reports (14%) and reading academic journals (13%). For clinical practitioners, the most trusted sources of information were professional associations (37%) and national government agencies (28%), and the most commonly ranked source of research findings was reading academic journals (38%) followed by email announcements (20%) and professional conferences (13%).

Table 1. Characteristics of survey respondents (n = 577)

1 Ministry of Health.

2 Private practice, locum, VA, Telemedicine, Corporation.

3 Trade/technical/vocational education beyond high school.

4 Includes Nutrition/Dietetics, Dentistry, Public Administration/Social Welfare, Education, History, Law, Economics.

5 Virgin Islands, Micronesia, Palau.

Table 2. Information sources for research findings (n = 577)

1 Includes internet search; news media; Up to Date website.

2 Includes academic journals; Pubmed; Up to Date website.

3 Top ranked source of information in ascending order (from most common to least common).

Overall, the majority of all practitioners reported that when presenting research findings, it is very or extremely important that information be relevant to the patients or populations served (92%), provides practical advice about implementation (89%), tells a story of how patients or populations served are affected by an issue (69%), provides data on cost-effectiveness (57%), and is delivered by someone known and respected (50%) (Table 3). Compared to clinical practitioners, a significantly higher proportion of public health practitioners indicated that it was extremely or very important for research findings to be relevant to the populations served (95% vs. 88%, p = 0.005), to present practical advice about implementation (95% vs. 84%, p = 0.001), and tell a story about how an issue affects populations served (78% vs. 62%, p < 0.001).

Table 3. Important characteristics of presenting research findings

1 Bolded p-value significant at p < 0.05, based on tests of differences between clinical and public health practitioners.

There were significant differences in the uses of research findings between public health and clinical practitioners (Table 4). Public health practitioners, compared to clinical practitioners, were more likely to every time or almost every time use research findings to promote health equity (80% vs. 56%, p < 0.001), evaluate programs/policies/services (83% vs. 52%, p < 0.001), address the spread of inaccurate information (64% vs. 57%, p < 0.001), modify existing programs/services (75% vs. 44%, p < 0.001), develop new programs/services (82% vs. 38%, p < 0.001), discontinue an existing program/service (45% vs. 35%, p < 0.001), and to write a grant application (71% vs. 13%, p < 0.001). Lack of time to find research was ranked as the most common barrier for both public health (44%) and clinical practitioners (37%), followed by lack of relevance of research to work needs for public health practitioners (17%), and lack of a brief summary of research findings for clinical practitioners (15%). For both public health and clinical practitioners, easy access to a summary of research findings (30% and 36%), easy access to research findings or data sources (30% vs. 26%), and leaders or direct supervisors placing high priority on research (23% and 13%) were the most important facilitators of using research findings.

Table 4. Uses of research findings (n = 577)

1 Bolded p-value significant at p < 0.05, based on tests of differences between clinical and public health practitioners.

Table 5 presents organizational factors related to use of research findings. A significantly lower proportion of public health practitioners compared to clinical practitioners strongly agreed or agree they had adequate staffing to implement research findings in their work (24% vs. 36%, p = 0.007) and adequate financial resources to implement research findings (23% vs. 36%, p < 0.001). The majority of practitioners (83%) placed a priority on promoting health equity in their work and indicated that it was extremely or very important for the organization/clinic to use research findings. Overall, 20% of the respondents had a designated individual or team responsible for findings and disseminating research findings.

Table 5. Organizational setting and supports in using research findings (n = 577)

1 Bolded p-value significant at p < 0.05, based on tests of differences between clinical and public health practitioners.

The majority of all survey practitioners in the survey (57%) indicated that research involvement (e.g., serving on an advisory committee or as a research participant, disseminating research findings) since COVID-19 stayed the same. Overall, about a third of practitioners reported being involved in collecting data, interpreting data, and dissemination findings through personal or professional networks. In the past 2 years, a significantly higher proportion of public health practitioners were involved with collecting data (51% vs. 24%, p < 0.001), interpreting data (45% vs. 21%, p < 0.001), and disseminating findings through personal or professional networks (46% vs. 20%, p < 0.001) compared to clinical practitioners. A significantly higher proportion of clinical practitioners compared to public health practitioners had not been involved in research in the past 2 years (45% vs. 20%, p < 0.001).

Discussion

This study addresses a critical gap in D4D by examining the perspectives of practitioners who are adopters and implementers of research. The most common source of research findings were academic journals and email announcements for clinical and public health practitioners. Email announcements could include newsletters, reports, or web links with information about research findings. The majority of all practitioners in the survey, with significantly higher proportions among public health practitioners, frequently used research findings to promote health equity, address the spread of inaccurate information, and develop, modify, and evaluate programs or services. The most common barrier to using research was a lack of time, while easy access to research evidence was the most common facilitator. Only a third of the practitioners had adequate staffing and financial resources to find and implement research in their work.

Consistent with previous findings in both public health practice and health care [Reference Glasgow and Emmons11,Reference Olswang and Prelock12,Reference Narain, Zimmerman and Richards23,Reference Sullivan, Wayne, Patey and Nasr39Reference Dodson, Baker and Brownson41], practitioners in our study reported time constraints as the most common barrier to finding and using research in practice. For example, in a survey among state-level public health practitioners, respondents commonly cited lack of time as a barrier to using evidence-based decision-making in practice [Reference Dodson, Baker and Brownson41]. In qualitative studies among healthcare providers including pediatric surgeons and allied health clinicians, the already demanding day-to-day workload poses time constraints not only to patient care but also to prioritizing and using research evidence in practice [Reference Sullivan, Wayne, Patey and Nasr39,Reference Harding, Porter, Horne-Thompson, Donley and Taylor40]. As outlined in a review of factors influencing research translation to practice, many practice settings are faced with competing priorities, tasks, and demands which may exacerbate the challenge of finding and integrating research [Reference Glasgow and Emmons11]. Narain and colleagues note that within a practice or policy setting, there is need to identify and take quick action on feasible solutions which may not always be appreciated or often accounted for in the process of research production and dissemination [Reference Narain, Zimmerman and Richards23].

The biggest facilitator for using research evidence among practitioners in our study was easy access to research and a summary of research findings. These data are similar to findings from a qualitative study among public health officials who favored summaries and systematic reviews as a way of consuming research evidence [Reference Narain, Zimmerman and Richards23,Reference Dobbins, Cockerill, Barnsley and Ciliska26]. This underscores the need for strategies that better align with practitioners’ preferences and simplify the access, retrieval, and integration of research evidence which may, in turn, help to overcome time constraints within practice settings.

We found major differences in research involvement, a key aspect of D4D, between practitioners and researchers in previous studies. All practitioners were more involved in research during the end stages of data collection, interpreting data, and disseminating findings through personal or professional networks. In contrast, previous D4D studies among researchers reported research engagement activities more toward the beginning of the research process. In two D4D studies among researchers in the USA and Canada, the most common methods of involvement included development of research advisory committees (66%–72%), engagement of persons with diverse experiences, perspectives and roles in research proposal development and implementation to enhance relevance of research to practice settings (62%) and to stakeholders (59%), and participation on the research team (63%) [Reference Brownson, Jacobs, Tabak, Hoehner and Stamatakis8,Reference Knoepke, Ingle, Matlock, Brownson and Glasgow22]. This suggests that there is still a need to identify ways to intentionally engage practitioners from conceptualization and throughout the research process. Engagement is critical to enhancing the relevance and translation of research to practice. Although half of practitioners indicated that their involvement in research did not change, the discrepancy in findings in our study may also reflect the challenges faced during the COVID-19 pandemic in bringing stakeholders together for research.

The second difference was related to dissemination-related activities between researchers and practitioners. We found that less than a third of all practitioners, with a much lower proportion among public health practitioners, most often got research information from reading academic journals. Yet, this is the most common approach for disseminating research evidence used by researchers [Reference Brownson42]. Knoepke and colleagues found that dissemination and implementation researchers most frequently disseminated their work by publishing in academic journals (88%), delivering conference presentations (86%), and reporting to funders (74%) [Reference Knoepke, Ingle, Matlock, Brownson and Glasgow22]. In this same study, use of dissemination-related activities most impactful to practice were used less frequently [Reference Knoepke, Ingle, Matlock, Brownson and Glasgow22]. The third contrast in findings from our study compared to that of other studies related to organizational supports. Only 19.8% of practitioners reported having a designated individual or team for finding and disseminating research evidence. This is much lower than reported in a previous study where over half (53%) of the researchers had a person or team within their unit dedicated to dissemination [Reference Brownson, Jacobs, Tabak, Hoehner and Stamatakis8]. This may reflect differences in capacity and structural supports available and accessible to practitioners compared to researchers, particularly those working in institutions with a significant focus on research, such as reported in the 2018 survey study of public health researchers [Reference Brownson, Jacobs, Tabak, Hoehner and Stamatakis8].

Ours is one of several recent findings highlighting the persistent push–pull disconnect between researchers and practitioners [Reference Kwan, Brownson, Glasgow, Morrato and Luke18,Reference Brownson42]. which has implications for the translation and integration of research evidence into practice or policy [Reference Narain, Zimmerman and Richards23]. To improve the translation of research into practice, there is an urgent need to better align and match approaches, preferences, and priorities of researchers and practitioners in the design, dissemination, and implementation of research. This necessitates the design of research and interventions to take into account the needs and contextual characteristics of practitioners and their practice settings in which research findings are intended to impact [Reference Narain, Zimmerman and Richards23]. Bridging the gap and improving the alignment between the research and practice world will also require creating and sustaining an enabling environment for effective research engagement as well as dissemination, integration, and implementation of scientific evidence. Building capacity and organizational support structures will be essential as proposed by the push–pull–capacity model [Reference Brownson42]. This model asserts that “for science to affect practice there must be a combination of the rationale for the science (pull), a demand for the science by practitioners (pull), and the delivery ability of the public health and healthcare systems (the capacity) [Reference Brownson42].” However, based on our results and previous studies, [Reference Jacobs, Clayton and Dove6,Reference Brownson, Fielding and Green19] current gaps still exist. We found that only a third of all practitioners had adequate staffing and financial resources to implement research findings in their work. Similar organizational factors, including expectation by funding agencies and previous work in a practice or policy setting, were shown as significant determinants for dissemination efforts among researchers [Reference Tabak, Stamatakis, Jacobs and Brownson13]. This suggests that strategies for capacity building, such as training, and strategies that build support structures at organizational level, such as staffing and funding, in both practice and research settings may contribute toward research translation. In building capacity, it will be critical to enhance equity by tailoring strategies based on specific needs of each setting and in consideration of contextual and social determinants and existing health disparities [Reference Brownson, Kumanyika, Kreuter and Haire-Joshu27].

Our findings should be considered in light of a few limitations. We used self-report survey data which may be subject to recall bias or response bias. Depending on the roles of respondents within their organizations/clinics/hospitals, it is possible that respondents may not have had all the information about survey questions focused on the organizational settings. Given the low response rate (common in surveys of practitioners [Reference Robins, Leider and Schaffer43,Reference Cook, Wittich, Daniels, West, Harris and Beebe44]), our findings may be subject to nonresponse bias. Those who responded may be different from those who did not respond regarding their perspectives on D4D. The low response rate affects the generalizability of the findings. We surveyed primary care physicians whose responses may not reflect experiences of other healthcare professionals. Given this study was the first to assess how clinical and public health practitioners in the USA learn about research evidence, future research is needed to examine D4D practice among other practitioner types within the healthcare professions as well as policymakers. Additionally, to gain a more comprehensive perspective on D4D and strategies to bridge the research–practice gap, it would have been helpful to have in-depth mixed or qualitative data to supplement our results. Despite these limitations, this is one of the few studies to examine how practitioners access and integrate research evidence, addressing a pertinent gap in our understanding of D4D. In addition, we were able to capture diverse experiences by surveying practitioners within public health and clinical settings.

Conclusion

This study described how practitioners in the USA receive and use research evidence, which is important for researchers undertaking the practice of D4D to reach this audience. We found that there are differences in dissemination activities, research engagement, and organizational supports, among practitioners in contrast to researchers, in previous D4D studies. This provides important insights into where the persistent disconnect between the two worlds exist and provides the opportunity to identify points of intervention. The current study and existing literature suggests the need to identify and develop strategies and tools to more effectively D4D tailored to the needs of those adopting and implementing research evidence.

Supplementary material

The supplementary material for this article can be found at https://doi.org/10.1017/cts.2023.695.

Acknowledgments

The authors thank the local and state public health practitioners as well as the clinical practitioners for their contribution and participation in this study. The authors also acknowledge administrative support for this study from Linda Dix and Mary Adams at the Prevention Research Center in St Louis, Brown School at Washington University in St Louis.

Funding statement

This study was supported by the National Cancer Institute of the National Institutes of Health [grant numbers 3P30CA091842-19S4, P50CA244688, and P50CA244431], Prevention Research Center through the Centers for Disease Control and Prevention, and Barnes Jewish Health Foundation. The content of this study is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

Competing interests

The authors have no conflicts of interest to declare.

References

Green, LW, Ottoson, JM, Garcia, C, Hiatt, RA. Diffusion theory and knowledge dissemination, utilization, and integration in public health. Annu Rev Public Health. 2009;30(1):151174.CrossRefGoogle ScholarPubMed
Proctor, E, Ramsey, AT, Saldana, L, Maddox, TM, Chambers, DA, Brownson, RC. FAST: a framework to assess speed of translation of health innovations to practice and policy. Glob Implement Res Appl. 2022;2(2):107119.CrossRefGoogle Scholar
Brownson, RC, Fielding, JE, Green, LW. Building capacity for evidence-based public health: reconciling the pulls of practice and the push of research. Annu Rev Public Health. 2018;39(1):2753.CrossRefGoogle ScholarPubMed
Dreisinger, M, Leet, TL, Baker, EA, Gillespie, KN, Haas, B, Brownson, RC. Improving the public health workforce: evaluation of a training course to enhance evidence-based decision making. J Public Health Manag Pract. 2008;14(2):138143.CrossRefGoogle ScholarPubMed
Gibbert, WS, Keating, SM, Jacobs, JA, et al. Training the workforce in evidence-based public health: an evaluation of impact among US and international practitioners. Prev Chronic Dis. 2013;10:E148.CrossRefGoogle ScholarPubMed
Jacobs, JA, Clayton, PF, Dove, C, et al. A survey tool for measuring evidence-based decision making capacity in public health agencies. BMC Health Serv Res. 2012;12(1):19.CrossRefGoogle ScholarPubMed
Ebell, MH, Sokol, R, Lee, A, Simons, C, Early, J. How good is the evidence to support primary care practice? BMJ Evid Based Med. 2017;22(3):8892.CrossRefGoogle ScholarPubMed
Brownson, RC, Jacobs, JA, Tabak, RG, Hoehner, CM, Stamatakis, KA. Designing for dissemination among public health researchers: findings from a national survey in the United States. Am J Public Health. 2013;103(9):16931699.CrossRefGoogle ScholarPubMed
Glasgow, RE, Marcus, AC, Bull, SS, Wilson, KM. Disseminating effective cancer screening interventions. Cancer. 2004;101(S5):12391250.CrossRefGoogle ScholarPubMed
Green, LW, Ottoson, JM, García, C, Hiatt, RA, Roditis, ML. Diffusion theory and knowledge dissemination, utilization and integration. Front Public Health. 2014;3(1):3.Google ScholarPubMed
Glasgow, RE, Emmons, KM. How can we increase translation of research into practice? Types of evidence needed. Annu Rev Publ Health. 2007;28(1):413433.CrossRefGoogle ScholarPubMed
Olswang, LB, Prelock, PA. Bridging the gap between research and practice: implementation science. J Speech Lang Hear Res. 2015;58(6):S1818S1826.CrossRefGoogle Scholar
Tabak, RG, Stamatakis, KA, Jacobs, JA, Brownson, RC. What predicts dissemination efforts among public health researchers in the United States? Public Health Rep. 2014;129(4):361368.CrossRefGoogle ScholarPubMed
Rabin, BA, Viglione, C, Brownson, RC. Terminology for dissemination and implementation research. In: Dissemination and Implementation Research in Health: Translating Science to Practice, 3rd ed. Oxford: Oxford University Press, 2023:2766.Google Scholar
Baumann, AA, Hooley, C, Kryzer, E, et al. A scoping review of frameworks in empirical studies and a review of dissemination frameworks. Implement Sci. 2022;17(1):53.CrossRefGoogle Scholar
Dobbins, M, Ciliska, D, Cockerill, R, Barnsley, J, DiCenso, A. A framework for the dissemination and utilization of research for health-care policy and practice. Online J Knowl Synth Nurs. 2002;9(1):149160.Google ScholarPubMed
Brownson, RC, Eyler, AA, Harris, JK, Moore, JB, Tabak, RG. Getting the word out: new approaches for disseminating public health science. J Public Health Manag Pract. 2018;24(2):102111.CrossRefGoogle ScholarPubMed
Kwan, BM, Brownson, RC, Glasgow, RE, Morrato, EH, Luke, DA. Designing for dissemination and sustainability to promote equitable impacts on health. Annu Rev Public Health. 2022;43(1):331353.CrossRefGoogle ScholarPubMed
Brownson, RC, Fielding, JE, Green, LW. Building capacity for evidence-based public health: reconciling the pulls of practice and the push of research. Annu Rev Publ Health. 2018;39(1):2753.CrossRefGoogle ScholarPubMed
Bero, LA, Grilli, R, Grimshaw, JM, Harvey, E, Oxman, AD, Thomson, MA. Closing the gap between research and practice: an overview of systematic reviews of interventions to promote the implementation of research findings. BMJ. 1998;317(7156):465468.CrossRefGoogle Scholar
Brownson, RC, Colditz, GA, Proctor, EK. Dissemination and Implementation Research in Health: Translating Science to Practice. 3rd ed. Oxford: Oxford University Press; 2023.Google Scholar
Knoepke, CE, Ingle, MP, Matlock, DD, Brownson, RC, Glasgow, RE. Dissemination and stakeholder engagement practices among dissemination & implementation scientists: results from an online survey. PloS One. 2019;14(11):e0216971.CrossRefGoogle ScholarPubMed
Narain, KDC, Zimmerman, FJ, Richards, J, et al. Evidentiary needs of US public health departments with a mission to advance equity and health: a qualitative analysis. BMJ Open. 2018;8(9):e022033.CrossRefGoogle ScholarPubMed
Dobbins, M, Jack, S, Thomas, H, Kothari, A. Public health decision-makers’ informational needs and preferences for receiving research evidence. Worldviews Evid Based Nurs. 2007;4(3):156163.CrossRefGoogle ScholarPubMed
Dobbins, M, Rosenbaum, P, Plews, N, Law, M, Fysh, A. Information transfer: what do decision makers want and need from researchers? Implement Sci. 2007;2(1):112.CrossRefGoogle ScholarPubMed
Dobbins, M, Cockerill, R, Barnsley, J, Ciliska, D. Factors of the innovation, organization, environment, and individual that predict the influence five systematic reviews had on public health decisions. Int J Technol Assess Health Care. 2001;17(4):467478.CrossRefGoogle ScholarPubMed
Brownson, RC, Kumanyika, SK, Kreuter, MW, Haire-Joshu, D. Implementation science should give higher priority to health equity. Implement Sci. 2021;16(1):116.CrossRefGoogle ScholarPubMed
Brownson, RC, Allen, P, Jacob, RR, et al. Controlling chronic diseases through evidence-based decision making: a group-randomized trial. Prev Chronic Dis. 2017;14:E121.CrossRefGoogle ScholarPubMed
Purtle, J, Lê-Scherban, F, Shattuck, P, Proctor, EK, Brownson, RC. An audience research study to disseminate evidence about comprehensive state mental health parity legislation to US state policymakers: protocol. Implement Sci. 2017;12(1):113.CrossRefGoogle ScholarPubMed
Evenson, KR, Brownson, RC, Satinsky, SB, Eyler, AA, Kohl, III HW. The US national physical activity plan: dissemination and use by public health practitioners. Am J Prev Med. 2013;44(5):431438.CrossRefGoogle ScholarPubMed
Dearing, JW. Improving the state of health programming by using diffusion theory. J Health Commun. 2004;9(S1):2136.CrossRefGoogle ScholarPubMed
Rogers, EM, Singhal, A, Quinlan, MM. Diffusion of innovations. In: Stacks DW, Salwen MB, eds. An integrated approach to communication theory and research, 2nd ed. New York, NY: Routledge, 2014:432448.Google Scholar
Graham, ID, Logan, J, Harrison, MB, et al. Lost in knowledge translation: time for a map? J Contin Educ Health Prof. 2006;26(1):1324.CrossRefGoogle ScholarPubMed
Wilson, KM, Brady, TJ, Lesesne, C, NCCDPHP Work Group on Translation. An organizing framework for translation in public health: the knowledge to action framework. Prev Chronic Dis. 2011;8(2):A46.Google ScholarPubMed
Glasgow, RE, Vogt, TM, Boles, SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. 1999;89(9):13221327.CrossRefGoogle ScholarPubMed
Glasgow, RE, Harden, SM, Gaglio, B, et al. RE-AIM planning and evaluation framework: adapting to new science and practice with a 20-year review. Front Public Health. 2019;7:64.CrossRefGoogle ScholarPubMed
Willis, GB. Cognitive Interviewing: A Tool for Improving Questionnaire Design. California: Sage publications; 2004.Google Scholar
Beatty, PC, Willis, GB. Research synthesis: the practice of cognitive interviewing. Public Opin Q. 2007;71(2):287311.CrossRefGoogle Scholar
Sullivan, KJ, Wayne, C, Patey, AM, Nasr, A. Barriers and facilitators to the implementation of evidence-based practice by pediatric surgeons. J Pediatr Surg. 2017;52(10):16661673.CrossRefGoogle Scholar
Harding, KE, Porter, J, Horne-Thompson, A, Donley, E, Taylor, NF. Not enough time or a low priority? Barriers to evidence-based practice for allied health clinicians. J Contin Educ Health Prof. 2014;34(4):224231.CrossRefGoogle ScholarPubMed
Dodson, EA, Baker, EA, Brownson, RC. Use of evidence-based interventions in state health departments: a qualitative assessment of barriers and solutions. J Public Health Man. 2010;16(6):E9E15.Google ScholarPubMed
Brownson, RC. Bridging research and practice to implement strategic public health science. Am J Public Health. 2021;111(8):13891391.CrossRefGoogle ScholarPubMed
Robins, M, Leider, JP, Schaffer, K, et al. PH WINS 2021 methodology report. J Public Health Manag Pract. 2023;29(1):S35S44.CrossRefGoogle ScholarPubMed
Cook, DA, Wittich, CM, Daniels, WL, West, CP, Harris, AM, Beebe, TJ. Incentive and reminder strategies to improve response rate for internet-based physician surveys: a randomized experiment. J Med Internet Res. 2016;18(9):e244.CrossRefGoogle ScholarPubMed
Figure 0

Table 1. Characteristics of survey respondents (n = 577)

Figure 1

Table 2. Information sources for research findings (n = 577)

Figure 2

Table 3. Important characteristics of presenting research findings

Figure 3

Table 4. Uses of research findings (n = 577)

Figure 4

Table 5. Organizational setting and supports in using research findings (n = 577)

Supplementary material: File

Shato et al. supplementary material 1
Download undefined(File)
File 14.6 KB
Supplementary material: File

Shato et al. supplementary material 2
Download undefined(File)
File 26.9 KB