Hostname: page-component-78c5997874-8bhkd Total loading time: 0 Render date: 2024-11-18T18:14:34.327Z Has data issue: false hasContentIssue false

Strengthening the clinical research workforce through a competency-based orientation program: Process outcomes and lessons learned across three academic health institutions

Published online by Cambridge University Press:  13 September 2021

Leslie A. Musshafen*
Affiliation:
Office of the Associate Vice Chancellor for Research, University of Mississippi Medical Center, Jackson, MS, USA
Jennifer M. Poger
Affiliation:
Department of Pediatrics, Penn State College of Medicine, Hershey, PA, USA
William R. Simmons
Affiliation:
Center for Clinical and Translational Science, Mayo Clinic, Rochester, MN, USA
Alicia M. Hoke
Affiliation:
Department of Medicine, Penn State College of Medicine, Hershey, PA, USA
Laura N. Hanson
Affiliation:
Center for Clinical and Translational Science, Mayo Clinic, Rochester, MN, USA
Whitney W. Bondurant
Affiliation:
Clinical Trials Office, University of Mississippi Medical Center, Jackson, MS, USA
Jody R. McCullough
Affiliation:
Department of Pediatrics, Penn State College of Medicine, Hershey, PA, USA
Jennifer L. Kraschnewski
Affiliation:
Department of Pediatrics, Penn State College of Medicine, Hershey, PA, USA Department of Medicine, Penn State College of Medicine, Hershey, PA, USA
*
Address for correspondence: L. A. Musshafen, MBA, CRA, CPRA, Executive Director-Research, University of Mississippi Medical Center, 2500 N State Street, Jackson, MS 39216, USA. Email: lmusshafen@umc.edu
Rights & Permissions [Opens in a new window]

Abstract

Clinical research coordinators are increasingly tasked with a multitude of complex study activities critical to scientific rigor and participant safety, though more than half report not receiving appropriate training. To determine the reproducibility of an established clinical research workforce orientation program, collaborative partners across Clinical and Translational Science Award institutions seeded core principles and structure from Mayo Clinic’s Clinical Research Orientation program within Penn State University and the University of Mississippi Medical Center from 2019 to 2021. Training concepts were established and tied to those domains deemed critical by the Joint Task Force for Clinical Trial Competency for the conduct of clinical research at the highest levels of safety and quality possible. Significant knowledge and confidence gains and high overall program satisfaction were reported across participants and partner sites, despite programs being required to pivot from traditional, in-person formats to entirely virtual platforms as a result of the COVID-19 pandemic. The successful standardization and translation of foundational clinical research training has important efficiency and efficacy implications for research enterprises across the USA.

Type
Special Communications
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCCreative Common License - SA
This is an Open Access article, distributed under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike licence (https://creativecommons.org/licenses/by-nc-sa/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the same Creative Commons licence is included and the original work is properly cited. The written permission of Cambridge University Press must be obtained for commercial re-use.
Copyright
© The Author(s), 2021. Published by Cambridge University Press on behalf of The Association for Clinical and Translational Science

Introduction

Well-trained research support staff are a key pillar of research infrastructure in promoting the highest quality translational and clinical science. Though overall responsibility for the conduct of a study lies with the principal investigator, clinical research coordinators (CRCs) are increasingly delegated investigator responsibilities and complex study activities [Reference Speicher, Fromel and Avery1-Reference Califf, Filerman, Murray and Rosenblatt3]. While core activities traditionally related to the recruitment, consenting, and care coordination of study participants, the role of CRCs has evolved to routinely include submission and maintenance of regulatory documents, study budget preparation and management, collection and processing of specimens, and liaison for healthcare providers, other study personnel, study participants, regulatory bodies, and sponsors alike [Reference Speicher, Fromel and Avery1].

Given the vital contributions of CRCs, the Clinical and Translational Science Award (CTSA) Research Coordinator Taskforce was created with a focus on enhancing CRC support and training [Reference Speicher, Fromel and Avery1]. The Taskforce surveyed all active CTSAs in 2008 and found that less than half (45%) of responding CRCs reported receiving appropriate training on all of the tasks their position required [Reference Speicher, Fromel and Avery1]. Further, Inspectional Observation Summaries by the US Food & Drug Administration in 2017 found the most frequently cited audit deficiency was the failure to establish, maintain, and follow standard operating procedures [4]. Lack of proper training and standardized processes to carry out research best practices coupled with expanded responsibilities of CRCs creates barriers to conducting efficacious and ethically sound research [Reference Calvin-Naylor, Jones and Wartak5].

As such, the Taskforce recommended institutions conduct gap analyses of their training programs to determine areas of weakness in CRC training, including core competencies and career development [Reference Speicher, Fromel and Avery1]. Follow-up surveys continue to expose gaps in training despite evidence that minimal training and experience are correlated with lower self-reported competency among CRCs to conduct the myriad research responsibilities they are tasked [Reference Rojewski, Choi and Hill6]. This further increases the risk of research staff turnover and burnout [Reference Gwede, Johnson, Roberts and Cantor7].

Model: Mayo Clinic Clinical Research Orientation Program

Identifying and learning from innovators of successful and sustainable CRC training programs is an initial step toward best practice implementation at other institutions. Mayo Clinic was one of the first 12 institutions to receive the CTSA from the National Institutes of Health. The resulting Mayo Clinic Center for Clinical and Translational Science (CCaTS) has created, sustained, and adapted educational and operational resources to train and support the coordination of clinical research studies since 2015. By leveraging institutional resources as well as extramural funding, CCaTS has created a centralized research infrastructure that avoids costly duplication and inefficient silos.

CCaTS established the Mayo Clinic Clinical Research Orientation program (MCCRO) to provide new research staff with the foundational knowledge needed to safely and efficiently conduct clinical trials of the highest quality, as well as to provide continuing professional development opportunities for faculty, trainees, and other research team members throughout their careers [Reference Simmons and Hanson8]. The MCCRO program serves as the standard onboarding program for all clinical research support staff at the Mayo Clinic, focusing on key concepts and processes throughout all stages of the clinical research lifecycle including study development (investigator-initiated studies) or assessment (externally initiated studies), startup, conduct, closure, and results dissemination [Reference Simmons and Hanson8]. Subsets of tasks at each stage are detailed according to research regulations and internal processes at the Mayo Clinic (Fig. 1).

Fig. 1. Visual overview of clinical research processes at the Mayo Clinic.

Mayo Clinic Clinical Research Orientation Format

Informed by the Morrison, Ross, and Kemp model of instructional design [Reference Morrison, Ross, Kalman and Kemp9], the MCCRO program relies on a blended learning/flipped classroom approach to deliver material and promote knowledge retention through completion of 20 h of online training modules, 20 h of in-person classroom instruction led by subject-matter experts (SMEs), and 40 h of hands-on, mentor-guided work in the participant’s assigned unit (Supplemental Fig. 1). Upon completion of the program, participants are expected to exhibit competency to: recognize the basic principles of human subjects protection and the significance of their role in Good Clinical Practice; identify key activities involved in coordinating a research protocol; and build a list of professional contacts and resources to support the coordination of a clinical study. Participant satisfaction surveys of program offerings, an exam required of all CRCs at the end of their first year of employment, and input from program mentors are used as program evaluation checkpoints and feedback loops. Nearly 1300 professionals have completed the MCCRO program to date with overwhelmingly positive feedback and improved operational efficiencies, making it a well-suited training program model for other CTSAs across the country.

Existing Clinical Research Training Landscapes at Penn State University and the University of Mississippi Medical Center

Penn State University College of Medicine (PSU) and the University of Mississippi Medical Center (UMMC) were both separately grappling with developing sustainable clinical research training programs in 2018. PSU conducted a needs assessment to gain a better understanding of the education, experience, research interests, and training needs of those on their campus. Roughly 80% of the 130 responses received from those who self-identified as PSU research support personnel expressed interest in training related to study development/management. The majority (75%) reported not receiving a formal research orientation upon hire, highlighting a stark gap between needs and resources (email communication, June 2021).

At UMMC, the institution’s clinical research portfolio was rapidly expanding with the construction of a dedicated Clinical Research and Trials Unit within the main hospital. Historically, individual departments had been charged with training clinical research support professionals in their area, resulting in a disjointed and incongruous training schema across the institution. Gaps in training were magnified in areas with less robust research infrastructure. The need for a standardized, institution-wide training program as a critical component to the institution’s clinical research infrastructure and continued success became apparent to institutional research leaders.

Objectives

Representatives from each site recognized an opportunity for collaboration and set out in 2019 to determine the reproducibility of the MCCRO program through implementation at PSU and UMMC. Each institution offered an opportunity to explore programmatic adaptations based on size, research portfolio, and unique training needs. Implementation of the MCCRO at partner sites, PSU and UMMC, and the subsequent evaluations of each are reported.

Methods

Establishing a Collaborative Plan

With funding through a CTSA administrative supplement (3UL1TR002014-03S2) and support from senior leadership across sites, a core team was defined to include experienced personnel from central research offices at each organization (Table 1). A kickoff visit to Mayo Clinic and participation in subsequent virtual orientation sessions allowed core team members from PSU and UMMC the opportunity to engage in the MCCRO program first-hand and meet with program staff. Existing program structures, training content, schedules, assessments, and related templates were shared among partners.

Table 1. Funded full-time equivalents (FTEs) dedicated to implementation and management of the standardized MCCRO program at each site May 2019–January 2021

MCCRO, Mayo Clinical Research Orientation; PSU, Penn State University; UMMC, University of Mississippi Medical Center.

PSU and UMMC then focused efforts on identifying both commonalities between the sites, as well as unique program adaptations that would be needed at their respective institutions. Subsequent core team visits to PSU and UMMC were organized to gain a better understanding of each site’s organizational culture, structure, and workforce. Monthly virtual meetings were held with all partner sites to review implementation efforts and troubleshoot challenges.

Implementation of a Standardized Orientation Program

Common program components implemented at partner sites

The Core Competency Framework (CCF) established by the Joint Task Force for Clinical Trial Competency asserts that clinical research professionals should exhibit competency in up to eight domains to safely and ethically carry out their roles [Reference Sonstein, Seltzer, Li, Jones, Silva and Daemen10]. All partner sites agreed at the outset of the collaboration that the CCF would serve as the foundation by which training objectives and participant knowledge assessments would be established and tied. Orientation sessions at both partner sites were developed to address topics related to study startup (i.e., activation, setup), study conduct (i.e., recruitment, fiscal management, regulatory oversight, reporting) and study closure (i.e., data management and dissemination), as guided by the CCF [Reference Sonstein, Seltzer, Li, Jones, Silva and Daemen10]. In doing so, training was harmonized across locations.

Pre-/posttest (PPT) assessments were developed at partner sites based on their respective orientation curriculum to evaluate participants’ knowledge of key competencies within clinical research both before and after orientation participation. CCF competencies across nearly all domains were assessed, though specific CCF sublevels tied to assessment questions varied by site. A total of six CCF competency sublevels across three domains − Ethical and Participant Safety Considerations, Clinical Study Operations (Good Clinical Practices), and Data Management and Informatics − were addressed across all sites to allow for comparisons of knowledge change across institutions. PSU revised their PPT assessment over the duration of this collaboration as a function of curriculum changes and participant feedback; however, crosssite competencies remained the same. Composite variables were created for questions that addressed the same competency. McNemar’s test was employed to compare individual knowledge scores from pretest to posttest. The Wilcoxon signed-rank test was used to compare percentage correct medians at pretest to those at posttest and also to evaluate differences between sites.

Both prior to and immediately following orientation, participants were asked to use a four-point Likert scale to report their level of confidence in performing tasks related to study development, participant consent, study management, and results dissemination. An ordinal logistic regression with a generalized estimating equation model was employed to evaluate repeated measures across time points. Odds ratios were calculated to compare confidence levels from preorientation to postorientation. Comparisons between median preorientation and postorientation confidence levels were drawn using the Wilcoxon signed-rank test.

Sites also assessed participant satisfaction of content presented and presenter effectiveness as a measure of program efficacy using a Likert scale. Space for qualitative feedback was included within evaluations to allow participants an opportunity to report experiences or provide suggestions not otherwise captured in quantitative survey items. Participants at PSU completed daily session evaluations, while those at Mayo Clinic and UMMC completed evaluations at the conclusion of the program.

Finally, with support from research leadership at partner sites, site-specific program eligibility and waiver requirements were outlined. Partner sites collaborated with their respective Human Resources office and central research service areas to identify eligible program participants. Faculty and staff instructors were identified based on subject-matter expertise. PSU and UMMC agreed that offering quarterly orientation sessions would allow adequate accommodation for all eligible staff.

Adaptation and implementation at Penn State University

To develop a plan for orientation content and structure, PSU utilized results from a research support staff assessment, two employee focus groups, and exploration meetings with over 20 content experts. The PSU program was developed to foundationally support investigator-initiated research initiatives and as a complement to industry-sponsored research topics offered in the MCCRO curriculum. Translational research topics in behavioral and community-based spheres were added focus areas.

Adaptation and implementation at the University of Mississippi Medical Center

Similarly, UMMC enhanced their existing training curriculum by incorporating feedback from previous trainees and adaptations required for full institutional dissemination. Program coordinators consulted with the UMMC Clinical Research Professionals Group and central research support areas to determine personnel needs and develop site-specific content. Analogous to the MCCRO curriculum, UMMC’s program aimed to focus on clinical trial competencies.

Results

Program Structures Across Collaborating Sites

The resulting program structures at each implementing site launched in October 2019 and varied based on site-specific needs, as well as SME and participant preferences. The PSU orientation was positioned under the umbrella of the Staffing, Mentorship, and Research Training Program within the Penn State Clinical and Translational Science Institute. Twenty hours of synchronous instruction were offered over 5 days (Supplemental Figs. 2 and 4), with daily postsession assignments and activities.

The resulting orientation program at UMMC was coined CREW (Clinical Research Education for the Workforce) and housed within the Office of Clinical Trials. CREW originally included 20 h of foundational online instruction, along with 5 half-day, in-person sessions. After the first cohort expressed challenges stepping away from assigned job areas for several consecutive days; however, the schedule was altered to limit in-person sessions to two full days (Supplemental Figs. 3 and 5). The 20 h of online instruction remained unchanged.

Transitions Necessitated by the COVID-19 Pandemic Across Sites

As the COVID-19 pandemic spread across the USA in 2020, all sites pivoted to move sessions and participant interactions entirely online with minimal programing disruptions. Mayo Clinic transitioned from a blended in-person/virtual orientation to a completely virtual offering using Blackboard Collaborate and Zoom. PSU delivered their program synchronously via the Zoom platform, maximizing interactivity through built-in chat, breakout room, and annotation features. UMMC transitioned the in-person component of its training to the Cisco Webex platform and utilized games, polls, and quizzes to maximize engagement.

Program Reach Across Collaborating Sites

PSU received support to launch the orientation from the Vice Dean for Research and Graduate Studies. As a new institutional offering without demonstrated efficacy, managers were permitted to opt in /opt out their employee(s). Invitations were sent to managers and employees whose job title fell into the defined PSU Research job family. Program coordinators specifically aimed to identify eligible employees within 3 to 4 months of their start date; participants with varying lengths of service regularly requested and attended orientation, however. In all, 71 participants completed the PSU program over five sessions offered from October 2019 to January 2021.

At the request of the program developers, the UMMC Associate Vice Chancellor for Research messaged all clinical department Chairs to introduce the program and affirm its alignment with strategic priorities. Posts to an internal listserv and email messages to identified eligible staff were then used to broadly announce the program and register participants. Any interested student, staff, or faculty member on campus was invited to attend, though all new employees functioning as a CRC or those new to a CRC role at the institution were required to complete the program within the first 3 months of their hire date. Existing employees functioning as a CRC, regardless of title, were required to complete the program within 1 year. Existing employees could request an exemption with (1) their supervisor’s approval, and (2) either Certified Clinical Research Professional or Certified Clinical Research Coordinator credentials, or a passing score (≥80%) on the CREW competency exam. From October 2019 to January 2021, a total of 115 participants completed the UMMC program over five orientation sessions. Only 14 exemptions were requested and approved during that time.

Knowledge Changes From Pretest to Posttest

Total knowledge scores are reported in Table 2. PSU cohorts 1−3 and 4−5 are reported separately due to the changes made to the assessment between these cohorts. Sites found significant increases in knowledge scores from pretest to posttest (PSU cohorts 4−5 only). When combining data across sites for the six shared competencies (n = 605), overall knowledge scores from pretest to posttest also significantly increased. No significant differences in crosscompetency knowledge scores were identified between sites.

Table 2. Participant knowledge changes from pretest to posttest across all sites October 2019–January 2021

PSU, Penn State University; SD, standard deviation; UMMC, University of Mississippi Medical Center.

* Does not include three participants who did not complete all pre-/posttest assessments.

** Does not include four participants who did not complete all pre-/posttest assessments.

*** Includes all Mayo Clinic and UMMC cohorts, PSU cohorts 1–5.

Confidence Changes From Pretest to Posttest

PSU implemented the participant confidence assessment as part of their PPT assessment from the outset of the program’s implementation; UMMC adopted the confidence assessment beginning with cohort 3. A significant increase in participant confidence to perform all assessed clinical research activities was observed at both PSU and UMMC, with participants having significantly higher odds of increased confidence postorientation as compared to preorientation (Table 3). A significant increase in the median confidence level of participants from pretest to posttest was also observed at both sites.

Table 3. Participant confidence changes from post- to pretest October 2019–January 2021

CI, confidence interval; IRB, Institutional Review Board; OR, odds ratio; PSU, Penn State University; UMMC, University of Mississippi Medical Center.

* Question evaluated at UMMC only.

** Question evaluated at PSU only.

Program Participant Satisfaction and Qualitative Feedback

Attendees at all sites reported high overall program satisfaction scores and presenter effectiveness. Mayo Clinic and UMMC utilized a five-point scale to gauge satisfaction, to include “poor” (1), “fair” (2), “satisfactory” (3), “good” (4), and “excellent” (5). Respectively, mean program satisfaction were 4.09 and 4.31, with average presenter effectiveness scores of 4.34 and 4.36. PSU used a four-point scale, including “not at all satisfied” (1), “not very satisfied” (2), “satisfied” (3), and “very satisfied” (4). The mean program satisfaction score at PSU was 3.77, with a mean presenter effectiveness score of 3.55.

Qualitative feedback from participants was overwhelmingly positive. Several participants expressed an appreciation for learning how their duties fit within the research lifecycle, and for the opportunity to share experiences and solutions. Even experienced research team members at PSU commented that the orientation “tied everything together” with regard to the research process, and provided “tips and strategies previously unknown.” A UMMC participant expressed appreciation for learning best practices and being able to immediately implement changes based on the program. A Mayo Clinic participant called the program “an eye-opening experience to learn about the key components to a clinical trial.” Participants across sites found the networking and career modeling aspects of the program particularly beneficial.

Discussion

This effort successfully confirmed the reproducibility of a clinical research workforce training program across CTSA organizations and reinforced evidence that collaboration supports best practices in standardizing research training and development. An institutional approach to onboarding study staff provides new hires with dedicated support, alleviates burden on study teams, and helps ensure research of the highest safety and ethical standards. Lack of a centralized training program may further exacerbate staffing challenges and turnover often experienced in these positions.

The COVID-19 pandemic accentuated the value of an effective and efficient clinical research training program. Academic institutions have experienced turnover in research support positions related to individuals leaving the workforce, temporary interruptions in hiring practices as a result of financial stabilization responses, and competing industry recruitment efforts. As universities rebuild their workforce, the benefit of an established orientation program for new professionals is evident.

Results from our programs indicate participants experienced knowledge and confidence gains associated with performing the varied responsibilities of clinical research professionals. Participants were highly satisfied with their respective programs and no significant knowledge differences were noted between sites. We believe the homogeneity of results achieved across programs points to both the benefit and the critical nature of successful collaborations. The value of our synergistic partnership extended beyond positive participant outcomes to also include valuable byproducts from sharing best practices and lessons learned throughout. Each site contributed and benefited from the partnership in different, yet tangible ways.

Considerations for Implementation at Other Organizations

As other institutions consider opportunities to implement new or improve existing clinical research workforce offerings, we propose seven key steps:

  1. 1. Assess your landscape (needs of research staff, existing training, SMEs, resources)

  2. 2. Identify intended program audience

  3. 3. Secure institutional buy-in and support (including funding, as needed)

  4. 4. Establish the core framework for orientation (integration within a learning management system, as needed)

  5. 5. Establish assessment metrics

  6. 6. Develop supplemental/ongoing trainings

  7. 7. Market the program

Dedicated resources to support both the initial development and the maintenance of programs cannot be understated. Personnel effort is necessary to develop and maintain content, infrastructure to deliver program content, and evaluative methods to measure program results. A mix of talents is suggested to include individuals with clinical research subject-matter expertise, as well as those with experience in curriculum-building, public speaking, and the selected content delivery platform(s). Inclusion of skilled marketing professionals from the outset is also advised to assist with establishing a program “brand” identity and consistent program communications.

Limitations

The described standardized program was implemented at unique research organizations with aspects tailored to account for these distinctions. Results reported are therefore not generalizable to all research organizations or study staff. Further, most program participants continued to perform their position duties while attending orientation sessions. Knowledge and confidence gains may therefore not be directly attributable to the program itself but may have resulted in part due to experience gained over the assessment period. This is limited, however, in that many program participants were seasoned research staff. Self-reported confidence also comes with its own set of limitations, including under- or over-reporting, and is therefore a less reliable indicator of program efficacy [Reference Dunning, Heath and Suls11].

Future Research and Program Opportunities

Expansion of the program to other CTSA sites and evaluation of the efficiency and effectiveness of doing so could provide a step towards standardized training and competencies for clinical research study staff. Utilization of a collaborative learning space like the Development, Implementation, and Assessment of Novel Training in Domain-based Competencies, or “DIAMOND” portal to share onboarding processes/documents, competency assessments, standardized job descriptions, and training materials could also serve as a resource towards the goals of improved training and increased efficiencies [12-Reference Hornung, Jones and Calvin-Naylor14]. Given each institutions’ investment in such programs and strong collaborative relationships, Mayo Clinic, PSU, and UMMC are well positioned to support these future endeavors.

Supplementary material

For supplementary material accompanying this paper visit https://doi.org/10.1017/cts.2021.852

Acknowledgements

We thank Jenny Weis of the Mayo Clinic for bringing our teams together and supporting this collaborative project, Janet Keniston and Kyle Bennett of the University of Mississippi Medical Center (UMMC) for their efforts in coordinating the UMMC CREW program and data collection, Erik B. Lehman of the Penn State College of Medicine for his statistical expertise and support services, and Veronica Flock of the Mayo Clinic for feedback throughout the manuscript writing process. Penn State University and University of Mississippi Medical Center data were collected and managed using REDCap (Research Electronic Data Capture) hosted at Penn State Health Milton S. Hershey Medical Center, Penn State College of Medicine, and the University of Mississippi Medical Center.

Disclosures

The project described was supported by the National Center for Advancing Translational Sciences of the National Institutes of Health under administrative supplement award number 3UL1TR002014–03S1. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

References

Speicher, LA, Fromel, G, Avery, S, et al. The critical need for academic health centers to assess the training, support, and career development requirements of clinical research coordinators: Recommendations from the Clinical and Translational Science Award Research Coordinator Taskforce. Clinical and Translational Science 2012; 5(6): 470475. DOI 10.1111/j.1752-8062.2012.00423.x.CrossRefGoogle ScholarPubMed
Getz, K, Campo, R, Kaitin, K. Variability in protocol design complexity by phase and therapeutic area. Drug Information Journal 2011; 45(4): 413420.CrossRefGoogle Scholar
Califf, R, Filerman, G, Murray, R, Rosenblatt, M. Appendix D: discussion paper. The clinical trials enterprise in the United States: a call for disruptive innovation. In: Institute of Medicine, eds. Envisioning a Transformed Clinical Trials Enterprise in the United State: Establishing an Agenda for 2020. Washington, DC: National Academies Press; 2012.Google Scholar
U.S. Food & Drug Administration. FY 2017 Inspectional Observation Summaries. MD: Silver Spring; 2018.Google Scholar
Calvin-Naylor, NA, Jones, CT, Wartak, MM, et al. Education and training of clinical and translational study investigators and research coordinators: a competency-based approach. Journal of Clinical and Translational Science 2017; 1(1): 1625. DOI 10.1017/cts.2016.2.CrossRefGoogle Scholar
Rojewski, JW, Choi, I, Hill, JR, et al. Perceived professional competence of clinical research coordinators. Journal of Clinical and Translational Science 2020; 5(1): e76. DOI 10.1017/cts.2020.558.CrossRefGoogle ScholarPubMed
Gwede, CK, Johnson, DJ, Roberts, C, Cantor, AB. Burnout in clinical research coordinators in the United States. Oncology Nursing Forum 2005; 32(6): 11231130. DOI 10.1188/05.onf.1123-1130.CrossRefGoogle ScholarPubMed
Simmons, WR, Hanson, LN. Designing, Operationalizing and Maintaining a Comprehensive Assessment-Driven Clinical Research Orientation. Webinar for the Association of Clinical Research Professionals, 2019.Google Scholar
Morrison, GR, Ross, SM, Kalman, HK, Kemp, JE. Designing Effective Instruction. 7th ed. Somerset, NJ: John Wiley, 2011.Google Scholar
Sonstein, S, Seltzer, J, Li, R, Jones, CT, Silva, H, Daemen, E. Moving from compliance to competency: a harmonized core competency framework for the clinical research professional. Clinical Research 2014; 10(6): 111.Google Scholar
Dunning, D, Heath, C, Suls, JM. Flawed self-assessment: implications for health, education, and the workplace. Psychological Science in the Public Interest 2004; 5(3): 69106. DOI 10.1111/j.1529-1006.2004.00018.x.CrossRefGoogle ScholarPubMed
DIAMOND. Training and Assessment Digital Network . (diamondportal.org/)Google Scholar
Hornung, C. Competency index for clinical research professionals (CIRCP) - assessment with scoring guide, 2018. (https://diamondportal.org/assessments/10)Google Scholar
Hornung, CA, Jones, CT, Calvin-Naylor, NA, et al. Competency indices to assess the knowledge, skills and abilities of clinical research professionals. International Journal of Clinical Trials 2018; 5(1): 4653.CrossRefGoogle Scholar
Figure 0

Fig. 1. Visual overview of clinical research processes at the Mayo Clinic.

Figure 1

Table 1. Funded full-time equivalents (FTEs) dedicated to implementation and management of the standardized MCCRO program at each site May 2019–January 2021

Figure 2

Table 2. Participant knowledge changes from pretest to posttest across all sites October 2019–January 2021

Figure 3

Table 3. Participant confidence changes from post- to pretest October 2019–January 2021

Supplementary material: PDF

Musshafen et al. supplementary material

Musshafen et al. supplementary material

Download Musshafen et al. supplementary material(PDF)
PDF 1.5 MB