Hostname: page-component-76fb5796d-wq484 Total loading time: 0 Render date: 2024-04-27T20:23:46.765Z Has data issue: false hasContentIssue false

Reimagining a summer research program during COVID: Strategies for enhancing research workforce diversity

Published online by Cambridge University Press:  28 February 2022

Brenda L. Eakin*
Affiliation:
Michigan Institute for Clinical and Health Research, University of Michigan, Ann Arbor, MI, USA
Phillip A. Ianni
Affiliation:
Michigan Institute for Clinical and Health Research, University of Michigan, Ann Arbor, MI, USA
Christy Byks-Jazayeri
Affiliation:
Michigan Institute for Clinical and Health Research, University of Michigan, Ann Arbor, MI, USA
Vicki L. Ellingrod
Affiliation:
Michigan Institute for Clinical and Health Research, University of Michigan, Ann Arbor, MI, USA College of Pharmacy, University of Michigan, Ann Arbor, MI, USA
Susan J. Woolford
Affiliation:
Michigan Institute for Clinical and Health Research, University of Michigan, Ann Arbor, MI, USA Department of Pediatrics, University of Michigan, Ann Arbor, MI, USA
*
Address for correspondence: B. L. Eakin, MS, University of Michigan, 2800 Plymouth Road, Building 400, Ann Arbor, Michigan 48109-2800, USA. Email: beakin@med.umich.edu
Rights & Permissions [Opens in a new window]

Abstract

Well-designed, accessible short-term research training programs are needed to recruit and retain underrepresented persons into clinical and translational research training programs and diversify the workforce. The Michigan Institute for Clinical and Health Research developed a summer research program, training over 270 students in 15 years. In response to the 2020 COVID-19 pandemic, we pivoted swiftly from an in-person format to a fully remote format. We describe this process, focusing on factors of diversity, equity, and inclusion including enabling student participation in remote research activities. We collected data about students’ learning experiences since the program’s inception; therefore, we could evaluate the impact of remote vs. in-person formats. We examined data from five cohorts: three in-person (2017–2019; n = 57) and two remote (2020–2021; n = 45). While there was some concern about the value of participating in a remote format, overall students in both formats viewed the program favorably, with students in the remote cohorts rating some aspects of the program significantly more favorably. In addition, more students who identified as Black or African American participated in the remote format than in the in-person format. We describe lessons learned from this unprecedented challenge and future program directions.

Type
Special Communications
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
© The Author(s), 2022. Published by Cambridge University Press on behalf of The Association for Clinical and Translational Science

Introduction

In recent years, there has been an increasing need to diversify the clinical and translational science workforce. In 2019, the National Institutes of Health released updated guidelines encouraging institutions to enhance the diversity of their workforce, stating that “scientists and trainees from diverse backgrounds and life experiences bring different perspectives, creativity, and individual enterprise to address complex scientific problems” [1]. To achieve greater diversity in the clinical and translational research (CTR) workforce, efforts have been made to develop training programs that are intended to recruit individuals from underrepresented populations [Reference Estape, Quarshie and Segarra2Reference Smalley and Warren7]. However, the proportion of underrepresented minority researchers in the CTR workforce continues to be smaller than that of the US population at large [Reference Heggeness, Evans, Pohlhaus and Mills8]. Therefore, it is critically important to establish substantive multidisciplinary research training to recruit and retain underrepresented minority individuals as well as those from disadvantaged backgrounds into CTR programs.

In 2007, the Michigan Institute for Clinical and Health Research (MICHR) at the University of Michigan developed a CTSA-sponsored immersive, 3-month summer research program to provide graduate students from lower-resourced institutions with a structured CTR experience. In 2012, the program began intentionally recruiting from groups traditionally underrepresented in translational research. Studies related to other summer research programs have demonstrated that this type of short, immersive experience has many benefits, including an increase in students’ self-reported interest in research and research skills [Reference Cain, Kramer and Ferguson9Reference Salerno, Gonzalez-Guarda and Hooshmand12], having an increased publication rate [Reference Brandl, Adler, Kelly, Taylor and Best13], and an increased likelihood of students pursuing a research career [Reference Smalley and Warren7,Reference Chou, Hammon and Akins10,Reference Johnson, Thomas, Pinheiro and Svetkey14].

From 2007 to 2019, MICHR’s summer research program was implemented as an in-person, mentored research experience. However, like many other CTSA hubs, in 2020 the COVID-19 pandemic necessitated an unanticipated and swift change in program format. Although previous work has described the impact the COVID-19 pandemic has had on CTR trainees and scholars [Reference Fattah, Peter, Sigel and Gabrilove15,Reference McCormack, Bredella and Ingbar16], few studies have evaluated the effects of pandemic-related changes to the curriculum of a CTR training program [Reference Afghani17]. In this paper, we will describe the process of quickly converting an established, in-person program to a comprehensive remote format, focusing on factors of diversity, equity, and inclusion. We will compare the impact of these changes by examining pre- and post-COVID student programmatic experiences and describe lessons learned from this unique challenge.

Methods

The Summer Research Program

The MICHR summer research program was developed in 2007 to provide students in master’s and health professions degree programs with a participatory experience in translational research. In 2012, the program was restructured to concentrate on health disparities research. The program attracts students from across the US and Puerto Rico who have diverse research interests and backgrounds. Students accepted into the program participate in activities focused on interdisciplinary and collaborative work in translational and health disparities research.

This short-term program is offered from June through August each year. As part of their application, students provide a brief description of their research interests, which assists the program manager in matching them with a faculty member at the University of Michigan who shares their research interests. Faculty mentors in all years of the program were given the opportunity to interview prospective students before agreeing to work with them.

Students work approximately 35 hours a week with their assigned faculty mentor on an ongoing research project and participate for approximately five hours a week in a structured curriculum that explores substantive methodological and career topics related to health disparities and translational research. The curriculum is aligned with seven of the CTSA core thematic areas [18] including 1) CTR questions, 2) study design, 3) regulatory support and knowledge, 4) responsible conduct of research, 5) translational teamwork, 6) leadership, and 7) community engagement (see Table 1). The structured learning components (e.g., activity-based seminars, journal club, group projects, and community-based site visits) provide students with hands-on experience in a range of health disparities and translational research activities that allow for consistent opportunities for interdisciplinary learning. The program also includes required training in the protection of human subjects and responsible conduct of research and multiple opportunities to learn and practice scientific communication skills. At the beginning of the program, students complete a short, 5-station research-focused objective structured clinical exam (R-OSCE) to assess their health disparities and translational research knowledge and skills. An in-depth discussion of this instrument is described elsewhere [Reference Ianni, Eakin, Samuels, Champagne and Ellingrod19]. Students are encouraged to discuss the results of the exam with their mentors and use them when developing their individualized development plans for the program.

Table 1. Program curriculum

Transition to a Remote Format

Prior to 2020, students participated in all research and curriculum activities in-person at the University of Michigan Ann Arbor campus. However, due to restrictions related to the COVID-19 pandemic, the program was reformatted in 2020 to be fully remote, including participation in remote research activities. Students in the 2021 cohort also participated remotely.

Since the inception of the MICHR summer research program, learning activities have been mapped to CTSA-defined competencies. To adapt this program to a remote format, all curriculum activities were reviewed by program managers and the faculty director to determine if 1) program competencies continued to align with the new remote program structure, 2) programmatic activities could be structured in a manner that fit a synchronous, remote learning format to accommodate multiple time zones, and 3) adequate resources were available to provide equitable access for all participants and to maintain the experiential and group learning structure that was deemed critical to the success of the summer program, including participation in remote research activities. Activities that met these criteria were retained and, if needed, redesigned for remote learning. Items that did not contribute to mastery of defined translational or health disparities research skills or those that were not able to be presented in a synchronous, virtual format were replaced with learning activities that better met the program goals.

In all, 12 of the 15 activities (80%) from the in-person curriculum were retained in the remote format. Three activities were not included because the sponsoring departments ceased offering them in 2020. During program planning, we recognized that students’ access to online research tools and learning platforms would vary. To ensure that all students had fair and equitable access to the program, we contacted them before the program started and asked about their level of internet access, their access to a computer with a camera and microphone, what time zone they were in, and their familiarity with learning management systems. Adjustments were made to the curriculum activities to promote access for all students. For example, when we discovered connectivity issues, such as temporary loss of internet access for some students during Hurricane Isaias in 2020, we modified assignments to accommodate students’ needs.

Synchronous, online learning in diversity, equity, and inclusion from across the University of Michigan was made available to students as optional activities in the curriculum. In addition, regular remote office hours with the program managers, unstructured chat sessions with a peer mentor, and a variety of virtual social activities (ice breaker activities, movie nights, group game activities) were added to foster connections and strengthen peer relationships. To reduce potential online participation fatigue and enhance contact with students, activities that were previously presented in 2–3 hour in-person seminars were adjusted to be presented in multiple 1-hour segments.

Principles of instructional design were used to ensure that core learning objectives and the program’s critical hands-on learning structure were maintained [Reference Collins, Day, Hamilton, Legris, Mawdsley and Walsh20]. Pre-recorded video lectures were not used. Instead, an emphasis was placed on developing synchronous participatory activities including group projects, student-led journal clubs, discussion groups, and role play activities. Program activities used features of online learning to enhance engagement. For example, screen sharing for participants, polls and surveys, and closed captioning were enabled for all activities. Eight of 11 activities (72%) used annotation, virtual whiteboards, or shared documents for collaborative projects; 5 of 11 activities (45%) used breakout rooms to promote teamwork. Presentations for journal clubs and the program’s capstone presentation were conducted in a live, synchronous manner with participants having the ability to raise hands to ask questions or add comments and questions to a monitored chat. Students were encouraged to have their cameras on during learning activities, but it was not a requirement.

At the beginning of this transition, few faculty members at our institution had experience with remote learning. To familiarize our faculty presenters and research mentors with the virtual platform, we held multiple, brief training sessions that provided tips on best practices for remote teaching and learning. We also provided personal technical assistance during the synchronous remote seminars.

Data Collection

Since the program’s inception, we have developed and implemented a robust evaluation plan. Because of this, we could evaluate the impact of the remote format in comparison with the in-person format. We chose to include data from three in-person cohorts (2017–2019) and two remote cohorts (2020 and 2021). To assess the success of the new remote program structure, students were asked to complete surveys once a month during the 3-month program (three surveys total) to ensure their needs were identified and responded to in an appropriate and timely manner. This was a necessary departure from the single end-program survey structure that past in-person cohorts of students completed. Students were asked to answer eight questions about their experiences in the program. Six questions matched those from past programmatic surveys, including questions related to the relevance of the didactic sessions to students’ research activities, whether and how the didactic sessions improved understanding of the topic presented, how often students met with their summer research teams, and students’ experiences working with their faculty mentors. Two new questions were included to assess the frequency and mode of communication between the students and their mentors as well as their experience with the technology used for the remote program sessions. We also administered a survey to faculty mentors at the end of the program in the 2020 and 2021 cohorts.

We recognize that student evaluation surveys do not always provide a full picture to inform program improvement; therefore, we held focus group sessions with students in the 2020 and 2021 cohorts to supplement the survey data [Reference Reyna21]. This structured focus group replaced previous years’ more informal face to face feedback sessions where students were asked to reflect on the strengths and weaknesses of various aspects of the program. The focus groups took place remotely during the last week of the program. Three themes were used to guide the discussions: 1) How did participation in the MICHR summer research program impact students’ short- and long-term plans for a research career, 2) how did students and mentors work together to advance research, and 3) what aspects of the program had the greatest positive impact on students’ understanding of translational and health disparities research. Students were asked 10 questions related to the overarching themes. With students’ permission, two note takers recorded the discussion and comments, and the transcripts were coded for themes by two trained coders.

Data Analysis

To assess changes in student experience between the in-person and remote formats, scores from feedback surveys of three prior cohort years were compared to survey scores from the remote program cohorts. This analysis was conducted with a series of 5 × 1 one-way ANOVAs, with simple planned contrasts using SPSS 26 software [22]. This project was reviewed and deemed exempt by the University of Michigan Institutional Review Board.

Results

Students from all five cohort years (n = 102) participated in translational and health disparities research projects during their time in the program. Seven students in the 2020 cohort and four in the 2021 cohort participated in research projects directly related to COVID-19. As seen in Table 2, the program included diverse cohorts of students from a variety of disciplines, most commonly from health-related fields such as public health, medicine, pharmacy, nursing, dentistry, and biology but also informatics, kinesiology, and biomedical engineering. Notably, significantly more students who meet the NIH criteria for disadvantaged status and Black/African American students participated in the program when it was offered remotely.

Table 2. Participant demographics

** For this program, an individual was considered disadvantaged if they met two or more criteria of the NIH definition including, but not limited to homelessness, being in the foster care system, eligible for Federal Free and Reduced Lunch program, and eligible for or receiving a Pell grant [23].

Survey Results

Mean scores for each item and cohort year for the feedback survey are shown in Table 3 (rating scale of 1 – strongly disagree to 5 – strongly agree for each item). Overall, students in both the in-person and remote program cohorts rated their experience very high, with no question receiving a score less than 4.0. However, there were notable differences for students between the remote and in-person formats. Planned comparisons showed that students in the remote cohorts gave significantly higher ratings for three items: 1) the program encouraged student interest in pursuing a clinical, translational, or health disparities research career, 2) students learned information that would be useful in their future career, and 3) students would recommend the program to others. Additionally, most faculty mentors in the remote cohorts indicated that they were satisfied or very satisfied with the virtual platforms as a form of communication, though many preferred an in-person program over a virtual program.

Table 3. Mean feedback survey scores for each cohort year

Focus Group Results

Focus group sessions were held at the end of the program with students in the 2020 (n = 18) and the 2021 cohorts (n = 19). Five overarching themes emerged: 1) interest in the program, 2) impact of the program on students’ plans to continue with a career in research, 3) the effects of mentorship on students’ research experiences, 4) attitudes toward the program, and 5) the remote aspects of the program.

Students cited several reasons for their interest in the program including gaining research experience before applying to graduate programs, interest in learning more about health disparities and translational research, and the opportunity to explore translational research as a future career (“I wanted to have research experience before starting my PhD. [That’s] hard to find for master’s students.”). When asked how participating in the program affected their plans for continuing in research, students said it had a positive impact on their long- and short-term career goals as well as their view of research as a career path (“I didn't think I would do research in my second year of med school…I plan to continue working with my mentor…which wasn’t my original plan.”). For some students, experience in the summer research program confirmed their choice of career, while others said the program exposed them to new career possibilities and was a catalyst to changing their career paths (“I realized I wanted to focus on clinical research as a career, not just to supplement my applications to medical school and residency.”). When discussing their mentor/mentee relationships, students said their mentors exposed them to new career possibilities, helped them establish professional social networks and career goals, and provided support and guidance (“[It was] interesting to talk with my mentor and [get her] thoughts about choosing a specialty. It was nice to see a different perspective.” “[My mentor] helped me find resources…and connected me with people who can help me with my career.”). In fact, several students said they planned to continue working with their mentor after the program ended, which was not their original intent. Lastly, students described the opportunity to develop research communication skills, the variety of learning experiences, and developing relationships with mentors as positive aspects of the program. Students also suggested potential program improvements and made recommendations for students considering applying to the program in the future.

We asked students to share what they liked and did not like about the remote format. Several said they missed having social interactions with their peers and expressed a desire to be more connected (“It would be nice to have a few more sessions like the social hours where students in the program could talk to each other. I’m sure it was much different when the program was in person but having that kind of touchpoint and being able to interact with each other more rather than being in a presentation together.”). Students also expressed a need for more bonding time (“I really enjoyed the opportunities that we did have to see what everyone else was working on and get to hear about everyone else’s research. I wish we would have had a little bit more time for collaboration like that… I think [being online] put some constraints on that”). This feedback suggests that while students liked the program, the online format constrained their ability to form a sense of community with students in their cohort and with their research mentors.

Discussion

Like other education programs in the era of COVID, moving our summer program online came with challenges. However, it also brought us the opportunity to recruit and train a more diverse group of students. The remote format allowed students greater access to training and resources they might not otherwise have had by eliminating barriers to participation, such as the cost of relocating to our university for the duration of the program, and providing increased access to a rich network of faculty mentors [Reference Kimble-Hill, Rivera-Figueroa and Chan24]. In addition, we were able to reevaluate our curriculum to ensure it provided up-to-date training in clinical and translational science [18] and remained current with trends in remote learning in medical education [Reference Almarzooq, Lopes and Kochar25].

Because the program already included a robust evaluation plan, the evaluations we have conducted since 2017 enabled us to directly compare student perceptions of the remote format to the in-person format. Overall, students in both the in-person and remote cohorts viewed the program favorably, with students in the remote cohorts indicating their program experience increased their interest in pursuing a career in health disparities research, helped them learn new information that would be useful in their future career, and that they would recommend the program to others significantly more than those in the in-person cohorts. We also saw a significant increase in the diversity of the students in our program, both in terms of race and disadvantaged status. The ability for students to participate remotely helped them see our program as a viable and attainable training opportunity, and we anticipate that the positive impact of the program will feed into the diversity of the CTR workforce.

Lessons Learned

Several important lessons were learned from the process of moving a highly successful in-person program to an equally successful remote program.

  1. 1. Mapping curriculum learning activities to established competencies was an important tool in the design of our program, whether the learning was being conducted in-person or remotely. This strategy allowed us to design activities that met defined learning objectives and program goals [Reference Collins, Day, Hamilton, Legris, Mawdsley and Walsh20,Reference Hall, Nousiainen and Campisi26]. Not all program learning components needed to be moved to an alternate learning format to maintain programmatic goals. When determining which activities to move to alternate formats, we selected those that matched defined program competencies and could be measured through a structured program evaluation plan.

  2. 2. In our program, web access, time zone changes, and other technical issues posed barriers to participation for some students. Research has shown that these types of issues may pose disproportionate barriers for underrepresented students [Reference Kimble-Hill, Rivera-Figueroa and Chan24,Reference Castelli and Sarvary27]. When developing program activities, we sought out ways to promote equity and minimize barriers. This included encouraging, but not requiring, camera use during synchronous remote activities, including both written and verbal learning activity instructions, and providing program content in multiple ways (including asynchronously) to maximize access and inclusion [Reference Collins, Day, Hamilton, Legris, Mawdsley and Walsh20,Reference Castelli and Sarvary27].

  3. 3. In reviewing our curriculum, we realized that learning activities that worked well in two- or three-hour classroom seminars were too long to be presented in a remote format. To promote inclusion and engagement for all program participants, we divided curriculum activities into smaller segments. Taking frequent breaks to answer questions or provide clarity for activities also encouraged inclusion and engagement [Reference Reyna21].

  4. 4. When creating our curriculum, whether in-person or remote, we incorporated varied teaching strategies and learning technologies to encourage an inclusive learning environment and meet students’ needs [Reference Collins, Day, Hamilton, Legris, Mawdsley and Walsh20,Reference Fermin-Gonzalez28]. We included multiple strategies including synchronous lectures with closed captioning, learning supplements in written and audio formats, synchronous online platforms for instructor-led and group learning activities, Google forms for collaborative writing, and institutional learning management systems to post program assignments, provide instructor feedback, and host electronic portfolios to promote accessibility and inclusion [Reference Fermin-Gonzalez28].

Conclusion

The process of redesigning the MICHR summer research program was unanticipated, but it allowed us to meet students where they are and examine how equity and inclusion factor into and impact our program. In future years, we are thinking more critically about what would be most equitable for all students who may be interested in participating in our program, but who may have experiences or circumstances that may deter or prevent them from participating in an in-person or residential program at the University of Michigan. Students’ comments about longing to form community among their cohort or meet with their research mentor and team in-person echoed our own impressions that a remote connection cannot replace an in-person experience in all cases. In the summer of 2022 and beyond, we will explore potential hybrid approaches, with some students participating in in-person research and others participating in remote research, and all students participating in remote learning sessions and activities together. In the future, we will examine the long-term impact that the virtual program had on students’ publications and career trajectory.

Acknowledgments

This project was supported by grants from the National Center for Advancing Translational Sciences (NCATS); UL1TR002240 and TL1TR002241.

Disclosures

The authors have no conflicts of interest to report.

References

National_Institutes_of_Health. Notice of NIH’s Interest in Diversity, 2019. (https://grants.nih.gov/grants/guide/notice-files/NOT-OD-20-031.html).Google Scholar
Estape, ES, Quarshie, A, Segarra, B, et al. Promoting diversity in the clinical and translational research workforce. Journal of the National Medical Association 2018; 110(6): 598605. DOI 10.1016/j.jnma.2018.03.010.CrossRefGoogle ScholarPubMed
Rubio, DM, Hamm, ME, Mayowski, CA, et al. Developing a training program to diversify the biomedical research workforce. Academic Medicine 2019; 94(8): 11151121. DOI 10.1097/ACM.0000000000002654.CrossRefGoogle ScholarPubMed
Odedina, FT, Reams, RR, Kaninjing, E, et al. Increasing the representation of minority students in the biomedical workforce: the ReTOOL program. Journal of Cancer Education 2019; 34(3): 577583. DOI 10.1007/s13187-018-1344-6.CrossRefGoogle ScholarPubMed
Salto, LM, Riggs, ML, Delgado De Leon, D, Casiano, CA, De Leon, M. Underrepresented minority high school and college students report STEM-pipeline sustaining gains after participating in the Loma Linda University Summer Health Disparities Research Program. PLoS One 2014; 9(9): e108497. DOI 10.1371/journal.pone.0108497.CrossRefGoogle Scholar
Kuo, AA, Sharif, MZ, Prelip, ML, et al. Training the next generation of latino health researchers: a multilevel, transdisciplinary, community-engaged approach. Health Promotion Practice 2017; 18(4): 497504. DOI 10.1177/1524839916665091.CrossRefGoogle ScholarPubMed
Smalley, KB, Warren, JC. Disparities Elimination Summer Research Experience (DESRE): an intensive summer research training program to promote diversity in the biomedical research workforce. Ethnicity & Disease 2020; 30(1): 4754. DOI 10.18865/ed.30.1.47.CrossRefGoogle ScholarPubMed
Heggeness, ML, Evans, L, Pohlhaus, JR, Mills, SL. Measuring diversity of the national institutes of health-funded workforce. Academic Medicine 2016; 91(8): 11641172. DOI 10.1097/ACM.0000000000001209.CrossRefGoogle ScholarPubMed
Cain, L, Kramer, G, Ferguson, M. The Medical Student Summer Research Program at the University of Texas Medical Branch at Galveston: building research foundations. Medical Education Online 2019; 24(1): 1581523. DOI 10.1080/10872981.2019.1581523.CrossRefGoogle ScholarPubMed
Chou, AF, Hammon, D, Akins, DR. Impact and outcomes of the Oklahoma IDeA Network of Biomedical Research Excellence summer undergraduate research program. J Microbiol Biol Educ 2019; 20(3): 10.1128/jmbe.v20i3.1815.CrossRefGoogle ScholarPubMed
McLean, NA, Fraser, M, Primus, NA, Joseph, MA. Introducing students of color to health sciences research: an evaluation of the health disparities summer internship program. Journal of Community Health 2018; 43(5): 10021010. DOI 10.1007/s10900-018-0505-1.CrossRefGoogle ScholarPubMed
Salerno, JP, Gonzalez-Guarda, R, Hooshmand, M. Increasing the pipeline and diversity of doctorally prepared nurses: description and preliminary evaluation of a health disparities summer research program. Public Health Nursing 2017; 34(5): 493499. DOI 10.1111/phn.12341.CrossRefGoogle ScholarPubMed
Brandl, K, Adler, D, Kelly, C, Taylor, P, Best, BM. Effect of a dedicated pharmacy student summer research program on publication rate. American Journal of Pharmaceutical Education 2017; 81(3): 48. DOI 10.5688/ajpe81348.CrossRefGoogle ScholarPubMed
Johnson, KS, Thomas, KL, Pinheiro, SO, Svetkey, LP. Design and evaluation of an interdisciplinary health disparities research curriculum. Journal of the National Medical Association 2018; 110(4): 305313. DOI 10.1016/j.jnma.2017.06.005.CrossRefGoogle ScholarPubMed
Fattah, L, Peter, I, Sigel, K, Gabrilove, JL. Tales from New York City, the pandemic epicenter: a case study of COVID-19 impact on clinical and translational research training at the Icahn School of Medicine at Mount Sinai. Journal of Clinical and Translational Science 2021; 5(1): e58. DOI 10.1017/cts.2020.560.CrossRefGoogle ScholarPubMed
McCormack, WT, Bredella, MA, Ingbar, DH, et al. Immediate impact of the COVID-19 pandemic on CTSA TL1 and KL2 training and career development. Journal of Clinical and Translational Science 2020; 4(6): 556561. DOI 10.1017/cts.2020.504.CrossRefGoogle ScholarPubMed
Afghani, B. COVID-19 pandemic: a catalyst for transformation of a summer online research program. Medical Education Online 2021; 26(1): 1886029. DOI 10.1080/10872981.2021.1886029.CrossRefGoogle ScholarPubMed
Clinical_and_Translational_Science_Awards. Core Competencies in Clinical and Translational Research. (https://clic-ctsa.org/sites/default/files/CTSA_Core_Competencies_final_2011.pdf).Google Scholar
Ianni, PA, Eakin, BL, Samuels, EM, Champagne, E, Ellingrod, VL. The Research Objective Structured Clinical Exam (R-OSCE): an innovative tool to assess clinical and translational research competencies. MedEdPublish 2021; 10(1): 143.Google Scholar
Collins, B, Day, R, Hamilton, JV, Legris, K, Mawdsley, HP, Walsh, T. 12 tips for pivoting to teaching in a virtual environment. MedEdPublish 2020; 9. DOI 10.15694/mep.2020.000170.1.CrossRefGoogle Scholar
Reyna, J. Twelve tips for COVID-19 friendly learning design in medical education. MedEdPublish 2020; 9: 103. DOI 10.15694/mep.2020.000103.1.CrossRefGoogle Scholar
IBM SPSS Statistics for Windows, Version 26.0, 2019.Google Scholar
National Institutes of Health.Disadvantaged Background. (https://era.nih.gov/commons/disadvantaged_def.htm).Google Scholar
Kimble-Hill, AC, Rivera-Figueroa, A, Chan, BC, et al. Insights gained into marginalized students access challenges during the COVID-19 academic response. Journal of Chemical Education 2020; 97(9): 33913395.CrossRefGoogle Scholar
Almarzooq, ZI, Lopes, M, Kochar, A. Virtual learning during the COVID-19 pandemic: a disruptive technology in graduate medical education. Journal of the American College of Cardiology 2020; 75(20): 26352638. DOI 10.1016/j.jacc.2020.04.015.CrossRefGoogle ScholarPubMed
Hall, AK, Nousiainen, MT, Campisi, P, et al. Training disrupted: practical tips for supporting competency-based medical education during the COVID-19 pandemic. Medical Teacher 2020; 42(7): 756761. DOI 10.1080/0142159X.2020.1766669.CrossRefGoogle ScholarPubMed
Castelli, FR, Sarvary, MA. Why students do not turn on their video cameras during online classes and an equitable and inclusive plan to encourage them to do so. Ecology and Evolution 2021; 11(8): 35653576. DOI 10.1002/ece3.7123.CrossRefGoogle ScholarPubMed
Fermin-Gonzalez, M. Research on virtual education, inclusion, and diversity: a systematic review of scientific publications. International Review of Research in Open and Distributed Learning 2019; 20(5): 146167.CrossRefGoogle Scholar
Figure 0

Table 1. Program curriculum

Figure 1

Table 2. Participant demographics

Figure 2

Table 3. Mean feedback survey scores for each cohort year