Hostname: page-component-7bb8b95d7b-dtkg6 Total loading time: 0 Render date: 2024-09-18T09:19:21.603Z Has data issue: false hasContentIssue false

What does team science look like across the CTSA consortium? A qualitative analysis of the Great CTSA Team Science Contest submissions

Published online by Cambridge University Press:  12 July 2021

Clara M. Pelfrey*
Affiliation:
Clinical and Translational Science Collaborative (CTSC), Center for Clinical Investigation, Case Western Reserve University, Cleveland, OH, USA
Ann S. Goldman
Affiliation:
George Washington University Milken Institute School of Public Health, Department of Epidemiology, Clinical and Translational Science Institute at Children's National (CTSI-CN), Washington, DC, USA
Deborah J. DiazGranados
Affiliation:
School of Medicine, Department of Medicine, Wright Center for Clinical and Translational Research, Virginia Commonwealth University, Richmond, VA, USA
*
Address for correspondence: C.M. Pelfrey, PhD, Associate Professor, School of Medicine, Clinical and Translational Science Collaborative, Center for Clinical Investigation, Case Western Reserve University, Biological Research Bldg., Room 109C, 10900 Euclid Ave., Cleveland, OH 44106, USA. Email: Clara.pelfrey@case.edu
Rights & Permissions [Opens in a new window]

Abstract

Introduction:

The Great CTSA Team Science Contest (GTSC) was developed to discover how Clinical and Translational Science Award (CTSA) hubs promote and support team science [1]. The purpose of this study was a secondary qualitative analysis of the GTSC submissions to better understand the diversity of team science initiatives across the CTSA consortium.

Methods:

Secondary qualitative analysis of the GTSC data addressed the following research questions, which defined the top-level coding: (1) What CTSA component sponsored it? (2) Who was the team doing the work? (3) Who were the intended beneficiaries? (4) What was the intended outcome? (5) What strategies did they use? (6) What translational science (TS) stage was addressed? (7) How do they align with the NCATS team science strategic goals? (8) How do the CTSA’s team science efforts align with the National Academies Research Council (NRC) recommendations for enhancing the effectiveness of team science?

Results:

The GTSC received 170 submissions from 45 unique CTSA hubs. Qualitative analysis revealed a great variety of team science strategies for virtually all team science stakeholders. In addition to strategies to promote team science, results show successful examples that focus on outcomes and illustrate ways of measuring success.

Conclusions:

The GTSC shows that the CTSA consortium is involved in an extremely diverse array of team science activities, which align well with both the NRC recommendations for enhancing the effectiveness of team science and the NCATS strategic goals for team science. Future research should evaluate the efficacy of team science strategies.

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
© The Author(s), 2021. Published by Cambridge University Press on behalf of The Association for Clinical and Translational Science

Introduction

The scope of team science strategies across the Clinical and Translational Science Award (CTSA) consortium is extremely diverse, but they have not been fully characterized or evaluated for effectiveness. Many CTSA hubs are engaged in promoting team science, however, there is no single definition of what constitutes “team science.” The increasing complexity of Clinical and Translational Science (CTS) problems has prompted large investments in team science initiatives [Reference Adler and Stewart2Reference Bennett, Gadlin and Marchand9]. The National Center for Advancing Translational Science (NCATS) supporting the CTSA program has made team science one of its strategic goals with four objectives: (1) Engage patients, community members, and nonprofit organizations in the translational process; (2) Share resources through collaborative research with other NIH institutes and centers; (3) Form innovative collaborations with multidisciplinary scientists at other institutions; and (4) Develop new collaborative structures with the private sector [10].

The complexities of science along with rapid technological advances have created the necessity for scientific collaborations between researchers with complementary expertise. The demand for collaborations has outpaced the development of institutional support, policies, funding opportunities, and team science culture. These gaps gave rise to the Science of Team Science (SciTS), a field of study designed to address the value of team science and develop strategies for facilitating, engaging in, leading, and supporting scientific teams [Reference Stokols, Hall, Taylor and Moser11]. The SciTS literature contains five key subjects: the value of team science, formation of teams, how team composition affects team performance, qualities of effective teams, and how institutions affect teams [Reference Hall, Vogel and Huang12]. The Great CTSA Team Science Contest (GTSC) data address all these major subjects while providing valuable examples of successful team science across the CTSA consortium.

The GTSC was developed in the NCATS Workgroup on Institutional Readiness for Team Science to collect stories describing how CTSA hubs promote and support team science [1]. The contest invited CTSA hubs to submit important, novel, or impactful ways that they advance team science (see Appendix 1 for the GTSC Instructions and Qualtrics Survey). Here, we conducted a secondary qualitative analysis of the GTSC dataset using a set of research questions to guide the analysis: (1) What CTSA component sponsored the team science activity? (2) Who was the team conducting the team science work? (3) Who were the intended beneficiaries? (4) What was the intended outcome? (5) What method did they use? (6) What translational science (TS) stage was addressed? (7) How do they align with the NCATS team science strategic goals? (8) How do the CTSA’s team science efforts align with the National Academies Research Council (NRC) recommendations for enhancing the effectiveness of team science? Thus, the purpose of this study was to conduct a qualitative analysis of how hubs were engaging in team science and to demonstrate the great breadth of consortium team science activities. In addition, this analysis sought to classify the submissions into meaningful categories as a resource for hubs interested in expanding team science initiatives. This study aims to advance our understanding of the diversity of CTSA team science initiatives including the intended recipients, strategies used, intended outcomes, and metrics for measuring success. Since the GTSC asked for strategies for enhancing team science, the results are discussed in the context of the how well they address the National Research Council’s recommendations for enhancing the effectiveness of team science [Reference Cooke and Hilton13].

Methods

GTSC Contest

This paper represents a secondary analysis of an existing data source: The GTSC dataset. The research questions addressed here were developed after completion of the contest to mine the contest results for meaningful information about team science. Thus, several submissions do not specify sufficient data to answer the research questions and are classified as “Not explicitly specified.” The GTSC was developed in 2018 by the NCATS Workgroup on Institutional Readiness for Team Science to collect stories describing the ways hubs were promoting and supporting team science across the CTSA consortium. The original contest purpose, rules, and winners are listed on the CLIC website [1]. The contest was designed as a low-burden method to elicit information about the numerous team science strategies within the consortium. A more detailed description of the contest features, the criteria for judging, the results, and the lessons learned can be found in the companion paper [Reference DiazGranados, Pelfrey and Goldman14]. The full-text GTSC submissions are reproduced with submission ID numbers and CTSA hub names in Appendix 2 [15].

Coding Structure and Analysis

The major research questions were used to organize the top-level coding categories, and inductive processes were used to develop the subcategories, such as the types of team science interventions or the beneficiary of the intervention. Subcategories were verified with more coding examples and similar subcategories were combined (e.g. trainees and students). The goal was to develop a meaningful framework to understand what strategies are being used to advance team science across the CTSA consortium. The qualitative data analysis coding structure with definitions is listed in Appendix 3. All 170 submissions were coded independently by the 3 authors. Where there was disagreement, the final coding assignment was made where two out of the three coders agreed. Many submissions could be coded to more than one subcategory, which more accurately represents the overlapping multidisciplinary nature of team science. When this occurred, the primary team science strategy was used to categorize the submission.

Research Questions

The research questions were developed using two main criteria: (1) What CTSA hubs would find useful in terms of methods/strategies, intended beneficiaries, intended outcomes, and metrics for measuring effectiveness and (2) To answer the question: How well does team science across the CTSA consortium, as found in the GTSC dataset, address the NRC recommendations for enhancing the effectiveness of team science and the NCATS strategic goals?

This analysis addressed the following research questions: (1) What CTSA component sponsored the team science activity? (2) Who was the team conducting the team science work? (3) Who were the intended beneficiaries? (4) What was the intended outcome? (5) What team science strategy did they use? (6) What TS stage was addressed? (7) How do the findings align with the NCATS team science strategic goals? (8) How do the CTSA’s team science efforts align with the NRC recommendations for enhancing the effectiveness of team science? The analysis also examined both the short-term outcomes for the participants (e.g. researchers, trainees, students, community members) as well as the mid- and long-term outcomes to assess team science success.

Results

Q1. What CTSA Component Sponsored the Team Science Activity? and Q2. Who was the team conducting the team science work?

The first two research questions focused on who conducted the work. First, each submission was coded by labeling which CTSA component conducted the work using the components from the 2018 CTSA Funding Opportunity Announcement [16] (Table 1). Second, each submission was coded to identify the team who was conducting the work (Table 2). For the first research question, it was predicted that the Community and Collaboration core would be the primary sponsors of the work submitted via the GTSC. This was true for 43 submissions (Table 1). Some submissions highlighted community engagement studios or team science training led by the Community and Collaboration core. On the other hand, the Network Capacity core was explicitly mentioned only once (e.g. Trial Innovation Network (TIN)). The second research question proved more difficult to code, as the team conducting the work was not always explicitly stated. The category with the most team science submissions was multidisciplinary teams. Several submissions highlighted collaborations between persons evaluating the team science work and the CTSA team science experts. While 58 of the submissions were coded as being conducted by multidisciplinary teams, 71 submissions did not clearly specify who conducted the work (Table 2). These results show that the CTSA infrastructure is now an integral part of many institutions and therefore team science is not done by any single group or core.

Table 1. Which Clinical and Translational Science Award (CTSA) component or core sponsored the team science intervention?

* The 2018 Funding Opportunity Announcement PAR-18-464 for the CTSA program includes the following components: Network Capacity, Hub Research Capacity, Community/Collaboration, Translational Endeavors, Research Methods, Informatics [16].

Table 2. Who was the team conducting the team science work?

CTSA, Clinical and Translational Science Award.

Q3. Who Were the Intended Beneficiaries?

The intended beneficiaries for the submissions included the following groups (case numbers in parentheses): team scientists (119), the community (21), patients (6), and students (24) (Table 3). Team scientists included creating/fostering multidisciplinary teams, team science training programs, establishing funding mechanisms, research on team science, training research staff, community-based projects, promotion and tenure (P&T), and multi-CTSA hub initiatives. The next largest category was the community in interventions such as joint university–community research projects and improving clinical and public healthcare systems. Patients directly benefitted when they were included in the design of research studies which focused on clinical trials and improving patient services in healthcare settings. Undergraduate and graduate students included medical students, masters, and PhD students in public health and health professions, as well as engineering and law. Postgraduate students included KL2 and TL1 scholars who participated in training courses and exercises to improve team function.

Table 3. Who is the intended beneficiary of the intervention?

CTSA, Clinical and Translational Science Award.

Q4. What was the Intended Outcome?

The intended outcome of the GTSC submissions fell into four major subcategories, as shown in Table 4: A. Team science skills and process; B. Getting the community involved in research; C. Other team science activities not otherwise specified; D. Examples of multidisciplinary collaborative teams (not shown).

Table 4. What was the intended outcome of the team science submission?

* Examples of specific team science stories did not have a specific strategy with an intended outcome, and thus are not shown in Tables 4 and 5.

A. Team science skills and process

The subcategory of team science skills and process was very broad. The primary goal or intended outcomes included the following areas from most to least (number of submissions in parentheses): 1. Training in team science competencies (24); 2. Training in communication skills (9); 3. Providing pilot funding or specific awards to promote existing team science (4); 4. Training undergraduates, graduates, and postdoctoral students in team science (7); 5. Developing a team charter or engaging in team building (5); 6. Leadership or mentorship training (5); and 7. Training in technology ventures or entrepreneurship (3). Each area is described below with specific examples.

Training in team science competencies covered 24 evidence-based competencies involving the 7 dimensions of leadership, communication styles and adjusting for different styles, and workshops on leading multidisciplinary teams. For some hubs, the outcome was better team science leadership, whereas others were developing every aspect of researchers’ teams. Some innovative programs were designed to promote team science leadership skills in underrepresented postdoctoral fellows. Importantly, team science competencies were not just for leaders, but some focused on training the research staff to help the entire team function more efficiently.

Training in communication was prominently represented and took many forms, such as workshops designed to promote smoother team functioning or end-of-meeting debriefs to clarify the meeting outcomes and next steps. A more complex workshop involved faculty and community members together focusing on public understanding of science and dissemination of results. Communication training included community members, researchers, trainees/students, and clinicians.

Providing pilot funding and special awards for promising teams is a frequently used strategy to promote team science. Sometimes a new funding mechanism incentivized the formation of a multidisciplinary team with researchers or community members with common interests. To promote community research on real-world problems, some pilot programs created community member review committees for input into pilot funding decisions. To promote entrepreneurs’ ideas getting into the real world, experts in dissemination and implementation collaborated on business/marketing plans to move the ideas forward as commercial ventures.

Training undergraduate, graduate, and postdoctoral students improves the team science pipeline. Several student training programs taught team science competencies, methods, team building, entrepreneurial skills, real-world observations, and analysis of teams. Student training often encompassed longer-term sessions with a week-long residential boot camp or a semester-long course. Several student-focused trainings put students onto existing collaborative teams where they engaged in real-world problem-solving.

Team building and developing team charters are important aspects of the team science process which detail many aspects of team function including direction, roles, and how team members should interact. Team charters were regularly mentioned by participants as one of the most useful outcomes of training. Other team building outcomes included improved distance collaboration and building multidisciplinary teams following facilitated brainstorming sessions. Some team building provided a facilitated residential event to create research proposals to address a grand challenge.

Leadership or mentorship training was one component of multifaceted programs that also addressed innovation, adaptation, project management, and communication. Training varied from 2-hour workshops to 12-month programs and several leadership programs are actively evaluating their outcomes.

Developing technology or entrepreneurial skills focused on developing technological solutions for specific unmet healthcare needs. Others supported increasing awareness of entrepreneurial activity and the potential to commercialize ideas, reinforcing the practice of team science for health science innovation.

B. Getting the community involved in research

A key component to getting the community involved in TS was engaging community members in determining the specific needs of the community. There were two subcategories of team science submissions that involved the community: (1) Partnering with community members by making them part of a community advisory board, a stakeholder panel, or providing direct advice to researchers on the conception, design, and execution of the study in community settings (26); (2) Providing targeted funding opportunities that address community health needs and incentivize researchers to include community members on the team (3).

C. Other team science activities

Fostering new multidisciplinary collaborations, an outcome in 33 submissions, included bringing together researchers from different disciplines to brainstorm solutions to a specific challenge. Strategies for creating collaborations included: (1) Hosting one-time events focused on a particular health problem to foster collaboration between multidisciplinary researchers; (2) Matchmaking among researchers from different disciplines using social network analysis, speed networking or, for example, paring physicians who needed medical devices with design engineers; (3) Creating ongoing formal research networks; (4) Providing unique grant mechanisms to foster multidisciplinary research and incentivize new collaborations; (5) Connecting CTSA structures, such as the TIN, with community organizations; (6) Creating formal coursework to train researchers how to create and maintain multidisciplinary collaborations; (7) Creating programs to train researchers to be entrepreneurs.

Multidisciplinary development of new informatics or biostatistical tools was a unique type of submission (6). These were collaborations between bioinformatics researchers, computational biologists, or biostatisticians who were developing devices or apps to solve specific problems. They included: (1) Medical apps to manage chronic conditions; (2) Informatics solutions for data discovery such as finding collaborators, biomarkers or developing algorithms; (3) Developing repositories to link data to patients’ electronic health records; (4) Medical device development or improvement (5) Using videogames to train healthcare students; (6) Programs to train informatics students or emergency healthcare professions to work in multidisciplinary teams.

Workforce development with research personnel included training health equity researchers, health system leaders, and community health workers in programs that integrate transdisciplinary research, community activation, education, and policy translation (6). Other programs trained research coordinators or managers. Collaborative educational programs enhanced the skills of healthcare students to interact with a community partner on a healthcare challenge.

Expanding capacity to conduct TS consisted of developing new systems or relationships to increase TS (10). Examples include new partnerships with scientific journals, developing new MS programs in data science across multiple departments using a shared governance model, and connecting regional health systems to the TIN to facilitate community-based trials.

Changing promotion and tenure (P&T) policies for team science was in only three submissions. One story helped chairs understand what it takes for researchers to get P&T as a team scientist. Another submission involved extensive studies of the faculty, chairs, leadership, and the institutional climate to understand the attitudes and barriers to changing team science P&T policies. Finally, a team science symposium was held on ways to advance team science.

D. Examples of multidisciplinary team science

There were many submissions that described a specific successful example of multidisciplinary collaborative teams doing disease-specific or wellness interventions. These covered areas of wellness/prevention, chronic disease, cardiovascular disease, mental/behavioral health, addiction, genetics, infectious disease, reducing health disparities, cancer, and others. These generally did not detail a specific intended outcome and are not described in greater detail. If an example described a team science intervention with an intended outcome, then it was moved to the relevant intended outcome category.

Evaluation of team science strategies can reveal more effective methods for producing productive teams. Table 5 lists the evaluation metrics divided into short-, mid-, and long-term outcomes from the team science interventions. Short-term outcomes measured immediate changes in knowledge, attitudes, or beliefs (e.g. higher confidence, improved satisfaction). The midterm outcomes documented changes in behavior (e.g. writing publications and grants, applying for patents). Long-term impacts showed changes in conditions (e.g. better health of patients, program was expanded beyond the hub, increased efficiency). Among efforts to encourage team approaches, some leadership assessment programs developed evidence-based competencies or behavioral exemplars, others developed specific evaluative criteria, and some developed programs to educate health system leaders as well as community health workers. Using evaluation of factors such as target customers, demand, and competition, one group developed a private sector tool that accurately predicts the commercial viability of ventures. A few programs randomized participants to control or experimental groups for the intervention, however, no GTSC submissions reported teams’ outcomes over time.

Table 5. Evaluation metrics: short-, mid-, and long-term outcomes from team science interventions

** (Table 5) Submissions may have more than one evaluation outcome so cases may be listed more than once.

Q5. What Team Science Method did they use?

There were two types of submissions: (1) Team science research, interventions, or resources (133) and (2) Unique examples of successful teams (37). The first category revealed several types of team science interventions including team science training programs (61), the development of new teams (43), community collaboration with teams (20), and other team science methods that did not fit into the previous categories (9). Table 6 shows a breakdown of the methods used.

Table 6. What team science method did they use?

* These are all specific examples of team science. If they included a methodological strategy for enhancing team science, then they are categorized according to that strategy in the section “Team science research, intervention or resource,” and do not appear under examples.

Team science research, intervention, or resource covered team science training and research activities, including leadership training for K-scholars and trainees, symposia to discuss best practices, and collaborative research. To foster new teams, submitters organized symposia and workshops around specific topics of interest to bring researchers together to develop ideas for new proposals. Funding opportunities developed for multidisciplinary teams took the form of awards honoring specific scientists and competitive award offerings.

Unique examples of team science were stories of a specific successful research team. They were coded as an “example” if they did not describe using a strategy to promote team science.

Q6. What Translational Science Stage was Addressed?

The submissions were categorized by assigning them to a specific TS phase to determine if most of the team science is addressing research occurring early in the TS spectrum or later in the applying discoveries to public health [17]. A breakdown of contest submissions by TS phase is shown in Fig. 1. All the TS phases were represented in the submissions. The largest phase of TS was public health, policy, and prevention (40). Within that category, public health-related submissions made up 33 cases, followed by prevention studies (12) and finally policy studies (3). There was significant overlap, and so many submissions were a combination of categories (e.g. public health and prevention). The clinical trial phase of the TS spectrum (13) involved multidisciplinary teams studying across the lifespan and several examples show trials in the clinic as well as community-based trials. Development of extensive research networks was critical to the feasibility of many of the trials as was vital assistance from recruitment specialists. Eighteen cases did not clearly fit into the NCATS-defined phases, consisting of things such as developing a training program with shared governance or development of informatics tools to promote collaboration, enhance education, or provide valuable data repositories. Over half of the GTSC submissions did not fall explicitly on the TS spectrum, but were involved in the science of team science (SciTS, 97 cases, not shown).

Fig. 1. Almost half of the Great Clinical and Translational Science Award (CTSA) Team Science Contest (GTSC) submissions involved in research are in public health, policy, and prevention. This figure shows the number and percent of GTSC submissions by the translational science phase (submissions about the Science of Team Science [SciTS] are not included, n = 97). Number of submissions is in white. Percent of submissions are in black numbers next to the label. Some submissions involved in research were found in more than 1 TS phase: 24 submissions spanned more than 1 TS phase and 5 spanned 3 different TS phases.

Q7. How are Hubs Using Team Science to Address the NCATS Strategic Objectives?

Supporting the CTSA program, NCATS has made team science a strategic goal with four objectives: (1) Engage patients, community members, and nonprofit organizations in the translational process; (2) Share resources through collaborative research with other NIH Institutes and Centers; (3) Form innovative collaborations with multidisciplinary scientists at other institutions; (4) Develop new collaborative structures with the private sector. Table 7 shows how the GTSC is addressing the four NCATS team science objectives. Multidisciplinary team science strategies that directly involved patients included funding multidisciplinary research for developing translational diagnostic, treatment, and disease self-management approaches. Other initiatives linked primary care service providers and researchers for clinical trials and improved healthcare. Team Science strategies involving communities included a variety of research and implementation projects: researchers and community members in joint initiatives to reduce health disparities, reduce high blood pressure, opioid abuse, improve oral health among migrants, combat cancer and rare genetic disorders. Submissions described partnerships that addressed healthy eating and obesity. In communities and state governments, academics and communities joined in programs to reduce risk factors of infant mortality, identify a rare genetic variant in newborns, promote positive parenting, and screen and identify victims of child abuse. Other collaborative partnerships focused on health promotion and wellness programs for older adults, including those in low-income housing.

Table 7. How are hubs using team science to address the National Center for Advancing Translational Science (NCATS) strategic objectives?

* Some submissions fulfill multiple NCATS strategic objectives for team science.

We examined the data to understand how they aligned with the second objective of NCATS, that is, sharing resources and expertise across the federal government through collaborative research with other NIH institutes and Centers. Six cases demonstrated collaboration with other NIH Institutes and Centers such as the National Heart, Lung, and Blood Institute (NHLBI), National Institute of Arthritis and Musculoskeletal and Skin Diseases (NIAMS), National Endowment for the Arts (NEA), or the National Library of Medicine (NLM). These were multidisciplinary collaborations conducting clinical trials for heart disease and testing a solution for sepsis and acute respiratory distress syndrome. The cases also represented the creation of a multicenter community-based trial in pediatrics, work integrating biomedical datasets in an effort to accelerate translational research, providing a mechanism for investigators to share data and training to improve perspective-taking skills in trainees.

Regarding the NCATS third objective of multi-institutional collaborations, several multidisciplinary submissions involved multistate and multi-CTSA hub consortia around pressing health problems like the opioid crisis, obesity, and biomarkers for autism. Several consortia formed to transform clinical research with efforts such as improving the e-consent process, expanding research into primary care practices, streamlining study start-up, and helping regional healthcare systems use the TIN. Other multi-hub efforts sought to provide a grand challenge and develop teams over a week of intensive training or to network KL2 and TL1 trainees from across multiple hubs.

Relating to the final NCATS team science objective of developing collaborative structures with the private sector, most of the submissions that involved the private sector concerned helping programs to scale inventions to help develop commercial health products or public health interventions. Other private sector collaborations involved getting industry members to join partnerships around major health challenges (e.g. opioid crisis, cancer) or getting input into potential start-up companies.

Discussion

The value of this qualitative analysis of the GTSC dataset is the examination of the breadth of team science across the consortium. What the individual submissions lacked in detail, they made up for in the breadth of interventions, the diversity of beneficiaries, and the broad multidisciplinary involvement.

Two kinds of submissions dominated the GTSC: strategies for promoting team science, but many lacked evidence of efficacy versus examples of team science that often lacked clear team science strategies, but provided examples of successful outcomes. The outcomes included short-term evaluation metrics such as increased leadership self-efficacy, greater confidence in conducting clinical research, and greater understanding of the value of multidisciplinary research. This led to midterm behavior changes by teams such as increased efficiency, writing grants, publications, and patent applications, and developing novel apps/digital tools. Finally, several long-term outcomes were attributed to the team science strategies, including improving health access for underserved populations, expanding capacity for special populations in clinical trials, policy changes, and expanding programs to other hubs.

Team science represents the ability for teams to effectively collaborate but also, the ability to integrate knowledge from diverse perspectives. Successfully involving the community in research is one transdisciplinary challenge in team science. To partner effectively with community members involves engaging in research that is both meaningful to the community and focused on community priorities. Many GTSC submissions describe such collaborations including community interventions and improvements in public health. Several submissions were focused on joint university–community research programs to improve public- and clinical/mental-health. Other team science initiatives provided training to team scientists and community members, involving them in creating funding mechanisms for TS, reviewing project proposals, and providing venues and activities for scientists and community members to meet and develop project ideas.

Integrating knowledge from individuals who have different perspectives is another vital aspect of engaging in successful transdisciplinary team science which involves fostering knowledge sharing and optimizing the use of appropriate collaboration technologies [Reference Cooke and Hilton13]. As technology, informatics, and medicine get more advanced, more instances of knowledge integration between fields are occurring, such as pairing engineers, app developers, and medical professionals to address unfulfilled needs in medicine. Few submissions rigorously measured team science effectiveness, however, some did address the need for data access to promote knowledge integration and the need for digital tools to obtain data on the value of team science.

There is ample evidence that the CTSA consortium is addressing the team science objectives set forth by NCATS [10]. Objective two, regarding sharing resources among NIH institutes and centers, is not extensively addressed. Although some GTSC submissions mention other centers (e.g. NHLBI, FDA, NIAMS, etc.), the contest did not ask for this information specifically [10]. The other three NCATS team science objectives are well-addressed by the GTSC submissions.

The GTSC dataset provides appropriate evidence that the CTSA consortium is addressing the team science recommendations made by the National Research Council (NRC) in 2015 [Reference Cooke and Hilton13]. Here, we address the recommendations according to their intended audience. For leaders of science teams, the NRC recommends using analytic tools to guide team composition and to evaluate both professional and science leadership development for science teams [Reference Cooke and Hilton13]. Several submissions used analytic tools to guide team composition. Eight submissions were able to include evaluation results of team activities. Two submissions directly addressed the NRC’s request for translating leadership evidence by developing and assessing leadership indicators. Clear metrics for evaluating team science do exist and CTSA hubs engaged in promoting team science would be wise to carefully evaluate what works to enhance team science [Reference Cooke and Hilton13, Reference Mâsse, Moser and Stokols18Reference Belcher, Rasmussen, Kemshaw and Zornes20].

Concerning geographically dispersed teams, the NRC recommends providing activities to develop shared vocabularies and work routines, train team members to use proven collaboration technologies, and develop criteria to give credit for team-based work [Reference Cooke and Hilton13]. Many of the submissions involved training in team-based competencies and developing common languages for collaboration. Many hubs developed new informatics tools to improve collaboration across geographic distances.

Regarding public and private funders, the NRC recommends reducing the barriers to team science by encouraging new collaborative models, providing informational resources, and removing disincentives to participate in team science while including collaboration plans and how they will promote knowledge integration between disciplines. Finally, they propose studying the effectiveness of team science and facilitating access to research and personnel data [Reference Cooke and Hilton13]. Many GTSC submissions described highly unique methods to promote collaboration between disparate groups of researchers. Only three submissions addressed overcoming academic disincentives to engage in team science by changing the P&T policies to allow team scientists to achieve parity with individual investigators. Some submissions addressed developing written plans for the collaboration. Team charters were regularly mentioned as one of the most useful collaboration tools.

Pertaining to researchers, the NRC recommends improved methods to match participants with project needs to enhance team effectiveness, professional and leadership development for team science, and collaboration with universities to develop new principles and remove barriers that discourage team science. Several submissions promoted matchmaking among scientists from different disciplines around research needs in basic science, healthcare, and public health. Likewise, many of the submissions involved professional development covering team science competencies. The final recommendation, to evaluate how criteria for allocating credit for team science are working was not addressed by the GTSC submissions. This may be due to the fact that although most CTSA hubs have some policies expressing the value of team science, there is significant variability across the consortium and within-institution variability as to how the policies are carried out [Reference McHale, Ranwala, DiazGranados, Bagshaw, Schienke and Blank21]. Most hubs have not had sufficient time or resources to evaluate the effects of team science P&T policies.

In connection with universities and other scientific organizations as well as the scientific community, the NRC recommends fostering positive team processes to enhance effectiveness, create and evaluate leadership development opportunities and remove barriers that discourage young faculty who are interested in team science from joining teams. The GTSC received many examples of positive team processes from training in team competencies to fostering good communication and strong leadership. The final barrier preventing systemic adoption of team science may involve diverging from the current funding system that rewards the investigator-initiated grants to provide additional incentives for team-based science (e.g., P&T policies for team science). In summary, both the NCATS team science objectives and most of the NRC recommendations for team science are being addressed in the GTSC dataset. There is one notable exception: evaluating how criteria for allocating credit for team science are working was not addressed. This area warrants further research.

Strengths, Limitations, and Future Directions

Strengths of this study include the robust CTSA consortium participation of 45/64 (70%) hubs and the large and extremely diverse dataset of submissions to the GTSC, which enabled the demonstration of the great heterogeneity in the consortium interventions to enhance team science. The contest provided more team science efforts than would other methods such as expert opinions, focus groups, or surveys. The resulting dataset and qualitative analysis provide hubs with a large reservoir of information so that they can focus on the team science strategies and outcomes that interest them. These strategies show the great breadth of innovation, particularly in interventions that involve the community or industry in team science. Besides many traditional team science trainings, there were several submissions that encompassed both an intervention, such as a scientific retreat, and a successful example of team science. Although the GTSC did not request examples of innovative and successful team science, several were submitted that demonstrated involvement of researchers who are not typically associated with team science.

Subdividing submissions according to the 2018 NCATS FOA categories assumed that a particular CTSA component was the driver of the team science story. This was not the case in many submissions that either did not specify what component was doing the work or involved many collaborating CTSA components and/or institutional promotion of team science. This is a positive finding which suggests team science has permeated so many aspects of the CTSA infrastructure that it has become intrinsic to the fabric of a CTSA hub and is no longer an activity performed by a single core.

For limitations, this study is a secondary analysis of an existing dataset. The GTSC dataset is not a random sample nor is it representative of all team science, and not all cases in the dataset were equally informative. The GTSC was designed to be an easy, low-effort contest so that more hubs would respond to the survey. Considering potential survey fatigue, the GTSC designers had to make a choice between getting fewer entries that were more detailed versus getting more responses that were very brief. They opted for the latter to demonstrate the breadth of team science. Additionally, some entrants submitted a specific example of a successful team rather than strategies for encouraging better team science, suggesting perhaps a lack of understanding about the purpose of the GTSC. Thus, some lack of clarity in describing interventions was due to the 150-word limit, which meant the submissions lacked significant detail on the process, the key stakeholders, or the evaluation outcomes. An additional limitation is a difficulty in teasing out distinct categories when there is significant complexity between the causal networks required to move from early TS training to measuring TS improvements to more productive research and publications to health-related benefits.

There are many challenges for defining team science outcome metrics, as well as institutional policies that could thwart well-meaning efforts including P&T criteria, indirect cost allocation policies, team science incentives, and the availability of collaborative organizational structures. It is more feasible to evaluate the knowledge, attitudes, and beliefs about team science functioning. It is much more challenging to evaluate downstream evidence that teams are functioning better or having real impacts. Only a few submissions described evaluating their team science strategies to encourage team approaches, reduce barriers to team science or improve team functioning. None of the submissions examined how credit for team contributions was allocated, which poses a significant challenge that deserves more attention in future studies. Only one submission described using control and experimental groups, emphasizing how difficult such approaches can be. Although some of the team science examples did not provide team science strategies, they did provide some more distal team science outcomes. These included changes in health policy, expanding the team science program beyond the hub, better health, screening, and access for underserved populations, and increased efficiency.

In conclusion, analysis of the GTSC dataset shows that the CTSA consortium is involved in an extremely diverse array of team science activities, which are appropriately addressing most of the team science recommendations made in 2015 by the NRC and the NCATS team science objectives. The strategies in this paper provide a resource for CTSA hubs seeking to expand their team science endeavors. To use this information, hubs would match their team science goal with our research questions. For example, if a hub wants to expand team science to include the community. Three of the research questions deal with the community as beneficiaries (Q3 and Table 3), getting the community involved in research (Q4 and Tables 4 and 5), or fostering community collaborations with teams (Q5 and Table 6). Once hubs locate examples, they can cross-reference those hubs with the list of contestants in Appendix 2 and contact those hubs to learn from their experiences to get assistance in their team science expansion. Future research should focus on evaluating the efficacy of team science strategies.

Supplementary material

To view supplementary material for this article, please visit https://doi.org/10.1017/cts.2021.812.

Acknowledgments

Research reported in this publication was supported by the National Center for Advancing Translational Sciences of the National Institutes of Health under Award Numbers UL1TR002548 (CWRU), UL1TR0012649 (VCU), UL1TR001876, and KL2TR001877 (GWU). The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

Disclosures

The authors have no conflicts of interest to declare.

References

Adler, NE, Stewart, J. Using team science to address health disparities: MacArthur network as case example. Annals of the New York Academy of Sciences 2010; 1186: 252260.CrossRefGoogle ScholarPubMed
Croyle, RT. The National Cancer Institute’s transdisciplinary centers initiatives and the need for building a science of team science. American Journal of Preventative Medicine 2008; 35(2 Suppl): S90S93.CrossRefGoogle ScholarPubMed
National Academy of Sciences, National Academy of Engineering, and Institute of Medicine. Facilitating Interdisciplinary Research. Washington, DC: The National Academies Press, 2005. doi: 10.17226/11153.:332.Google Scholar
Wuchty, S, Jones, BF, Uzzi, B. The increasing dominance of teams in production of knowledge. Science (New York, NY) 2007; 316(5827): 10361039.CrossRefGoogle ScholarPubMed
National Center for Research Resources (NCRR). Clinical and translational science awards [Internet], 2010, [cited Jan 24, 2021]. (http://www.ctsaweb.org/)Google Scholar
Esparza, J, Yamada, T. The discovery value of “Big Science.” Journal of Experimental Medicine 2007; 204(4): 701704.CrossRefGoogle ScholarPubMed
National Research Council. Collaborations of Consequence: NAKFI’s 15 Years Igniting Innovation at the Intersections of Disciplines. Washington, DC: The National Academies Press, 2018.Google Scholar
Bennett, LM, Gadlin, H, Marchand, C. Collaboration and Team Science: A Field Guide [Internet], 2018, updated March 21, 2019 [cited Jan 24, 2021]. (https://www.cancer.gov/about-nci/organization/crs/research-initiatives/team-science-field-guide) Google Scholar
National Center for Advancing Translational Sciences. Strategic Goal 2: Advance Translational Team Science by Fostering Innovative Partnerships and Collaborations with a Strategic Array of Stakeholders. National Center for Advancing Translational Sciences Web site [Internet], 2017, updated December 28, 2017 [cited Dec 2020]. (https://ncats.nih.gov/strategicplan/goal2) Google Scholar
Stokols, D, Hall, KL, Taylor, BK, Moser, RP. The science of team science: overview of the field and introduction to the supplement. American Journal of Preventive Medicine 2008; 35(2): S77S89.CrossRefGoogle ScholarPubMed
Hall, KL, Vogel, AL, Huang, GC, et al. The science of team science: a review of the empirical evidence and research gaps on collaboration in science. American Psychologist 2018; 73(4): 532548.CrossRefGoogle Scholar
National Research Council. Enhancing the Effectiveness of Team Science. In: Cooke, NJ, Hilton, ML, eds. Washington, DC: National Academies Press, 2015.Google Scholar
DiazGranados, D, Pelfrey, CM, Goldman, A, et al. The Great CTSA Team Science Contest: an evaluation of team science activities across the CTSA consortia and lessons learned. Journal of Clinical and Translational Science (manuscript forthcoming).Google Scholar
The Great CTSA Team Science Contest – Submissions. Center for Leading Innovation & Collaboration (CLIC) [Internet], 2018 [cited Jan 29, 2021]. (https://clic-ctsa.org/sites/default/files/2020-05/Great_team_science_sub.pdf) Google Scholar
Funding Opportunity Announcement (FOA) Number PAR-18-464. National Center for Advancing Translational Sciences [Internet], December 7, 2017 [cited Dec 30, 2020]. (https://grants.nih.gov/grants/guide/pa-files/PAR-18-464.html) Google Scholar
Translational Science Spectrum. National Center for Advancing Translational Sciences (NCATS) [Internet], 2020, updated March 19, 2020 [cited Jan 24, 2021]. (https://ncats.nih.gov/translation/spectrum) Google Scholar
Mâsse, LC, Moser, RP, Stokols, D, et al. Measuring collaboration and transdisciplinary integration in team science. American Journal of Preventative Medicine 2008; 35(2, Supplement): S151S160.CrossRefGoogle ScholarPubMed
Falk-Krzesinski, HJ, Börner, K, Contractor, N, et al. Advancing the science of team science. Clinical and Translational Science 2010; 3(5): 263266.CrossRefGoogle ScholarPubMed
Belcher, BM, Rasmussen, KE, Kemshaw, MR, Zornes, DA. Defining and assessing research quality in a transdisciplinary context. Research Evaluation 2015; 25(1): 117.CrossRefGoogle Scholar
McHale, SM, Ranwala, DD, DiazGranados, D, Bagshaw, D, Schienke, E, Blank, AE. Promotion and tenure policies for team science at colleges/schools of medicine. Journal of Clinical and Translational Science 2019; 3(5): 245252.CrossRefGoogle ScholarPubMed
Figure 0

Table 1. Which Clinical and Translational Science Award (CTSA) component or core sponsored the team science intervention?

Figure 1

Table 2. Who was the team conducting the team science work?

Figure 2

Table 3. Who is the intended beneficiary of the intervention?

Figure 3

Table 4. What was the intended outcome of the team science submission?

Figure 4

Table 5. Evaluation metrics: short-, mid-, and long-term outcomes from team science interventions

Figure 5

Table 6. What team science method did they use?

Figure 6

Fig. 1. Almost half of the Great Clinical and Translational Science Award (CTSA) Team Science Contest (GTSC) submissions involved in research are in public health, policy, and prevention. This figure shows the number and percent of GTSC submissions by the translational science phase (submissions about the Science of Team Science [SciTS] are not included, n = 97). Number of submissions is in white. Percent of submissions are in black numbers next to the label. Some submissions involved in research were found in more than 1 TS phase: 24 submissions spanned more than 1 TS phase and 5 spanned 3 different TS phases.

Figure 7

Table 7. How are hubs using team science to address the National Center for Advancing Translational Science (NCATS) strategic objectives?

Supplementary material: PDF

Pelfrey et al. supplementary material

Pelfrey et al. supplementary material 1

Download Pelfrey et al. supplementary material(PDF)
PDF 405.1 KB
Supplementary material: PDF

Pelfrey et al. supplementary material

Pelfrey et al. supplementary material 2

Download Pelfrey et al. supplementary material(PDF)
PDF 256.4 KB
Supplementary material: PDF

Pelfrey et al. supplementary material

Pelfrey et al. supplementary material 3

Download Pelfrey et al. supplementary material(PDF)
PDF 205.3 KB