Hostname: page-component-848d4c4894-8bljj Total loading time: 0 Render date: 2024-06-23T08:50:10.308Z Has data issue: false hasContentIssue false

Developing relevant assessments of community-engaged research partnerships: A community-based participatory approach to evaluating clinical and health research study teams

Published online by Cambridge University Press:  11 May 2023

Elias Samuels*
University of Michigan, Michigan Institute for Clinical & Health Research, Ann Arbor, MI, USA
Donald Vereen
University of Michigan, Michigan Institute for Clinical & Health Research, Ann Arbor, MI, USA
Patricia Piechowski
University of Michigan, Michigan Institute for Clinical & Health Research, Ann Arbor, MI, USA
Athena McKay
University of Michigan, Michigan Institute for Clinical & Health Research, Ann Arbor, MI, USA
E. Hill De Loney
Health Awareness Center, Flint, MI, USA Community Based Organization Partners, Flint, MI, USA
Sarah Bailey
Community Based Organization Partners, Flint, MI, USA Bridges into the Future, Flint, MI, USA All Faiths Health Alliance, USA
Luther Evans
Community Based Organization Partners, Flint, MI, USA Anders Associates Flint, MI, USA
Bettina Campbell
Community Based Organization Partners, Flint, MI, USA
Yvonne Lewis
Community Based Organization Partners, Flint, MI, USA Healthy Flint Research Coordinating Center Flint, MI, USA National Center for African American Health Consciousness, Flint, MI, USA
Ella Greene-Moton
Community Based Organization Partners, Flint, MI, USA
Kent Key
Community Based Organization Partners, Flint, MI, USA Michigan State University. College of Human Medicine, East Lansing, MI, USA
DeWaun Robinson
Community Based Organization Partners, Flint, MI, USA Artistic Visions Flint, MI, USA
Arlene Sparks
Community Based Organization Partners, Flint, MI, USA
Ellen Champagne
University of Michigan, Michigan Institute for Clinical & Health Research, Ann Arbor, MI, USA
Susan Woolford
University of Michigan, Michigan Institute for Clinical & Health Research, Ann Arbor, MI, USA
Corresponding author: E. Samuels; Email:
Rights & Permissions [Opens in a new window]



In 2017, the Michigan Institute for Clinical and Health Research (MICHR) and community partners in Flint, Michigan collaborated to launch a research funding program and evaluate the dynamics of those research partnerships receiving funding. While validated assessments for community-engaged research (CEnR) partnerships were available, the study team found none sufficiently relevant to conducting CEnR in the context of the work. MICHR faculty and staff along with community partners living and working in Flint used a community-based participatory research (CBPR) approach to develop and administer a locally relevant assessment of CEnR partnerships that were active in Flint in 2019 and 2021.


Surveys were administered each year to over a dozen partnerships funded by MICHR to evaluate how community and academic partners assessed the dynamics and impact of their study teams over time.


The results suggest that partners believed that their partnerships were engaging and highly impactful. Although many substantive differences between community and academic partners’ perceptions over time were identified, the most notable regarded the financial management of the partnerships.


This work contributes to the field of translational science by evaluating how the financial management of community-engaged health research partnerships in a locally relevant context of Flint can be associated with these teams’ scientific productivity and impact with national implications for CEnR. This work presents evaluation methods which can be used by clinical and translational research centers that strive to implement and measure their use of CBPR approaches.

Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (, which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
© The Author(s), 2023. Published by Cambridge University Press on behalf of The Association for Clinical and Translational Science


The National Center for Advancing Clinical and Translational Science at the National Institutes of Health funds a consortium of Clinical and Translational Science Award (CTSA) centers located in over 60 universities and research institutions nationwide. A key goal of the CTSA Consortium is to accelerate the process of translating scientific discoveries into improvement in human health through community engagement (CE) [1,Reference Minkler and Wallerstein2]. The COVID-19 pandemic’s disproportionate impact on the health and healthcare of minority communities across the country deepened the investment in CE being made by research centers in the CTSA Consortium, including associated funding to study the long-term effects of the COVID-19 pandemic [Reference Marsh, Kappelman and Kost3,Reference Price-Haywood, Burton, Fort and Seoane4].

This paper presents an evaluation of health research partnerships supported by a CTSA hub, the Michigan Institute for Clinical and Health Research (MICHR). A community-based participatory research (CBPR) approach was used to evaluate community-engaged research (CEnR) partnerships working across the spectrum of CE [Reference Key, Furr-Holden and Lewis5,Reference Michener, Scutchfield and Aguilar-Gaxiola6]. This study contributes to translational science by demonstrating how CEnR partnerships can be evaluated by health research centers using CBPR approaches. The results of this approach also contribute to a burgeoning line of research demonstrating that CE health research partnerships have national relevance for impact on community health and healthcare [Reference Vitale, Newton and Abraido-Lanza7,Reference Westfall, Ingram and Navarro8]. We achieve this by focusing on locally relevant approaches to evaluation which can be used to inform the assessment of CEnR on a broader scale. While there is no single way that CEnR partnerships “work,” this paper provides an example of how CTSA-supported CEnR partnerships geographically centered in Flint were evaluated, which may help inform evaluation of other teams. We have attempted to provide sufficient detail to allow clinical and translational scientists seeking to learn from this example to compare their and their partners’ circumstances and goals to those presented here, thus enhancing their ability to draw conclusions about the relevance of this paper to their own research and scientific goals.

Many CTSA hubs have worked to develop comprehensive assessment models to use in the evaluation of CE programs and services [Reference Vitale, Newton and Abraido-Lanza7]. This study builds on the long-term accomplishments of existing health research partnerships in Flint, Michigan. The assessment used for this study was developed by combining and recategorizing existing CEnR partnership evaluation tools using a concept mapping process [Reference Kane and Trochim9]. Specifically, this study integrates new quantitative measures of financial management with existing measures of CEnR partnerships that have been validated through empirical research.

At the beginning of the CTSA funding in 2007, MICHR began collaborating with community partners in organizations in Flint, Detroit, and Ypsilanti, MI. Those from Flint were mainly members of the Community Based Organization Partners (CBOP), a nonprofit organization representing over 40 multisector and faith-based community organizations. This collaboration included citizen scientists with years of CBPR experience in Flint dating back to the early 1990s with their participation in the W. K. Kellogg Foundation’s Community Based Public Health Initiative [Reference Schmitz, Johnson, Himmelman and Wunderlich10].

The authors of this study include MICHR faculty, staff, and CBOP board members and other community partners, who formed a workgroup with the aim of guiding the implementation of health research initiatives involving communities in Flint. The authors met bimonthly starting in 2016 and all the community partners were compensated $25/hour for their time for the duration of the project. This group adopted a CBPR approach to develop and enhance MICHR’s support of health research on the long-standing health disparities in Flint, particularly those revealed and exacerbated by the ongoing Flint water crisis.

Community-Academic Partnerships Evaluated by This Study

The community-academic teams evaluated in this study were selected by virtue of having received funding via a mechanism for CEnR partnerships developed by MICHR in collaboration with CBOP. This mechanism named the Building Capacity for Research and Action (BCRA) was launched in 2017. BCRA awards funded multi- and transdisciplinary scientific teams that proposed to engage community partners throughout the entire research process. All funded projects were required to focus on community-identified health priorities and to use health measures to evaluate the impact of their project. A scientific committee of community and academic reviewers reviewed the proposals and recommended awardees to MICHR. MICHR made final funding decisions based on the review committee’s recommendations.

The funded teams, which required a community partner from Flint, had a wide range of prior experience, with some building on decades of shared work while others included new collaborators working together for the first time. An innovative element of this mechanism is that non-U-M academics were encouraged to apply for funding. However, most applications came from teams where the academic partner was from the U-M.

The awards were distributed in three rounds during the study period. A total of 16 BCRA project proposals were accepted for funding between 2018 and 2022. Each funded team attended orientation meetings with MICHR specialists in research regulation, administration, finance, CE and program evaluation.

The BCRA awards ranged from $5K to $10K and were all offered for a 12-month project period, with one possible 6-month no-cost extension, in total MICHR awarded $150,996 in BCRA funding. Overall, the funded projects encumbered 34% of their budgets for community partner effort. And among the 12 projects which have completed their work as of August 2022, a total of $62,663 was actually allocated towards community partner expenses, a sum representing 54% of the total proposed budget and 56% of the total amount spent.

Two additional awards for health research projects in Flint are also included in this study. One participating partnership was conducting research on the use of Reiki for the treatment of substance abuse. This project was supported with $5000 in MICHR pilot grant funding, 87% of which was encumbered for community partners and 89% of the total budget spent ultimately went to community partners. The other partnership included in this study received $25,000 in MICHR discretionary funding to engage community members in discussions about the Flint water crisis and their trust in health research.


This study is an exploratory evaluation of CEnR partnerships using existing and new survey measures that were categorized into domains through a collaborative concept mapping process [Reference Kane and Trochim9,Reference Creswell and Clark12]. Following best practice [Reference Key13], this evaluation was reviewed and approved by the CBOP’s Community Ethics Review Board (CERB) in Flint, which provided a letter of endorsement for the project. This letter of endorsement was included in the proposal which was reviewed and exempted from oversight by the University of Michigan Medical School’s Institutional Review Board (HUM00156451).

Developing an Assessment of Partnership Dynamics

The assessment administered to BCRA-funded partnerships was developed using existing assessments (e.g., Wallerstein, 2011 and Israel, 2008) as well as novel questions about financial management which were created by the authors. First, validated assessments of CEnR partnerships published in peer-reviewed journals were collected and reviewed for relevance. Over 120 distinct questions derived from published assessments of CEnR were identified [1,Reference Key, Furr-Holden and Lewis5,Reference Michener, Scutchfield and Aguilar-Gaxiola6,Reference Oetzel, Zhou and Duran14Reference Wallerstein and Duran29].

The concept mapping process used for this study involved six steps [Reference Kane and Trochim9]. These included 1) preparing all existing measures for review, detailing their measurement scale, citing the author, and associating each with the domain utilized by the author, 2) developing new questions to address any resultant gaps among the questions, 3) rating the relevance and appropriateness of all questions for use in this study, 4) grouping all of the questions by cross-cutting key domains of community-engagement developed by the study team, 5) analyzing the relationships between the questions, domains, and the teams’ ratings, and 6) developing a comprehensive assessment for use in pilot testing.

The use of established conceptual domains of CEnR partnerships enabled the study team to categorize and compare the existing assessment measures. Novel questions about financial management were also developed to compliment the few questions grouped into one domain. The mapping process resulted in a comprehensive set of survey questions grouped into six broad domains of community-engaged partnerships, including 1) Partnership Background & Sustainability, (2) Communication, (3) CE, (4) Decision Making & Trust, (5) Team Finances, and (6) Impact.

The Partnership Background and Sustainability Domain includes existing measures of the composition, diversity, and structure of health research partnerships, including the community-identified health priorities of focus and the degree of CE on the spectrum of CEnR [Reference Key, Furr-Holden and Lewis5,Reference Michener, Scutchfield and Aguilar-Gaxiola6]. This domain also includes questions regarding partners’ support for the partnership over time and the sustainability of the partnership [Reference Oetzel, Zhou and Duran14]. The CE Domain contains questions including ones about the shared credit for, and ownership of research achievements [Reference Schulz, Israel and Lantz15], and assessments of the Principles of CE [Reference Oetzel, Zhou and Duran14Reference Israel, Schulz, Parker, Minkler and Wallerstein16].

The Communications Domain includes measures derived from several existing assessments. These include measures of how community and academic partners interacted during meetings, conversed in respectful and productive ways, and agreed with their teams’ mission, priorities, and work strategies [Reference Oetzel, Zhou and Duran14,Reference Schulz, Israel and Lantz15,Reference Braun, Nguyen and Tanjasiri17,18]. The questions regarding Decision-Making and Trust Domain include questions regarding partners’ support of the decisions made by their team, their inclusion in those decisions, their comfort with the process, and their feeling pressured to go along with decisions of the group [Reference Oetzel, Zhou and Duran14,Reference Khodyakov, Stockdale and Jones19]. A measure of trust is also included which characterizes the type of trust that existed within their partnership using an ordinal scale [Reference Minkler and Wallerstein2,Reference Lucero, Wallerstein and Duran20].

The Team Finances Domain includes existing questions regarding the felt needs of community partners, their access to the benefits of health research partnerships, and the equity with which those benefits were distributed [Reference Oetzel, Zhou and Duran14,Reference Schulz, Israel and Lantz15,Reference Khodyakov, Stockdale and Jones19]. Novel questions about the allocation of financial resources were also developed by the study team. The Impact Domain includes questions regarding the effect of the partnership on the health of local communities, the quality of health care provided to them and the degree to which the partnership advanced community-identified health priorities. These measures also include questions measuring how the partnership benefited the partners overall, and individually, including through subsequent awards and recognition [Reference Oetzel, Zhou and Duran14,Reference Schulz, Israel and Lantz15,Reference Khodyakov, Stockdale and Jones19].

Each measure used in the concept mapping process was reviewed in two focus group sessions of CEnR partnerships using a semi-structured protocol. Participants completed the assessment in advance and then discussed the relevance of each question to their past research experience. The discussions were recorded, and the transcripts were analyzed to identify participants’ recommends for changes to the survey. As shown in Table 1, after making these revisions, the resultant survey questionnaire was reduced from 137 questions down to 67.

Table 1. The number of survey question by domain and study phase

The resultant survey was sent to every member of each participating study team in 2019 and in 2021. Participants received survey invitations and forms which were personalized with their names as well as the name of the partnership which they were being asked to evaluate. Following the initial survey invitation two reminders were sent 1 week apart to all nonresponders.

Two 2-tailed t-tests (assuming unequal variances) were conducted to identify statistically significant differences, both between responses in 2019 and in 2021, but also between community and academic partners responding within each year. ANOVAs were conducted to look for differences in categorical measures. More rigorous longitudinal analyses of differences among people over time could not be conducted due to the small number of academic and community partners who responded to both iterations of the survey.

For formative evaluation, the differences in the perceptions of community and academic partners were quantified; each team received their data, and anonymized reports of the comparative results were provided to all participants in 2021. Further feedback about the financial management practices of the funded CEnR partnerships was subsequently collected through an anonymous online survey using open-ended questions. Invitations to complete the survey were sent to members of all funded teams in 2022.


In 2019, a total of 14 funded study teams received invitations, and at least one member of each team responded. Eight teams (57%) returned responses from at least one academic partner and at least one community partner. A total of 56 individuals received survey invitations in 2019 of which 30 responded, representing a 54% response rate. In 2021, a total of 16 funded study teams received survey invitations, to which at least one member of all but two teams responded, and nine teams (56%) returned responses from at least one academic partner and at least one community partner. A total of 62 individuals received survey invitations in 2021 of which 31 responded, representing a 50% response rate.

Nineteen individuals responded to both surveys, collectively representing 13 of the study teams participating. However, the small number of consistent respondents across years prevents robust longitudinal analyses of statistical difference from being conducted at this time. Eleven individuals (about 18%) responded to the anonymous survey about financial management that was administered in 2022. These survey response rates range from sufficient to high, as specified by Daikeler and colleagues (2020), particularly considering the mitigating factors associated with distributing email invitations to online surveys [Reference Daikeler, Bošnjak and Lozar Manfreda30].

These respondents’ survey data were analyzed to identify statistically significant differences between academic and community partners in 2019 and 2021 and differences between respondents by year. Substantive differences were found in respondents’ perceptions of their teams’ 1) Background & Sustainability, 2) CE, 3) Communication, 4) Decision-making & Trust, 5) Team Finances, and 6) Impact. The following sections and tables present each of these sets of results in turn.

Notably, out of the 195 t-tests performed for this study comparing community partner perception to academic partner perceptions of the functioning of their partnership, only six statistically significant differences were found. Five of these statistical differences were related to Team Finances or the Sustainability of the partnership. One statistically significant difference addressed a question assessing the quality of communication within partnerships.

Partnership Background

When asked to describe their partnership from a list of options, 77% selected to the 2019 survey described their collaboration as being a CEnR partnership in which, “the community provides input, or consults, about critical aspects of the research process, such as research questions, project design, project objectives, data analysis, dissemination, translation of findings” with the remainder (23%) indicating their research was also “placed somewhere in the local vicinity of the community where people are engaging within the context of the physical spaces of the community receiving the service” [Reference Key, Furr-Holden and Lewis5]. The partnerships represented by the respondents’ focused on clinical and health issues including aging populations, hunger, urban populations, men’s health, women’s health, mental health, poverty, youth and adolescent populations, social isolation, sexual and gender minorities, racism, and substance abuse.

A little under half (45%) of the survey respondents in 2021 were paid members of their partnerships, and a greater proportion of respondents (54%) to the 2019 survey were paid members. Half of the academic partners responding in 2019 reported that they had been in their partnership for more than 2 years, compared with only 19% of community partners, who largely reported being in their partnership for less than 2 years. Only 15% of the respondents to the 2019 survey indicated their partnership had been initiated by an academic partner with the rest reporting their partnerships were initiated by the community partner (27%), or jointly by community and academic partners (58%).

As shown in Table 2, although partners indicated that they were committed to sustaining their partnerships with no or low funding, their responses to the statement became less positive between 2019 and 2021 (µ = 4.2 in 2019, N = 17; µ = 6.3 in 2021, N = 19). They also remained satisfied with their team’s attention to financial sustainability and evaluation of mutually beneficial funding opportunities. Although there was a statistically significant difference found between the academic and community partners responding to this question in 2021, their rates of commitment remained high on both iterations of the survey (see Table 2). A statistically significant difference was found between the respondents to the 2019 and 2021 surveys. All respondents in 2019 survey expressed a stronger commitment to sustaining their partnerships after their funding ended compared to respondents in 2021 (µ = 4.3 in 2019, N = 26; 2021 µ = 3.6 in 2021, N = 31, t (48) = −3.2, p = .003).

Table 2. Partnership sustainability*

*Minor changes to the wording of some cited questions were made, as described in the methods section concerning the concept mapping process.

Principles of Engagement

Respondents were asked to indicate the extent to which their teams used eight well-established principles of CE [1]. Although the partners reported that many of the principles were not used as actively in 2021 compared to 2019, their responses indicated that the teams mostly follow each of the principles (Table 3). The measures showing the greatest changes in magnitude related to partners’ increasing belief that their partnership shares knowledge and findings to all members and involves them all equitably in the dissemination process, which rose from an average 4.25 in 2019 (N = 26) to 4.35 in 2021 (N = 29). In addition, in both years participants reported their partnership consistently facilitates equity in all phases of its research, balances research and social benefits, and fits local community cultures and norms.

Table 3. Principles of engagement*

*Minor changes to the wording of some cited questions were made, as described in the methods section concerning the concept mapping process.

Respondents in both years reported that their partnership consistently practiced co-learning, although this question returned the greatest decline of all eight principles of CE measured between 2019 and 2021. The participants further indicated that their partnerships build on community resources and strengths. They also view CEnR as a long-term commitment. However, both measures also declined over the 2 years.

Importantly, in both years, the respondents’ confirmed that their partnerships typically followed all eight principles of CE measured in this study. These consistently positive perceptions of CE are to be expected with the respondents representing a spectrum of CEnR partnerships. However, the trends among these measures of engagement are mixed (Table 3).


In contrast to their perceptions of CE, respondents reported holding less positive perceptions of their team’s communication on all but one of the associated measures in the survey. Although the opinions held by the responding partners did not become negative throughout, their ratings of their partnership’s communication declined on almost every measure from 2019 to 2021 (Table 4). The greatest declines were reported in teams’ ability to work together to resolve disagreements and in their ability to reach consensus on the strategies used to pursue priorities. Respondents’ positive opinions about the occurrence of constructive arguments on their team slightly declined during this time as well, particularly regarding the occurrence of disrespectful remarks made during team meetings.

Table 4. Communication*

*Minor changes to the wording of some cited questions were made, as described in the methods section concerning the concept mapping process.

Importantly, a statistically significant difference was found between the respondents in 2019 and 2021 regarding their agreement on the strategies the partnership should use in pursuing priorities. In 2021, the respondents were less likely to indicate that there was agreement among their team about the strategies that should be used, compared to 2019 (µ = 4.6 in 2019, N = 25; µ = 4.1 in 2021, N = 28, t (48) = −2.1, p = .040).

Decision-Making and Trust

The partners’ perceptions of the decision-making process in their partnership remained consistently positive or became slightly less positive over time (Table 5). In both 2019 and 2021, respondents reported often feeling comfortable with the way decisions were being made and supportive of the decisions themselves. Throughout this time, they rarely if ever reported feeling pressured to go along with decisions with which they did not agree or being left out of decision-making. However, they did feel they had been left out of decisions more so in 2021 than they reported in 2019.

Table 5. Decision-making*

*Minor changes to the wording of some cited questions were made, as described in the methods section concerning the concept mapping process.

In both 2019 and in 2021, the respondents rated the level of trust that they had at the start of the partnership, at the current moment, and the level of trust that they anticipated achieving in the future. For each of these questions, the scale ranged from 1 to 7, with definitions of each corresponding level of trust provided. In both years, they tended to report having a consistent level of trust in their team (µ = 6.4 in 2019, N = 24; µ = 6.3 in 2021, N = 29). The type of trust respondents had through this period was defined as “proxy trust,” a state in which all members of this partnership are trusted, even if only by proxy due to another team member being viewed as trustworthy.

Respondents’ assessment of the level of trust they had at the beginning of their partnership increased over time (µ = 4.3 in 2019, N = 24; µ = 5.0 in 2021, N = 29) while the level of trust they anticipated achieving in the future somewhat declined (µ = 6.8 in 2019, N = 24; µ = 6.5 in 2021, N = 29). Most importantly, respondents consistently anticipated that their partnership would reach new levels of “critical reflective trust, in which mistakes and other issues resulting from differences, including differences in culture and power, could be talked about, and resolved in good ways.

Team Finances

Respondents felt less positively about their study teams’ use of financial resources in 2021 than in 2019. But as shown in Table 6, they consistently reported their partnership was making fair decisions about how its resources were used, and that these resources were distributed in a fair and equitable manner. While they also agreed that they had adequate knowledge of their research budget during this period, they wanted to have more input into their teams’ allocation of resources in 2021 than in 2019. However, these respondents also reported their teams made good use of key resources and time during this period. While their perceptions of how well the partnership used its financial resources and time both declined slightly during this period, their opinion of their teams’ use of in-kind resources improved.

Table 6. Team finances*

*Minor changes to the wording of some cited questions were made, as described in the methods section concerning the concept mapping process.

All survey participants were also asked to rate three aspects of their engagement with study team financial management on a sliding 10-point scale. These survey questions were created by the authors as no such validated measure was found to be available. Respondents reported having less opportunity to be involved in writing their team’s research budgets in 2021 compared to 2019 (µ = 8.5 in 2019, N = 24; µ = 7.6 in 2021, N = 28. 0 = “no opportunity,” 10 = “lots of opportunity”). They reported that the team’s budget was less equitably distributed in 2021 than they had reported it was in 2019 (µ = 8.3 in 2019, N = 25; µ = 8.1 in 2021, N = 30. 0 = “not equitable,” 10 = “very equitable distributions”). Finally, the respondents reported that they had been provided with all the resources they needed to accomplish their work on the study to a lesser extent in 2021 compared to 2019 (µ = 8.5 in 2019, N = 26; µ = 8.2 in 2021, N = 31. 0 = “no resources,” 10 = “all of the resources needed”).

Statistically significant differences were found in respondents’ opinions of their team’s finances in 2019 compared to 2021 and between community partners and academic partners’ responses in 2021. Both community and academic partners were less likely to report that their financial resources had been distributed fairly and equitably in 2021 than they had been in 2019 (µ = 4.5 in 2019, N = 25; µ = 4.1 in 2021, N = 28, t (48) = −2.1, p = .040). By 2021, community partners were less likely than academic partners to agree that their teams’ distribution of resources was fair and equitable (Community partners µ = 3.6, N = 15; Academic partners µ = 4.3, N = 15, t (25) = 2.1, p = .035) or to report having adequate knowledge of their research budget (Community partners µ = 3.5, N = 15; Academic partners µ = 4.5, N = 15, t (20) = 2.6, p = .017).


The respondents consistently confirmed that their partnerships were impactful on a range of measures used to assess the contributions made by the partnership to community health as well as the work of all the partners involved (Table 7). They consistently indicated that their partnerships impacted the overall health of the community and its overall environment. They similarly reported their partnerships resulted in sustained collaborations between agencies and that it improved the access, delivery, and quality of health services in the community. However, they reported their partnerships having a lower impact on resultant policy change and on their acquisition of additional financial support or receipt of recognition from policymakers and officials.

Table 7. Impact*

*Minor changes to the wording of some cited questions were made, as described in the methods section concerning the concept mapping process.

These respondents also increasingly reported that their partnerships were beneficial to the work and abilities of their team. They indicated their partnerships had enhanced their and their partners’ reputations and that the expertise of their partners had been increasingly utilized between 2019 and 2021. However, these respondents also reported that the utilization of their personal expertise declined during this period. Although they reported their partnerships have a small to moderate impact on their personal ability to acquire additional financial support for their research, the respondents felt that their partners had a greater ability to acquire further financial support in 2019 than in 2021.

Partnership Financial Management Practices

The results of the 2019 and 2021 surveys were aggregated by the authors and sent to each member of the funded partnerships for the purposes of formative evaluation. Each member of the funded teams was then invited to provide anonymous feedback about the financial management practices of their partnership using open-ended questions. The anonymous survey returned dozens of recommendations for financial management practices submitted by community and academic partners.

When asked to share effective approaches to financial management used by their own teams, eight individuals emphasized the value of transparency in discussions about financial resources. One individual noted that their team members had sought, “100% transparency with [the] budget and other resource allocation,” and another reported that their team, “maintained transparent communication through all phases of the project.” A few respondents also emphasized the importance of consistently allocating fair compensation for community partners, including for the time taken to participate in team meetings.

Other best practices in financial management recommended by the respondents included the direct involvement of community members in all financial planning meetings and processes. Considered overall, these recommendations ranged from ensuring that community members are present in discussions about resource allocation to the use of participatory budgeting and “community led” budget formation. These results were used by the authors to identify potential best practices for CEnR teams.


The results of this study demonstrate how the use of a CBPR approach enabled the evaluation of CEnR partnerships supported by MICHR. New and existing measures regarding the management of study team finances were used to evaluate these partnerships and significant differences were found in the participants’ response to some key questions. The funded teams used the results for the purpose of formative evaluation and provided practical, evidence-based recommendations about the financial management of CEnR partnerships in response. These results also demonstrate how partnerships participating in this study believed they were making a positive impact on community health and health care.

The emphasis that this paper places on Flint-focused partnerships is not a limitation of this work. While the CEnR partnerships and survey questions used for this study are clearly locally relevant, the approach and findings of this study have broader implications for the field of CEnR, especially in the context of CTSAs. This work makes two contributions to the field of translational science. First, the results demonstrate the feasibility of using a CBPR approach to evaluate the implementation and impact of health research funding mechanisms for CEnR partnerships [Reference Minkler and Wallerstein2]. The collaborative process used to develop, pilot, and apply the assessment results is also a best practice which can be used by CTSA hubs to evaluate CEnR partnerships using CBPR methods [1].

Second, the results of this study suggest that the use of more participatory budgeting techniques[Reference Ganuza and Baiocchi31] merits study as a potential best practice for CEnR partnerships. Further research is needed to validate the measures of the Team Finances domain used in this study before the measures of the domain can be used to assess the dynamics of CEnR partnerships. Future studies could validate these measures by testing key hypotheses about their predictive validity, including the following.

H1: Partners’ positive valuations of Team Finances affect the sustainability of their partnerships.

H2: Partners’ positive valuations of Team Finances affect the cultivation of trust within their partnerships.

H3: Partners’ positive valuations of Team Finances contributes to partners’ capacity to conduct clinical and health research.

H4: Partners’ qualitative experiences of the financial management practices used by CEnR study teams will subsequently be manifested in valuations of quantitative measures of Team Finances.


There are four primary limitations of this work. First, because some of the questions about Team Finances were novel, there is no existing record of their validity. Second, the small sample size of this study precludes opportunities for longitudinal analyses required in order to understand why individuals’ perceptions changed over time. Third, no qualitative or mixed-methods evaluation was conducted which could have revealed specific facilitators or challenges to the financial management of these partnerships. The quantitative approach taken in this work focuses narrowly on measures of financial management for CEnR partnerships which can be validated but cannot easily illuminate the specific situations these partnerships encountered in the conduct of their work. Finally, this study was implemented during the midst of nationwide social justice protests following the death of George Floyd, the global rise of the COVID-19 pandemic, and an ongoing water crisis, all of which profoundly affected the work of the Flint-focused BCRA-funded partnerships. Secondary limitations of this study include low-response rates and the possibility that participants may have misinterpreted the survey directions by responding on behalf of their team members instead of themselves as an individual.


CTSA hubs can accelerate the process of translating scientific discoveries into improvement in health through CE [1,Reference Minkler and Wallerstein2]. These hubs can have a clear impact on the health and healthcare of communities by supporting studies across the full spectrum of community-engaged clinical and health research [Reference Marsh, Kappelman and Kost3,Reference Price-Haywood, Burton, Fort and Seoane4]. And the evaluation of these partnerships is essential to the advancement of translational science [Reference Vitale, Newton and Abraido-Lanza7,Reference Westfall, Ingram and Navarro8].

The authors chose to use a CBPR approach because it enables the exploration of the dynamics of partnerships that are most relevant to the communities participating in the research [Reference Wallerstein, Duran, Minkler and Wallerstein32,Reference Wallerstein and Duran33]. The long-standing need for empirical research about work of community-engaged health research partnerships is well-recognized [1,Reference Minkler and Wallerstein2]. Clinical and translational scientists should build on research on financial management practices of CEnR partnerships [1,Reference Ganuza and Baiocchi31,Reference Kim, Cheney and Black34Reference Bartocci, Grossi, Mauro and Ebdon37]. CTSA hubs interested in utilizing a CBPR approach to evaluating CEnR partnerships must carefully consider how their local community and institutional contexts compare to those presented here before using CBPR approaches in similar ways for the partnerships they fund and support.

Studies show that the use of participatory budgeting practices can affect the allocation of public finances, and further that the use of participatory budgeting in health programs can impact measures of community and patient well-being [Reference Marsh, Kappelman and Kost3Reference Touchton and Wampler40]. More research is needed to better understand the relationships linking the financial management of CEnR partnerships and their impact on community health. This need may be particularly felt in communities like Flint, Michigan, but it may also be relevant to other communities that are facing ongoing and compounding public health crises [Reference Carrera, Key and Bailey41].


Dr George Mashour, Dr Julie Lumeng, and Dr Erica E. Marsh provided invaluable support and guidance for this work. This work was supported by the National Center for Advancing Translational Sciences, National Institutes of Health, through award number UL1TR002240; UM1TR004404.


The authors have no competing interests to declare.


Clinical and Translational Science Awards Consortium Community Engagement Key Function Committee Task Force on the Principles of Community Engagement. Principles of Community Engagement, 2nd ed.,, Published June 2011. Accessed January 26, 2017.Google Scholar
Minkler, M, Wallerstein, N, eds. Community-based Participatory Research for Health: From Process to Outcomes. Hoboken, NJ: John Wiley & Sons; 2011.Google Scholar
Marsh, EE, Kappelman, MD, Kost, RG, et al. Community engagement during COVID: a field report from seven CTSAs. J Clin Transl Sci. 2021;5(e104):17. doi: 10.1017/cts.2021.785.CrossRefGoogle Scholar
Price-Haywood, EG, Burton, J, Fort, D, Seoane, L. Hospitalization and mortality among black patients and white patients with Covid-19. New Eng J Med. 2020;382(26):25342543.CrossRefGoogle ScholarPubMed
Key, KD, Furr-Holden, D, Lewis, EY, et al. The continuum of community engagement in research: a roadmap for understanding and assessing progress. Progr Commun Health Partn Res Educ Action. 2019;13(4):427434.CrossRefGoogle ScholarPubMed
Michener, L, Scutchfield, FD, Aguilar-Gaxiola, S, et al. Clinical and translational science awards and community engagement: now is the time to mainstream prevention into the nation’s health research agenda. Am J Prevent Med. 2009;37(5):464467.CrossRefGoogle ScholarPubMed
Vitale, K, Newton, GL, Abraido-Lanza, AF, et al. Community engagement in academic health centers: a model for capturing and advancing our successes. J Commun Engagem Scholarsh 2018;10(1):8190.Google Scholar
Westfall, JM, Ingram, B, Navarro, D, et al. Engaging communities in education and research: PBRNs, AHEC, and CTSA. Clin Transl Sci. 2012;5(3):250258.CrossRefGoogle ScholarPubMed
Kane, M, Trochim, W. Concept Mapping for Planning and Evaluation. Thousand Oaks, CA: Sage Publications Inc; 2007.CrossRefGoogle Scholar
Schmitz, CC, Johnson, CM, Himmelman, AT, Wunderlich, M. Cluster evaluation of the community-based public health initiative: 1996 annual report and final summary., Published Oct 1996. Accessed September 26, 2016.Google Scholar
Community Based Public Health Caucus of the American Public Health Association. American Public Health Association Communities website.∼:text=The%20Community%2DBased%20Public%20Health,interests%20of%20the%20community%20itself, Accessed September 26, 2022.Google Scholar
Creswell, JW, Clark, VP. Designing and Conducting Mixed Methods Research. Thousand Oaks, CA: Sage Publications Inc; 2007.Google Scholar
Key, KD. Expanding ethics review processes to include community-level protections: a case study from Flint, Michigan. AMA J Ethics. 2017;19(10):989998.Google ScholarPubMed
Oetzel, J, Zhou, C, Duran, B, et al. Establishing the psychometric properties of constructs in a community-based participatory research conceptual model. Am J Health Promotion. 2015;29(5):188202.CrossRefGoogle Scholar
Schulz, AJ, Israel, BA, Lantz, P. Instrument for evaluating dimensions of group dynamics within community-based participatory research partnerships. Eval Prog Plann. 2003;26(3):249262.CrossRefGoogle Scholar
Israel, BA, Schulz, AJ, Parker, EA, et al. Critical issues in developing and following CBPR principles. In: Minkler, M, Wallerstein, N, eds. Community-Based Participatory Research for Health: From Process to Outcomes. 2nd ed. San Francisco, Calif: Jossey-Bass; 2008:4766.Google Scholar
Braun, KL, Nguyen, TT, Tanjasiri, SP, et al. Operationalization of community-based participatory research principles: assessment of the National Cancer Institute’s community network programs. Am J Public Health. 2012;102(6):11951203.CrossRefGoogle ScholarPubMed
Allies Against Asthma University of Michigan Center for Managing Chronic Disease. Allies Against Asthma Evaluation Instrument website,, Published 2003. Accessed January 26, 2019.Google Scholar
Khodyakov, D, Stockdale, S, Jones, F, et al. An exploration of the effect of community engagement in research on perceived outcomes of partnered mental health services projects. Soc Ment Health. 2011;1(3):185199.CrossRefGoogle ScholarPubMed
Lucero, J, Wallerstein, N, Duran, B, et al. Development of a mixed methods table investigation of process and outcomes of community-based participatory research. J Mixed Methods Res. 2018;12(1):120.CrossRefGoogle Scholar
Becker, A, Israel, BA, Gustat, J, Reyes, A, Allen, A. Strategies and techniques for effective group process in CBPR partnerships. In: Israel, B, Eng, E, Schulz, A, Parker, E, Methods for Community-based Participatory Research for Health, San Francisco, CA: Jossey-Bass; 2013:6996.Google Scholar
Cheezum, R, Coombe, C, Israel, B. Building community capacity to advocate for policy change: an outcome evaluation of the neighborhoods working in partnership project in Detroit. J Commun Practice. 2013;21(3):228247.CrossRefGoogle Scholar
Figueroa, ME, Kincaid, DL, Rani, M, Lewis, G. Communication for social change: an integrated model for measuring the process and its outcomes, Communication for Social Change Working Paper Series No. 1. 2002, The Rockefeller Foundation, Published 2002. Accessed January 26, 2019, Google Scholar
Israel, BA, Coombe, C, Cheezum, R, et al. Community-based participatory research: a capacity-building approach for policy advocacy aimed at eliminating health disparities. Am J Public Health. 2010;100(11):20942102.CrossRefGoogle ScholarPubMed
Israel, BA, Cummings, KM, Dignan, MB, et al. Evaluation of health education programs: current assessment and future directions. Health Educ Quart. 1995;22(2):364389.CrossRefGoogle ScholarPubMed
Khodyakov, D, Stockdale, S, Jones, A, Mango, J, Jones, F, Lizaola, E. On measuring community participation in research. Health Educ Behav. 2013;40(3):346354.CrossRefGoogle ScholarPubMed
Oetzel, J, Duran, B, Sussman, A, et al. Evaluation of CBPR partnerships and outcomes: lessons and tools from the research for improved health study. In: Wallersten, N, Duran, B, Oetzel, J, Minkler, M, eds. Community-based Participatory Research for Health: Advancing Social and Health Equity. 3rd ed. Hoboken, NY: John Wiley & Sons; 2018:237249.Google Scholar
Oetzel, J, Wallerstein, N, Duran, B, et al. Impact of participatory health research: a test of the community-based participatory research conceptual model. BioMed Res Intl. 2018;1-12:112. doi: 10.1155/2018/7281405.CrossRefGoogle Scholar
Wallerstein, N, Duran, B. Community-based participatory research contributions to intervention research: the intersection of science and practice to improve health equity. Am J Public Health. 2010;100(S1):4046.CrossRefGoogle ScholarPubMed
Daikeler, J, Bošnjak, M, Lozar Manfreda, K. Web versus other survey modes: an updated and extended meta-analysis comparing response rates. J Surv Stati Methodol. 2020;8(3):513539. doi: 10.1093/jssam/smz008.CrossRefGoogle Scholar
Ganuza, E, Baiocchi, G. The power of ambiguity: how participatory budgeting travels the globe. J Deliberative Democr. 2002;8(2):114.Google Scholar
Wallerstein, N, Duran, B. The theoretical, historical, and practice roots of CBPR. In: Minkler, M, Wallerstein, N, Community-based Participatory Research for Health: From Process to Outcomes, San Francisco, CA: Jossey-Bass; 2008:2546.Google Scholar
Wallerstein, N, Duran, B. Using community-based participatory research to address health disparities. Health Prom Pract. 2006;7(3):312323.CrossRefGoogle ScholarPubMed
Kim, MM, Cheney, A, Black, A, et al. Trust in community-engaged research partnerships: a methodological overview of designing a multisite Clinical and Translational Science Awards (CTSA) initiative. Eval Health Prof. 2020;43(3):180192.CrossRefGoogle ScholarPubMed
Freeman, E, Seifer, SD, Stupak, M, Martinez, LS. Community engagement in the CTSA program: stakeholder responses from a national Delphi process. Clin Transl Sci. 2014;7(3):191195.CrossRefGoogle ScholarPubMed
Carter-Edwards, L, Grewe, ME, Fair, AM, et al. Recognizing cross-institutional fiscal and administrative barriers and facilitators to conducting community-engaged clinical and translational research. Acad Med. 2021;96(4):558567.CrossRefGoogle ScholarPubMed
Bartocci, L, Grossi, G, Mauro, SG, Ebdon, C. The journey of participatory budgeting: a systematic literature review and future research directions. Int Rev Amin Sci. 2022;0(0):118. doi: 10.1177/00208523221078938.Google Scholar
Shybalkina, I, Bifulco, R. Does participatory budgeting change the share of public funding to low income neighborhoods? Publ Budget Finan. 2019;39(1):4566.CrossRefGoogle Scholar
Campbell, M, Escobar, O, Fenton, C, Craig, P. The impact of participatory budgeting on health and wellbeing: a scoping review of evaluations. BMC Public Health. 2018;18(1):111.CrossRefGoogle ScholarPubMed
Touchton, M, Wampler, B. Public engagement for public health: participatory budgeting, targeted social programmes, and infant mortality in Brazil. Dev Pract. 2020;30(5):681686.CrossRefGoogle Scholar
Carrera, JS, Key, K, Bailey, S, et al. Community science as a pathway for resilience in response to a public health crisis in Flint, Michigan. Soc Sci. 2019;8(3):125.CrossRefGoogle Scholar
Figure 0

Table 1. The number of survey question by domain and study phase

Figure 1

Table 2. Partnership sustainability*

Figure 2

Table 3. Principles of engagement*

Figure 3

Table 4. Communication*

Figure 4

Table 5. Decision-making*

Figure 5

Table 6. Team finances*

Figure 6

Table 7. Impact*