To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Registry-based trials have emerged as a potentially cost-saving study methodology. Early estimates of cost savings, however, conflated the benefits associated with registry utilisation and those associated with other aspects of pragmatic trial designs, which might not all be as broadly applicable. In this study, we sought to build a practical tool that investigators could use across disciplines to estimate the ranges of potential cost differences associated with implementing registry-based trials versus standard clinical trials.
We built simulation Markov models to compare unique costs associated with data acquisition, cleaning, and linkage under a registry-based trial design versus a standard clinical trial. We conducted one-way, two-way, and probabilistic sensitivity analyses, varying study characteristics over broad ranges, to determine thresholds at which investigators might optimally select each trial design.
Registry-based trials were more cost effective than standard clinical trials 98.6% of the time. Data-related cost savings ranged from $4300 to $600,000 with variation in study characteristics. Cost differences were most reactive to the number of patients in a study, the number of data elements per patient available in a registry, and the speed with which research coordinators could manually abstract data. Registry incorporation resulted in cost savings when as few as 3768 independent data elements were available and when manual data abstraction took as little as 3.4 seconds per data field.
Registries offer important resources for investigators. When available, their broad incorporation may help the scientific community reduce the costs of clinical investigation. We offer here a practical tool for investigators to assess potential costs savings.
Neurocognitive and functional neuroimaging studies point to frontal lobe abnormalities in schizophrenia. Molecular and behavioural genetic studies suggest that the frontal lobe is under significant genetic influence. We carried out structural magnetic resonance imaging (MRI) of the frontal lobe in monozygotic (MZ) twins concordant or discordant for schizophrenia and healthy MZ control twins.
The sample comprised 21 concordant pairs, 17 discordant affected and 18 discordant unaffected twins from 19 discordant pairs, and 27 control pairs. Groups were matched on sociodemographic variables. Patient groups (concordant, discordant affected) did not differ on clinical variables. Volumes of superior, middle, inferior and orbital frontal gyri were calculated using the Cavalieri principle on the basis of manual tracing of anatomic boundaries. Group differences were investigated covarying for whole-brain volume, gender and age.
Results for superior frontal gyrus showed that twins with schizophrenia (i.e. concordant twins and discordant affected twins) had reduced volume compared to twins without schizophrenia (i.e. discordant unaffected and control twins), indicating an effect of illness. For middle and orbital frontal gyrus, concordant (but not discordant affected) twins differed from non-schizophrenic twins. There were no group differences in inferior frontal gyrus volume.
These findings suggest that volume reductions in the superior frontal gyrus are associated with a diagnosis of schizophrenia (in the presence or absence of a co-twin with schizophrenia). On the other hand, volume reductions in middle and orbital frontal gyri are seen only in concordant pairs, perhaps reflecting the increased genetic vulnerability in this group.
Schizophrenia is a devastating mental disorder with diverse dimensions of symptoms like delusions, hallucinations, affective symptoms and alterations in cognition. Declarative memory deficits are among the most important factors leading to poor functional outcomes in this disorder. Recently it was supposed, that sleep disturbances in patients with schizophrenia might contribute to these memory impairments (Manoach et al. 2009, Ferrarelli et al. 2010, Lu and Göder 2012). In young healthy subjects it was shown that declarative memory consolidation was enhanced by inducing slow oscillation-like potential fields during sleep (Marshall et al. 2006). In the present study transcranial direct current stimulation (tDCS) was applied to 14 patients with schizophrenia on stable medication with a mean age of 33 years. The main effects of tDCS in comparison to sham stimulation were: An enhancement in declarative memory retention and an increase in mood after sleep. In conclusion, so-tDCS offers an interesting approach for studying the relationship of sleep and memory in psychiatric disorders and could possibly improve disturbed memory processing in patients with schizophrenia.
Traumatic brain injuries (TBI) may lead to persistent depression symptoms. We conducted several pilot studies to examine the efficacy of mindfulness-based interventions to deal with this issue; all showed strong effect sizes. The logical next step was to conduct a randomized controlled trial (RCT).
We sought to determine the efficacy of mindfulness-based cognitive therapy for people with depression symptoms post-TBI (MBCT-TBI).
Using a multi-site RCT design, participants (mean age = 47) were randomized to intervention or control arms. Treatment participants received a group-based, 10-week intervention; control participants waited. Outcome measures, administered pre- and post-intervention, and after three months, included: Beck Depression Inventory-II (BDI-II), Patient Health Questionnaire-9 (PHQ-9), and Symptom Checklist-90-Revised (SCL-90-R). The Philadelphia Mindfulness Scale (PHLMS) captured present moment awareness and acceptance.
BDI-II scores decreased from 25.47 to 18.84 in treatment groups while they stayed relatively stable in control groups (respectively 27.13 to 25.00; p = .029). We did not find statistically significant differences on the PHQ-9 and SCL-90R post- treatment. However, after three months, all scores were statistically significantly lower than at baseline (ps < .01). Increases in mindfulness were associated with decreases in BDI-II scores (r = -.401, p = .025).
MBCT-TBI may alleviate depression symptoms up to three months post-intervention. Greater mindfulness may have contributed to the reduction in depression symptoms although the association does not confirm causality. More work is required to replicate these findings, identify subgroups that may better respond to the intervention, and refine the intervention to maximize its effectiveness.
Clinical Enterobacteriacae isolates with a colistin minimum inhibitory concentration (MIC) ≥4 mg/L from a United States hospital were screened for the mcr-1 gene using real-time polymerase chain reaction (RT-PCR) and confirmed by whole-genome sequencing. Four colistin-resistant Escherichia coli isolates contained mcr-1. Two isolates belonged to the same sequence type (ST-632). All subjects had prior international travel and antimicrobial exposure.
To evaluate the association between novel pre- and post-operative biomarker levels and 30-day unplanned readmission or mortality after paediatric congenital heart surgery.
Children aged 18 years or younger undergoing congenital heart surgery (n = 162) at Johns Hopkins Hospital from 2010 to 2014 were enrolled in the prospective cohort. Collected novel pre- and post-operative biomarkers include soluble suppression of tumorgenicity 2, galectin-3, N-terminal prohormone of brain natriuretic peptide, and glial fibrillary acidic protein. A model based on clinical variables from the Society of Thoracic Surgery database was developed and evaluated against two augmented models.
Unplanned readmission or mortality within 30 days of cardiac surgery occurred among 21 (13%) children. The clinical model augmented with pre-operative biomarkers demonstrated a statistically significant improvement over the clinical model alone with a receiver-operating characteristics curve of 0.754 (95% confidence interval: 0.65–0.86) compared to 0.617 (95% confidence interval: 0.47–0.76; p-value: 0.012). The clinical model augmented with pre- and post-operative biomarkers demonstrated a significant improvement over the clinical model alone, with a receiver-operating characteristics curve of 0.802 (95% confidence interval: 0.72–0.89; p-value: 0.003).
Novel biomarkers add significant predictive value when assessing the likelihood of unplanned readmission or mortality after paediatric congenital heart surgery. Further exploration of the utility of these novel biomarkers during the pre- or post-operative period to identify early risk of mortality or readmission will aid in determining the clinical utility and application of these biomarkers into routine risk assessment.
Using existing data from clinical registries to support clinical trials and other prospective studies has the potential to improve research efficiency. However, little has been reported about staff experiences and lessons learned from implementation of this method in pediatric cardiology.
We describe the process of using existing registry data in the Pediatric Heart Network Residual Lesion Score Study, report stakeholders’ perspectives, and provide recommendations to guide future studies using this methodology.
The Residual Lesion Score Study, a 17-site prospective, observational study, piloted the use of existing local surgical registry data (collected for submission to the Society of Thoracic Surgeons-Congenital Heart Surgery Database) to supplement manual data collection. A survey regarding processes and perceptions was administered to study site and data coordinating center staff.
Survey response rate was 98% (54/55). Overall, 57% perceived that using registry data saved research staff time in the current study, and 74% perceived that it would save time in future studies; 55% noted significant upfront time in developing a methodology for extracting registry data. Survey recommendations included simplifying data extraction processes and tailoring to the needs of the study, understanding registry characteristics to maximise data quality and security, and involving all stakeholders in design and implementation processes.
Use of existing registry data was perceived to save time and promote efficiency. Consideration must be given to the upfront investment of time and resources needed. Ongoing efforts focussed on automating and centralising data management may aid in further optimising this methodology for future studies.
Polyphenol oxidase (PPO) in red clover (RC) has been shown to reduce both lipolysis and proteolysis in silo and implicated (in vitro) in the rumen. However, all in vivo comparisons have compared RC with other forages, typically with lower levels of PPO, which brings in other confounding factors as to the cause for the greater protection of dietary nitrogen (N) and C18 polyunsaturated fatty acids (PUFA) on RC silage. This study compared two RC silages which when ensiled had contrasting PPO activities (RC+ and RC−) against a control of perennial ryegrass silage (PRG) to ascertain the effect of PPO activity on dietary N digestibility and PUFA biohydrogenation. Two studies were performed the first to investigate rumen and duodenal flow with six Hereford×Friesian steers, prepared with rumen and duodenal cannulae, and the second investigating whole tract N balance using six Holstein-Friesian non-lactating dairy cows. All diets were offered at a restricted level based on animal live weight with each experiment consisting of two 3×3 Latin squares using big bale silages ensiled in 2010 and 2011, respectively. For the first experiment digesta flow at the duodenum was estimated using a dual-phase marker system with ytterbium acetate and chromium ethylenediaminetetraacetic acid as particulate and liquid phase markers, respectively. Total N intake was higher on the RC silages in both experiments and higher on RC− than RC+. Rumen ammonia-N reflected intake with ammonia-N per unit of N intake lower on RC+ than RC−. Microbial N duodenal flow was comparable across all silage diets with non-microbial N higher on RC than the PRG with no difference between RC+ and RC−, even when reported on a N intake basis. C18 PUFA biohydrogenation was lower on RC silage diets than PRG but with no difference between RC+ and RC−. The N balance trial showed a greater retention of N on RC+ over RC−; however, this response is likely related to the difference in N intake over any PPO driven protection. The lack of difference between RC silages, despite contrasting levels of PPO, may reflect a similar level of protein-bound-phenol complexing determined in each RC silage. Previously this complexing has been associated with PPOs protection mechanism; however, this study has shown that protection is not related to total PPO activity.
The effect of transportation and lairage on the faecal shedding and post-slaughter contamination of carcasses with Escherichia coli O157 and O26 in young calves (4–7-day-old) was assessed in a cohort study at a regional calf-processing plant in the North Island of New Zealand, following 60 calves as cohorts from six dairy farms to slaughter. Multiple samples from each animal at pre-slaughter (recto-anal mucosal swab) and carcass at post-slaughter (sponge swab) were collected and screened using real-time PCR and culture isolation methods for the presence of E. coli O157 and O26 (Shiga toxin-producing E. coli (STEC) and non-STEC). Genotype analysis of E. coli O157 and O26 isolates provided little evidence of faecal–oral transmission of infection between calves during transportation and lairage. Increased cross-contamination of hides and carcasses with E. coli O157 and O26 between co-transported calves was confirmed at pre-hide removal and post-evisceration stages but not at pre-boning (at the end of dressing prior to chilling), indicating that good hygiene practices and application of an approved intervention effectively controlled carcass contamination. This study was the first of its kind to assess the impact of transportation and lairage on the faecal carriage and post-harvest contamination of carcasses with E. coli O157 and O26 in very young calves.
Callery pear (Pyrus calleryana Decne.) was introduced to North America as an ornamental tree in the early 1900s. Due to widespread planting, P. calleryana has become common throughout the eastern United States and has invaded natural areas, especially disturbed areas. Prescribed fire is a common management technique in prairie ecosystems to mimic natural disturbances. We tested the effectiveness of prescribed fire as a control technique for P. calleryana in a managed prairie system. Fire top-killed all established P. calleryana individuals. However, these individuals responded to fire with 3 to 4 epicormic sprouts each. Similar sprouting behavior occurred in 2-yr-old seedlings. Exposed seeds, fruits, and 1-yr-old seedlings were killed by fire. Established P. calleryana were single-stemmed individuals before exposure to fire. After the prescribed fire, they all were multistemmed, which increased the potential flower-bearing stems within the prairie. We conclude that fire alone is not a suitable technique for managing P. calleryana invasion. Cut and herbicide application methods are labor-intensive. However, combining cut and spray methods with prescribed fire may be effective. Fire removes standing grass and forb biomass, leaving exposed P. calleryana stems, which would make locating individuals and directly applying herbicides easier.
Several studies have suggested that maternal lifestyle during pregnancy may influence long-term health of offspring by altering the offspring epigenome. Whether maternal leisure-time physical activity (LTPA) during pregnancy might have this effect is unknown. The purpose of this study was to determine the relationship between maternal LTPA during pregnancy and offspring DNA methylation. Participants were recruited from the Archive for Research on Child Health study. At enrollment, participants’ demographic information and self-reported LTPA during pregnancy were determined. High active participants (averaged 637.5 min per week of LTPA; n=14) were matched by age and race to low active participants (averaged 59.5 min per week LTPA; n=28). Blood spots were obtained at birth. Pyrosequencing was used to determine methylation levels of long interspersed nucleotide elements (LINE-1) (global methylation) and peroxisome proliferator-activated receptor-gamma (PPARγ), peroxisome proliferator-activated receptor-gamma coactivator (PGC1-α), insulin-like growth factor 2 (IGF2), pyruvate dehydrogenase kinase, isozyme 4 (PDK4) and transcription factor 7-like 2 (TCF7L2). We found no differences between offspring of high active and low active groups for LINE-1 methylation. The only differences in candidate gene methylation between groups were at two CpG sites in the P2 promoter of IGF2; the offspring of low active group had significantly higher DNA methylation (74.70±2.25% methylation for low active v. 72.83±2.85% methylation for high active; P=0.045). Our results suggest no effect of maternal LTPA on offspring global and candidate gene methylation, with the exception of IGF2. IGF2 has been previously associated with regulation of physical activity, suggesting a possible role of maternal LTPA on regulation of offspring physical activity.
Effective methods to increase awareness of preventable infectious diseases are key components of successful control programmes. Rabies is an example of a disease with significant impact, where public awareness is variable. A recent awareness campaign in a rabies endemic region of Azerbaijan provided a unique opportunity to assess the efficacy of such campaigns. A cluster cross-sectional survey concerning rabies was undertaken following the awareness campaign in 600 households in 38 randomly selected towns, in districts covered by the campaign and matched control regions. This survey demonstrated that the relatively simple awareness campaign was effective at improving knowledge of rabies symptoms and vaccination schedules. Crucially, those in the awareness campaign group were also 1·4 times more likely to report that they had vaccinated their pets, an essential component of human rabies prevention. In addition, low knowledge of appropriate post-exposure treatment and animal sources of rabies provide information useful for future public awareness campaigns in the region and other similar areas.
On August 25, 2017, Hurricane Harvey made landfall near Corpus Christi, Texas. The ensuing unprecedented flooding throughout the Texas coastal region affected millions of individuals.1 The statewide response in Texas included the sheltering of thousands of individuals at considerable distances from their homes. The Dallas area established large-scale general population sheltering as the number of evacuees to the area began to amass. Historically, the Dallas area is one familiar with “mega-sheltering,” beginning with the response to Hurricane Katrina in 2005.2 Through continued efforts and development, the Dallas area had been readying a plan for the largest general population shelter in Texas. (Disaster Med Public Health Preparedness. 2019;13:33–37)
An internationally approved and globally used classification scheme for the diagnosis of CHD has long been sought. The International Paediatric and Congenital Cardiac Code (IPCCC), which was produced and has been maintained by the International Society for Nomenclature of Paediatric and Congenital Heart Disease (the International Nomenclature Society), is used widely, but has spawned many “short list” versions that differ in content depending on the user. Thus, efforts to have a uniform identification of patients with CHD using a single up-to-date and coordinated nomenclature system continue to be thwarted, even if a common nomenclature has been used as a basis for composing various “short lists”. In an attempt to solve this problem, the International Nomenclature Society has linked its efforts with those of the World Health Organization to obtain a globally accepted nomenclature tree for CHD within the 11th iteration of the International Classification of Diseases (ICD-11). The International Nomenclature Society has submitted a hierarchical nomenclature tree for CHD to the World Health Organization that is expected to serve increasingly as the “short list” for all communities interested in coding for congenital cardiology. This article reviews the history of the International Classification of Diseases and of the IPCCC, and outlines the process used in developing the ICD-11 congenital cardiac disease diagnostic list and the definitions for each term on the list. An overview of the content of the congenital heart anomaly section of the Foundation Component of ICD-11, published herein in its entirety, is also included. Future plans for the International Nomenclature Society include linking again with the World Health Organization to tackle procedural nomenclature as it relates to cardiac malformations. By doing so, the Society will continue its role in standardising nomenclature for CHD across the globe, thereby promoting research and better outcomes for fetuses, children, and adults with congenital heart anomalies.
Paediatric hospital-associated venous thromboembolism is a leading quality and safety concern at children’s hospitals.
The aim of this study was to determine risk factors for hospital-associated venous thromboembolism in critically ill children following cardiothoracic surgery or therapeutic cardiac catheterisation.
We conducted a retrospective, case–control study of children admitted to the cardiovascular intensive care unit at Johns Hopkins All Children’s Hospital (St. Petersburg, Florida, United States of America) from 2006 to 2013. Hospital-associated venous thromboembolism cases were identified based on ICD-9 discharge codes and validated using radiological record review. We randomly selected two contemporaneous cardiovascular intensive care unit controls without hospital-associated venous thromboembolism for each hospital-associated venous thromboembolism case, and limited the study population to patients who had undergone cardiothoracic surgery or therapeutic cardiac catheterisation. Odds ratios and 95% confidence intervals for associations between putative risk factors and hospital-associated venous thromboembolism were determined using univariate and multivariate logistic regression.
Among 2718 admissions to the cardiovascular intensive care unit during the study period, 65 met the criteria for hospital-associated venous thromboembolism (occurrence rate, 2%). Restriction to cases and controls having undergone the procedures of interest yielded a final study population of 57 hospital-associated venous thromboembolism cases and 76 controls. In a multiple logistic regression model, major infection (odds ratio=5.77, 95% confidence interval=1.06–31.4), age ⩽1 year (odds ratio=6.75, 95% confidence interval=1.13–160), and central venous catheterisation (odds ratio=7.36, 95% confidence interval=1.13–47.8) were found to be statistically significant independent risk factors for hospital-associated venous thromboembolism in these children. Patients with all three factors had a markedly increased post-test probability of having hospital-associated venous thromboembolism.
Major infection, infancy, and central venous catheterisation are independent risk factors for hospital-associated venous thromboembolism in critically ill children following cardiothoracic surgery or cardiac catheter-based intervention, which, in combination, define a high-risk group for hospital-associated venous thromboembolism.
Developing countries are experiencing an increase in total demand for livestock commodities, as populations and per capita demands increase. Increased production is therefore required to meet this demand and maintain food security. Production increases will lead to proportionate increases in greenhouse gas (GHG) emissions unless offset by reductions in the emissions intensity (Ei) (i.e. the amount of GHG emitted per kg of commodity produced) of livestock production. It is therefore important to identify measures that can increase production whilst reducing Ei cost-effectively. This paper seeks to do this for smallholder agro-pastoral cattle systems in Senegal; ranging from low input to semi-intensified, they are representative of a large proportion of the national cattle production. Specifically, it identifies a shortlist of mitigation measures with potential for application to the various herd systems and estimates their GHG emissions abatement potential (using the Global Livestock Environmental Assessment Model) and cost-effectiveness. Limitations and future requirements are identified and discussed. This paper demonstrates that the Ei of meat and milk from livestock systems in a developing region can be reduced through measures that would also benefit food security, many of which are likely to be cost-beneficial. The ability to make such quantification can assist future sustainable development efforts.
To assess relationships between mothers’ feeding practices (food as a reward, food for emotion regulation, modelling of healthy eating) and mothers’ willingness to purchase child-marketed foods and fruits/vegetables (F&V) requested by their children during grocery co-shopping.
Cross-sectional. Mothers completed an online survey that included questions about feeding practices and willingness (i.e. intentions) to purchase child-requested foods during grocery co-shopping. Feeding practices scores were dichotomized at the median. Foods were grouped as nutrient-poor or nutrient-dense (F&V) based on national nutrition guidelines. Regression models compared mothers with above-the-median v. at-or-below-the-median feeding practices scores on their willingness to purchase child-requested food groupings, adjusting for demographic covariates.
Participants completed an online survey generated at a public university in the USA.
Mothers (n 318) of 2- to 7-year-old children.
Mothers who scored above-the-median on using food as a reward were more willing to purchase nutrient-poor foods (β=0·60, P<0·0001), mothers who scored above-the-median on use of food for emotion regulation were more willing to purchase nutrient-poor foods (β=0·29, P<0·0031) and mothers who scored above-the-median on modelling of healthy eating were more willing to purchase nutrient-dense foods (β=0·22, P<0·001) than were mothers with at-or-below-the-median scores, adjusting for demographic covariates.
Mothers who reported using food to control children’s behaviour were more willing to purchase child-requested, nutrient-poor foods. Parental feeding practices may facilitate or limit children’s foods requested in grocery stores. Parent–child food consumer behaviours should be investigated as a route that may contribute to children’s eating patterns.
To determine the scope, source, and mode of transmission of a multifacility outbreak of extensively drug-resistant (XDR) Acinetobacter baumannii.
SETTING AND PARTICIPANTS
Residents and patients in skilled nursing facilities, long-term acute-care hospital, and acute-care hospitals.
A case was defined as the incident isolate from clinical or surveillance cultures of XDR Acinetobacter baumannii resistant to imipenem or meropenem and nonsusceptible to all but 1 or 2 antibiotic classes in a patient in an Oregon healthcare facility during January 2012–December 2014. We queried clinical laboratories, reviewed medical records, oversaw patient and environmental surveillance surveys at 2 facilities, and recommended interventions. Pulsed-field gel electrophoresis (PFGE) and molecular analysis were performed.
We identified 21 cases, highly related by PFGE or healthcare facility exposure. Overall, 17 patients (81%) were admitted to either long-term acute-care hospital A (n=8), or skilled nursing facility A (n=8), or both (n=1) prior to XDR A. baumannii isolation. Interfacility communication of patient or resident XDR status was not performed during transfer between facilities. The rare plasmid-encoded carbapenemase gene blaOXA-237 was present in 16 outbreak isolates. Contact precautions, chlorhexidine baths, enhanced environmental cleaning, and interfacility communication were implemented for cases to halt transmission.
Interfacility transmission of XDR A. baumannii carrying the rare blaOXA-237 was facilitated by transfer of affected patients without communication to receiving facilities.
Salmonella is a leading cause of bacterial foodborne illness. We report the collaborative investigative efforts of US and Canadian public health officials during the 2013–2014 international outbreak of multiple Salmonella serotype infections linked to sprouted chia seed powder. The investigation included open-ended interviews of ill persons, traceback, product testing, facility inspections, and trace forward. Ninety-four persons infected with outbreak strains from 16 states and four provinces were identified; 21% were hospitalized and none died. Fifty-four (96%) of 56 persons who consumed chia seed powder, reported 13 different brands that traced back to a single Canadian firm, distributed by four US and eight Canadian companies. Laboratory testing yielded outbreak strains from leftover and intact product. Contaminated product was recalled. Although chia seed powder is a novel outbreak vehicle, sprouted seeds are recognized as an important cause of foodborne illness; firms should follow available guidance to reduce the risk of bacterial contamination during sprouting.
Historically, community engagement (CE) in research has been implemented in the fields of public health, education and agricultural development. In recent years, international discussions on the ethical and practical goals of CE have been extended to human genomic research and biobanking, particularly in the African context. While there is some consensus on the goals and value of CE generally, questions remain about the effectiveness of CE practices and how to evaluate this. Under the auspices of the Human Heredity and Health in Africa Initiative (H3Africa), the H3Africa CE working group organized a workshop in Stellenbosch, South Africa in March 2016 to explore the extent to which communities should be involved in genomic research and biobanking and to examine various methods of evaluating the effectiveness of CE. In this paper, we present the key themes that emerged from the workshop and make a case for the development of a rigorous application, evaluation and learning around approaches for CE that promote a more systematic process of engaging relevant communities. We highlight the key ways in which CE should be embedded into genomic research and biobanking projects.