We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
After officer-involved shootings (OIS), rapid delivery of emergency medical care is critical but may be delayed due to scene safety concerns. The purpose of this study was to describe medical care rendered by law enforcement officers (LEOs) after lethal force incidents.
Methods:
Retrospective analysis of open-source video footage of OIS occurring from February 15, 2013 through December 31, 2020. Frequency and nature of care provided, time until LEO and Emergency Medical Services (EMS) care, and mortality outcomes were evaluated. The study was deemed exempt by the Mayo Clinic Institutional Review Board.
Results:
Three hundred forty-two (342) videos were included in the final analysis; LEOs rendered care in 172 (50.3%) incidents. Average elapsed time from time-of-injury (TOI) to LEO-provided care was 155.8 (SD = 198.8) seconds. Hemorrhage control was the most common intervention performed. An average of 214.2 seconds elapsed between LEO care and EMS arrival. No mortality difference was identified between LEO versus EMS care (P = .1631). Subjects with truncal wounds were more likely to die than those with extremity wounds (P < .00001).
Conclusions:
It was found that LEOs rendered medical care in one-half of all OIS incidents, initiating care on average 3.5 minutes prior to EMS arrival. Although no significant mortality difference was noted for LEO versus EMS care, this finding must be interpreted cautiously, as specific interventions, such as extremity hemorrhage control, may have impacted select patients. Future studies are needed to determine optimal LEO care for these patients.
Multifetal pregnancies are at risk of adverse maternal, neonatal and long-term health outcomes, and gestational weight gain (GWG) is a potentially modifiable risk factor for several of these. However, studies assessing the associations of GWG with long-term health in twins are rare, and studies which do assess these associations in twins often do not account for gestational age. Since longer gestations are likely to lead to larger GWG and lower risk of adverse outcomes, adjusting for gestational age is necessary to better understand the association of GWG with twin health outcomes. We aimed to explore long-term associations of GWG-for-gestational-age with twin anthropometric measures. The Peri/Postnatal Epigenetic Twins Study (PETS) is a prospective cohort study, which recruited women pregnant with twins from 2007 to 2009. Twins were followed-up at 18 months and 6 years of age. GWG-for-gestational-age z-scores were calculated from pre-pregnancy weight and weight at delivery. We fitted regression models to assess associations of GWG with twin weight, height and BMI at birth, 18 months, and 6 years. Of the 250 women in the PETS, 172 had GWG measured throughout pregnancy. Overall, higher GWG-for-gestational-age z-scores were associated with higher birthweight (β: 0.32 z-scores, 95% Confidence Interval (95% CI): 0.19, 0.45), BMI (β: 0.29 z-scores, 95% CI: 0.14, 0.43) and length (β: 0.27 z-scores, 95% CI: 0.09, 0.45). However, these associations were not observed at 18 months or 6 years of age. GWG was associated with twin length, weight and BMI at birth but not during childhood. Further research is needed to determine the long-term effects of GWG on twin health outcomes.
Although twins often participate in medical research, few clinical trials are conducted entirely in twin populations. The purpose of this review is to demonstrate the substantial benefits and address the key challenges of conducting clinical trials in twin populations, or ‘twin-only trials’. We consider the unique design, analysis, recruitment and ethical issues that arise in such trials. In particular, we describe the different approaches available for randomizing twin pairs, highlight the similarity or correlation that exists between outcomes of twins, and discuss the impact of this correlation on sample size calculations and statistical analysis methods for estimating treatment effects. We also consider the role of both monozygotic and dizygotic twins for studying variation in outcomes, the factors that may affect recruitment of twins, and the ethics of conducting trials entirely in twin populations. The advantages and disadvantages of conducting twin-only trials are also discussed. Finally, we recommend that twin-only trials should be considered more often.
To describe the cumulative seroprevalence of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) antibodies during the coronavirus disease 2019 (COVID-19) pandemic among employees of a large pediatric healthcare system.
Design, setting, and participants:
Prospective observational cohort study open to adult employees at the Children’s Hospital of Philadelphia, conducted April 20–December 17, 2020.
Methods:
Employees were recruited starting with high-risk exposure groups, utilizing e-mails, flyers, and announcements at virtual town hall meetings. At baseline, 1 month, 2 months, and 6 months, participants reported occupational and community exposures and gave a blood sample for SARS-CoV-2 antibody measurement by enzyme-linked immunosorbent assays (ELISAs). A post hoc Cox proportional hazards regression model was performed to identify factors associated with increased risk for seropositivity.
Results:
In total, 1,740 employees were enrolled. At 6 months, the cumulative seroprevalence was 5.3%, which was below estimated community point seroprevalence. Seroprevalence was 5.8% among employees who provided direct care and was 3.4% among employees who did not perform direct patient care. Most participants who were seropositive at baseline remained positive at follow-up assessments. In a post hoc analysis, direct patient care (hazard ratio [HR], 1.95; 95% confidence interval [CI], 1.03–3.68), Black race (HR, 2.70; 95% CI, 1.24–5.87), and exposure to a confirmed case in a nonhealthcare setting (HR, 4.32; 95% CI, 2.71–6.88) were associated with statistically significant increased risk for seropositivity.
Conclusions:
Employee SARS-CoV-2 seroprevalence rates remained below the point-prevalence rates of the surrounding community. Provision of direct patient care, Black race, and exposure to a confirmed case in a nonhealthcare setting conferred increased risk. These data can inform occupational protection measures to maximize protection of employees within the workplace during future COVID-19 waves or other epidemics.
Epigenetics is likely to play a role in the mediation of the effects of genes and environment in risk for many non-communicable diseases (NCDs). The Developmental Origins of Health and Disease (DOHaD) theory presents unique opportunities regarding the possibility of early life interventions to alter the epigenetic makeup of an individual, thereby modifying their risk for a variety of NCDs. While it is important to determine how we can lower the risk of these NCDs, it is equally important to understand how the public’s knowledge and opinion of DOHaD and epigenetic concepts may influence their willingness to undertake such interventions for themselves and their children. In this review, we provide an overview of epigenetics, DOHaD, NCDs, and the links between them. We explore the issues surrounding using epigenetics to identify those at increased risk of NCDs, including the concept of predictive testing of children. We also outline what is currently understood about the public’s understanding and opinion of epigenetics, DOHaD, and their relation to NCDs. In doing so, we demonstrate that it is essential that future research explores the public’s awareness and understanding of epigenetics and epigenetic concepts. This will provide much-needed information which will prepare health professionals for the introduction of epigenetic testing into future healthcare.
The field of epigenetics is currently one of the most rapidly expanding in biology and has resulted in increasing public interest in its applications to human health. Epigenetics provides a promising avenue for both targeted individual intervention and public health messaging. However, to develop effective strategies for engagement, it is important to understand the public’s understanding of the relevant concepts. While there has been some research exploring the public’s understanding of genetic and environmental susceptibility to disease, limited research exists on public opinion and understanding of epigenetics and epigenetic concepts. Using an online questionnaire, this study investigated the Australian public’s understanding, views, and opinions of epigenetics and related concepts, including the concepts of the developmental origins of health and disease (DOHaD) and the first 1000 days. Over 600 questionnaires were completed, with 391 included in the analysis. The survey included questions on knowledge of epigenetics and perceptions of epigenetic concepts for self and for children. Data were analyzed using predominately descriptive statistics, with free-text responses scored based on concordance with predetermined definitions. While participants’ recognition of epigenetic terms and phrases was high, their understanding was limited. The DOHaD theory was more accurately understood than the first 1000 days or epigenetics itself. Female participants without children were more likely to recognize the term epigenetics, while age also had an impact. This research provides a solid foundation for further detailed investigation of these themes, all of which will be important data to help inform future public health messages regarding epigenetic concepts.
Birthweight has been consistently related to risk of cardiometabolic disorders in later life. Twins are at higher risk of low birthweight than singletons, so understanding the links between birthweight and cardiometabolic health may be particularly important for twins. However, evidence for the association of birthweight with childhood markers of cardiometabolic health in twins is currently lacking. Previous studies have often failed to appropriately adjust for gestational age or fully implement twin regression models. Therefore, we aimed to evaluate the association of birthweight-for-gestational-age z-scores with childhood cardiometabolic health in twins, using within-between regression models. The Peri/Postnatal Epigenetic Twins Study is a Melbourne-based prospective cohort study of 250 twin pairs. Birthweight was recorded at delivery, and childhood anthropometric measures were taken at 18-month and 6-year follow-up visits. Associations of birthweight with markers of cardiometabolic health were assessed at the individual, between- and within-pair level using linear regression with generalised estimating equations. Birthweight-for-gestational-age z-scores were associated with height, weight and BMI at 18 months and 6 years, but not with blood pressure (twins-as-individual SBP: β = 0.15, 95% CI: −0.81, 1.11; twins-as-individual DBP: β = 0.22, 95% CI: −0.34, 0.77). We found little evidence to indicate that the within-between models improved on the twins-as-individuals models. Birthweight was associated with childhood anthropometric measures, but not blood pressure, after appropriately adjusting for gestational age. These associations were consistent across the within-between and twins-as-individuals models. After adjusting for gestational age, results from the twins-as-individuals models are consistent with singleton studies, so these results can be applied to the general population.
To evaluate the hypothesis that a perinatal educational dietary intervention focused on ‘eating for the gut microbiota’ improves diet quality of pregnant women pre- and postnatally.
Design:
The Healthy Parents, Healthy Kids study is a prospectively registered randomised controlled trial designed to evaluate the efficacy of a dietary intervention in altering the maternal and infant gut microbiota and improving perinatal diet quality. Eligible pregnant women were randomised to receive dietary advice from their healthcare provider or to additionally receive a three session dietary intervention. Dietary data were collected at gestation weeks 26, 31, 36 and postnatal week 4. Outcome measures were diet quality, dietary variety, prebiotic and probiotic food intakes, energy, fibre, saturated fat and discretionary food intakes. Between-group differential changes from baseline before and after birth in these dietary measures were assessed using generalised estimating equations.
Setting:
Melbourne, Australia.
Participants:
Healthy pregnant women from gestation week 26.
Results:
Forty-five women were randomised (twenty-two control, twenty-three intervention). Compared with the control group, the intervention group improved diet quality prior to birth (5·66 (95 % CI 1·65, 9·67), Cohen’s d: 0·82 (se 0·33)). The intervention improved dietary variety (1·05 (95 % CI 0·17, 1·94), d: 0·66 (se 0·32)) and increased intakes of prebiotic (0·8 (95 % CI 0·27, 1·33), d: 0·91 (se 0·33)) and probiotic foods (1·05 (95 % CI 0·57, 1·53), d: 1·3(se 0·35)) over the whole study period compared with the control group.
Conclusion:
A dietary intervention focused on ‘eating for the gut microbiota’ can improve aspects of perinatal diet quality during and after pregnancy.
Brain imaging studies have shown altered amygdala activity during emotion processing in children and adolescents with oppositional defiant disorder (ODD) and conduct disorder (CD) compared to typically developing children and adolescents (TD). Here we aimed to assess whether aggression-related subtypes (reactive and proactive aggression) and callous-unemotional (CU) traits predicted variation in amygdala activity and skin conductance (SC) response during emotion processing.
Methods
We included 177 participants (n = 108 cases with disruptive behaviour and/or ODD/CD and n = 69 TD), aged 8–18 years, across nine sites in Europe, as part of the EU Aggressotype and MATRICS projects. All participants performed an emotional face-matching functional magnetic resonance imaging task.
Results
Differences between cases and TD in affective processing, as well as specificity of activation patterns for aggression subtypes and CU traits, were assessed. Simultaneous SC recordings were acquired in a subsample (n = 63). Cases compared to TDs showed higher amygdala activity in response to negative faces (fearful and angry) v. shapes. Subtyping cases according to aggression-related subtypes did not significantly influence on amygdala activity; while stratification based on CU traits was more sensitive and revealed decreased amygdala activity in the high CU group. SC responses were significantly lower in cases and negatively correlated with CU traits, reactive and proactive aggression.
Conclusions
Our results showed differences in amygdala activity and SC responses to emotional faces between cases with ODD/CD and TD, while CU traits moderate both central (amygdala) and peripheral (SC) responses. Our insights regarding subtypes and trait-specific aggression could be used for improved diagnostics and personalized treatment.
Neurodevelopment is sensitive to genetic and pre/postnatal environmental influences. These effects are likely mediated by epigenetic factors, yet current knowledge is limited. Longitudinal twin studies can delineate the link between genetic and environmental factors, epigenetic state at birth and neurodevelopment later in childhood. Building upon our study of the Peri/postnatal Epigenetic Twin Study (PETS) from gestation to 6 years of age, here we describe the PETS 11-year follow-up in which we will use neuroimaging and cognitive testing to examine the relationship between early-life environment, epigenetics and neurocognitive outcomes in mid-childhood. Using a within-pair twin model, the primary aims are to (1) identify early-life epigenetic correlates of neurocognitive outcomes; (2) determine the developmental stability of epigenetic effects and (3) identify modifiable environmental risk factors. Secondary aims are to identify factors influencing gut microbiota between 6 and 11 years of age to investigate links between gut microbiota and neurodevelopmental outcomes in mid-childhood. Approximately 210 twin pairs will undergo an assessment at 11 years of age. This includes a direct child cognitive assessment, multimodal magnetic resonance imaging, biological sampling, anthropometric measurements and a range of questionnaires on health and development, behavior, dietary habits and sleeping patterns. Data from complementary data sources, including the National Assessment Program — Literacy and Numeracy and the Australian Early Development Census, will also be sought. Following on from our previous focus on relationships between growth, cardiovascular health and oral health, this next phase of PETS will significantly advance our understanding of the environmental interactions that shape the developing brain.
In 1984, Hrubec and Robinette published what was arguably the first review of the role of twins in medical research. The authors acknowledged a growing distinction between two categories of twin studies: those aimed at assessing genetic contributions to disease and those aimed at assessing environmental contributions while controlling for genetic variation. They concluded with a brief section on recently founded twin registries that had begun to provide unprecedented access to twins for medical research. Here we offer an overview of the twin research that, in our estimation, best represents the field has progress since 1984. We start by summarizing what we know about twinning. We then focus on the value of twin study designs to differentiate between genetic and environmental influences on health and on emerging applications of twins in multiple areas of medical research. We finish by describing how twin registries and networks are accelerating twin research worldwide.
Twins Research Australia (TRA) is a community of twins and researchers working on health research to benefit everyone, including twins. TRA leads multidisciplinary research through the application of twin and family study designs, with the aim of sustaining long-term twin research that, both now and in the future, gives back to the community. This article summarizes TRA’s recent achievements and future directions, including new methodologies addressing causation, linkage to health, economic and educational administrative datasets and to geospatial data to provide insight into health and disease. We also explain how TRA’s knowledge translation and exchange activities are key to communicating the impact of twin studies to twins and the wider community. Building researcher capability, providing registry resources and partnering with all key stakeholders, particularly the participants, are important for how TRA is advancing twin research to improve health outcomes for society. TRA provides researchers with open access to its vibrant volunteer membership of twins, higher order multiples (multiples) and families who are willing to consider participation in research. Established four decades ago, this resource facilitates and supports research across multiple stages and a breadth of health domains.
Monozygotic (MZ) and dizygotic (DZ) twins participate in research that partitions variance in health, disease, and behavior into genetic and environmental components. However, there are other innovative roles for twins in medical research. One such way is involving MZ and/or DZ twins in co-twin control-designed randomized controlled trials (RCTs). To our knowledge, no reviews have been conducted that summarizes the involvement of twins in RCTs. Therefore, we conducted a systematic literature search using the U.S. Clinical Trials Database, NHS electronic databases, MEDLINE, EMBASE, and PsychINFO for RCTs on publications involving MZ and/or DZ twins as RCT participants. Out of the 186,027 clinical trials registered in the U.S. clinical trial register ClinicaTrails.gov, only six RCTs used twins as participants. From 1,598 articles identified in our search, 50 peer-reviewed English language publications met our pre-defined inclusion criteria. Sample sizes for RCTs have ranged from a total number of participants from 2 to 1,162; however, 32 (64%) studies had a sample size of 100 or less, and of those, 12 (24%) had fewer than 10. Both MZ and DZ twins have been recruited to the RCTs. In most instances (33/50) each twin from a pair were assigned to different study arms. Most of those studies included MZ twins only. Despite the methodological advantages, the use of MZ and DZ twins as participants in interventional RCTs appeared limited. The continuous development of innovative twin designs, especially RCTs, indicates that twin research can extend beyond the more widely recognized heritability estimates.
Accurate weed emergence models are valuable tools for scheduling planting, cultivation, and herbicide applications. Multiple models predicting giant ragweed emergence have been developed, but none have been validated in diverse crop rotation and tillage systems, which have the potential to influence weed emergence patterns. This study evaluated the performance of published giant ragweed emergence models across various crop rotations and spring tillage dates in southern Minnesota. Across experiments, the most robust model was a mixed-effects Weibull (flexible sigmoidal function) model predicting emergence in relation to hydrothermal time accumulation with a base temperature of 4.4 C, a base soil matric potential of −2.5 MPa, and two random effects determined by overwinter growing degree days (GDD) (10 C) and precipitation accumulated during seedling recruitment. The deviations in emergence between individual plots and the fixed-effects model were distinguished by the positive association between the lower horizontal asymptote (Drop) and maximum daily soil temperature during seedling recruitment. This finding indicates that crops and management practices that increase soil temperature will have a shorter lag phase at the start of giant ragweed emergence compared with practices promoting cool soil temperatures. Thus, crops with early-season crop canopies such as perennial crops and crops planted in early spring and in narrow rows will likely have a slower progression of giant ragweed emergence. This research provides a valuable assessment of published giant ragweed emergence models and illustrates that accurate emergence models can be used to time field operations and improve giant ragweed control across diverse cropping systems.
Erosion of agricultural croplands is a significant contributor of sedimentation to reservoirs. Here, physiographic and economic models for a large agricultural watershed (2377 square miles with 27 subwatersheds) are integrated for the reduction of sedimentation of one Midwestern reservoir. Sediment reduction and the cost-effectiveness of three agricultural best management practices (no-till, filter strip, and permanent vegetation) implementation were considered under three modeling scenarios: random assignment; the globally most cost-effective approach; and a cost-effective targeting approach. This study demonstrates how physiographic and economic data can be harnessed to yield readily comprehendible cost-effective targeting maps. Cost-effective targeting may be preferable to watershed managers for its “user-friendliness” without too great a sacrifice of the globally most cost-efficient solution.
As herbicide-resistant weed populations become increasingly problematic in crop production, alternative strategies of weed control are necessary. Giant ragweed, one of the most competitive agricultural weeds in row crops, has evolved resistance to multiple herbicide biochemical sites of action within the plant, necessitating the development of new and integrated methods of weed control. This study assessed the quantity and duration of seed retention of giant ragweed grown in soybean fields and adjacent field margins. Seed retention of giant ragweed was monitored weekly during the 2012 to 2014 harvest seasons using seed collection traps. Giant ragweed plants produced an average of 1,818 seeds per plant, with 66% being potentially viable. Giant ragweed on average began shattering hard (potentially viable) and soft (nonviable) seeds September 12 and continued through October at an average rate of 0.75 and 0.44% of total seeds per day during September and October, respectively. Giant ragweed seeds remained on the plants well into the Minnesota soybean harvest season, with an average of 80% of the total seeds being retained on October 11, when Minnesota soybean harvest was approximately 75% completed in the years of the study. These results suggest that there is a sufficient amount of time to remove escaped giant ragweed from production fields and field margins before the seeds shatter by managing weed seed dispersal before or at crop harvest. Controlling weed seed dispersal has potential to manage herbicide-resistant giant ragweed by limiting replenishment of the weed seed bank.
In the midwestern United States, biotypes of giant ragweed resistant to multiple herbicide biochemical sites of action have been identified. Weeds with resistance to multiple herbicides reduce the utility of existing herbicides and necessitate the development of alternative weed control strategies. In two experiments in southeastern Minnesota, we determined the effect of six 3 yr crop-rotation systems containing corn, soybean, wheat, and alfalfa on giant ragweed seedbank depletion and emergence patterns. The six crop-rotation systems included continuous corn, soybean–corn–corn, corn–soybean–corn, soybean–wheat–corn, soybean–alfalfa–corn, and alfalfa–alfalfa–corn. The crop-rotation system had no effect on the amount of seedbank depletion when a zero-weed threshold was maintained, with an average of 96% of the giant ragweed seedbank being depleted within 2 yr. Seedbank depletion occurred primarily through seedling emergence in all crop-rotation systems. However, seedling emergence tended to account for more of the seedbank depletion in rotations containing only corn or soybean compared with rotations with wheat or alfalfa. Giant ragweed emerged early across all treatments, with on average 90% emergence occurring by June 4. Duration of emergence was slightly longer in established alfalfa compared with other cropping systems. These results indicate that corn and soybean rotations are more conducive to giant ragweed emergence than rotations including wheat and alfalfa, and that adopting a zero-weed threshold is a viable approach to depleting the weed seedbank in all crop-rotation systems.