To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Increased risk donors in paediatric heart transplantation have characteristics that may increase the risk of infectious disease transmission despite negative serologic testing. However, the risk of disease transmission is low, and refusing an IRD offer may increase waitlist mortality. We sought to determine the risks of declining an initial IRD organ offer.
Methods and results:
We performed a retrospective analysis of candidates waitlisted for isolated PHT using 20072017 United Network of Organ Sharing datasets. Match runs identified candidates receiving IRD offers. Competing risks analysis was used to determine mortality risk for those that declined an initial IRD offer with stratified Cox regression to estimate the survival benefit associated with accepting initial IRD offers. Overall, 238/1067 (22.3%) initial IRD offers were accepted. Candidates accepting an IRD offer were younger (7.2 versus 9.8 years, p < 0.001), more often female (50 versus 41%, p = 0.021), more often listed status 1A (75.6 versus 61.9%, p < 0.001), and less likely to require mechanical bridge to PHT (16% versus 23%, p = 0.036). At 1- and 5-year follow-up, cumulative mortality was significantly lower for candidates who accepted compared to those that declined (6% versus 13% 1-year mortality and 15% versus 25% 5-year mortality, p = 0.0033). Decline of an IRD offer was associated with an adjusted hazard ratio for mortality of 1.87 (95% CI 1.24, 2.81, p < 0.003).
IRD organ acceptance is associated with a substantial survival benefit. Increasing acceptance of IRD organs may provide a targetable opportunity to decrease waitlist mortality in PHT.
Diet is a modifiable risk factor for chronic disease and a potential modulator of telomere length (TL). The study aim was to investigate associations between diet quality and TL in Australian adults after a 12-week dietary intervention with an almond-enriched diet (AED). Participants (overweight/obese, 50–80 years) were randomised to an AED (n 62) or isoenergetic nut-free diet (NFD, n 62) for 12 weeks. Diet quality was assessed using a Dietary Guideline Index (DGI), applied to weighed food records, that consists of ten components reflecting adequacy, variety and quality of core food components and discretionary choices within the diet. TL was measured by quantitative PCR in samples of lymphocytes, neutrophils, and whole blood. There were no significant associations between DGI scores and TL at baseline. Diet quality improved with AED and decreased with NFD after 12 weeks (change from baseline AED + 9·8 %, NFD − 14·3 %; P < 0·001). TL increased in neutrophils (+9·6 bp, P = 0·009) and decreased in whole blood, to a trivial extent (–12·1 bp, P = 0·001), and was unchanged in lymphocytes. Changes did not differ between intervention groups. There were no significant relationships between changes in diet quality scores and changes in lymphocyte, neutrophil or whole blood TL. The inclusion of almonds in the diet improved diet quality scores but had no impact on TL mid-age to older Australian adults. Future studies should investigate the impact of more substantial dietary changes over longer periods of time.
Antisaccade tasks can be used to index cognitive control processes, e.g. attention, behavioral inhibition, working memory, and goal maintenance in people with brain disorders. Though diagnoses of schizophrenia (SZ), schizoaffective (SAD), and bipolar I with psychosis (BDP) are typically considered to be distinct entities, previous work shows patterns of cognitive deficits differing in degree, rather than in kind, across these syndromes.
Large samples of individuals with psychotic disorders were recruited through the Bipolar-Schizophrenia Network on Intermediate Phenotypes 2 (B-SNIP2) study. Anti- and pro-saccade task performances were evaluated in 189 people with SZ, 185 people with SAD, 96 people with BDP, and 279 healthy comparison participants. Logistic functions were fitted to each group's antisaccade speed-performance tradeoff patterns.
Psychosis groups had higher antisaccade error rates than the healthy group, with SZ and SAD participants committing 2 times as many errors, and BDP participants committing 1.5 times as many errors. Latencies on correctly performed antisaccade trials in SZ and SAD were longer than in healthy participants, although error trial latencies were preserved. Parameters of speed-performance tradeoff functions indicated that compared to the healthy group, SZ and SAD groups had optimal performance characterized by more errors, as well as less benefit from prolonged response latencies. Prosaccade metrics did not differ between groups.
With basic prosaccade mechanisms intact, the higher speed-performance tradeoff cost for antisaccade performance in psychosis cases indicates a deficit that is specific to the higher-order cognitive aspects of saccade generation.
Clinical trials are a fundamental tool in evaluating the safety and efficacy of new drugs, medical devices, and health system interventions. Clinical trial visits generally involve eligibility assessment, enrollment, intervention administration, data collection, and follow-up, with many of these steps performed during face-to-face visits between participants and the investigative team. Social distancing, which emerged as one of the mainstay strategies for reducing the spread of SARS-CoV-2, has presented a challenge to the traditional model of clinical trial conduct, causing many research teams to halt all in-person contacts except for life-saving research. Nonetheless, clinical research has continued during the pandemic because study teams adapted quickly, turning to virtual visits and other similar methods to complete critical research activities. The purpose of this special communication is to document this rapid transition to virtual methodologies at Clinical and Translational Science Awards hubs and highlight important considerations for future development. Looking beyond the pandemic, we envision that a hybrid approach, which implements remote activities when feasible but also maintains in-person activities as necessary, will be adopted more widely for clinical trials. There will always be a need for in-person aspects of clinical research, but future study designs will need to incorporate remote capabilities.
The Eating Assessment in Toddlers FFQ (EAT FFQ) has been shown to have good reliability and comparative validity for ranking nutrient intakes in young children. With the addition of food items (n 4), we aimed to re-assess the validity of the EAT FFQ and estimate calibration factors in a sub-sample of children (n 97) participating in the Growing Up Milk – Lite (GUMLi) randomised control trial (2015–2017). Participants completed the ninety-nine-item GUMLi EAT FFQ and record-assisted 24-h recalls (24HR) on two occasions. Energy and nutrient intakes were assessed at months 9 and 12 post-randomisation and calibration factors calculated to determine predicted estimates from the GUMLi EAT FFQ. Validity was assessed using Pearson correlation coefficients, weighted kappa (κ) and exact quartile categorisation. Calibration was calculated using linear regression models on 24HR, adjusted for sex and treatment group. Nutrient intakes were significantly correlated between the GUMLi EAT FFQ and 24HR at both time points. Energy-adjusted, de-attenuated Pearson correlations ranged from 0·3 (fibre) to 0·8 (Fe) at 9 months and from 0·3 (Ca) to 0·7 (Fe) at 12 months. Weighted κ for the quartiles ranged from 0·2 (Zn) to 0·6 (Fe) at 9 months and from 0·1 (total fat) to 0·5 (Fe) at 12 months. Exact agreement ranged from 30 to 74 %. Calibration factors predicted up to 56 % of the variation in the 24HR at 9 months and 44 % at 12 months. The GUMLi EAT FFQ remained a useful tool for ranking nutrient intakes with similar estimated validity compared with other FFQ used in children under 2 years.
There has been scant exploration of the social and emotional wellbeing (SEWB) of young Indigenous populations that identify as LGBTQA+ (Lesbian, Gay, Bisexual, Transgender, Queer/Questioning, Asexual +). Given the vulnerability of this cohort living in Western settler colonial societies, wider investigation is called for to respond to their needs, experiences and aspirations. This paper summarizes existing research on the topic highlighting the lack of scholarship on the intersection of youth, Indigeneity, LGBTQA+ and SEWB. The paper takes a holistic approach to provide a global perspective that draws on an emerging body of literature and research driven by Indigenous scholars in settler colonial societies. The paper points to the importance of understanding converging colonial influences and ongoing contemporary elements, such as racism and marginalization that impact on young Indigenous LGBTQA+ wellbeing.
Registry-based trials have emerged as a potentially cost-saving study methodology. Early estimates of cost savings, however, conflated the benefits associated with registry utilisation and those associated with other aspects of pragmatic trial designs, which might not all be as broadly applicable. In this study, we sought to build a practical tool that investigators could use across disciplines to estimate the ranges of potential cost differences associated with implementing registry-based trials versus standard clinical trials.
We built simulation Markov models to compare unique costs associated with data acquisition, cleaning, and linkage under a registry-based trial design versus a standard clinical trial. We conducted one-way, two-way, and probabilistic sensitivity analyses, varying study characteristics over broad ranges, to determine thresholds at which investigators might optimally select each trial design.
Registry-based trials were more cost effective than standard clinical trials 98.6% of the time. Data-related cost savings ranged from $4300 to $600,000 with variation in study characteristics. Cost differences were most reactive to the number of patients in a study, the number of data elements per patient available in a registry, and the speed with which research coordinators could manually abstract data. Registry incorporation resulted in cost savings when as few as 3768 independent data elements were available and when manual data abstraction took as little as 3.4 seconds per data field.
Registries offer important resources for investigators. When available, their broad incorporation may help the scientific community reduce the costs of clinical investigation. We offer here a practical tool for investigators to assess potential costs savings.
There is limited knowledge of how individuals reflect on their involuntary admission.
To investigate, at one year after an involuntary admission,
(i) peoples perception of the necessity of their involuntary admission
(ii) the enduring impact on the relationship with their family, consultant psychiatrist and employment prospects
(iii) readmission rates to hospital and risk factors for readmission.
People that were admitted involuntarily over a 15 month period were re-interviewed at one year following discharge.
Sixty eight people were re-interviewed at one year and this resulted in a follow-up rate of 84%. Prior to discharge, 72% of people reported that their involuntary admission had been necessary however this reduced to 60% after one year. Over one third of people changed their views and the majority of these patients reflected negatively towards their involuntary admission.
One quarter of people continued to experience a negative impact on the relationship with a family member and their consultant psychiatrist one year after an involuntary admission, while 13% reported a positive impact. A similar proportion perceived that it had negative consequences in their employment.
Within one year, 43% of all patients involuntarily admitted in the study period were readmitted to hospital and half of these admissions were involuntary. Involuntary readmission was associated with a sealing over recovery style.
Peoples’ perception of the necessity of their involuntary admissions changes significantly over time. Involuntary admissions can have a lasting negative impact on the relationship with family members and treating consultant psychiatrist.
The Single Ventricle Reconstruction Trial randomised neonates with hypoplastic left heart syndrome to a shunt strategy but otherwise retained standard of care. We aimed to describe centre-level practice variation at Fontan completion.
Centre-level data are reported as median or median frequency across all centres and range of medians or frequencies across centres. Classification and regression tree analysis assessed the association of centre-level factors with length of stay and percentage of patients with prolonged pleural effusion (>7 days).
The median Fontan age (14 centres, 320 patients) was 3.1 years (range from 1.7 to 3.9), and the weight-for-age z-score was −0.56 (−1.35 + 0.44). Extra-cardiac Fontans were performed in 79% (4–100%) of patients at the 13 centres performing this procedure; lateral tunnels were performed in 32% (3–100%) at the 11 centres performing it. Deep hypothermic circulatory arrest (nine centres) ranged from 6 to 100%. Major complications occurred in 17% (7–33%). The length of stay was 9.5 days (9–12); 15% (6–33%) had prolonged pleural effusion. Centres with fewer patients (<6%) with prolonged pleural effusion and fewer (<41%) complications had a shorter length of stay (<10 days; sensitivity 1.0; specificity 0.71; area under the curve 0.96). Avoiding deep hypothermic circulatory arrest and higher weight-for-age z-score were associated with a lower percentage of patients with prolonged effusions (<9.5%; sensitivity 1.0; specificity = 0.86; area under the curve 0.98).
Fontan perioperative practices varied widely among study centres. Strategies to decrease the duration of pleural effusion and minimise complications may decrease the length of stay. Further research regarding deep hypothermic circulatory arrest is needed to understand its association with prolonged pleural effusion.
Introduction: Variation in image ordering exists across Alberta emergency departments (EDs). Evidence-based, pocket-sized knowledge dissemination tools were developed for two conditions (acute asthma [AA] and benign headache [BHA]) for which imaging (chest x-ray [CXR] and computed tomography [CT], respectively) has limited utility. This study explored tool acceptability among ED patients and emergency physicians (EPs). Methods: Tool feedback was provided by EPs, via online survey, and adult patients with AA and BHA via in-person survey. EPs qualitative interviews further explored communication tools. Preliminary descriptive analyses of survey responses and content analysis of interview data were conducted. Results: Overall, 55 EPs (55/192; 29%) and 38 consecutive patients participated in the AA study; 73 EPs (73/192; 38%) and 160 patients participated in the BHA study. In both studies, approximately 50% of EPs felt comfortable using the tool; however, they suggested including radiation risk details and imaging indications and removing references to imaging variation and health system cost. In the BHA study, EPs opposed the four Choosing Wisely® campaign questions fearing they would increase imaging expectations. In both conditions, most patients ( >90%) understood the content and 68% felt the information applied to them. Less than half (AA:45%; BHA: 38%) agreed that they now knew more about when a patient should have imaging workup done. Following tool review, 71% of AA and 50% of BHA patients stated they would discuss their imaging needs with their ED care provider today or during a future presentation. Both patient groups suggested including: additional imaging details (i.e., indications, risk, clinical utility), removing imaging overuse references, and including instructions that encourage patients to ask their EP questions. EP interviews (n = 12) identified preferences for personalized and interactive tools. Tensions were perceived around ED time pressure as well as remuneration schemes that fail to prioritize patient conversation. Tool centralization, easy access, and connection with outpatient support were also key themes. Conclusion: Both patients and EPs provided valuable information on how to improve ED knowledge dissemination tools, using two chronic conditions to demonstrate how these changes would improve tool utility. Implementing these recommendations, and considering preferences of EPs and patients, may improve future tool uptake and impact.
The second year of life is a period of nutritional vulnerability. We aimed to investigate the dietary patterns and nutrient intakes from 1 to 2 years of age during the 12-month follow-up period of the Growing Up Milk – Lite (GUMLi) trial. The GUMLi trial was a multi-centre, double-blinded, randomised controlled trial of 160 healthy 1-year-old children in Auckland, New Zealand and Brisbane, Australia. Dietary intakes were collected at baseline, 3, 6, 9 and 12 months post-randomisation, using a validated FFQ. Dietary patterns were identified using principal component analysis of the frequency of food item consumption per d. The effect of the intervention on dietary patterns and intake of eleven nutrients over the duration of the trial were investigated using random effects mixed models. A total of three dietary patterns were identified at baseline: ‘junk/snack foods’, ‘healthy/guideline foods’ and ‘breast milk/formula’. A significant group difference was observed in ‘breast milk/formula’ dietary pattern z scores at 12 months post-randomisation, where those in the GUMLi group loaded more positively on this pattern, suggesting more frequent consumption of breast milk. No difference was seen in the other two dietary patterns. Significant intervention effects were seen on nutrient intake between the GUMLi (intervention) and cows’ milk (control) groups, with lower protein and vitamin B12, and higher Fe, vitamin D, vitamin C and Zn intake in the GUMLi (intervention) group. The consumption of GUMLi did not affect dietary patterns, however, GUMLi participants had lower protein intake and higher Fe, vitamins D and C and Zn intake at 2 years of age.
The objective of this study was to investigate the impact of the most commonly cited factors that may have influenced infants’ gut microbiota profiles at one year of age: mode of delivery, breastfeeding duration and antibiotic exposure. Barcoded V3/V4 amplicons of bacterial 16S-rRNA gene were prepared from the stool samples of 52 healthy 1-year-old Australian children and sequenced using the Illumina MiSeq platform. Following the quality checks, the data were processed using the Quantitative Insights Into Microbial Ecology pipeline and analysed using the Calypso package for microbiome data analysis. The stool microbiota profiles of children still breastfed were significantly different from that of children weaned earlier (P<0.05), independent of the age of solid food introduction. Among children still breastfed, Veillonella spp. abundance was higher. Children no longer breastfed possessed a more ‘mature’ microbiota, with notable increases of Firmicutes. The microbiota profiles of the children could not be differentiated by delivery mode or antibiotic exposure. Further analysis based on children’s feeding patterns found children who were breastfed alongside solid food had significantly different microbiota profiles compared to that of children who were receiving both breastmilk and formula milk alongside solid food. This study provided evidence that breastfeeding continues to influence gut microbial community even at late infancy when these children are also consuming table foods. At this age, any impacts from mode of delivery or antibiotic exposure did not appear to be discernible imprints on the microbial community profiles of these healthy children.
Weed suppression is one possible benefit of including cover crops in crop
rotations. The late spring planting date of dry beans allows for more growth
of cover crops in the spring. We assessed the influence of cover crops on
weed dynamics in organic dry beans and weed seed persistence. Medium red
clover, oilseed radish, and cereal rye were planted the year before dry
beans; a no-cover-crop control was also included. After cover-crop
incorporation, common lambsquarters, giant foxtail, and velvetleaf seeds
were buried in the red clover, cereal rye, and no-cover control treatments
and then retrieved 0, 1, 2, 4, 6, and 12 mo after cover-crop incorporation.
Dry beans were planted in June and weed emergence and biomass measured.
Eleven or more site-years of data were collected for each cover-crop
treatment between 2011 and 2013, allowing for structural equation modeling
(SEM), in addition to traditional analyses. Cereal rye residue increased
giant foxtail and velvetleaf seed persistence by up to 12%; red clover
decreased common lambsquarters seed persistence by 22% in 1 of 2 yr relative
to the no-cover-crop control. Oilseed radish and incorporated cereal rye
rarely reduced weed densities. When red clover biomass exceeded 5 Mg
ha−1, soil inorganic N was often higher (5 of 6 site-years),
as were weed density and biomass (5 and 4 of 12 main site sample times,
respectively). Using SEM, we identified one causal relationship between
cover-crop N content and weed biomass at the first flower stage (R1), as
mediated through soil N at the time of dry bean planting and at the stage
with two fully expanded trifoliates. Increasing cover-crop C : N ratios
directly reduced weed biomass at R1, not mediated through changes in soil N.
Cover crops that make a significant contribution to soil N may also
stimulate weed emergence and growth.
Calcium is considered important in buffering excess stomach acid in mammals, including horses. Control of stomach acid is important in preventing the development of ulcers within the stomach lining, which, in horses, are considered to be caused by acid splashing. Algae supplements contain various minerals which are in natural form, as seen in all plant and feedstuffs. The current trial was conducted to examine if a high calcium algae supplement had any impact on gastric ulceration in horses, which may be due to buffering stomach acid, reducing the pH in a gradual manner, without resorting to medication. Ten horses, of either thoroughbred, standardbred or sport horse breed, were selected on the basis of the presence of ulcers in their stomach, as ascertained by endoscopy. The average ulceration score before algae supplementation was 2.2 ± 0.75 according to the EGUC scoring system. The horses were then maintained on their normal diet (unchanged from the initial ulcer scoring) by the owner with the addition of 40 g per day of the high calcium, algae based Maxia Complete® (Seahorse Supplements Ltd, Christchurch, NZ) for thirty days (T30). All horses were then re endoscoped to assess any change in ulceration score. All horses showed a significant improvement in ulcer score, with seven having a score of zero (fully healed, no evidence of further ulceration) and two with a score of one (some residual inflammation or keratinosis in areas of healed ulcers). This resulted in a mean score of 0.3 ± 0.48 (P < 0.0001: T0 versus T30) at the end of the study. This trial demonstrated that feeding an organic form of high calcium from algae reduced ulceration in horses.
To determine if total lifetime physical activity (PA) is associated with better cognitive functioning with aging and if cerebrovascular function mediates this association. A sample of 226 (52.2% female) community dwelling middle-aged and older adults (66.5±6.4 years) in the Brain in Motion Study, completed the Lifetime Total Physical Activity Questionnaire and underwent neuropsychological and cerebrovascular blood flow testing. Multiple robust linear regressions were used to model the associations between lifetime PA and global cognition after adjusting for age, sex, North American Adult Reading Test results (i.e., an estimate of premorbid intellectual ability), maximal aerobic capacity, body mass index and interactions between age, sex, and lifetime PA. Mediation analysis assessed the effect of cerebrovascular measures on the association between lifetime PA and global cognition. Post hoc analyses assessed past year PA and current fitness levels relation to global cognition and cerebrovascular measures. Better global cognitive performance was associated with higher lifetime PA (p=.045), recreational PA (p=.021), and vigorous intensity PA (p=.004), PA between the ages of 0 and 20 years (p=.036), and between the ages of 21 and 35 years (p<.0001). Cerebrovascular measures did not mediate the association between PA and global cognition scores (p>.5), but partially mediated the relation between current fitness and global cognition. This study revealed significant associations between higher levels of PA (i.e., total lifetime, recreational, vigorous PA, and past year) and better cognitive function in later life. Current fitness levels relation to cognitive function may be partially mediated through current cerebrovascular function. (JINS, 2015, 21, 816–830)
Considerable research has documented that exposure to traumatic events has negative effects on physical and mental health. Much less research has examined the predictors of traumatic event exposure. Increased understanding of risk factors for exposure to traumatic events could be of considerable value in targeting preventive interventions and anticipating service needs.
General population surveys in 24 countries with a combined sample of 68 894 adult respondents across six continents assessed exposure to 29 traumatic event types. Differences in prevalence were examined with cross-tabulations. Exploratory factor analysis was conducted to determine whether traumatic event types clustered into interpretable factors. Survival analysis was carried out to examine associations of sociodemographic characteristics and prior traumatic events with subsequent exposure.
Over 70% of respondents reported a traumatic event; 30.5% were exposed to four or more. Five types – witnessing death or serious injury, the unexpected death of a loved one, being mugged, being in a life-threatening automobile accident, and experiencing a life-threatening illness or injury – accounted for over half of all exposures. Exposure varied by country, sociodemographics and history of prior traumatic events. Being married was the most consistent protective factor. Exposure to interpersonal violence had the strongest associations with subsequent traumatic events.
Given the near ubiquity of exposure, limited resources may best be dedicated to those that are more likely to be further exposed such as victims of interpersonal violence. Identifying mechanisms that account for the associations of prior interpersonal violence with subsequent trauma is critical to develop interventions to prevent revictimization.
Cryptosporidium, a parasite known to cause large drinking and recreational water outbreaks, is tolerant of chlorine concentrations used for drinking water treatment. Human laboratory-based surveillance for enteric pathogens detected a cryptosporidiosis outbreak in Baker City, Oregon during July 2013 associated with municipal drinking water. Objectives of the investigation were to confirm the outbreak source and assess outbreak extent. The watershed was inspected and city water was tested for contamination. To determine the community attack rate, a standardized questionnaire was administered to randomly sampled households. Weighted attack rates and confidence intervals (CIs) were calculated. Water samples tested positive for Cryptosporidium species; a Cryptosporidium parvum subtype common in cattle was detected in human stool specimens. Cattle were observed grazing along watershed borders; cattle faeces were observed within watershed barriers. The city water treatment facility chlorinated, but did not filter, water. The community attack rate was 28·3% (95% CI 22·1–33·6), sickening an estimated 2780 persons. Watershed contamination by cattle probably caused this outbreak; water treatments effective against Cryptosporidium were not in place. This outbreak highlights vulnerability of drinking water systems to pathogen contamination and underscores the need for communities to invest in system improvements to maintain multiple barriers to drinking water contamination.