To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Surveillance of healthcare-associated infections is often performed by manual chart review. Semiautomated surveillance may substantially reduce workload and subjective data interpretation. We assessed the validity of a previously published algorithm for semiautomated surveillance of deep surgical site infections (SSIs) after total hip arthroplasty (THA) or total knee arthroplasty (TKA) in Dutch hospitals. In addition, we explored the ability of a hospital to automatically select the patients under surveillance.
Multicenter retrospective cohort study.
Hospitals identified patients who underwent THA or TKA either by procedure codes or by conventional surveillance. For these patients, routine care data regarding microbiology results, antibiotics, (re)admissions, and surgeries within 120 days following THA or TKA were extracted from electronic health records. Patient selection was compared with conventional surveillance and patients were retrospectively classified as low or high probability of having developed deep SSI by the algorithm. Sensitivity, positive predictive value (PPV), and workload reduction were calculated and compared to conventional surveillance.
Of 9,554 extracted THA and TKA surgeries, 1,175 (12.3%) were revisions, and 8,378 primary surgeries remained for algorithm validation (95 deep SSIs, 1.1%). Sensitivity ranged from 93.6% to 100% and PPV ranged from 55.8% to 72.2%. Workload was reduced by ≥98%. Also, 2 SSIs (2.1%) missed by the algorithm were explained by flaws in data selection.
This algorithm reliably detects patients with a high probability of having developed deep SSI after THA or TKA in Dutch hospitals. Our results provide essential information for successful implementation of semiautomated surveillance for deep SSIs after THA or TKA.
We have previously shown that higher intake of cruciferous vegetables is inversely associated with carotid artery intima-media thickness. To further test the hypothesis that an increased consumption of cruciferous vegetables is associated with reduced indicators of structural vascular disease in other areas of the vascular tree, we aimed to investigate the cross-sectional association between cruciferous vegetable intake and extensive calcification in the abdominal aorta. Dietary intake was assessed, using a FFQ, in 684 older women from the Calcium Intake Fracture Outcome Study. Cruciferous vegetables included cabbage, Brussels sprouts, cauliflower and broccoli. Abdominal aortic calcification (AAC) was scored using the Kauppila AAC24 scale on dual-energy X-ray absorptiometry lateral spine images and was categorised as ‘not extensive’ (0–5) or ‘extensive’ (≥6). Mean age was 74·9 (sd 2·6) years, median cruciferous vegetable intake was 28·2 (interquartile range 15·0–44·7) g/d and 128/684 (18·7 %) women had extensive AAC scores. Those with higher intakes of cruciferous vegetables (>44·6 g/d) were associated with a 46 % lower odds of having extensive AAC in comparison with those with lower intakes (<15·0 g/d) after adjustment for lifestyle, dietary and CVD risk factors (ORQ4 v. Q1 0·54, 95 % CI 0·30, 0·97, P = 0·036). Total vegetable intake and each of the other vegetable types were not related to extensive AAC (P > 0·05 for all). This study strengthens the hypothesis that higher intake of cruciferous vegetables may protect against vascular calcification.
Individuals with schizophrenia are more likely to smoke and less likely to quit smoking than those without schizophrenia. Because task persistence is lower in smokers with than without schizophrenia, it is possible that lower levels of task persistence may contribute to greater difficulties in quitting smoking observed among smokers with schizophrenia.
To develop a feasible and acceptable intervention for smokers with schizophrenia.
Participants (N = 24) attended eight weekly individual cognitive behavioral therapy sessions for tobacco use disorder with a focus on increasing task persistence and received 10 weeks of nicotine patch.
In total, 93.8% of participants rated the intervention as at least a 6 out of 7 regarding how ‘easy to understand’ it was and 81.3% rated the treatment as at least a 6 out of 7 regarding how helpful it was to them. A total of 62.5% attended at least six of the eight sessions and session attendance was positively related to nicotine dependence and age and negatively related to self-efficacy for quitting.
This intervention was feasible and acceptable to smokers with schizophrenia. Future research will examine questions appropriate for later stages of therapy development such as initial efficacy of the intervention and task persistence as a mediator of treatment outcome.
Many studies demonstrate that marriage protects against risky alcohol use and moderates genetic influences on alcohol outcomes; however, previous work has not considered these effects from a developmental perspective or in high-risk individuals. These represent important gaps, as it cannot be assumed that marriage has uniform effects across development or in high-risk samples. We took a longitudinal developmental approach to examine whether marital status was associated with heavy episodic drinking (HED), and whether marital status moderated polygenic influences on HED. Our sample included 937 individuals (53.25% female) from the Collaborative Study on the Genetics of Alcoholism who reported their HED and marital status biennially between the ages of 21 and 25. Polygenic risk scores (PRS) were derived from a genome-wide association study of alcohol consumption. Marital status was not associated with HED; however, we observed pathogenic gene-by-environment effects that changed across young adulthood. Among those who married young (age 21), individuals with higher PRS reported more HED; however, these effects decayed over time. The same pattern was found in supplementary analyses using parental history of alcohol use disorder as the index of genetic liability. Our findings indicate that early marriage may exacerbate risk for those with higher polygenic load.
Observational studies have linked elevated homocysteine to vascular conditions. Folate intake has been associated with lower homocysteine concentration, although randomised controlled trials of folic acid supplementation to decrease the incidence of vascular conditions have been inconclusive. We investigated determinants of maternal homocysteine during pregnancy, particularly in a folic acid-fortified population.
Data were from the Ottawa and Kingston Birth Cohort of 8085 participants. We used multivariable regression analyses to identify factors associated with maternal homocysteine, adjusted for gestational age at bloodwork. Continuous factors were modelled using restricted cubic splines. A subgroup analysis examined the modifying effect of MTHFR 677C>T genotype on folate, in determining homocysteine concentration.
Participants were recruited in Ottawa and Kingston, Canada, from 2002 to 2009.
Women were recruited when presenting for prenatal care in the early second trimester.
In 7587 participants, factors significantly associated with higher homocysteine concentration were nulliparous, smoking and chronic hypertension, while factors significantly associated with lower homocysteine concentration were non-Caucasian race, history of a placenta-mediated complication and folic acid supplementation. Maternal age and BMI demonstrated U-shaped associations. Folic acid supplementation of >1 mg/d during pregnancy did not substantially increase folate concentration. In the subgroup analysis, MTHFR 677C>T modified the effect of folate status on homocysteine concentration.
We identified determinants of maternal homocysteine relevant to the lowering of homocysteine in the post-folic acid fortification era, characterised by folate-replete populations. A focus on periconceptional folic acid supplementation and improving health status may form an effective approach to lower homocysteine.
Contact guidance is vital to many physiological processes, yet is still poorly understood. This is partly due to the variability of experimental platforms, making comparisons difficult. To combat this, a multiplexed approach was used to fabricate topographical cues on single quartz coverslips for high-throughput screening. Furthermore, this method offers control of surface roughness and protein adsorption characterization, two critical aspects to the in vitro environment often overlooked in contact guidance platforms. The quartz surface can be regenerated, is compatible with versatile microscopy modes, and can scale up for manufacturing offering a novel platform that could serve as a potential standard assay.
Nutritional therapy is a cornerstone of burns management. The optimal macronutrient intake for wound healing after burn injury has not been identified, although high-energy, high-protein diets are favoured. The present study aimed to identify the optimal macronutrient intake for burn wound healing. The geometric framework (GF) was used to analyse wound healing after a 10 % total body surface area contact burn in mice ad libitum fed one of the eleven high-energy diets, varying in macronutrient composition with protein (P5−60 %), carbohydrate (C20−75 %) and fat (F20−75 %). In the GF study, the optimal ratio for wound healing was identified as a moderate-protein, high-carbohydrate diet with a protein:carbohydrate:fat (P:C:F) ratio of 1:4:2. High carbohydrate intake was associated with lower mortality, improved body weight and a beneficial pattern of body fat reserves. Protein intake was essential to prevent weight loss and mortality, but a protein intake target of about 7 kJ/d (about 15 % of energy intake) was identified, above which no further benefit was gained. High protein intake was associated with delayed wound healing and increased liver and spleen weight. As the GF study demonstrated that an initial very high protein intake prevented mortality, a very high-protein, moderate-carbohydrate diet (P40:C42:F18) was specifically designed. The dynamic diet study was also designed to combine and validate the benefits of an initial very high protein intake for mortality, and subsequent moderate protein, high carbohydrate intake for optimal wound healing. The dynamic feeding experiment showed switching from an initial very high-protein diet to the optimal moderate-protein, high-carbohydrate diet accelerated wound healing whilst preventing mortality and liver enlargement.
Studies suggest that alcohol consumption and alcohol use disorders have distinct genetic backgrounds.
We examined whether polygenic risk scores (PRS) for consumption and problem subscales of the Alcohol Use Disorders Identification Test (AUDIT-C, AUDIT-P) in the UK Biobank (UKB; N = 121 630) correlate with alcohol outcomes in four independent samples: an ascertained cohort, the Collaborative Study on the Genetics of Alcoholism (COGA; N = 6850), and population-based cohorts: Avon Longitudinal Study of Parents and Children (ALSPAC; N = 5911), Generation Scotland (GS; N = 17 461), and an independent subset of UKB (N = 245 947). Regression models and survival analyses tested whether the PRS were associated with the alcohol-related outcomes.
In COGA, AUDIT-P PRS was associated with alcohol dependence, AUD symptom count, maximum drinks (R2 = 0.47–0.68%, p = 2.0 × 10−8–1.0 × 10−10), and increased likelihood of onset of alcohol dependence (hazard ratio = 1.15, p = 4.7 × 10−8); AUDIT-C PRS was not an independent predictor of any phenotype. In ALSPAC, the AUDIT-C PRS was associated with alcohol dependence (R2 = 0.96%, p = 4.8 × 10−6). In GS, AUDIT-C PRS was a better predictor of weekly alcohol use (R2 = 0.27%, p = 5.5 × 10−11), while AUDIT-P PRS was more associated with problem drinking (R2 = 0.40%, p = 9.0 × 10−7). Lastly, AUDIT-P PRS was associated with ICD-based alcohol-related disorders in the UKB subset (R2 = 0.18%, p < 2.0 × 10−16).
AUDIT-P PRS was associated with a range of alcohol-related phenotypes across population-based and ascertained cohorts, while AUDIT-C PRS showed less utility in the ascertained cohort. We show that AUDIT-P is genetically correlated with both use and misuse and demonstrate the influence of ascertainment schemes on PRS analyses.
A consensus workshop on low-calorie sweeteners (LCS) was held in November 2018 where seventeen experts (the panel) discussed three themes identified as key to the science and policy of LCS: (1) weight management and glucose control; (2) consumption, safety and perception; (3) nutrition policy. The aims were to identify the reliable facts on LCS, suggest research gaps and propose future actions. The panel agreed that the safety of LCS is demonstrated by a substantial body of evidence reviewed by regulatory experts and current levels of consumption, even for high users, are within agreed safety margins. However, better risk communication is needed. More emphasis is required on the role of LCS in helping individuals reduce their sugar and energy intake, which is a public health priority. Based on reviews of clinical evidence to date, the panel concluded that LCS can be beneficial for weight management when they are used to replace sugar in products consumed in the diet (without energy substitution). The available evidence suggests no grounds for concerns about adverse effects of LCS on sweet preference, appetite or glucose control; indeed, LCS may improve diabetic control and dietary compliance. Regarding effects on the human gut microbiota, data are limited and do not provide adequate evidence that LCS affect gut health at doses relevant to human use. The panel identified research priorities, including collation of the totality of evidence on LCS and body weight control, monitoring and modelling of LCS intakes, impacts on sugar reduction and diet quality and developing effective communication strategies to foster informed choice. There is also a need to reconcile policy discrepancies between organisations and reduce regulatory hurdles that impede low-energy product development and reformulation.
This study used data from 12 cultural groups in 9 countries (China, Colombia, Italy, Jordan, Kenya, Philippines, Sweden, Thailand, and United States; N = 1,315) to investigate bidirectional associations between parental warmth and control, and child externalizing and internalizing behaviors. In addition, the extent to which these associations held across mothers and fathers and across cultures with differing normative levels of parent warmth and control were examined. Mothers, fathers, and children completed measures when children were ages 8 to 13. Multiple-group autoregressive cross-lagged structural equation models revealed that evocative child-driven effects of externalizing and internalizing behavior on warmth and control are ubiquitous across development, cultures, mothers, and fathers. Results also reveal that parenting effects on child externalizing and internalizing behaviors, though rarer than child effects, extend into adolescence when examined separately in mothers and fathers. Father-based parent effects were more frequent than mother effects. Most parent- and child-driven effects appear to emerge consistently across cultures. The rare culture-specific parenting effects suggested that occasionally the effects of parenting behaviors that run counter to cultural norms may be delayed in rendering their protective effect against deleterious child outcomes.
Neurodevelopment is sensitive to genetic and pre/postnatal environmental influences. These effects are likely mediated by epigenetic factors, yet current knowledge is limited. Longitudinal twin studies can delineate the link between genetic and environmental factors, epigenetic state at birth and neurodevelopment later in childhood. Building upon our study of the Peri/postnatal Epigenetic Twin Study (PETS) from gestation to 6 years of age, here we describe the PETS 11-year follow-up in which we will use neuroimaging and cognitive testing to examine the relationship between early-life environment, epigenetics and neurocognitive outcomes in mid-childhood. Using a within-pair twin model, the primary aims are to (1) identify early-life epigenetic correlates of neurocognitive outcomes; (2) determine the developmental stability of epigenetic effects and (3) identify modifiable environmental risk factors. Secondary aims are to identify factors influencing gut microbiota between 6 and 11 years of age to investigate links between gut microbiota and neurodevelopmental outcomes in mid-childhood. Approximately 210 twin pairs will undergo an assessment at 11 years of age. This includes a direct child cognitive assessment, multimodal magnetic resonance imaging, biological sampling, anthropometric measurements and a range of questionnaires on health and development, behavior, dietary habits and sleeping patterns. Data from complementary data sources, including the National Assessment Program — Literacy and Numeracy and the Australian Early Development Census, will also be sought. Following on from our previous focus on relationships between growth, cardiovascular health and oral health, this next phase of PETS will significantly advance our understanding of the environmental interactions that shape the developing brain.
The Single Ventricle Reconstruction Trial randomised neonates with hypoplastic left heart syndrome to a shunt strategy but otherwise retained standard of care. We aimed to describe centre-level practice variation at Fontan completion.
Centre-level data are reported as median or median frequency across all centres and range of medians or frequencies across centres. Classification and regression tree analysis assessed the association of centre-level factors with length of stay and percentage of patients with prolonged pleural effusion (>7 days).
The median Fontan age (14 centres, 320 patients) was 3.1 years (range from 1.7 to 3.9), and the weight-for-age z-score was −0.56 (−1.35 + 0.44). Extra-cardiac Fontans were performed in 79% (4–100%) of patients at the 13 centres performing this procedure; lateral tunnels were performed in 32% (3–100%) at the 11 centres performing it. Deep hypothermic circulatory arrest (nine centres) ranged from 6 to 100%. Major complications occurred in 17% (7–33%). The length of stay was 9.5 days (9–12); 15% (6–33%) had prolonged pleural effusion. Centres with fewer patients (<6%) with prolonged pleural effusion and fewer (<41%) complications had a shorter length of stay (<10 days; sensitivity 1.0; specificity 0.71; area under the curve 0.96). Avoiding deep hypothermic circulatory arrest and higher weight-for-age z-score were associated with a lower percentage of patients with prolonged effusions (<9.5%; sensitivity 1.0; specificity = 0.86; area under the curve 0.98).
Fontan perioperative practices varied widely among study centres. Strategies to decrease the duration of pleural effusion and minimise complications may decrease the length of stay. Further research regarding deep hypothermic circulatory arrest is needed to understand its association with prolonged pleural effusion.
The National Visitor Use Monitoring (NVUM) program data underlies estimates of the volume of recreation use of the National Forest System. The data also enable estimation of both the local economic contributions and nonmarket benefits of that visitation. Applications include evaluating the effects of natural disasters, site characteristics, and climate change, as well as expenditure and benefit transfers. This article describes the history and science background of the NVUM program, outlines the methods used in estimating market and nonmarket economic outcomes, and lists some examples of results found in the literature.
Experimental studies have reported on the anti-inflammatory properties of polyphenols. However, results from epidemiological investigations have been inconsistent and especially studies using biomarkers for assessment of polyphenol intake have been scant. We aimed to characterise the association between plasma concentrations of thirty-five polyphenol compounds and low-grade systemic inflammation state as measured by high-sensitivity C-reactive protein (hsCRP). A cross-sectional data analysis was performed based on 315 participants in the European Prospective Investigation into Cancer and Nutrition cohort with available measurements of plasma polyphenols and hsCRP. In logistic regression analysis, the OR and 95 % CI of elevated serum hsCRP (>3 mg/l) were calculated within quartiles and per standard deviation higher level of plasma polyphenol concentrations. In a multivariable-adjusted model, the sum of plasma concentrations of all polyphenols measured (per standard deviation) was associated with 29 (95 % CI 50, 1) % lower odds of elevated hsCRP. In the class of flavonoids, daidzein was inversely associated with elevated hsCRP (OR 0·66, 95 % CI 0·46, 0·96). Among phenolic acids, statistically significant associations were observed for 3,5-dihydroxyphenylpropionic acid (OR 0·58, 95 % CI 0·39, 0·86), 3,4-dihydroxyphenylpropionic acid (OR 0·63, 95 % CI 0·46, 0·87), ferulic acid (OR 0·65, 95 % CI 0·44, 0·96) and caffeic acid (OR 0·69, 95 % CI 0·51, 0·93). The odds of elevated hsCRP were significantly reduced for hydroxytyrosol (OR 0·67, 95 % CI 0·48, 0·93). The present study showed that polyphenol biomarkers are associated with lower odds of elevated hsCRP. Whether diet rich in bioactive polyphenol compounds could be an effective strategy to prevent or modulate deleterious health effects of inflammation should be addressed by further well-powered longitudinal studies.
The association between schizophrenia and decreased vitamin D levels is well documented. Low maternal and postnatal vitamin D levels suggest a possible etiological mechanism. Alternatively, vitamin D deficiency in patients with schizophrenia is presumably (also) the result of disease-related factors or demographic risk factors such as urbanicity.
In a study population of 347 patients with psychotic disorder and 282 controls, group differences in vitamin D concentration were examined. Within the patient group, associations between vitamin D, symptom levels and clinical variables were analyzed. Group × urbanicity interactions in the model of vitamin D concentration were examined. Both current urbanicity and urbanicity at birth were assessed.
Vitamin D concentrations were significantly lower in patients (B = −8.05; 95% confidence interval (CI) −13.68 to −2.42; p = 0.005). In patients, higher vitamin D concentration was associated with lower positive (B = −0.02; 95% CI −0.04 to 0.00; p = 0.049) and negative symptom levels (B = −0.03; 95% CI −0.05 to −0.01; p = 0.008). Group differences were moderated by urbanicity at birth (χ2 = 6.76 and p = 0.001), but not by current urbanicity (χ2 = 1.50 and p = 0.224). Urbanicity at birth was negatively associated with vitamin D concentration in patients (B = −5.11; 95% CI −9.41 to −0.81; p = 0.020), but not in controls (B = 0.72; 95% CI −4.02 to 5.46; p = 0.765).
Lower vitamin D levels in patients with psychotic disorder may in part reflect the effect of psychosis risk mediated by early environmental adversity. The data also suggest that lower vitamin D and psychopathology may be related through direct or indirect mechanisms.
There are a variety of causes of acute heart failure in children including myocarditis, genetic/metabolic conditions, and congenital heart defects. In cases with a structurally normal heart and a negative personal and family history, myocarditis is often presumed to be the cause, but we hypothesise that genetic disorders contribute to a significant portion of these cases. We reviewed our cases of children who presented with acute heart failure and underwent genetic testing from 2008 to 2017. Eighty-seven percent of these individuals were found to have either a genetic syndrome or pathogenic or likely pathogenic variant in a cardiac-related gene. None of these individuals had a personal or family history of cardiomyopathy that was suggestive of a genetic aetiology prior to presentation. All of these individuals either passed away or were listed for cardiac transplantation indicating genetic testing may provide important information regarding prognosis in addition to providing information critical to assessment of family members.
The widespread use of herbicides in cropping systems has led to the evolution of resistance in major weeds. The resultant loss of herbicide efficacy is compounded by a lack of new herbicide sites of action, driving demand for alternative weed control technologies. While there are many alternative methods for control, identifying the most appropriate method to pursue for commercial development has been hampered by the inability to compare techniques in a fair and equitable manner. Given that all currently available and alternative weed control methods share an intrinsic energy consumption, the aim of this review was to compare methods based on energy consumption. Energy consumption was compared for chemical, mechanical, and thermal weed control technologies when applied as broadcast (whole-field) and site-specific treatments. Tillage systems, such as flex-tine harrow (4.2 to 5.5 MJ ha−1), sweep cultivator (13 to 14 MJ ha−1), and rotary hoe (12 to 17 MJ ha−1) consumed the least energy of broadcast weed control treatments. Thermal-based approaches, including flaming (1,008 to 4,334 MJ ha−1) and infrared (2,000 to 3,887 MJ ha−1), are more appropriate for use in conservation cropping systems; however, their energy requirements are 100- to 1,000-fold greater than those of tillage treatments. The site-specific application of weed control treatments to control 2-leaf-stage broadleaf weeds at a density of 5 plants m−2 reduced energy consumption of herbicidal, thermal, and mechanical treatments by 97%, 99%, and 97%, respectively. Significantly, this site-specific approach resulted in similar energy requirements for current and alternative technologies (e.g., electrocution [15 to 19 MJ ha−1], laser pyrolysis [15 to 249 MJ ha−1], hoeing [17 MJ ha−1], and herbicides [15 MJ ha−1]). Using similar energy sources, a standardized energy comparison provides an opportunity for estimation of weed control costs, suggesting site-specific weed management is critical in the economically realistic implementation of alternative technologies.
BACTOT, Quebec’s healthcare-associated bloodstream infection (HABSI) surveillance program has been operating since 2007. In this study, we evaluated the changes in HABSI rates across 10 years of BACTOT surveillance under a Bayesian framework.
A retrospective, cohort study of eligible hospitals having participated in BACTOT for at least 3 years, regardless of their entry date. Multilevel Poisson regressions were fitted independently for cases of HABSI, catheter-associated bloodstream infections (CA-BSIs), non–catheter-associated primary BSIs (NCA-BSIs), and BSIs secondary to urinary tract infections (BSI-UTIs) as the outcome and log of patient days as the offset. The log of the mean Poisson rate was decomposed as the sum of a surveillance year effect, period effect, and hospital effect. The main estimate of interest was the cohort-level rate in years 2–10 of surveillance relative to year 1.
Overall, 17,479 cases and 33,029,870 patient days were recorded for the cohort of 77 hospitals. The pooled 10-year HABSI rate was 5.20 per 10,000 patient days (95% CI, 5.12–5.28). For HABSI, CA-BSI, and BSI-UTI, there was no difference between the estimated posterior rates of years 2–10 compared to year 1. The posterior means of the NCA-BSI rate ratios increased from the seventh year until the tenth year, when the rate was 29% (95% confidence interval, 1%–89%) higher than the first year rate.
HABSI rates and those of the most frequent subtypes remained stable over the surveillance period. To achieve reductions in incidence, we recommend that more effort be expended in active interventions against HABSI alongside surveillance.