To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We aimed to identify groups of children presenting distinct perinatal adversity profiles and test the association between profiles and later risk of suicide attempt.
Data were from the Québec Longitudinal Study of Child Development (QLSCD, N = 1623), and the Avon Longitudinal Study of Parents and Children (ALSPAC, N = 5734). Exposures to 32 perinatal adversities (e.g. fetal, obstetric, psychosocial, and parental psychopathology) were modeled using latent class analysis, and associations with a self-reported suicide attempt by age 20 were investigated with logistic regression. We investigated to what extent childhood emotional and behavioral problems, victimization, and cognition explained the associations.
In both cohorts, we identified five profiles: No perinatal risk, Poor fetal growth, Socioeconomic adversity, Delivery complications, Parental mental health problems (ALSPAC only). Compared to children with No perinatal risk, children in the Poor fetal growth (pooled estimate QLSCD-ALSPAC, OR 1.89, 95% CI 1.04–3.44), Socioeconomic adversity (pooled-OR 1.42, 95% CI 1.08–1.85), and Parental mental health problems (OR 1.74, 95% CI 1.27–2.40), but not Delivery complications, profiles were more likely to attempt suicide. The proportion of this effect mediated by the putative mediators was larger for the Socioeconomic adversity profile compared to the others.
Perinatal adversities associated with suicide attempt cluster in distinct profiles. Suicide prevention may begin early in life and requires a multidisciplinary approach targeting a constellation of factors from different domains (psychiatric, obstetric, socioeconomic), rather than a single factor, to effectively reduce suicide vulnerability. The way these factors cluster together also determined the pathways leading to a suicide attempt, which can guide decision-making on personalized suicide prevention strategies.
Giant miscanthus has the potential to move beyond cultivated fields and invade noncrop areas, but this can be overshadowed by aesthetic appeal and monetary value as a biofuel crop. Most research on giant miscanthus has focused on herbicide tolerance for establishment and production rather than terminating an existing stand. This study was conducted to evaluate herbicide options for control or terminating a stand of giant miscanthus. In 2013 and 2014, field experiments were conducted on established stands of the giant miscanthus cultivars ‘Nagara’ and ‘Freedom.’ Herbicides evaluated in both years included glyphosate, hexazinone, imazapic, imazapyr, clethodim, fluazifop, and glyphosate plus fluazifop. All treatments were applied in summer (June or July) and September. For both years, biomass reduction ranged from 85% to 100% when glyphosate was applied in June or July at 4.5 or 7.3 kg ae ha−1. No other treatment applied at this timing provided more than 50% giant miscanthus biomass reduction 1 yr after application. September applications of glyphosate were not consistent: treatments in 2013 reduced biomass by 40% or less, whereas in 2014, at all rates provided at least 78% biomass reduction. Glyphosate applied in June or July was the only treatment that provided effective and consistent control of giant miscanthus 1 yr after treatment.
Although death by neurologic criteria (brain death) is legally recognized throughout the United States, state laws and clinical practice vary concerning three key issues: (1) the medical standards used to determine death by neurologic criteria, (2) management of family objections before determination of death by neurologic criteria, and (3) management of religious objections to declaration of death by neurologic criteria. The American Academy of Neurology and other medical stakeholder organizations involved in the determination of death by neurologic criteria have undertaken concerted action to address variation in clinical practice in order to ensure the integrity of brain death determination. To complement this effort, state policymakers must revise legislation on the use of neurologic criteria to declare death. We review the legal history and current laws regarding neurologic criteria to declare death and offer proposed revisions to the Uniform Determination of Death Act (UDDA) and the rationale for these recommendations.
Objective: Detection of cognitive impairment suggestive of risk for Alzheimer’s disease (AD) progression is crucial to the prevention of incipient dementia. This study was performed to determine if performance on a novel object discrimination task improved identification of earlier deficits in older adults at risk for AD. Method: In total, 135 participants from the 1Florida Alzheimer’s Disease Research Center [cognitively normal (CN), Pre-mild cognitive impairment (PreMCI), amnestic mild cognitive impairment (aMCI), and dementia] completed a test of object discrimination and traditional memory measures in the context of a larger neuropsychological and clinical evaluation. Results: The Object Recognition and Discrimination Task (ORDT) revealed significant differences between the PreMCI, aMCI, and dementia groups versus CN individuals. Moreover, relative risk of being classified as PreMCI rather than CN increased as an inverse function of ORDT score. Discussion: Overall, the obtained results suggest that a novel object discrimination task improves the detection of very early AD-related cognitive impairment, increasing the window for therapeutic intervention. (JINS, 2019, 25, 688–698)
Cognitive impairment is a core feature of psychotic disorders, but the profile of impairment across adulthood, particularly in African-American populations, remains unclear.
Using cross-sectional data from a case–control study of African-American adults with affective (n = 59) and nonaffective (n = 68) psychotic disorders, we examined cognitive functioning between early and middle adulthood (ages 20–60) on measures of general cognitive ability, language, abstract reasoning, processing speed, executive function, verbal memory, and working memory.
Both affective and nonaffective psychosis patients showed substantial and widespread cognitive impairments. However, comparison of cognitive functioning between controls and psychosis groups throughout early (ages 20–40) and middle (ages 40–60) adulthood also revealed age-associated group differences. During early adulthood, the nonaffective psychosis group showed increasing impairments with age on measures of general cognitive ability and executive function, while the affective psychosis group showed increasing impairment on a measure of language ability. Impairments on other cognitive measures remained mostly stable, although decreasing impairments on measures of processing speed, memory and working memory were also observed.
These findings suggest similarities, but also differences in the profile of cognitive dysfunction in adults with affective and nonaffective psychotic disorders. Both affective and nonaffective patients showed substantial and relatively stable impairments across adulthood. The nonaffective group also showed increasing impairments with age in general and executive functions, and the affective group showed an increasing impairment in verbal functions, possibly suggesting different underlying etiopathogenic mechanisms.
In 2017, Public Health England South East Health Protection Team (HPT) were involved in the management of an outbreak of Mycobacterium bovis (the causative agent of bovine tuberculosis) in a pack of working foxhounds. This paper summarises the actions taken by the team in managing the public health aspects of the outbreak, and lessons learned to improve the management of future potential outbreaks. A literature search was conducted to identify relevant publications on M. bovis. Clinical notes from the Public Health England (PHE) health protection database were reviewed and key points extracted. Animal and public health stakeholders involved in the management of the situation provided further evidence through unstructured interviews and personal communications. The PHE South East team initially provided ‘inform and advise’ letters to human contacts whilst awaiting laboratory confirmation to identify the infectious agent. Once M. bovis had been confirmed in the hounds, an in-depth risk assessment was conducted, and contacts were stratified in to risk pools. Eleven out of 20 exposed persons with the greatest risk of exposure were recommended to attend TB screening and one tested positive, but had no evidence of active TB infection. The number of human contacts working with foxhound packs can be large and varied. HPTs should undertake a comprehensive risk assessment of all potential routes of exposure, involve all other relevant stakeholders from an early stage and undertake regular risk assessments. Current guidance should be revised to account for the unique risks to human health posed by exposure to infected working dogs.
The causative agent of urogenital schistosomiasis, Schistosoma haematobium, was thought to be the only schistosome species transmitted through Bulinus snails on Unguja and Pemba Island (Zanzibar, United Republic of Tanzania). For insights into the environmental risk of S. haematobium transmission on Pemba Island, malacological surveys collecting Bulinus globosus and B. nasutus, two closely related potential intermediate hosts of S. haematobium were conducted across the island in November 2016. Of 1317 B. globosus/B. nasutus collected, seven B. globosus, identified through sequencing a DNA region of the mitochondrial cytochrome oxidase subunit 1 (cox1), were observed with patent infections assumed to be S. haematobium. However, when the collected cercariae were identified through sequencing a region of the cox1 and the nuclear internal transcribed spacer (ITS1 + 2), schistosomes from five of these B. globosus collected from a single locality were in fact S. bovis. The identified presence of S. bovis raises concerns for animal health on Pemba, and complicates future transmission monitoring of S. haematobium. These results show the pertinence for not only sensitive, but also species-specific markers to be used when identifying cercariae during transmission monitoring, and also provide the first molecular confirmation for B. globosus transmitting S. bovis in East Africa.
Hill (Twin Research and Human Genetics, Vol. 21, 2018, 84–88) presented a critique of our recently published paper in Cell Reports entitled ‘Large-Scale Cognitive GWAS Meta-Analysis Reveals Tissue-Specific Neural Expression and Potential Nootropic Drug Targets’ (Lam et al., Cell Reports, Vol. 21, 2017, 2597–2613). Specifically, Hill offered several interrelated comments suggesting potential problems with our use of a new analytic method called Multi-Trait Analysis of GWAS (MTAG) (Turley et al., Nature Genetics, Vol. 50, 2018, 229–237). In this brief article, we respond to each of these concerns. Using empirical data, we conclude that our MTAG results do not suffer from ‘inflation in the FDR [false discovery rate]’, as suggested by Hill (Twin Research and Human Genetics, Vol. 21, 2018, 84–88), and are not ‘more relevant to the genetic contributions to education than they are to the genetic contributions to intelligence’.
Adult schistosomes live in the blood vessels and cannot easily be sampled from humans, so archived miracidia larvae hatched from eggs expelled in feces or urine are commonly used for population genetic studies. Large collections of archived miracidia on FTA cards are now available through the Schistosomiasis Collection at the Natural History Museum (SCAN). Here we describe protocols for whole genome amplification of Schistosoma mansoni and Schistosome haematobium miracidia from these cards, as well as real time PCR quantification of amplified schistosome DNA. We used microgram quantities of DNA obtained for exome capture and sequencing of single miracidia, generating dense polymorphism data across the exome. These methods will facilitate the transition from population genetics, using limited numbers of markers to population genomics using genome-wide marker information, maximising the value of collections such as SCAN.
To evaluate the impact of discontinuing routine contact precautions (CP) for endemic methicillin-resistant Staphylococcus aureus (MRSA) and vancomycin-resistant Enterococcus (VRE) on hospital adverse events.
Academic medical center with single-occupancy rooms.
We compared hospital reportable adverse events 1 year before and 1 year after discontinuation of routine CP for endemic MRSA and VRE (preintervention and postintervention periods, respectively). Throughout the preintervention period, daily chlorhexidine gluconate bathing was expanded to nearly all inpatients. Chart reviews were performed to identify which patients and events were associated with CP for MRSA/VRE in the preintervention period as well as the patients that would have met prior criteria for MRSA/VRE CP but were not isolated in the postintervention period. Adverse events during the 2 periods were compared using segmented and mixed-effects Poisson regression models.
There were 24,732 admissions in the preintervention period and 25,536 in the postintervention period. Noninfectious adverse events (ie, postoperative respiratory failure, hemorrhage/hematoma, thrombosis, wound dehiscence, pressure ulcers, and falls or trauma) decreased by 19% (12.3 to 10.0 per 1,000 admissions, P=.022) from the preintervention to the postintervention period. There was no significant difference in the rate of infectious adverse events after CP discontinuation (20.7 to 19.4 per 1,000 admissions, P=.33). Patients with MRSA/VRE showed the largest reduction in noninfectious adverse events after CP discontinuation, with a 72% reduction (21.4 to 6.08 per 1,000 MRSA/VRE admissions; P<.001).
After discontinuing routine CP for endemic MRSA/VRE, the rate of noninfectious adverse events declined, especially in patients who no longer required isolation. This suggests that elimination of CP may substantially reduce noninfectious adverse events.
PRE and POST herbicide options were evaluated to control perilla mint, a potentially deadly plant for livestock. The germination requirements of seed from weedy populations were also investigated to better understand and predict emergence timing. POST applications of aminocyclopyrachlor blends, glyphosate, picloram+2,4-D, aminopyralid+2,4-D, and 2,4-D alone provided superior control of perilla mint when applied in the early reproductive growth stage. Picloram+2,4-D and aminocyclopyrachlor+chlorsulfuron also provided soil residual activity and the most effective PRE control followed by pendimethalin and aminopyralid+2,4-D. Seed from weedy populations tend to germinate in a range of night/day soil temperatures from 10–15 C to 25–30 C. Therefore, application and activation of the most effective PRE treatments should be made before these temperatures occur in areas where weedy perilla mint populations are found.
Control of noxious weeds such as cogongrass depend heavily on chemical treatment, but success is limited unless integrated with other practices. Utilization of cover crops in the system is ideal to avoid the use of excess herbicide and replace vegetation that will resist cogongrass reinvasion. Greenhouse studies were conducted from 2013 through 2015 at Mississippi State University with the objective to evaluate ‘AG4934’ RR/STS soybean, Korean lespedeza, crimson clover and ‘Durana’ white clover tolerance to soil-applied imazapyr at selected rates and various planting times after application. Plastic containers filled with a mixture of 2:1 sand:topsoil were treated with imazapyr at 0, 70, 140 and 280 g ae ha–1. Legume species were planted 0, 1, 3 and 6 months after treatment (MAT). The factorial experimental design included legume species, imazapyr rate and planting time. At 6 weeks after each planting, the number of seedlings, average plant height and shoot biomass were measured. Statistical analysis revealed the imazapyr rate x planting time interaction was significant with respect to number of emerged seedlings, average height and shoot biomass per plant for each species. It was observed that the legumes planted at 0 MAT of imazapyr at 70 g ae ha–1 or higher reduced emerged seedlings, average height and biomass production. In general, seeds planted 1 MAT or later in combination with these same herbicide rates, showed less growth reductions than treatments seeded 0 MAT. In conclusion, sites treated with imazapyr rates from 70 to 280 g ae ha–1 for weed control, should not be seeded with legume ground covers less than 1 month after treatment to reduce emergence failure, plant height and biomass production.
We have previously shown that the minor alleles of vascular endothelial growth factor A (VEGFA) single-nucleotide polymorphism rs833069 and superoxide dismutase 2 (SOD2) single-nucleotide polymorphism rs2758331 are both associated with improved transplant-free survival after surgery for CHD in infants, but the underlying mechanisms are unknown. We hypothesised that one or both of these minor alleles are associated with better systemic ventricular function, resulting in improved survival.
This study is a follow-up analysis of 422 non-syndromic CHD patients who underwent neonatal cardiac surgery with cardiopulmonary bypass. Echocardiographic reports were reviewed. Systemic ventricular function was subjectively categorised as normal, or as mildly, moderately, or severely depressed. The change in function was calculated as the change from the preoperative study to the last available study. Stepwise linear regression, adjusting for covariates, was performed for the outcome of change in ventricular function. Model comparison was performed using Akaike’s information criterion. Only variables that improved the model prediction of change in systemic ventricular function were retained in the final model.
Genetic and echocardiographic data were available for 335/422 subjects (79%). Of them, 33 (9.9%) developed worse systemic ventricular function during a mean follow-up period of 13.5 years. After covariate adjustment, the presence of the VEGFA minor allele was associated with preserved ventricular function (p=0.011).
These data support the hypothesis that the mechanism by which the VEGFA single-nucleotide polymorphism rs833069 minor allele improves survival may be the preservation of ventricular function. Further studies are needed to validate this genotype–phenotype association and to determine whether this mechanism is related to increased vascular endothelial growth factor production.
To evaluate the impact of discontinuation of contact precautions (CP) for methicillin-resistant Staphylococcus aureus (MRSA) and vancomycin-resistant Enterococcus (VRE) and expansion of chlorhexidine gluconate (CHG) use on the health system.
We compared hospital-wide laboratory-identified clinical culture rates (as a marker of healthcare-associated infections) 1 year before and after routine CP for endemic MRSA and VRE were discontinued and CHG bathing was expanded to all units. Culture data from patients and cost data on material utilization were collected. Nursing time spent donning personal protective equipment was assessed and quantified using time-driven activity-based costing.
Average positive culture rates before and after discontinuing CP were 0.40 and 0.32 cultures/100 admissions for MRSA (P=.09), and 0.48 and 0.40 cultures/100 admissions for VRE (P=.14). When combining isolation gown and CHG costs, the health system saved $643,776 in 1 year. Before the change, 28.5% intensive care unit and 19% medicine/surgery beds were on CP for MRSA/VRE. On the basis of average room entries and donning time, estimated nursing time spent donning personal protective equipment for MRSA/VRE before the change was 45,277 hours/year (estimated cost, $4.6 million).
Discontinuing routine CP for endemic MRSA and VRE did not result in increased rates of MRSA or VRE after 1 year. With cost savings on materials, decreased healthcare worker time, and no concomitant increase in possible infections, elimination of routine CP may add substantial value to inpatient care delivery.
The importance of chronic low-grade inflammation in the pathology of numerous age-related chronic conditions is now clear. An unresolved inflammatory response is likely to be involved from the early stages of disease development. The present position paper is the most recent in a series produced by the International Life Sciences Institute's European Branch (ILSI Europe). It is co-authored by the speakers from a 2013 workshop led by the Obesity and Diabetes Task Force entitled ‘Low-grade inflammation, a high-grade challenge: biomarkers and modulation by dietary strategies’. The latest research in the areas of acute and chronic inflammation and cardiometabolic, gut and cognitive health is presented along with the cellular and molecular mechanisms underlying inflammation–health/disease associations. The evidence relating diet composition and early-life nutrition to inflammatory status is reviewed. Human epidemiological and intervention data are thus far heavily reliant on the measurement of inflammatory markers in the circulation, and in particular cytokines in the fasting state, which are recognised as an insensitive and highly variable index of tissue inflammation. Potential novel kinetic and integrated approaches to capture inflammatory status in humans are discussed. Such approaches are likely to provide a more discriminating means of quantifying inflammation–health/disease associations, and the ability of diet to positively modulate inflammation and provide the much needed evidence to develop research portfolios that will inform new product development and associated health claims.
Attitudes to aging have been investigated in non-carer populations and found to have important relationships with physical and mental health. However, these have not been explored in an older carer sample, although it is becoming increasingly important to clarify variables which are linked with positive carer outcomes. This is one of the first studies to report on older carers, their attitudes to aging, and the relationship with carer-related factors.
A cross-sectional study of 202 carers with a mean age of 70.8 years was conducted in Victoria, Australia, using carer demographic data, carer factors such as depression (using the Geriatric Depression Scale), burden (using the Zarit Burden Inventory, ZBI), physical health, personality, and attitudes to aging (using the Attitudes to Aging Questionnaire, AAQ). Spearman rank correlation and hierarchical regression analyses were used.
This study showed that carers had overall positive attitudes to aging inspite of their caring role. It also identified that carer factors including depression and burden contributed a significant amount of the variance to attitudes to aging in terms of physical change and psychosocial loss. Personality traits, specifically neuroticism, and extraversion, were also important contributors to attitudes to aging.
Results from this study demonstrated that inspite of moderate levels of depression and spending significant time caring, carers reported positive attitudes to aging. Treating depression, decreasing burden, and investigating the benefits of caring may assist older carers maintain their well-being.