To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We explored the acceptability of a personalised proteomic risk intervention for patients at increased risk of type 2 diabetes and their healthcare providers, as well as their experience of participating in the delivery of proteomic-based risk feedback in UK primary care.
Advances in proteomics now allow the provision of personalised proteomic risk reports, with the intention of achieving positive behaviour change. This technology has the potential to encourage behaviour change in people at risk of developing type 2 diabetes.
A semi-structured interview study was carried out with patients at risk of type 2 diabetes and their healthcare providers in primary care in the North of England. Participants (n = 17) and healthcare provider (n = 4) were interviewed either face to face or via telephone. Data were analysed using thematic analysis. This qualitative study was nested within a single-arm pilot trial and undertaken in primary care.
The personalised proteomic risk intervention was generally acceptable and the experience was positive. The personalised nature of the report was welcomed, especially the way it provided a holistic approach to risks of organ damage and lifestyle factors. Insights were provided as to how this may change behaviour. Some participants reported difficulties in understanding the format of the presentation of risk and expressed surprise at receiving risk estimates for conditions other than type 2 diabetes. Personalised proteomic risk interventions have the potential to provide holistic and comprehensive assessments of risk factors and lifestyle factors which may lead to positive behaviour change.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
On any reading of 2 Samuel 9–20, Joab’s ruse with the Tekoite woman in 14:2–21 is a pivotal scene. It portrays the moral reasoning which David adopted in order to allow Absalom to return to Jerusalem from exile in Geshur and a new development in the ethics of David’s administration. The formal character of the king’s decision as an official royal pronouncement sets it apart and makes it especially significant. Within the narrative, the account marks a turning point from which ethical thinking in David’s court never seems to recover. The characters construct a theological ethic that places a premium on the communal “togetherness” of God’s people, and the crown accepts it as justification for overlooking the bloodguilt of one of its most marginalized members. The half-life of this ethic in the ensuing narrative wreaks havoc on the kingdom, through two of its most maniacally vulnerable agents. Once bloodguilt can be overlooked for the sake of togetherness, little remains to prevent members of the community from sanctioning the bloodshed of any member perceived to threaten that togetherness. The troubles that follow after Absalom’s return are not simply due to the mere fact that a maniacal member of God’s estate was returned to run amok. Rather, they arise from the problematic ethic employed to justify his restoration. The king’s response further exacerbates the situation. Contextualization within outworking of divine judgment on the king for his own acts of oppression adds another element to the narrative’s portrayal of causality, part of 2 Sam 8:15–20:26’s critical dramatization of David’s efforts to establish “justice and righteousness for all of his people.”
The first demonstration of laser action in ruby was made in 1960 by T. H. Maiman of Hughes Research Laboratories, USA. Many laboratories worldwide began the search for lasers using different materials, operating at different wavelengths. In the UK, academia, industry and the central laboratories took up the challenge from the earliest days to develop these systems for a broad range of applications. This historical review looks at the contribution the UK has made to the advancement of the technology, the development of systems and components and their exploitation over the last 60 years.
A single radiocarbon date derived from the Buhl burial in south-central Idaho has frequently been used as a data point for the interpretation of the Western Stemmed Tradition (WST) chronology and technology because of the stemmed biface found in situ with the human remains. AMS dating of bone collagen in 1991 produced an age of 10,675 ± 95 14C BP, immediately postdating the most widely accepted age range for Clovis. The Buhl burial has been cited as evidence that stemmed point technology may have overlapped with Clovis technology in the Intermountain West. We discuss concerns about the radiocarbon date, arguing that even at face value, the calibrated date has minimal overlap with Clovis at the 95.4% range. Furthermore, the C:N ratio of 3.69 in the analyzed collagen is outside of the typical range for well-preserved samples, indicating a postdepositional change in carbon composition, which may make the date erroneously older or younger than the age of the skeleton. Finally, the potential dietary incorporation of small amounts of anadromous fish may indicate that the burial is younger than traditionally accepted. For these reasons, we argue that the Buhl burial cannot be used as evidence of overlap between WST and Clovis.
Cover crops are increasingly being used for weed management, and planting them as diverse mixtures has become an increasingly popular strategy for their implementation. While ecological theory suggests that cover crop mixtures should be more weed suppressive than cover crop monocultures, few experiments have explicitly tested this for more than a single temporal niche. We assessed the effects of cover crop mixtures (5- or 6-species and 14-species mixtures) and monocultures on weed abundance (weed biomass) and weed suppression at the time of cover crop termination. Separate experiments were conducted in Madbury, NH, from 2014 to 2017 for each of three temporal cover-cropping niches: summer (spring planting–summer termination), fall (summer planting–fall termination), and spring (fall planting–subsequent spring termination). Regardless of temporal niche, mixtures were never more weed suppressive than the most weed-suppressive cover crop grown as a monoculture, and the more diverse mixture (14 species) never outperformed the less diverse mixture. Mean weed-suppression levels of the best-performing monocultures in each temporal niche ranged from 97% to 98% for buckwheat (Fagopyrum esculentum Moench) in the summer niche and forage radish (Raphanus sativus L. var. niger J. Kern.) in the fall niche, and 83% to 100% for triticale (×Triticosecale Wittm. ex A. Camus [Secale × Triticum]) in the winter–spring niche. In comparison, weed-suppression levels for the mixtures ranged from 66% to 97%, 70% to 90%, and 67% to 99% in the summer, fall, and spring niches, respectively. Stability of weed suppression, measured as the coefficient of variation, was two to six times greater in the best-performing monoculture compared with the most stable mixture, depending on the temporal niche. Results of this study suggest that when weed suppression is the sole objective, farmers are more likely to achieve better results planting the most weed-suppressive cover crop as a monoculture than a mixture.
Poor physical health in severe mental illness (SMI) remains a major issue for clinical practice.
To use electronic health records of routinely collected clinical data to determine levels of screening for cardiometabolic disease and adverse health outcomes in a large sample (n = 7718) of patients with SMI, predominantly schizophrenia and bipolar disorder.
We linked data from the Glasgow Psychosis Clinical Information System (PsyCIS) to morbidity records, routine blood results and prescribing data.
There was no record of routine blood monitoring during the preceding 2 years for 16.9% of the cohort. However, monitoring was poorer for male patients, younger patients aged 16–44, those with schizophrenia, and for tests of cholesterol, triglyceride and glycosylated haemoglobin. We estimated that 8.0% of participants had diabetes and that lipids levels, and use of lipid-lowering medication, was generally high.
Electronic record linkage identified poor health screening and adverse health outcomes in this vulnerable patient group. This approach can inform the design of future interventions and health policy.
Weed management is a major challenge in organic crop production, and organic farms generally harbor larger weed populations and more diverse communities compared with conventional farms. However, little research has been conducted on the effects of different organic management practices on weed communities and crop yields. In 2014 and 2015, we measured weed community structure and soybean [Glycine max (L.) Merr.] yield in a long-term experiment that compared four organic cropping systems that differed in nutrient inputs, tillage, and weed management intensity: (1) high fertility (HF), (2) low fertility (LF), (3) enhanced weed management (EWM), and (4) reduced tillage (RT). In addition, we created weed-free subplots within each system to assess the impact of weeds on soybean yield. Weed density was greater in the LF and RT systems compared with the EWM system, but weed biomass did not differ among systems. Weed species richness was greater in the RT system compared with the EWM system, and weed community composition differed between RT and other systems. Our results show that differences in weed community structure were primarily related to differences in tillage intensity, rather than nutrient inputs. Soybean yield was lower in the EWM system compared with the HF and RT systems. When averaged across all four cropping systems and both years, soybean yield in weed-free subplots was 10% greater than soybean yield in the ambient weed subplots that received standard management practices for the systems in which they were located. Although weed competition limited soybean yield across all systems, the EWM system, which had the lowest weed density, also had the lowest soybean yield. Future research should aim to overcome such trade-offs between weed control and yield potential, while conserving weed species richness and the ecosystem services associated with increased weed diversity.
Background: Observational studies have reported an association between childhood obesity and a higher risk of multiple sclerosis (MS). However, the difficulties to fully account for confounding and long recall periods make causal inference from these studies challenging. The objective of this study was to assess the contribution of childhood obesity to the development of MS through Mendelian randomization, which uses genetic associations to minimize the risk of confounding. Methods: We selected 23 independent genetic variants strongly associated with childhood body mass index (BMI) in a genome-wide association study (GWAS) which included 47,541 children. The corresponding effects of these variants on risk of MS were obtained from a GWAS of 14,802 MS cases and 26,703 controls. Standard two-sample Mendelian randomization methods were performed, with additional sensitivity analyses to assess the likelihood of bias from genetic pleiotropy. Results: The inverse-variance weighted MR analysis revealed that one standard deviation increase in childhood BMI increased odds of MS by 26% (odds ratio=1.26, 95% confidence interval 1.10-1.45, p=0.001). There was no significant heterogeneity across the individual estimates. Sensitivity analyses were consistent with the main findings and provided no evidence of pleiotropy. Conclusions: This study provides genetic support of a role for increased childhood BMI in the development of MS.
Determining infectious cross-transmission events in healthcare settings involves manual surveillance of case clusters by infection control personnel, followed by strain typing of clinical/environmental isolates suspected in said clusters. Recent advances in genomic sequencing and cloud computing now allow for the rapid molecular typing of infecting isolates.
To facilitate rapid recognition of transmission clusters, we aimed to assess infection control surveillance using whole-genome sequencing (WGS) of microbial pathogens to identify cross-transmission events for epidemiologic review.
Clinical isolates of Staphylococcus aureus, Enterococcus faecium, Pseudomonas aeruginosa, and Klebsiella pneumoniae were obtained prospectively at an academic medical center, from September 1, 2016, to September 30, 2017. Isolate genomes were sequenced, followed by single-nucleotide variant analysis; a cloud-computing platform was used for whole-genome sequence analysis and cluster identification.
Most strains of the 4 studied pathogens were unrelated, and 34 potential transmission clusters were present. The characteristics of the potential clusters were complex and likely not identifiable by traditional surveillance alone. Notably, only 1 cluster had been suspected by routine manual surveillance.
Our work supports the assertion that integration of genomic and clinical epidemiologic data can augment infection control surveillance for both the identification of cross-transmission events and the inclusion of missed and exclusion of misidentified outbreaks (ie, false alarms). The integration of clinical data is essential to prioritize suspect clusters for investigation, and for existing infections, a timely review of both the clinical and WGS results can hold promise to reduce HAIs. A richer understanding of cross-transmission events within healthcare settings will require the expansion of current surveillance approaches.
High-residue cover crops can facilitate organic no-till vegetable production when cover crop biomass production is sufficient to suppress weeds (>8000 kg ha−1), and cash crop growth is not limited by soil temperature, nutrient availability, or cover crop regrowth. In cool climates, however, both cover crop biomass production and soil temperature can be limiting for organic no-till. In addition, successful termination of cover crops can be a challenge, particularly when cover crops are grown as mixtures. We tested whether reusable plastic tarps, an increasingly popular tool for small-scale vegetable farmers, could be used to augment organic no-till cover crop termination and weed suppression. We no-till transplanted cabbage into a winter rye (Secale cereale L.)-hairy vetch (Vicia villosa Roth) cover crop mulch that was terminated with either a roller-crimper alone or a roller-crimper plus black or clear tarps. Tarps were applied for durations of 2, 4 and 5 weeks. Across tarp durations, black tarps increased the mean cabbage head weight by 58% compared with the no tarp treatment. This was likely due to a combination of improved weed suppression and nutrient availability. Although soil nutrients and biological activity were not directly measured, remaining cover crop mulch in the black tarp treatments was reduced by more than 1100 kg ha−1 when tarps were removed compared with clear and no tarp treatments. We interpret this as an indirect measurement of biological activity perhaps accelerated by lower daily soil temperature fluctuations and more constant volumetric water content under black tarps. The edges of both tarp types were held down, rather than buried, but moisture losses from the clear tarps were greater and this may have affected the efficacy of clear tarps. Plastic tarps effectively killed the vetch cover crop, whereas it readily regrew in the crimped but uncovered plots. However, emergence of large and smooth crabgrass (Digitaria spp.) appeared to be enhanced in the clear tarp treatment. Although this experiment was limited to a single site-year in New Hampshire, it shows that use of black tarps can overcome some of the obstacles to implementing cover crop-based no-till vegetable productions in northern climates.
Many Spanish chroniclers detail violent cultural practices of the indigenous populations they encountered in the Isthmo-Colombian Area; however, lack of physical evidence of interpersonal violence from archaeological contexts has made uncertain the veracity of these claims. At the precolumbian site of Playa Venado in Panama, these accounts of violent mortuary rituals may have influenced the interpretation of the burials encountered in excavations, leading to claims of mutilations and sacrifice, with little or no supporting evidence. This paper considers the physical evidence for interpersonal violence and sacrificial death at Playa Venado based on the burial positioning, demographic composition, and trauma present on the human remains recovered from the site. Analysis of field notes, excavation photos, and the 77 individuals available for study from the site yielded no evidence of perimortem trauma nor abnormal body positioning unexplained by taphonomy. The demography at the site tracked with normal patterns of natural age-at-death at the non-elite site of Cerro Juan Díaz rather than the abnormal patterns seen at the large ceremonial sites of Sitio Conte and El Caño. Therefore, we propose an alternative interpretation of the site as a non-elite cemetery containing evidence of re-use and secondary burial practices associated with ancestor veneration rituals.
Objectives: Agenesis of the corpus callosum (AgCC), characterized by developmental absence of the corpus callosum, is one of the most common congenital brain malformations. To date, there are limited data on the neuropsychological consequences of AgCC and factors that modulate different outcomes, especially in children. This study aimed to describe general intellectual, academic, executive, social and behavioral functioning in a cohort of school-aged children presenting for clinical services to a hospital and diagnosed with AgCC. The influences of age, social risk and neurological factors were examined. Methods: Twenty-eight school-aged children (8 to 17 years) diagnosed with AgCC completed tests of general intelligence (IQ) and academic functioning. Executive, social and behavioral functioning in daily life, and social risk, were estimated from parent and teacher rated questionnaires. MRI findings reviewed by a pediatric neurologist confirmed diagnosis and identified brain characteristics. Clinical details including the presence of epilepsy and diagnosed genetic condition were obtained from medical records. Results: In our cohort, ~50% of children experienced general intellectual, academic, executive, social and/or behavioral difficulties and ~20% were functioning at a level comparable to typically developing children. Social risk was important for understanding variability in neuropsychological outcomes. Brain anomalies and complete AgCC were associated with lower mathematics performance and poorer executive functioning. Conclusions: This is the first comprehensive report of general intellectual, academic, executive social and behavioral consequences of AgCC in school-aged children. The findings have important clinical implications, suggesting that support to families and targeted intervention could promote positive neuropsychological functioning in children with AgCC who come to clinical attention. (JINS, 2018, 24, 445–455)
The northern New England region includes the states of Vermont, New Hampshire, and Maine and encompasses a large degree of climate and edaphic variation across a relatively small spatial area, making it ideal for studying climate change impacts on agricultural weed communities. We sampled weed seedbanks and measured soil physical and chemical characteristics on 77 organic farms across the region and analyzed the relationships between weed community parameters and select geographic, climatic, and edaphic variables using multivariate procedures. Temperature-related variables (latitude, longitude, mean maximum and minimum temperature) were the strongest and most consistent correlates with weed seedbank composition. Edaphic variables were, for the most part, relatively weaker and inconsistent correlates with weed seedbanks. Our analyses also indicate that a number of agriculturally important weed species are associated with specific U.S. Department of Agriculture plant hardiness zones, implying that future changes in climate factors that result in geographic shifts in these zones will likely be accompanied by changes in the composition of weed communities and therefore new management challenges for farmers.
Tillage is a foundational management practice in many cropping systems. Although effective at reducing weed populations and preparing a crop seedbed, tillage and cultivation can also dramatically alter weed community composition. We examined the impact of soil tillage timing on weed community structure at four sites across the northeastern United States. Soil was tilled every 2 wk throughout the growing season (late April to late September 2013), and weed seedling density was quantified by species 6 wk after each tillage event. We used a randomized complete block design with four replicates for each tillage-timing treatment; a total of 196 plots were sampled. The timing of tillage was an important factor in shaping weed community composition and structure at all sites. We identified three main periods of tillage timing that resulted in similar communities. Across all sites, total weed density tended to be greatest and weed evenness tended to be lowest when soils were tilled early in the growing season. From the earliest to latest group of timings, total abundance decreased on average from 428±393 to 159±189 plants m−2, and evenness increased from 0.53±0.25 to 0.72±0.20. The effect of tillage timing on weed species richness varied by site. Our results show that tillage timing affects weed community structure, suggesting that farmers can manage weed communities and the potential for weed interference by adjusting the timing of their tillage and cropping practices.
Agricultural expansion contributes to the degradation of biodiverse ecosystems and the services these systems provide. Expansion of urban and peri-urban agriculture (UPA), on the other hand, may hold promise to both expand the portfolio of ecosystem services (ES) available in built environments, where ES are typically low and to reduce pressure to convert sensitive non-urban, non-agricultural ecosystems to agriculture. However, few data are available to support these hypotheses. Here we review and summarize the research conducted on UPA from 320 peer-reviewed papers published between 2000 and 2014. Specifically, we explored the availability of data regarding UPA's impact on ES and disservices. We also assessed the literature for evidence that UPA can contribute to land sparing. We find that the growth in UPA research over this time period points to the emerging recognition of the potential role that UPA systems play in food production worldwide. However, few studies (n = 15) place UPA in the context of ES, and no studies in our review explicitly quantify the land sparing potential of UPA. Additionally, while few studies (n = 19) quantify production potential of UPA, data that are necessary to accurately quantify the role these systems can play in land sparing, our rough estimates suggest that agricultural extensification into the world's urban environments via UPA could spare an area approximately twice the size of the US state of Massachusetts. Expanding future UPA research to include quantification of ES and functions would shed light on the ecological tradeoffs associated with agricultural production in the built environment. As food demand increases and urban populations continue to grow, it will be critical to better understand the role urban environments can play in global agricultural production and ecosystem preservation.
Weed resistance to herbicides occurs when herbicides are overused and can be mitigated by reducing their use. Consensus on herbicide resistance management strategies is problematic given strong industrial profit motive links in the weed science discipline.