To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
The First Episode Rapid Early Intervention for Eating Disorders (FREED) service model is associated with significant reductions in wait times and improved clinical outcomes for emerging adults with recent-onset eating disorders. An understanding of how FREED is implemented is a necessary precondition to enable an attribution of these findings to key components of the model, namely the wait-time targets and care package.
This study evaluated fidelity to the FREED service model during the multicentre FREED-Up study.
Participants were 259 emerging adults (aged 16–25 years) with an eating disorder of <3 years duration, offered treatment through the FREED care pathway. Patient journey records documented patient care from screening to end of treatment. Adherence to wait-time targets (engagement call within 48 h, assessment within 2 weeks, treatment within 4 weeks) and care package, and differences in adherence across diagnosis and treatment group were examined.
There were significant increases (16–40%) in adherence to the wait-time targets following the introduction of FREED, irrespective of diagnosis. Receiving FREED under optimal conditions also increased adherence to the targets. Care package use differed by component and diagnosis. The most used care package activities were psychoeducation and dietary change. Attention to transitions was less well used.
This study provides an indication of adherence levels to key components of the FREED model. These adherence rates can tentatively be considered as clinically meaningful thresholds. Results highlight aspects of the model and its implementation that warrant future examination.
The first demonstration of laser action in ruby was made in 1960 by T. H. Maiman of Hughes Research Laboratories, USA. Many laboratories worldwide began the search for lasers using different materials, operating at different wavelengths. In the UK, academia, industry and the central laboratories took up the challenge from the earliest days to develop these systems for a broad range of applications. This historical review looks at the contribution the UK has made to the advancement of the technology, the development of systems and components and their exploitation over the last 60 years.
A single radiocarbon date derived from the Buhl burial in south-central Idaho has frequently been used as a data point for the interpretation of the Western Stemmed Tradition (WST) chronology and technology because of the stemmed biface found in situ with the human remains. AMS dating of bone collagen in 1991 produced an age of 10,675 ± 95 14C BP, immediately postdating the most widely accepted age range for Clovis. The Buhl burial has been cited as evidence that stemmed point technology may have overlapped with Clovis technology in the Intermountain West. We discuss concerns about the radiocarbon date, arguing that even at face value, the calibrated date has minimal overlap with Clovis at the 95.4% range. Furthermore, the C:N ratio of 3.69 in the analyzed collagen is outside of the typical range for well-preserved samples, indicating a postdepositional change in carbon composition, which may make the date erroneously older or younger than the age of the skeleton. Finally, the potential dietary incorporation of small amounts of anadromous fish may indicate that the burial is younger than traditionally accepted. For these reasons, we argue that the Buhl burial cannot be used as evidence of overlap between WST and Clovis.
Increasing prevalence of overweight and obese people in England has led policymakers to consider regulating the use of price promotions on foods high in fat, sugar and salt content. In January 2019, the government opened a consultation programme for a policy proposal that significantly restricts the use of price promotions that can induce consumers to buy higher volumes of unhealthy foods and beverages. These proposed policies are the first of their kind in public health and are believed to reduce excess purchasing and, therefore, overconsumption of unhealthy products. This study summarises evidence relating price promotions to the purchasing of food and drink for home consumption and places it in the context of the proposed policy.
Non-systematic review of quantitative analyses of price promotions in food and drink published in peer-reviewed journals and sighted by PubMed, ScienceDirect & EBSCOhost between 1980 and January 2018.
While the impact of price promotions on sales has been of interest to marketing academics for a long time with modelling studies showing that its use has increased food and drink sales by 12–43 %, it is only now being picked up in the public health sphere. However, existing evidence does not consider the effects of removing or restricting the use of price promotions across the food sector. In this commentary, we discuss existing evidence, how it deals with the complexity of shoppers’ behaviour in reacting to price promotions on foods and, importantly, what can be learned from it in this policy context.
The current evidence base supports the notion that price promotions increase purchasing of unhealthy food, and while the proposed restriction policy is yet to be evaluated for consumption and health effects, there is arguably sufficient evidence to proceed. This evidence is not restricted to volume-based promotions. Close monitoring and proper evaluation should follow to provide empirical evidence of its intended and unintended effects.
Teachers of traditional courses in Western civilization have often found it difficult to be enthusiastic about the three centuries between 600 and 900 CE. These were the “Dark Ages,” and Europe was still feeling the reverberations from the momentous “fall” of the Roman empire, now characterized more as a “transition,” that had been transpiring since the third century. Migration and invasion by tribal peoples from the hinterland, institutional breakdown, and a decline in public order had all been manifestations of this process. Underlying it were larger considerations including climatic deterioration and a demographic downturn that became a catastrophe in the sixth century with a pandemic often referred to as the Plague of Justinian. Fewer people in a pre-industrial economy meant fewer producers and consumers and thus a drop in economic activity. And the decline was across the board. The demand for artisanal and industrial goods plunged, and the amount of cultivated land contracted. Prosperity decreased, and poverty increased. The surviving urban areas were generally administrative or ecclesiastical rather than commercial centres. Infrastructure, especially roads, was not maintained, and much of the transportation system collapsed. The market economy itself was disassembled and in some places practically disintegrated; advanced sectors like banking disappeared. The monetary system floundered, and more primitive forms of exchange like barter and giftgiving were revived. Long-distance trade was confined to luxury and prestige goods, and even interregional markets disappeared.
Or so this worst-case scenario version of the Early Middle Ages has often been presented. Historians who have taken a closer look have tried to paint a more nuanced view. Different places at different times had different experiences. Along with the big downswing there were many smaller upswings, and deep down new forces for change were starting to bubble. The real problem with this discussion is that Europe just wasn't a very important place for trade and commerce or most other matters historians deal with in the period between 600 and 900. In more important places there was nothing dark about this time; for trade and commerce, it was one of history's most luminous ages. Driving this both overland and maritime were two great empires on either side of Eurasia: Tang China in the east and the Islamic caliphate in the west.
Increasing weed control costs and limited herbicide options threaten vegetable crop profitability. Traditional interrow mechanical cultivation is very effective at removing weeds between crop rows. However, weed control within the crop rows is necessary to establish the crop and prevent yield loss. Currently, many vegetable crops require hand weeding to remove weeds within the row that remain after traditional cultivation and herbicide use. Intelligent cultivators have come into commercial use to remove intrarow weeds and reduce cost of hand weeding. Intelligent cultivators currently on the market such as the Robovator, use pattern recognition to detect the crop row. These cultivators do not differentiate crops and weeds and do not work well among high weed populations. One approach to differentiate weeds is to place a machine-detectable mark or signal on the crop (i.e., the crop has the mark and the weed does not), thereby facilitating weed/crop differentiation. Lettuce and tomato plants were marked with labels and topical markers, then cultivated with an intelligent cultivator programmed to identify the markers. Results from field trials in marked tomato and lettuce found that the intelligent cultivator removed 90% more weeds from tomato and 66% more weeds from lettuce than standard cultivators without reducing yields. Accurate crop and weed differentiation described here resulted in a 45% to 48% reduction in hand-weeding time per hectare.
Weed management is a major challenge in organic crop production, and organic farms generally harbor larger weed populations and more diverse communities compared with conventional farms. However, little research has been conducted on the effects of different organic management practices on weed communities and crop yields. In 2014 and 2015, we measured weed community structure and soybean [Glycine max (L.) Merr.] yield in a long-term experiment that compared four organic cropping systems that differed in nutrient inputs, tillage, and weed management intensity: (1) high fertility (HF), (2) low fertility (LF), (3) enhanced weed management (EWM), and (4) reduced tillage (RT). In addition, we created weed-free subplots within each system to assess the impact of weeds on soybean yield. Weed density was greater in the LF and RT systems compared with the EWM system, but weed biomass did not differ among systems. Weed species richness was greater in the RT system compared with the EWM system, and weed community composition differed between RT and other systems. Our results show that differences in weed community structure were primarily related to differences in tillage intensity, rather than nutrient inputs. Soybean yield was lower in the EWM system compared with the HF and RT systems. When averaged across all four cropping systems and both years, soybean yield in weed-free subplots was 10% greater than soybean yield in the ambient weed subplots that received standard management practices for the systems in which they were located. Although weed competition limited soybean yield across all systems, the EWM system, which had the lowest weed density, also had the lowest soybean yield. Future research should aim to overcome such trade-offs between weed control and yield potential, while conserving weed species richness and the ecosystem services associated with increased weed diversity.
Cougar Mountain Cave is located in Oregon's Fort Rock Basin. In 1958, avocationalist John Cowles excavated most of the cave's deposits and recovered abundant fiber, lithic, wood, and osseous artifacts. A crew from the University of California, Davis returned to the site in 1966 to evaluate the potential for further research, collecting additional lithic and fiber artifacts from disturbed deposits and in situ charcoal from apparently undisturbed deposits. Because Cowles took few notes or photographs, the Cougar Mountain Cave collection—most of which is housed at the Favell Museum in Klamath Falls, Oregon—has largely gone unstudied even though it contains diagnostic artifacts spanning the Holocene and, potentially, the terminal Pleistocene. We recently submitted charcoal and basketry from the site for radiocarbon dating, providing the first reliable sense of when Cougar Mountain Cave was first occupied. Our results indicate at least a Younger Dryas age for initial occupation. The directly dated basketry has provided new information about the age ranges and spatial distributions of diagnostic textile types in the northwestern Great Basin.
Determining infectious cross-transmission events in healthcare settings involves manual surveillance of case clusters by infection control personnel, followed by strain typing of clinical/environmental isolates suspected in said clusters. Recent advances in genomic sequencing and cloud computing now allow for the rapid molecular typing of infecting isolates.
To facilitate rapid recognition of transmission clusters, we aimed to assess infection control surveillance using whole-genome sequencing (WGS) of microbial pathogens to identify cross-transmission events for epidemiologic review.
Clinical isolates of Staphylococcus aureus, Enterococcus faecium, Pseudomonas aeruginosa, and Klebsiella pneumoniae were obtained prospectively at an academic medical center, from September 1, 2016, to September 30, 2017. Isolate genomes were sequenced, followed by single-nucleotide variant analysis; a cloud-computing platform was used for whole-genome sequence analysis and cluster identification.
Most strains of the 4 studied pathogens were unrelated, and 34 potential transmission clusters were present. The characteristics of the potential clusters were complex and likely not identifiable by traditional surveillance alone. Notably, only 1 cluster had been suspected by routine manual surveillance.
Our work supports the assertion that integration of genomic and clinical epidemiologic data can augment infection control surveillance for both the identification of cross-transmission events and the inclusion of missed and exclusion of misidentified outbreaks (ie, false alarms). The integration of clinical data is essential to prioritize suspect clusters for investigation, and for existing infections, a timely review of both the clinical and WGS results can hold promise to reduce HAIs. A richer understanding of cross-transmission events within healthcare settings will require the expansion of current surveillance approaches.
Syndromic surveillance is a form of surveillance that generates information for public health action by collecting, analysing and interpreting routine health-related data on symptoms and clinical signs reported by patients and clinicians rather than being based on microbiologically or clinically confirmed cases. In England, a suite of national real-time syndromic surveillance systems (SSS) have been developed over the last 20 years, utilising data from a variety of health care settings (a telehealth triage system, general practice and emergency departments). The real-time systems in England have been used for early detection (e.g. seasonal influenza), for situational awareness (e.g. describing the size and demographics of the impact of a heatwave) and for reassurance of lack of impact on population health of mass gatherings (e.g. the London 2012 Olympic and Paralympic Games).We highlight the lessons learnt from running SSS, for nearly two decades, and propose questions and issues still to be addressed. We feel that syndromic surveillance is an example of the use of ‘big data’, but contend that the focus for sustainable and useful systems should be on the added value of such systems and the importance of people working together to maximise the value for the public health of syndromic surveillance services.
We identified a pseudo-outbreak of Mycobacterium avium in an outpatient bronchoscopy clinic following an increase in clinic procedure volume. We terminated the pseudo-outbreak by increasing the frequency of automated endoscope reprocessors (AER) filter changes from quarterly to monthly. Filter changing schedules should depend on use rather than fixed time intervals.
The use of underground geological repositories, such as in radioactive waste disposal (RWD) and in carbon capture (widely known as Carbon Capture and Storage; CCS), constitutes a key environmental priority for the 21st century. Based on the identification of key scientific questions relating to the geophysics, geochemistry and geobiology of geodisposal of wastes, this paper describes the possibility of technology transfer from high-technology areas of the space exploration sector, including astrobiology, planetary sciences, astronomy, and also particle and nuclear physics, into geodisposal. Synergies exist between high technology used in the space sector and in the characterization of underground environments such as repositories, because of common objectives with respect to instrument miniaturization, low power requirements, durability under extreme conditions (in temperature and mechanical loads) and operation in remote or otherwise difficult to access environments.
Cognitive impairment in heart failure (HF) is believed to in part stem from structural brain alterations, including shrinkage of subcortical regions. Fortunately, neurocognitive dysfunction in HF can be mitigated by physical activity (PA), though mechanisms for this phenomenon are unclear. PA is protective against age-related cognitive decline that may involve improved structural integrity to brain regions sensitive to aging (e.g., subcortical structures). Yet, no study has examined the benefits of PA on the brain in HF and we sought to do so and clarify related cognitive implications. Fifty older adults with HF completed a neuropsychological battery and wore an accelerometer for 7 days. All participants underwent brain MRI. This study targeted subcortical brain volume given subcortical alterations are often observed in HF and the sensitivity of PA to subcortical structures in other patient populations. Participants averaged 4348.49 (SD=2092.08) steps per day and greater daily steps predicted better attention/executive function, episodic memory, and language abilities, p’s<.05. Medical and demographically adjusted regression analyses revealed higher daily steps per day predicted greater subcortical volume, with specific effects for the thalamus and ventral diencephalon, p’s<.05. Greater subcortical volume was associated with better attention/executive function, p<.05. Higher daily PA was associated with increased subcortical brain volume and better cognition in older adults with HF. Longitudinal work is needed to clarify whether daily PA can attenuate brain atrophy in HF to reduce accelerated cognitive decline in this population. (JINS, 2015, 21, 851–860)