To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The scientific community and most mainstream agriculturalists typically design fertilizer recommendations to provide a ‘sufficient level of available nutrients’ to meet the annual N, P and K requirements of common field crops. Soil balancing is another approach to managing soil fertility that focuses on the levels of Ca, Mg and K to achieve a desired base cation saturation ratio (BCSR). Soil balancing is believed to be practiced frequently by organic and other alternative farmers but is viewed skeptically by conventional agricultural scientists due to a lack of support for the idea in the published scientific literature. This study represents a pioneering effort to collect systematic data on the extent of soil balancing, how it is practiced and the types of outcomes reported by organic farmers. Our survey of over 850 farmers who grow certified organic corn in Indiana, Michigan, Ohio and Pennsylvania found that over half report using a soil-balancing approach based on BCSR. Their practice of soil balancing frequently includes more than management of base cations, but also uses a wide range of soil amendment products (such as purchased organic NPK fertilizers, micronutrients, microbial stimulants and soil inoculants) other than those applied specifically for cation balance. Farms that rely on vegetable and dairy production for most of their income, and Amish farmers who rely on horses for fieldwork, were more likely to report using a soil-balancing program. Self-described soil balancers perceived positive agronomic outcomes from the use of a BCSR program, including improvements in soil physical and biological properties and improved crop health and quality. Although farmers in our study report extensive use and positive perceived outcomes from soil-balancing methods, the scientific research literature has been unable to reproduce evidence that manipulating soil base cation levels has any systematic effect on crop yield. Future research could consider the interacting effects of BCSR with other field management practices to more closely approximate the actual practices of farmers.
Understanding differences in social-emotional behavior can help identify atypical development. This study examined the differences in social-emotional development in children at increased risk of an autism spectrum disorder (ASD) diagnosis (infant siblings of children diagnosed with the disorder). Parents completed the Brief Infant-Toddler Social-Emotional Assessment (BITSEA) to determine its ability to flag children with later-diagnosed ASD in a high-risk (HR) sibling population. Parents of HR (n = 311) and low-risk (LR; no family history of ASD; n = 127) children completed the BITSEA when their children were 18 months old and all children underwent a diagnostic assessment for ASD at age 3 years. All six subscales of the BITSEA (Problems, Competence, ASD Problems, ASD Competence, Total ASD Score, and Red Flags) distinguished between those in the HR group who were diagnosed with ASD (n = 84) compared to non-ASD-diagnosed children (both HR-N and LR). One subscale (BITSEA Competence) differentiated between the HR children not diagnosed with ASD and the LR group. The results suggest that tracking early social-emotional development may have implications for all HR children, as they are at increased risk of ASD but also other developmental or mental health conditions.
It is not clear to what extent associations between schizophrenia, cannabis use and cigarette use are due to a shared genetic etiology. We, therefore, examined whether schizophrenia genetic risk associates with longitudinal patterns of cigarette and cannabis use in adolescence and mediating pathways for any association to inform potential reduction strategies.
Associations between schizophrenia polygenic scores and longitudinal latent classes of cigarette and cannabis use from ages 14 to 19 years were investigated in up to 3925 individuals in the Avon Longitudinal Study of Parents and Children. Mediation models were estimated to assess the potential mediating effects of a range of cognitive, emotional, and behavioral phenotypes.
The schizophrenia polygenic score, based on single nucleotide polymorphisms meeting a training-set p threshold of 0.05, was associated with late-onset cannabis use (OR = 1.23; 95% CI = 1.08,1.41), but not with cigarette or early-onset cannabis use classes. This association was not mediated through lower IQ, victimization, emotional difficulties, antisocial behavior, impulsivity, or poorer social relationships during childhood. Sensitivity analyses adjusting for genetic liability to cannabis or cigarette use, using polygenic scores excluding the CHRNA5-A3-B4 gene cluster, or basing scores on a 0.5 training-set p threshold, provided results consistent with our main analyses.
Our study provides evidence that genetic risk for schizophrenia is associated with patterns of cannabis use during adolescence. Investigation of pathways other than the cognitive, emotional, and behavioral phenotypes examined here is required to identify modifiable targets to reduce the public health burden of cannabis use in the population.
THE GREAT PARCHMENT Book project represents a very good example of a successful relationship between different disciplines and professions. The research brought together archivists, a paleographer, and conservators from London Metropolitan Archives (LMA) and experts in digital technologies from the University College London (UCL) Department of Computer Science and UCL Centre for Digital Humanities. This joint effort supported a four-year Engineering Doctorate (EngD) in the UCL Virtual Environments, Imaging, and Visualization program funded by the Engineering and Physical Research Council and LMA. The conservation work was funded by the National Manuscript Conservation Trust.
The aim was to create a digital copy that revealed the content of a manuscript where the text was illegible due to its fragile physical condition. The book (LMA reference CLA/ 049/ EM/ 02/ 018) was unavailable for access due to the extreme fragility of its support. Traditional conservation alone could not reinstate the document in an acceptable and safe condition to enable it to be handled. Only the work of the UCL digitization team revealed the very useful information that the document held.
Today the volume is available for access online without the need to retrieve and handle the document unnecessarily.
The Great Parchment Book owned by the Honourable The Irish Society was commissioned in 1639 by Charles I with the aim to survey all the estates in Derry∼Londonderry managed by the City of London through The Irish Society and the City of London livery companies. This survey was compiled at a time of great political and social change and provides important information about the role of the City of London in the Protestant colonization and administration of Ulster as well as the population.
Since 1639 the book has been held in London. In February 1786, a fire in the Chamber of London at the Guildhall in the City of London destroyed most of the early records of The Irish Society, and only very few of the seventeenth-century documents remained. Among those which survived is the Great Parchment Book.
The volume was severely damaged by fire and subsequently by water. The volume was so distorted and fragile that for over 200 years it was unavailable for research.
Worldwide, early intervention services for young people with recent-onset psychosis have been associated with improvements in outcomes, including reductions in hospitalization, symptoms, and improvements in treatment engagement and work/school participation. States have received federal mental health block grant funding to implement team-based, multi-element, evidence-based early intervention services, now called coordinated specialty care (CSC) in the USA. New York State’s CSC program, OnTrackNY, has grown into a 23-site, statewide network, serving over 1800 individuals since its 2013 inception. A state-supported intermediary organization, OnTrackCentral, has overseen the growth of OnTrackNY. OnTrackNY has been committed to quality improvement since its inception. In 2019, OnTrackNY was awarded a regional hub within the National Institute of Mental Health-sponsored Early Psychosis Intervention Network (EPINET). The participation in the national EPINET initiative reframes and expands OnTrackNY’s quality improvement activities. The national EPINET initiative aims to develop a learning healthcare system (LHS); OnTrackNY’s participation will facilitate the development of infrastructure, including a systematic approach to facilitating stakeholder input and enhancing the data and informatics infrastructure to promote quality improvement. Additionally, this infrastructure will support practice-based research to improve care. The investment of the EPINET network to build regional and national LHSs will accelerate innovations to improve quality of care.
Psychiatric disorders are associated with increased risk of ischaemic heart disease (IHD) and stroke, but it is not known whether the associations or the role of sociodemographic factors have changed over time.
To investigate the association between psychiatric disorders and IHD and stroke, by time period and sociodemographic factors.
We used Scottish population-based records from 1991 to 2015 to create retrospective cohorts with a hospital record for psychiatric disorders of interest (schizophrenia, bipolar disorder or depression) or no record of hospital admission for mental illness. We estimated incidence and relative risks of IHD and stroke in people with versus without psychiatric disorders by calendar year, age, gender and area-based deprivation level.
In all cohorts, incidence of IHD (645 393 events) and stroke (276 073 events) decreased over time, but relative risks decreased for depression only. In 2015, at the mean age at event onset, relative risks were 2- to 2.5-fold higher in people with versus without a psychiatric disorder. Age at incidence of outcome differed by cohort, gender and socioeconomic status. Relative but not absolute risks were generally higher in women than men. Increasing deprivation conveys a greater absolute risk of IHD for people with bipolar disorder or depression.
Despite declines in absolute rates of IHD and stroke, relative risks remain high in those with versus without psychiatric disorders. Cardiovascular disease monitoring and prevention approaches may need to be tailored by psychiatric disorder and cardiovascular outcome, and be targeted, for example, by age and deprivation level.
BACTOT, Quebec’s healthcare-associated bloodstream infection (HABSI) surveillance program has been operating since 2007. In this study, we evaluated the changes in HABSI rates across 10 years of BACTOT surveillance under a Bayesian framework.
A retrospective, cohort study of eligible hospitals having participated in BACTOT for at least 3 years, regardless of their entry date. Multilevel Poisson regressions were fitted independently for cases of HABSI, catheter-associated bloodstream infections (CA-BSIs), non–catheter-associated primary BSIs (NCA-BSIs), and BSIs secondary to urinary tract infections (BSI-UTIs) as the outcome and log of patient days as the offset. The log of the mean Poisson rate was decomposed as the sum of a surveillance year effect, period effect, and hospital effect. The main estimate of interest was the cohort-level rate in years 2–10 of surveillance relative to year 1.
Overall, 17,479 cases and 33,029,870 patient days were recorded for the cohort of 77 hospitals. The pooled 10-year HABSI rate was 5.20 per 10,000 patient days (95% CI, 5.12–5.28). For HABSI, CA-BSI, and BSI-UTI, there was no difference between the estimated posterior rates of years 2–10 compared to year 1. The posterior means of the NCA-BSI rate ratios increased from the seventh year until the tenth year, when the rate was 29% (95% confidence interval, 1%–89%) higher than the first year rate.
HABSI rates and those of the most frequent subtypes remained stable over the surveillance period. To achieve reductions in incidence, we recommend that more effort be expended in active interventions against HABSI alongside surveillance.
In this article, we outline the key principles of education for sustainability (EfS) that enable us to question the enthusiastic and uncritical promotion of STEM (science, mathematics, engineering and technology) and its offshoot, STEM education, as key contributors to an environmentally sustainable future. We examine the framing of STEM and STEM education as situated in an unproblematised, neoliberal growthist paradigm, in contrast to the more critical ecological paradigm of EfS. We conclude that STEM, and hence STEM education, need to include critical reflection and futures perspectives if they are to align themselves with a flourishing economic, social and environmental future. We provide examples for the classroom that illustrate our contention.
Healthcare-associated bloodstream infections (HABSI) are a significant cause of morbidity and mortality worldwide. In Québec, Canada, HABSI arising from acute-care hospitals have been monitored since April 2007 through the Surveillance des bactériémies nosocomiales panhospitalières (BACTOT) program, but this is the first detailed description of HABSI epidemiology.
This retrospective, descriptive study was conducted using BACTOT surveillance data from hospitals that participated continuously between April 1, 2007, and March 31, 2017. HABSI cases and rates were stratified by hospital type and/or infection source. Temporal trends of rates were analyzed by fitting generalized estimating equation Poisson models, and they were stratified by infection source.
For 40 hospitals, 13,024 HABSI cases and 23,313,959 patient days were recorded, for an overall rate of 5.59 per 10,000 patient days (95% CI, 5.54–5.63). The most common infection sources were catheter-associated BSIs (23.0%), BSIs secondary to a urinary focus (21.5%), and non–catheter-associated primary BSIs (18.1%). Teaching hospitals and nonteaching hospitals with ICUs often had rates higher than nonteaching hospitals without ICUs. Annual HABSI rates did not exhibit statistically significant changes from year to year. Non–catheter-associated primary BSIs were the only HABSI type that exhibited a sustained change across the 10 years, increasing from 0.69 per 10,000 patient days (95% CI, 0.59–0.80) in 2007–2008 to 1.42 per 10,000 patient days (95% CI, 1.27–1.58) in 2016–2017.
Despite ongoing surveillance, overall HABSI rates have not decreased. The effect of BACTOT participation should be more closely investigated, and targeted interventions along alternative surveillance modalities should be considered, prioritizing high-burden and potentially preventable BSI types.
There is a broad set of human beliefs, attitudes and behaviours around the issue of magical animals, referring to both mythical animals not recognized by science and extant animals that are recognized by science but have magical properties. This is a broad issue ranging from spiritual beliefs around mythical animals living in Malagasy forests, to cultural heritage associated with the Loch Ness Monster in Scotland. Beliefs and behaviours around magical animals can have positive and negative impacts on biodiversity conservation goals. Yet, so far, the discipline of conservation biology has not adequately considered magical animals, neglecting to account for the broader knowledge from outside the natural sciences on this issue, and taking a narrow, utilitarian approach to how magical animals should be managed, without necessarily considering the broader impacts on conservation goals or ethics. Here we explore how magical animals can influence conservation goals, how conservation biology and practice has thought about magical animals, and some of the limitations of current approaches, particularly the failure to consider magical animals as part of wider systems of belief and culture. We argue that magical animals and their implications for conservation merit wider consideration.
Early-onset conduct problems (CP) are a key predictor of adult criminality and poor mental health. While previous studies suggest that both genetic and environmental risks play an important role in the development of early-onset CP, little is known about potential biological processes underlying these associations. In this study, we examined prospective associations between DNA methylation (cord blood at birth) and trajectories of CP (4–13 years), using data drawn from the Avon Longitudinal Study of Parents and Children. Methylomic variation at seven loci across the genome (false discovery rate < 0.05) differentiated children who go on to develop early-onset (n = 174) versus low (n = 86) CP, including sites in the vicinity of the monoglyceride lipase (MGLL) gene (involved in endocannabinoid signaling and pain perception). Subthreshold associations in the vicinity of three candidate genes for CP (monoamine oxidase A [MAOA], brain-derived neurotrophic factor [BDNF], and FK506 binding protein 5 [FKBP5]) were also identified. Within the early-onset CP group, methylation levels of the identified sites did not distinguish children who will go on to persist versus desist in CP behavior over time. Overall, we found that several of the identified sites correlated with prenatal exposures, and none were linked to known genetic methylation quantitative trait loci. Findings contribute to a better understanding of epigenetic patterns associated with early-onset CP.
Validation-study data were analysed to investigate retention interval (RI) and prompt effects on the accuracy of fourth-grade children’s reports of school-breakfast and school-lunch (in 24-h recalls), and the accuracy of school-breakfast reports by breakfast location (classroom; cafeteria). Randomly selected fourth-grade children at ten schools in four districts were observed eating school-provided breakfast and lunch, and were interviewed under one of eight conditions created by crossing two RIs (‘short’ – prior-24-hour recall obtained in the afternoon and ‘long’ – previous-day recall obtained in the morning) with four prompts (‘forward’ – distant to recent, ‘meal name’ – breakfast, etc., ‘open’ – no instructions, and ‘reverse’ – recent to distant). Each condition had sixty children (half were girls). Of 480 children, 355 and 409 reported meals satisfying criteria for reports of school-breakfast and school-lunch, respectively. For breakfast and lunch separately, a conventional measure – report rate – and reporting-error-sensitive measures – correspondence rate and inflation ratio – were calculated for energy per meal-reporting child. Correspondence rate and inflation ratio – but not report rate – showed better accuracy for school-breakfast and school-lunch reports with the short RI than with the long RI; this pattern was not found for some prompts for each sex. Correspondence rate and inflation ratio showed better school-breakfast report accuracy for the classroom than for cafeteria location for each prompt, but report rate showed the opposite. For each RI, correspondence rate and inflation ratio showed better accuracy for lunch than for breakfast, but report rate showed the opposite. When choosing RI and prompts for recalls, researchers and practitioners should select a short RI to maximise accuracy. Recommendations for prompt selections are less clear. As report rates distort validation-study accuracy conclusions, reporting-error-sensitive measures are recommended.
Non-invasive survey in the Stonehenge ‘Triangle’, Amesbury, Wiltshire, has highlighted a number of features that have a significant bearing on the interpretation of the site. Geophysical anomalies may signal the position of buried stones adding to the possibility of former stone arrangements, while laser scanning has provided detail on the manner in which the stones have been dressed; some subsequently carved with axe and dagger symbols. The probability that a lintelled bluestone trilithon formed an entrance in the north-east is signposted. This work has added detail that allows discussion on the question of whether the sarsen circle was a completed structure, although it is by no means conclusive in this respect. Instead, it is suggested that it was built as a façade, with other parts of the circuit added and with an entrance in the south.
Integrated non-invasive survey in the Stonehenge ‘triangle’, Amesbury, Wiltshire, has highlighted a number of features that have a significant bearing on the interpretation of the site. Among them are periglacial and natural topographical structures, including a chalk mound that may have influenced site development. Some geophysical anomalies are similar to the post-holes in the car park of known Mesolithic date, while others beneath the barrows to the west may point to activity contemporary with Stonehenge itself. Evidence that the ‘North Barrow’ may be earlier in the accepted sequence is presented and the difference between the eastern and western parts of the enclosure ditch highlighted, while new data relating to the Y and Z Holes and to the presence of internal banks that mirror their respective circuits is also outlined.