We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
During their northward migration, Red Knots Calidris canutus rufa stop at the Lagoa do Peixe National Park in the extreme south of Brazil to build up fat reserves for their journey to their Canadian breeding grounds. We tracked five Red Knots with PinPoint Argos-75 GPS transmitters to investigate differences in migration strategies from this stopover. Tracked birds used two different routes: the Central Brazil route and the Brazilian Atlantic Coast route. One bird flew 8,300 km straight from Lagoa do Peixe to the Delaware Bay (USA). Another bird stopped in Maranhão (north-east Brazil) and a third one used a yet unknown environment for the species, the mouth of the Amazon River at Baía Santa Rosa, Brazil. These two birds made short flights, covering stretches of 1,600 km to 3,600 km between stop-overs, where they stayed from 4 to 18 days. Our study highlights the occurrence of intrapopulation variation in migratory strategies and reveals the connectivity of environments that are essential for the viability of rufa Red Knot populations.
Various water-based heater-cooler devices (HCDs) have been implicated in nontuberculous mycobacteria outbreaks. Ongoing rigorous surveillance for healthcare-associated M. abscessus (HA-Mab) put in place following a prior institutional outbreak of M. abscessus alerted investigators to a cluster of 3 extrapulmonary M. abscessus infections among patients who had undergone cardiothoracic surgery.
Methods:
Investigators convened a multidisciplinary team and launched a comprehensive investigation to identify potential sources of M. abscessus in the healthcare setting. Adherence to tap water avoidance protocols during patient care and HCD cleaning, disinfection, and maintenance practices were reviewed. Relevant environmental samples were obtained. Patient and environmental M. abscessus isolates were compared using multilocus-sequence typing and pulsed-field gel electrophoresis. Smoke testing was performed to evaluate the potential for aerosol generation and dispersion during HCD use. The entire HCD fleet was replaced to mitigate continued transmission.
Results:
Clinical presentations of case patients and epidemiologic data supported intraoperative acquisition. M. abscessus was isolated from HCDs used on patients and molecular comparison with patient isolates demonstrated clonality. Smoke testing simulated aerosolization of M. abscessus from HCDs during device operation. Because the HCD fleet was replaced, no additional extrapulmonary HA-Mab infections due to the unique clone identified in this cluster have been detected.
Conclusions:
Despite adhering to HCD cleaning and disinfection strategies beyond manufacturer instructions for use, HCDs became colonized with and ultimately transmitted M. abscessus to 3 patients. Design modifications to better contain aerosols or filter exhaust during device operation are needed to prevent NTM transmission events from water-based HCDs.
The extent to which weed species vary in their ability to acquire and use different forms of nitrogen (N) (inorganic and organic) has not been investigated but could have important implications for weed survival and weed–crop competition in agroecosystems. We conducted a controlled environment experiment using stable isotopes to determine the uptake and partitioning of organic and inorganic N (amino acids, ammonium, and nitrate) by seven common weed and non-weed species. All species took up inorganic and organic N, including as intact amino acids. Concentrations of 15N derived from both ammonium and amino acids in shoot tissues were higher in large crabgrass [Digitaria sanguinalis (L.) Scop.] and barnyardgrass [Echinochloa crus-galli (L.) P. Beauv] than in common lambsquarters (Chenopodium album L.), redroot pigweed (Amaranthus retroflexus L.), and sorghum-sudangrass [Sorghum bicolor (L.) Moench × Sorghum bicolor (L.) ssp. drummondii (Nees ex Steud.) de Wet & Harlan]. In contrast, the concentration of 15N derived from nitrate was higher in wild mustard (Sinapis arvensis L.) shoots than in wild oat (Avena fatua L.) shoots. Root concentration of 15N derived from ammonium was lower in sorghum-sudangrass compared with other species, except for A. retroflexus and A. fatua, while root concentration of 15N derived from nitrate was lower in A. retroflexus compared with other species, except for C. album and S. arvensis. Discriminant analysis classified species based on their uptake and partitioning of all three labeled N forms. These results suggest that common agricultural weeds can access and use organic N and differentially take up inorganic N forms. Additional research is needed to determine whether species-specific differences in organic and inorganic N uptake influence the intensity of competition for soil N.
Research has shown 30-40 % of people who have experienced traumatic injury are at risk of developing mental illness. Some injuries may be the result of mental ill-health, including self-inflicted injury. Furthermore, the development of psychopathology after injury appears to be a major determinant of long term disability. Early intervention can reduce symptom severity and prevent development of mental illness.
Ireland’s National Trauma System Implementation Programme, announced in April 2021, highlights the need for screening for mental disorders.
The Mater Misericordiae University Hospital (MMUH) is designated as one of two national Major Trauma Centres in Ireland. Its trauma service will expand with an expectation of an additional 450- 500 major trauma patients over the next three years.
The Consultation Liaison Psychiatry Service (CLP) currently provides expert mental health input to medical and surgical teams, in managing a range of patients with mental illnesses or psychological difficulties, including those with experience of major trauma.
Objectives
To examine the current mental health service provision for trauma patients over a six-month period. We aimed to identify areas of need to inform future development of a psychiatry-led MDT service for trauma patients.
Methods
A review of all patients admitted on the MMUH trauma pathway between January 2021 and June 2021 was performed. The following data were recorded: demographics, mechanism of injury and information on referrals to the liaison psychiatry service.
Results
There were 105 trauma cases over the six-month period; 46 females and 59 males. The mean age was 58.4 years (SD 22.16). Twelve individuals were recorded as ‘No Fixed Abode’ or living in homeless accommodation(11.4%).
In terms of mechanism of injury; 20 were assaulted of which 8 were stabbing/ knife injuries. There were 65 falls and 12 road traffic accidents. In 3 cases (2.8%), the mechanism of injury was self-inflicted. Twenty patients were admitted to critical care (19%).
Of the 105 trauma patients, 19 (18%) were referred to CLP service; 2 (10.5%) were seen in the outpatient setting, the rest as inpatients (89.5%). At least one repeat review was indicated in 10 of the 19 patients (52.6%).
Conclusions
Trauma patients have a high rate of comorbid mental illness. Nearly 1/5 are currently referred to the CLP service, which is likely an underestimation of the actual burden of mental health disorders and could be explained by the lack of dedicated services. The liaison psychiatry team provides valuable input into the multidisciplinary care of trauma patients and the demand for its services is likely to increase with the expansion under the Major Trauma Strategy for Ireland.
Human infection with antimicrobial-resistant Campylobacter species is an important public health concern due to the potentially increased severity of illness and risk of death. Our objective was to synthesise the knowledge of factors associated with human infections with antimicrobial-resistant strains of Campylobacter. This scoping review followed systematic methods, including a protocol developed a priori. Comprehensive literature searches were developed in consultation with a research librarian and performed in five primary and three grey literature databases. Criteria for inclusion were analytical and English-language publications investigating human infections with an antimicrobial-resistant (macrolides, tetracyclines, fluoroquinolones, and/or quinolones) Campylobacter that reported factors potentially linked with the infection. The primary and secondary screening were completed by two independent reviewers using Distiller SR®. The search identified 8,527 unique articles and included 27 articles in the review. Factors were broadly categorised into animal contact, prior antimicrobial use, participant characteristics, food consumption and handling, travel, underlying health conditions, and water consumption/exposure. Important factors linked to an increased risk of infection with a fluoroquinolone-resistant strain included foreign travel and prior antimicrobial use. Identifying consistent risk factors was challenging due to the heterogeneity of results, inconsistent analysis, and the lack of data in low- and middle-income countries, highlighting the need for future research.
The developmental absence (agenesis) of the corpus callosum (AgCC) is a congenital brain malformation associated with risk for a range of neuropsychological difficulties. Inhibitory control outcomes, including interference control and response inhibition, in children with AgCC are unclear. This study examined interference control and response inhibition: 1) in children with AgCC compared with typically developing (TD) children, 2) in children with different anatomical features of AgCC (complete vs. partial, isolated vs. complex), and 3) associations with white matter volume and microstructure of the anterior (AC) and posterior commissures (PC) and any remnant corpus callosum (CC).
Methods:
Participants were 27 children with AgCC and 32 TD children 8–16 years who completed inhibitory control assessments and brain MRI to define AgCC anatomical features and measure white matter volume and microstructure.
Results:
The AgCC cohort had poorer performance and higher rates of below average performance on inhibitory control measures than TD children. Children with complex AgCC had poorer response inhibition performance than children with isolated AgCC. While not statistically significant, there were select medium to large effect sizes for better inhibitory control associated with greater volume and microstructure of the AC and PC, and with reduced volume and microstructure of the remnant CC in partial AgCC.
Conclusions:
This study provides evidence of inhibitory control difficulties in children with AgCC. While the sample was small, the study found preliminary evidence that the AC (f2=.18) and PC (f2=.30) may play a compensatory role for inhibitory control outcomes in the absence of the CC.
We explored the acceptability of a personalised proteomic risk intervention for patients at increased risk of type 2 diabetes and their healthcare providers, as well as their experience of participating in the delivery of proteomic-based risk feedback in UK primary care.
Background:
Advances in proteomics now allow the provision of personalised proteomic risk reports, with the intention of achieving positive behaviour change. This technology has the potential to encourage behaviour change in people at risk of developing type 2 diabetes.
Methods:
A semi-structured interview study was carried out with patients at risk of type 2 diabetes and their healthcare providers in primary care in the North of England. Participants (n = 17) and healthcare provider (n = 4) were interviewed either face to face or via telephone. Data were analysed using thematic analysis. This qualitative study was nested within a single-arm pilot trial and undertaken in primary care.
Findings:
The personalised proteomic risk intervention was generally acceptable and the experience was positive. The personalised nature of the report was welcomed, especially the way it provided a holistic approach to risks of organ damage and lifestyle factors. Insights were provided as to how this may change behaviour. Some participants reported difficulties in understanding the format of the presentation of risk and expressed surprise at receiving risk estimates for conditions other than type 2 diabetes. Personalised proteomic risk interventions have the potential to provide holistic and comprehensive assessments of risk factors and lifestyle factors which may lead to positive behaviour change.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
Aims
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Method
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Results
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
Conclusions
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
The first demonstration of laser action in ruby was made in 1960 by T. H. Maiman of Hughes Research Laboratories, USA. Many laboratories worldwide began the search for lasers using different materials, operating at different wavelengths. In the UK, academia, industry and the central laboratories took up the challenge from the earliest days to develop these systems for a broad range of applications. This historical review looks at the contribution the UK has made to the advancement of the technology, the development of systems and components and their exploitation over the last 60 years.
On any reading of 2 Samuel 9–20, Joab’s ruse with the Tekoite woman in 14:2–21 is a pivotal scene. It portrays the moral reasoning which David adopted in order to allow Absalom to return to Jerusalem from exile in Geshur and a new development in the ethics of David’s administration. The formal character of the king’s decision as an official royal pronouncement sets it apart and makes it especially significant. Within the narrative, the account marks a turning point from which ethical thinking in David’s court never seems to recover. The characters construct a theological ethic that places a premium on the communal “togetherness” of God’s people, and the crown accepts it as justification for overlooking the bloodguilt of one of its most marginalized members. The half-life of this ethic in the ensuing narrative wreaks havoc on the kingdom, through two of its most maniacally vulnerable agents. Once bloodguilt can be overlooked for the sake of togetherness, little remains to prevent members of the community from sanctioning the bloodshed of any member perceived to threaten that togetherness. The troubles that follow after Absalom’s return are not simply due to the mere fact that a maniacal member of God’s estate was returned to run amok. Rather, they arise from the problematic ethic employed to justify his restoration. The king’s response further exacerbates the situation. Contextualization within outworking of divine judgment on the king for his own acts of oppression adds another element to the narrative’s portrayal of causality, part of 2 Sam 8:15–20:26’s critical dramatization of David’s efforts to establish “justice and righteousness for all of his people.”
A single radiocarbon date derived from the Buhl burial in south-central Idaho has frequently been used as a data point for the interpretation of the Western Stemmed Tradition (WST) chronology and technology because of the stemmed biface found in situ with the human remains. AMS dating of bone collagen in 1991 produced an age of 10,675 ± 95 14C BP, immediately postdating the most widely accepted age range for Clovis. The Buhl burial has been cited as evidence that stemmed point technology may have overlapped with Clovis technology in the Intermountain West. We discuss concerns about the radiocarbon date, arguing that even at face value, the calibrated date has minimal overlap with Clovis at the 95.4% range. Furthermore, the C:N ratio of 3.69 in the analyzed collagen is outside of the typical range for well-preserved samples, indicating a postdepositional change in carbon composition, which may make the date erroneously older or younger than the age of the skeleton. Finally, the potential dietary incorporation of small amounts of anadromous fish may indicate that the burial is younger than traditionally accepted. For these reasons, we argue that the Buhl burial cannot be used as evidence of overlap between WST and Clovis.
Cover crops are increasingly being used for weed management, and planting them as diverse mixtures has become an increasingly popular strategy for their implementation. While ecological theory suggests that cover crop mixtures should be more weed suppressive than cover crop monocultures, few experiments have explicitly tested this for more than a single temporal niche. We assessed the effects of cover crop mixtures (5- or 6-species and 14-species mixtures) and monocultures on weed abundance (weed biomass) and weed suppression at the time of cover crop termination. Separate experiments were conducted in Madbury, NH, from 2014 to 2017 for each of three temporal cover-cropping niches: summer (spring planting–summer termination), fall (summer planting–fall termination), and spring (fall planting–subsequent spring termination). Regardless of temporal niche, mixtures were never more weed suppressive than the most weed-suppressive cover crop grown as a monoculture, and the more diverse mixture (14 species) never outperformed the less diverse mixture. Mean weed-suppression levels of the best-performing monocultures in each temporal niche ranged from 97% to 98% for buckwheat (Fagopyrum esculentum Moench) in the summer niche and forage radish (Raphanus sativus L. var. niger J. Kern.) in the fall niche, and 83% to 100% for triticale (×Triticosecale Wittm. ex A. Camus [Secale × Triticum]) in the winter–spring niche. In comparison, weed-suppression levels for the mixtures ranged from 66% to 97%, 70% to 90%, and 67% to 99% in the summer, fall, and spring niches, respectively. Stability of weed suppression, measured as the coefficient of variation, was two to six times greater in the best-performing monoculture compared with the most stable mixture, depending on the temporal niche. Results of this study suggest that when weed suppression is the sole objective, farmers are more likely to achieve better results planting the most weed-suppressive cover crop as a monoculture than a mixture.
Poor physical health in severe mental illness (SMI) remains a major issue for clinical practice.
Aims
To use electronic health records of routinely collected clinical data to determine levels of screening for cardiometabolic disease and adverse health outcomes in a large sample (n = 7718) of patients with SMI, predominantly schizophrenia and bipolar disorder.
Method
We linked data from the Glasgow Psychosis Clinical Information System (PsyCIS) to morbidity records, routine blood results and prescribing data.
Results
There was no record of routine blood monitoring during the preceding 2 years for 16.9% of the cohort. However, monitoring was poorer for male patients, younger patients aged 16–44, those with schizophrenia, and for tests of cholesterol, triglyceride and glycosylated haemoglobin. We estimated that 8.0% of participants had diabetes and that lipids levels, and use of lipid-lowering medication, was generally high.
Conclusions
Electronic record linkage identified poor health screening and adverse health outcomes in this vulnerable patient group. This approach can inform the design of future interventions and health policy.
Weed management is a major challenge in organic crop production, and organic farms generally harbor larger weed populations and more diverse communities compared with conventional farms. However, little research has been conducted on the effects of different organic management practices on weed communities and crop yields. In 2014 and 2015, we measured weed community structure and soybean [Glycine max (L.) Merr.] yield in a long-term experiment that compared four organic cropping systems that differed in nutrient inputs, tillage, and weed management intensity: (1) high fertility (HF), (2) low fertility (LF), (3) enhanced weed management (EWM), and (4) reduced tillage (RT). In addition, we created weed-free subplots within each system to assess the impact of weeds on soybean yield. Weed density was greater in the LF and RT systems compared with the EWM system, but weed biomass did not differ among systems. Weed species richness was greater in the RT system compared with the EWM system, and weed community composition differed between RT and other systems. Our results show that differences in weed community structure were primarily related to differences in tillage intensity, rather than nutrient inputs. Soybean yield was lower in the EWM system compared with the HF and RT systems. When averaged across all four cropping systems and both years, soybean yield in weed-free subplots was 10% greater than soybean yield in the ambient weed subplots that received standard management practices for the systems in which they were located. Although weed competition limited soybean yield across all systems, the EWM system, which had the lowest weed density, also had the lowest soybean yield. Future research should aim to overcome such trade-offs between weed control and yield potential, while conserving weed species richness and the ecosystem services associated with increased weed diversity.
Background: Observational studies have reported an association between childhood obesity and a higher risk of multiple sclerosis (MS). However, the difficulties to fully account for confounding and long recall periods make causal inference from these studies challenging. The objective of this study was to assess the contribution of childhood obesity to the development of MS through Mendelian randomization, which uses genetic associations to minimize the risk of confounding. Methods: We selected 23 independent genetic variants strongly associated with childhood body mass index (BMI) in a genome-wide association study (GWAS) which included 47,541 children. The corresponding effects of these variants on risk of MS were obtained from a GWAS of 14,802 MS cases and 26,703 controls. Standard two-sample Mendelian randomization methods were performed, with additional sensitivity analyses to assess the likelihood of bias from genetic pleiotropy. Results: The inverse-variance weighted MR analysis revealed that one standard deviation increase in childhood BMI increased odds of MS by 26% (odds ratio=1.26, 95% confidence interval 1.10-1.45, p=0.001). There was no significant heterogeneity across the individual estimates. Sensitivity analyses were consistent with the main findings and provided no evidence of pleiotropy. Conclusions: This study provides genetic support of a role for increased childhood BMI in the development of MS.
Determining infectious cross-transmission events in healthcare settings involves manual surveillance of case clusters by infection control personnel, followed by strain typing of clinical/environmental isolates suspected in said clusters. Recent advances in genomic sequencing and cloud computing now allow for the rapid molecular typing of infecting isolates.
Objective:
To facilitate rapid recognition of transmission clusters, we aimed to assess infection control surveillance using whole-genome sequencing (WGS) of microbial pathogens to identify cross-transmission events for epidemiologic review.
Methods:
Clinical isolates of Staphylococcus aureus, Enterococcus faecium, Pseudomonas aeruginosa, and Klebsiella pneumoniae were obtained prospectively at an academic medical center, from September 1, 2016, to September 30, 2017. Isolate genomes were sequenced, followed by single-nucleotide variant analysis; a cloud-computing platform was used for whole-genome sequence analysis and cluster identification.
Results:
Most strains of the 4 studied pathogens were unrelated, and 34 potential transmission clusters were present. The characteristics of the potential clusters were complex and likely not identifiable by traditional surveillance alone. Notably, only 1 cluster had been suspected by routine manual surveillance.
Conclusions:
Our work supports the assertion that integration of genomic and clinical epidemiologic data can augment infection control surveillance for both the identification of cross-transmission events and the inclusion of missed and exclusion of misidentified outbreaks (ie, false alarms). The integration of clinical data is essential to prioritize suspect clusters for investigation, and for existing infections, a timely review of both the clinical and WGS results can hold promise to reduce HAIs. A richer understanding of cross-transmission events within healthcare settings will require the expansion of current surveillance approaches.
High-residue cover crops can facilitate organic no-till vegetable production when cover crop biomass production is sufficient to suppress weeds (>8000 kg ha−1), and cash crop growth is not limited by soil temperature, nutrient availability, or cover crop regrowth. In cool climates, however, both cover crop biomass production and soil temperature can be limiting for organic no-till. In addition, successful termination of cover crops can be a challenge, particularly when cover crops are grown as mixtures. We tested whether reusable plastic tarps, an increasingly popular tool for small-scale vegetable farmers, could be used to augment organic no-till cover crop termination and weed suppression. We no-till transplanted cabbage into a winter rye (Secale cereale L.)-hairy vetch (Vicia villosa Roth) cover crop mulch that was terminated with either a roller-crimper alone or a roller-crimper plus black or clear tarps. Tarps were applied for durations of 2, 4 and 5 weeks. Across tarp durations, black tarps increased the mean cabbage head weight by 58% compared with the no tarp treatment. This was likely due to a combination of improved weed suppression and nutrient availability. Although soil nutrients and biological activity were not directly measured, remaining cover crop mulch in the black tarp treatments was reduced by more than 1100 kg ha−1 when tarps were removed compared with clear and no tarp treatments. We interpret this as an indirect measurement of biological activity perhaps accelerated by lower daily soil temperature fluctuations and more constant volumetric water content under black tarps. The edges of both tarp types were held down, rather than buried, but moisture losses from the clear tarps were greater and this may have affected the efficacy of clear tarps. Plastic tarps effectively killed the vetch cover crop, whereas it readily regrew in the crimped but uncovered plots. However, emergence of large and smooth crabgrass (Digitaria spp.) appeared to be enhanced in the clear tarp treatment. Although this experiment was limited to a single site-year in New Hampshire, it shows that use of black tarps can overcome some of the obstacles to implementing cover crop-based no-till vegetable productions in northern climates.
Many Spanish chroniclers detail violent cultural practices of the indigenous populations they encountered in the Isthmo-Colombian Area; however, lack of physical evidence of interpersonal violence from archaeological contexts has made uncertain the veracity of these claims. At the precolumbian site of Playa Venado in Panama, these accounts of violent mortuary rituals may have influenced the interpretation of the burials encountered in excavations, leading to claims of mutilations and sacrifice, with little or no supporting evidence. This paper considers the physical evidence for interpersonal violence and sacrificial death at Playa Venado based on the burial positioning, demographic composition, and trauma present on the human remains recovered from the site. Analysis of field notes, excavation photos, and the 77 individuals available for study from the site yielded no evidence of perimortem trauma nor abnormal body positioning unexplained by taphonomy. The demography at the site tracked with normal patterns of natural age-at-death at the non-elite site of Cerro Juan Díaz rather than the abnormal patterns seen at the large ceremonial sites of Sitio Conte and El Caño. Therefore, we propose an alternative interpretation of the site as a non-elite cemetery containing evidence of re-use and secondary burial practices associated with ancestor veneration rituals.