To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: In November 2020, bamlanivimab received emergency use authorization (EUA) to treat patients with early, mild-to-moderate COVID-19 who are at high risk of progression. Montefiore Medical Center serves an economically underserved community of >1.4 million residents in the Bronx, New York. Montefiore’s antimicrobial stewardship team (AST) developed a multidisciplinary treatment pathway for patients meeting EUA criteria: (1) outpatients and hospital associates and (2) acute-care patients (EDs or inpatient). Methods: The Montefiore AST established a centralized process for screening high-risk COVID-19 patients 7 days a week. Referrals were sent by e-mail from occupational health, primary care practices, specialty practices, emergency departments, and urgent care centers. Patients were screened in real time and were treated in the ED or a newly established infusion center within 24 hours. After infusion, all patients received phone calls from nurses and had an infectious diseases televisit. Demographics, clinical symptoms, subsequent ED visit or hospital admission, and timing from infusion to ED or hospitalization were obtained from the electronic health record. Results: In total, 281 high-risk patients (median age, 62 years; 57% female) received bamlanivimab at the infusion center or in the acute-care setting between December 2, 2020, and January 27, 2021 (Table 1). The number of treated patients increased weekly (Figure 1). Also, 62% were Hispanic or black, and 96% met EUA criteria. Furthermore, 51 (18%) were referred from occupational health, 205 (73%) were referred from the community, and 25 (9%) were inpatients (https://www.fda.gov/media/143605/download). All patients were successfully infused without adverse reactions. In addition, 23 patients (8.2%) were hospitalized and 6 (2.1%) visited EDs within 30 days of treatment. The average number of days between symptom onset and infusion was 4.9. The median age of admitted versus nonadmitted patients was 68 years versus 61.5 years (P = .07). Conclusions: An AST-coordinated bamlanivimab treatment program successfully treated multiple high-risk COVID-19 patients and potentially reduced hospitalizations. However, the effort, personnel, and resources required are significant. Dedicated hospital investment is necessary for maximal success.
All patients discharged from our Paediatric Liaison Team will have an electronic discharge summary sent to their GP within 24 hours by January 2020.
Writing a GP discharge summary is an essential part of patient care and is a patient safety issue if not completed on time. The NHS England Standard Contract states discharge summaries should be completed and sent to a GP within 24 hours of discharge. Baseline data showed our median time between discharge and a GP summary being sent off as 3 days and a baseline survey of staff in our team rated our discharge summary process as inefficient and time consuming. At baseline our discharge summary was typed on a word document which was then emailed to admin staff who would print and post to the GP. Our electronic patient record had an inbuilt discharge notification function that generates and sends summaries via email to the GP that other teams in the trust were already using.
We utilised the Model for Improvement Quality Improvement methodology. Initially we created a driver diagram breaking the process of discharge summary writing into its constituent components to generate change ideas. We then tested out these out in plan, do, study, act (PDSA) cycles whilst continually collecting data using a shared team spreadsheet to monitor for change.
We found that switching to electronically sent discharge notifications improved our time from discharge to a summary being sent to the GP from a median of 3 days to 1 day. We noticed that alongside a shared team spreadsheet monitoring when summaries were written we also reduced variation of time between discharge and a summary from a range of 0-27 days (with an outlier of 161) to 0-9 days.
On average the time from discharge to a summary being written met the standard and we reduced the variability of time delay by using an electronic notification. However only 56% of summaries were sent within the 24 hour limit. Key factors for continued variability identified during regular team meetings included overall caseload of patients, amount of staff on shift and technical issues with the form. Our plan for sustainability is to discuss monthly in the team meeting any discharges that took longer than 1 day and target further PDSA cycles to these issues.
To describe and compare psychoactive substance misuse help-seeking among transgender (trans) and cisgender (cis) participants from a large multi-national cross-sectional survey.
Trans people experience stressors related to their minority status which have been associated with increased rates of psychoactive substance use and related harm. Despite this, there is a paucity of evidence relating to the treatment needs of trans people who use psychoactive substances, beyond a small body of literature describing a culture of transphobic hostility in general substance misuse services. This paper aims to describe and compare psychoactive substance misuse help-seeking among trans and cis participants from a large multi-national cross-sectional survey.
Over 180,000 participants, recruited from the world's largest annual survey of drug use - the Global Drug Survey (GDS) - during 2018 and 2019, reported use of a range of psychoactive substances in the preceding 12 months. Five gender groups (118,157 cis men, 64,319 cis women, 369 trans men, 353 trans women and 1,857 non-binary people) were compared, using Chi-square and z-tests with Bonferroni correction, on items relating to the desire to use less psychoactive substances and the need to seek help to achieve this. Respondents from GDS 2018 were also assessed for substance dependence. Binary logistic regression was used to compare gender groups on self-reported substance dependence to frame the help-seeking analyses.
Trans respondents (n = 1,710) to GDS 2018 were significantly more likely than cis respondents to report use of illicit substances (OR = 1.66-2.93) and dependence on cannabis (OR = 2.39), alcohol (OR = 3.28) and novel psychoactive substances (OR = 4.60). In the combined GDS 2018 and 2019 dataset, there were no significant differences between trans (n = 2,579) and cis (n = 182,476) participants on the desire to reduce substance use. However, among those who did report wanting to use less, non-binary people and trans women were most likely to want help to achieve this.
Trans respondents reported a greater need for help with reducing substance use than cis respondents. Given the deficit of specialist services for psychoactive substance users who are trans, there is a need for a more thorough understanding of the barriers and facilitators to their engagement in general substance misuse services. In the interim, substance misuse service providers require education about gender minority status to help meet the needs of trans clients.
While donor-conceived children have similar mental health outcomes compared to spontaneously conceived children, there is an inconsistency between studies investigating mental health outcomes of donor-conceived people in adulthood. This study is an online health survey that was completed by 272 donor sperm-conceived adults and 877 spontaneously conceived adults from around the world. Donor sperm-conceived adults had increased diagnoses of attention deficit disorder (P = 0.004), and autism (P = 0.044) in comparison to those conceived spontaneously. Donor sperm-conceived adults self-reported increased incidences of seeing a mental health professional (P < 0.001), identity formation problems (P < 0.001), learning difficulties (P < 0.001), panic attacks (P = 0.038), recurrent nightmares (sperm P = 0.038), and alcohol/drug dependency (P = 0.037). DASS-21 analysis revealed that donor sperm-conceived adults were also more stressed than those conceived spontaneously (P = 0.013). Both donor sperm and spontaneously conceived cohorts were matched for sex, age, height, alcohol consumption, smoking, exercise, own fertility, and maternal smoking. The increase in adverse mental health outcomes is consistent with some studies of donor-conceived adult mental health outcomes. These results are also consistent with the Developmental Origins of Health and Disease (DOHaD) phenomenon that has linked adverse perinatal outcomes, which have been observed in donor-conceived neonates, to increased risk of chronic disease, including mental health. Further work is required to reconcile our observations in adults to contrary observations reported in donor-conceived children.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
Donor-conceived neonates have poorer birth outcomes, including low birth weight and preterm delivery that are associated with poorer long-term health in adulthood through the developmental origins of health and disease (DOHaD) theory. The aim of this study was to conduct the first investigation of the adult health outcomes of donor-conceived people. An online health survey was completed by 272 donor sperm-conceived adults and 877 spontaneously conceived adults from around the world. Donor and spontaneously conceived groups were matched for age, sex, height, smoking, alcohol consumption, exercise, own fertility and maternal smoking. Donor sperm-conceived adults had significantly higher reports of being diagnosed with type 1 diabetes (P = 0.031), thyroid disease (P = 0.031), acute bronchitis (P = 0.008), environmental allergies (P = 0.046), sleep apnoea (P = 0.037) and having ear tubes/grommets surgically implanted (P = 0.046). This is the first study to investigate the health outcomes of adult donor sperm-conceived people. Donor sperm-conceived adults self-reported elevated frequencies of various health conditions. The outcomes are consistent with birth defect data from donor sperm treatment and are consistent with the DOHaD linking perturbed early growth and chronic disease in adulthood.
From an evolutionary perspective, aggression is viewed as a flexible context-specific adaption that was selected for because it enhanced the survival and reproductive success of ancestral humans. Evolutionary pressures have impinged differentially on the sexes, leading to the hypothesis that sex differences should be manifest in aggressive behavior. Evidence to date supports key predictions made from sexual selection theory that women direct their aggression primarily toward same-sex competitors, which peaks as mate competition intensifies. Women demonstrate a notable preference across cultures for more indirect, as opposed to direct, forms of intrasexual rivalry as a likely consequence of heightened obligatory parental investment, lower lifetime reproductive potential, and the greater importance of maternal survival for the health and longevity of offspring. An evolutionary approach can yield unique insights into the sex-differentiated functions, development, and outcomes of aggressive behavior.
UK Biobank is a well-characterised cohort of over 500 000 participants including genetics, environmental data and imaging. An online mental health questionnaire was designed for UK Biobank participants to expand its potential.
Describe the development, implementation and results of this questionnaire.
An expert working group designed the questionnaire, using established measures where possible, and consulting a patient group. Operational criteria were agreed for defining likely disorder and risk states, including lifetime depression, mania/hypomania, generalised anxiety disorder, unusual experiences and self-harm, and current post-traumatic stress and hazardous/harmful alcohol use.
A total of 157 366 completed online questionnaires were available by August 2017. Participants were aged 45–82 (53% were ≥65 years) and 57% women. Comparison of self-reported diagnosed mental disorder with a contemporary study shows a similar prevalence, despite respondents being of higher average socioeconomic status. Lifetime depression was a common finding, with 24% (37 434) of participants meeting criteria and current hazardous/harmful alcohol use criteria were met by 21% (32 602), whereas other criteria were met by less than 8% of the participants. There was extensive comorbidity among the syndromes. Mental disorders were associated with a high neuroticism score, adverse life events and long-term illness; addiction and bipolar affective disorder in particular were associated with measures of deprivation.
The UK Biobank questionnaire represents a very large mental health survey in itself, and the results presented here show high face validity, although caution is needed because of selection bias. Built into UK Biobank, these data intersect with other health data to offer unparalleled potential for crosscutting biomedical research involving mental health.
Harvest weed seed control (HWSC) technology, such as impact mills that destroy weed seeds in seed-bearing chaff material during grain crop harvest, has been highly effective in Australian cropping systems. However, the impact mill has never been tested in soybeans [Glycine max (L.) Merr.] and weeds common to soybean production systems in the midwestern and mid-Atlantic United States. We conducted stationary testing of Harrington Seed Destructor (HSD) impact mill and winter burial studies during 2015 to 2016 and 2017 to 2018 to determine (1) the efficacy of the impact mill to target weed seeds of seven common weeds in midwestern and five in the mid-Atlantic United States, and (2) the fate of impact mill–processed weed seeds after winter burial. The impact mill was highly effective in destroying seeds of all the species tested, with 93.5% to 99.8% weed seed destruction in 2015 and 85.6% to 100% in 2017. The weak relationships (positive or negative) between seed size and seed destruction by impact mill and the high percentage of weed seed destruction by impact mill across all seed sizes indicate that the biological or practical effect of seed size is limited. The impact mill–processed weed seeds that retained at least 50% of their original size, labeled as potentially viable seed (PVS), were buried for 90 d overwinter to determine the fate of weed seeds after winter burial. At 90 d after burial, the impact mill–processed PVS were significantly less viable than unprocessed control seeds, indicating that impact mill processing physically damaged the PVS and promoted seed mortality overwinter. A very small fraction (<0.4%) of the total weed seed processed by the impact mill remained viable after winter burial. The results presented here demonstrate that the impact mill is highly effective in increasing seed mortality and could potentially be used as an HWSC tactic for weed management in this region.
Major depressive disorder and neuroticism (Neu) share a large genetic basis. We sought to determine whether this shared basis could be decomposed to identify genetic factors that are specific to depression.
We analysed summary statistics from genome-wide association studies (GWAS) of depression (from the Psychiatric Genomics Consortium, 23andMe and UK Biobank) and compared them with GWAS of Neu (from UK Biobank). First, we used a pairwise GWAS analysis to classify variants as associated with only depression, with only Neu or with both. Second, we estimated partial genetic correlations to test whether the depression's genetic link with other phenotypes was explained by shared overlap with Neu.
We found evidence that most genomic regions (25/37) associated with depression are likely to be shared with Neu. The overlapping common genetic variance of depression and Neu was genetically correlated primarily with psychiatric disorders. We found that the genetic contributions to depression, that were not shared with Neu, were positively correlated with metabolic phenotypes and cardiovascular disease, and negatively correlated with the personality trait conscientiousness. After removing shared genetic overlap with Neu, depression still had a specific association with schizophrenia, bipolar disorder, coronary artery disease and age of first birth. Independent of depression, Neu had specific genetic correlates in ulcerative colitis, pubertal growth, anorexia and education.
Our findings demonstrate that, while genetic risk factors for depression are largely shared with Neu, there are also non-Neu-related features of depression that may be useful for further patient or phenotypic stratification.
The rocky shores of the north-east Atlantic have been long studied. Our focus is from Gibraltar to Norway plus the Azores and Iceland. Phylogeographic processes shape biogeographic patterns of biodiversity. Long-term and broadscale studies have shown the responses of biota to past climate fluctuations and more recent anthropogenic climate change. Inter- and intra-specific species interactions along sharp local environmental gradients shape distributions and community structure and hence ecosystem functioning. Shifts in domination by fucoids in shelter to barnacles/mussels in exposure are mediated by grazing by patellid limpets. Further south fucoids become increasingly rare, with species disappearing or restricted to estuarine refuges, caused by greater desiccation and grazing pressure. Mesoscale processes influence bottom-up nutrient forcing and larval supply, hence affecting species abundance and distribution, and can be proximate factors setting range edges (e.g., the English Channel, the Iberian Peninsula). Impacts of invasive non-native species are reviewed. Knowledge gaps such as the work on rockpools and host–parasite dynamics are also outlined.
Field experiments were conducted in 2016 and 2017 in Champaign County, IL, to study a waterhemp [Amaranthus tuberculatus (Moq.) J. D. Sauer] population (CHR) resistant to 2,4-D and 4-hydroxyphenylpyruvate dioxygenase (HPPD)-, photosystem II–, acetolactate synthase (ALS)-, and protoporphyrinogen oxidase–inhibiting herbicides. Two field experiments were designed to investigate the efficacy of very-long-chain fatty-acid (VLCFA)-inhibiting herbicides, including a comparison of active ingredients at labeled use rates and a rate titration experiment. Amaranthus tuberculatus density and control were evaluated at 28 and 42 d after treatment (DAT). Nonencapsulated acetochlor, alachlor, and pyroxasulfone provided the greatest PRE control of CHR (56% to 75%) at 28 DAT, while metolachlor, S-metolachlor, dimethenamid-P, and encapsulated acetochlor provided less than 27% control. In the rate titration study, nonencapsulated acetochlor controlled CHR more than equivalent field use rates of S-metolachlor. Subsequent dose–response experiments with acetochlor, S-metolachlor, dimethenamid-P, and pyroxasulfone in the greenhouse included three multiple herbicide–resistant (MHR) A. tuberculatus populations: CHR-M6 (progeny generated from CHR), MCR-NH40 (progeny generated from Mclean County, IL), and ACR (Adams County, IL), in comparison with a sensitive population (WUS). Both CHR-M6 and MCR-NH40 are MHR to atrazine and HPPD, and ALS inhibitors and demonstrated higher survival rates (LD50) to S-metolachlor, acetochlor, dimethenamid-P, or pyroxasulfone than ACR (atrazine resistant but HPPD-inhibitor sensitive) and WUS. Based on biomass reduction (GR50), resistant to sensitive (R:S) ratios between CHR-M6 and WUS were 7.5, 6.1, 5.5, and 2.9 for S-metolachlor, acetochlor, dimethenamid-P, and pyroxasulfone, respectively. Values were greater for MCR-NH40 than CHR-M6, and ACR was the most sensitive to all VLCFA inhibitors tested. Complete control of all populations was achieved at or below a field use rate of acetochlor. In summary, field studies demonstrated CHR is not controlled by several VLCFA-inhibiting herbicides. Greenhouse dose–response experiments corroborated field results and generated R:S ratios (LD50) ranging from 4.5 to 64 for CHR-M6 and MCR-NH40 among the four VLCFA-inhibiting herbicides evaluated.
Experiments were initiated to characterize a waterhemp population (CHR) discovered in a central Illinois corn field after it was not controlled by the 4-hydroxyphenylpyruvate dioxygenase (HPPD) inhibitor topramezone. Field experiments conducted during 2014–2015 indicated that acetolactate synthase (ALS)-, protoporphyrinogen oxidase (PPO)-, photosystem II (PSII)-, and HPPD-inhibiting herbicides and the synthetic auxin 2,4-D did not control the CHR population. Laboratory experiments confirmed target site–based resistance mechanisms to ALS- and PPO-inhibiting herbicides. Herbicide doses required to reduce dry biomass 50% (GR50) were determined in greenhouse dose–response experiments, and indicated 16-fold resistance to the HPPD inhibitor mesotrione, 9.5-fold resistance to the synthetic auxin 2,4-D, and 252-fold resistance to the PSII inhibitor atrazine. Complementary results from field, laboratory, and greenhouse investigations indicate that the CHR population has evolved resistance to herbicides from five sites of action (SOAs): ALS-, PPO-, PSII-, and HPPD-inhibiting herbicides and 2,4-D. Herbicide use history for the field in which CHR was discovered indicates no previous use of 2,4-D.
We sought to define the prevalence of echocardiographic abnormalities in long-term survivors of paediatric hematopoietic stem cell transplantation and determine the utility of screening in asymptomatic patients. We analysed echocardiograms performed on survivors who underwent hematopoietic stem cell transplantation from 1982 to 2006. A total of 389 patients were alive in 2017, with 114 having an echocardiogram obtained ⩾5 years post-infusion. A total of 95 patients had echocardiogram performed for routine surveillance. The mean time post-hematopoietic stem cell transplantation was 13 years. Of 95 patients, 77 (82.1%) had ejection fraction measured, and 10/77 (13.0%) had ejection fraction z-scores ⩽−2.0, which is abnormally low. Those patients with abnormal ejection fraction were significantly more likely to have been exposed to anthracyclines or total body irradiation. Among individuals who received neither anthracyclines nor total body irradiation, only 1/31 (3.2%) was found to have an abnormal ejection fraction of 51.4%, z-score −2.73. In the cohort of 77 patients, the negative predictive value of having a normal ejection fraction given no exposure to total body irradiation or anthracyclines was 96.7% at 95% confidence interval (83.3–99.8%). Systolic dysfunction is relatively common in long-term survivors of paediatric hematopoietic stem cell transplantation who have received anthracyclines or total body irradiation. Survivors who are asymptomatic and did not receive radiation or anthracyclines likely do not require surveillance echocardiograms, unless otherwise indicated.
Greenhouse experiments were conducted to quantify resistance levels to the 4-hydroxyphenyl-pyruvate dioxygenase (HPPD)-inhibiting herbicides mesotrione (MES) and isoxaflutole (IFT) in NEB (Nebraska HPPD- and atrazine-resistant) and SIR (Stanford, IL, HPPD- and atrazine-resistant) waterhemp [Amaranthus tuberculatus (Moq.) J. D. Sauer] populations. These populations differ in their field-use histories and resistance levels to MES. Foliar growth responses were compared with ACR (HPPD sensitive; metabolic atrazine-resistant) and SEN (sensitive to HPPD and photosystem II [PSII] inhibitors). A greenhouse dose–response study was conducted with each herbicide at two POST timings: early (EPOST) (5 cm; 4 to 5 true leaves) and POST (10 cm; 8 to 9 true leaves). At the EPOST timing, SIR was 10-fold resistant to IFT and 32-fold resistant to MES, while NEB was 4-fold resistant to IFT and 7-fold resistant to MES when compared with ACR. At the POST timing, SIR was 17-fold resistant to IFT and 21-fold resistant to MES, while NEB was 3-fold resistant to IFT and 7-fold resistant to MES when compared with ACR. Results overall indicated greater fold-resistance levels to MES relative to IFT at each timing. However, POST treatments to SIR showed contrasting effects on resistance levels relative to EPOST. To investigate potential management strategies for resistant A. tuberculatus populations, a POST interaction study was conducted using combinations of metribuzin and either IFT or MES. A metribuzin rate (191 g ai ha−1) causing an approximately 20% biomass reduction was chosen for interaction studies and combined with varying rates of either IFT or MES. Results indicated 52.5 g ai ha−1 of MES combined with metribuzin displayed a synergistic effect on biomass reduction in SIR. However, other combinations of either MES or IFT and metribuzin resulted in additive effects on biomass reduction in both HPPD-resistant populations. These results provide insights into the joint activity between HPPD and PSII inhibitors for controlling metabolism-based, multiple herbicide–resistant A. tuberculatus.