To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Monoclonal antibody therapeutics to treat coronavirus disease (COVID-19) have been authorized by the US Food and Drug Administration under Emergency Use Authorization (EUA). Many barriers exist when deploying a novel therapeutic during an ongoing pandemic, and it is critical to assess the needs of incorporating monoclonal antibody infusions into pandemic response activities. We examined the monoclonal antibody infusion site process during the COVID-19 pandemic and conducted a descriptive analysis using data from 3 sites at medical centers in the United States supported by the National Disaster Medical System. Monoclonal antibody implementation success factors included engagement with local medical providers, therapy batch preparation, placing the infusion center in proximity to emergency services, and creating procedures resilient to EUA changes. Infusion process challenges included confirming patient severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) positivity, strained staff, scheduling, and pharmacy coordination. Infusion sites are effective when integrated into pre-existing pandemic response ecosystems and can be implemented with limited staff and physical resources.
Seed retention, and ultimately seed shatter, are extremely important for the efficacy of harvest weed seed control (HWSC) and are likely influenced by various agroecological and environmental factors. Field studies investigated seed-shattering phenology of 22 weed species across three soybean [Glycine max (L.) Merr.]-producing regions in the United States. We further evaluated the potential drivers of seed shatter in terms of weather conditions, growing degree days, and plant biomass. Based on the results, weather conditions had no consistent impact on weed seed shatter. However, there was a positive correlation between individual weed plant biomass and delayed weed seed–shattering rates during harvest. This work demonstrates that HWSC can potentially reduce weed seedbank inputs of plants that have escaped early-season management practices and retained seed through harvest. However, smaller individuals of plants within the same population that shatter seed before harvest pose a risk of escaping early-season management and HWSC.
Although potential links between oxytocin (OT), vasopressin (AVP), and social cognition are well-grounded theoretically, most studies have included all male samples, and few have demonstrated consistent effects of either neuropeptide on mentalizing (i.e. understanding the mental states of others). To understand the potential of either neuropeptide as a pharmacological treatment for individuals with impairments in social cognition, it is important to demonstrate the beneficial effects of OT and AVP on mentalizing in healthy individuals.
In the present randomized, double-blind, placebo-controlled study (n = 186) of healthy individuals, we examined the effects of OT and AVP administration on behavioral responses and neural activity in response to a mentalizing task.
Relative to placebo, neither drug showed an effect on task reaction time or accuracy, nor on whole-brain neural activation or functional connectivity observed within brain networks associated with mentalizing. Exploratory analyses included several variables previously shown to moderate OT's effects on social processes (e.g., self-reported empathy, alexithymia) but resulted in no significant interaction effects.
Results add to a growing literature demonstrating that intranasal administration of OT and AVP may have a more limited effect on social cognition, at both the behavioral and neural level, than initially assumed. Randomized controlled trial registrations: ClinicalTrials.gov; NCT02393443; NCT02393456; NCT02394054.
Ethnohistoric accounts indicate that the people of Australia's Channel Country engaged in activities rarely recorded elsewhere on the continent, including food storage, aquaculture and possible cultivation, yet there has been little archaeological fieldwork to verify these accounts. Here, the authors report on a collaborative research project initiated by the Mithaka people addressing this lack of archaeological investigation. The results show that Mithaka Country has a substantial and diverse archaeological record, including numerous large stone quarries, multiple ritual structures and substantial dwellings. Our archaeological research revealed unknown aspects, such as the scale of Mithaka quarrying, which could stimulate re-evaluation of Aboriginal socio-economic systems in parts of ancient Australia.
Colleges and universities around the world engaged diverse strategies during the COVID-19 pandemic. Baylor University, a community of ˜22,700 individuals, was 1 of the institutions which resumed and sustained operations. The key strategy was establishment of multidisciplinary teams to develop mitigation strategies and priority areas for action. This population-based team approach along with implementation of a “Swiss Cheese” risk mitigation model allowed small clusters to be rapidly addressed through testing, surveillance, tracing, isolation, and quarantine. These efforts were supported by health protocols including face coverings, social distancing, and compliance monitoring. As a result, activities were sustained from August 1 to December 8, 2020. There were 62,970 COVID-19 tests conducted with 1435 people testing positive for a positivity rate of 2.28%. A total of 1670 COVID-19 cases were identified with 235 self-reports. The mean number of tests per week was 3500 with approximately 80 of these positive (11/d). More than 60 student tracers were trained with over 120 personnel available to contact trace, at a ratio of 1 per 400 university members. The successes and lessons learned provide a framework and pathway for similar institutions to mitigate the ongoing impacts of COVID-19 and sustain operations during a global pandemic.
We summarize some of the past year's most important findings within climate change-related research. New research has improved our understanding of Earth's sensitivity to carbon dioxide, finds that permafrost thaw could release more carbon emissions than expected and that the uptake of carbon in tropical ecosystems is weakening. Adverse impacts on human society include increasing water shortages and impacts on mental health. Options for solutions emerge from rethinking economic models, rights-based litigation, strengthened governance systems and a new social contract. The disruption caused by COVID-19 could be seized as an opportunity for positive change, directing economic stimulus towards sustainable investments.
A synthesis is made of ten fields within climate science where there have been significant advances since mid-2019, through an expert elicitation process with broad disciplinary scope. Findings include: (1) a better understanding of equilibrium climate sensitivity; (2) abrupt thaw as an accelerator of carbon release from permafrost; (3) changes to global and regional land carbon sinks; (4) impacts of climate change on water crises, including equity perspectives; (5) adverse effects on mental health from climate change; (6) immediate effects on climate of the COVID-19 pandemic and requirements for recovery packages to deliver on the Paris Agreement; (7) suggested long-term changes to governance and a social contract to address climate change, learning from the current pandemic, (8) updated positive cost–benefit ratio and new perspectives on the potential for green growth in the short- and long-term perspective; (9) urban electrification as a strategy to move towards low-carbon energy systems and (10) rights-based litigation as an increasingly important method to address climate change, with recent clarifications on the legal standing and representation of future generations.
Social media summary
Stronger permafrost thaw, COVID-19 effects and growing mental health impacts among highlights of latest climate science.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
Analysis of food webs is important for defining functional components of ecosystems, but dietary data are often difficult to obtain and coarsely characterised. We compared three methods of rainbow trout (Oncorhynchus mykiss (Walbaum); Salmoniformes: Salmonidae) and prickly sculpin (Cottus asper Richardson; Scorpaeniformes: Cottidae) gut content analysis: traditional morphological taxonomy of prey items, genetic sequencing of individual prey items, and next-generation sequencing of homogenised gut contents. Prey analysis of invertebrates by morphological identification allowed order-level classifications and produced ecologically important count and mass data. Sequencing individual specimens provided greater taxonomic resolution, while next-generation sequencing of stomach contents revealed more prey diversity in the diets of both fish species as it was possible to detect prey that were degraded beyond visual recognition. Both fish species exhibited generalist feeding characteristics; however, terrestrial Insecta were a large diet component for rainbow trout. This study demonstrates an efficient approach for prey analysis using molecular techniques that complement traditional taxonomy.
Personal protective equipment (PPE) is worn by prehospital providers (PHPs) for protection from hazardous exposures. Evidence regarding the ability of PHPs to perform resuscitation procedures has been described in adult but not pediatric models. This study examined the effects of PPE on the ability of PHPs to perform resuscitation procedures on pediatric patients.
This prospective study was conducted at a US simulation center. Paramedics wore normal attire at the baseline session and donned full Level B PPE for the second session. During each session, they performed timed sets of psychomotor tasks simulating clinical care of a critically ill pediatric patient. The difference in time to completion between baseline and PPE sessions per task was examined using Wilcoxon signed-rank tests.
A total of 50 paramedics completed both sessions. Median times for task completion at the PPE sessions increased significantly from baseline for several procedures: tracheal intubation (+4.5 s; P = 0.01), automated external defibrillator (AED) placement (+9.5 s; P = 0.01), intraosseous line insertion (+7 s; P < 0.0001), tourniquet (+8.5 s; P < 0.0001), intramuscular injection (+21-23 s, P < 0.0001), and pulse oximetry (+4 s; P < 0.0001). There was no significant increase in completion time for bag-mask ventilation or autoinjector use.
PPE did not have a significant impact on PHPs performing critical tasks while caring for a pediatric patient with a highly infectious or chemical exposure. This information may guide PHPs faced with the situation of resuscitating children while wearing Level B PPE.
The SCN5A gene is implicated in many arrhythmogenic and cardiomyopathic processes. We identified a novel SCN5A variant in a family with significant segregation in individuals affected with progressive sinus and atrioventricular nodal disease, atrial arrhythmia, dilated cardiomyopathy, and early sudden cardiac arrest.
A patient pedigree was created following the clinical evaluation of three affected individuals, two monozygotic twins and a paternal half-brother, which lead to the evaluation of a paternal half-sister (four siblings with the same father and three mothers) all of whom experienced varying degrees of atrial arrhythmias, conduction disease, and dilated cardiomyopathy in addition to a paternal history of unexplained death in his 50s with similar autopsy findings. The index male underwent sequencing of 58 genes associated with cardiomyopathies. Sanger sequencing was used to provide data for bases with insufficient coverage and for bases in some known regions of genomic segmental duplications. All clinically significant and novel variants were confirmed by independent Sanger sequencing.
All relatives tested were shown to have the same SCN5A variant of unknown significance (p. Asp197His) and the monozygotic twins shared a co-occurring NEXN (p. Glu575*). Segregation analysis demonstrates likely pathogenic trait for the SCN5A variant with an additional possible role for the NEXN variant in combination.
There is compelling clinical evidence suggesting that the SCN5A variant p. Asp197His may be re-classified as likely pathogenic based on the segregation analysis of our family of interest. Molecular mechanism studies are pending.
The initial classic Fontan utilising a direct right atrial appendage to pulmonary artery anastomosis led to numerous complications. Adults with such complications may benefit from conversion to a total cavo-pulmonary connection, the current standard palliation for children with univentricular hearts.
A single institution, retrospective chart review was conducted for all Fontan conversion procedures performed from July, 1999 through January, 2017. Variables analysed included age, sex, reason for Fontan conversion, age at Fontan conversion, and early mortality or heart transplant within 1 year after Fontan conversion.
A total of 41 Fontan conversion patients were identified. Average age at Fontan conversion was 24.5 ± 9.2 years. Dominant left ventricular physiology was present in 37/41 (90.2%) patients. Right-sided heart failure occurred in 39/41 (95.1%) patients and right atrial dilation was present in 33/41 (80.5%) patients. The most common causes for Fontan conversion included atrial arrhythmia in 37/41 (90.2%), NYHA class II HF or greater in 31/41 (75.6%), ventricular dysfunction in 23/41 (56.1%), and cirrhosis or fibrosis in 7/41 (17.1%) patients. Median post-surgical follow-up was 6.2 ± 4.9 years. Survival rates at 30 days, 1 year, and greater than 1-year post-Fontan conversion were 95.1, 92.7, and 87.8%, respectively. Two patients underwent heart transplant: the first within 1 year of Fontan conversion for heart failure and the second at 5.3 years for liver failure.
Fontan conversion should be considered early when atrial arrhythmias become common rather than waiting for severe heart failure to ensue, and Fontan conversion can be accomplished with an acceptable risk profile.
Elevated left ventricular end diastolic pressure is a risk factor for ventricular arrhythmias in patients with tetralogy of Fallot. The objective of this retrospective study was to identify echocardiographic measures associated with left ventricular end diastolic pressure >12 mmHg in this population. Repaired tetralogy of Fallot patients age ≥13 years, who underwent a left heart catheterisation within 7 days of having an echocardiogram were evaluated. Univariate comparison was made in echocardiographic and clinical variables between patients with left ventricular end diastolic pressure >12 versus ≤12 mmHg. Ninety-four patients (54% male) with a median age of 24.6 years were included. Thirty-four (36%) had left ventricular end diastolic pressure >12 mmHg. Patients with left ventricular end diastolic pressure >12mmHg were older (median 32.9 versus 24.0 years, p = 0.02), more likely to have a history of an aortopulmonary shunt (62% versus 38%, p = 0.03), and have a diagnosis of hypertension (24% versus 7%, p = 0.03) compared to those with left ventricular end diastolic pressure ≤12 mmHg. There were no significant differences in mitral valve E/A ratio, annular e’ velocity, or E/e’ ratio between patients with left ventricular end diastolic pressure >12 versus ≤12 mmHg. Patients with left ventricular end diastolic pressure >12mmHg had larger left atrial area (mean 17.7 versus 14.0 cm2, p = 0.03) and larger left atrium anterior–posterior diameter (mean 36.0 versus 30.6 mm, p = 0.004). In conclusion, typical echocardiographic measures of left ventricular diastolic dysfunction may not be reliable in tetralogy of Fallot patients. Prospective studies with the use of novel echocardiographic measures are needed.
To determine the scope, source, and mode of transmission of a multifacility outbreak of extensively drug-resistant (XDR) Acinetobacter baumannii.
SETTING AND PARTICIPANTS
Residents and patients in skilled nursing facilities, long-term acute-care hospital, and acute-care hospitals.
A case was defined as the incident isolate from clinical or surveillance cultures of XDR Acinetobacter baumannii resistant to imipenem or meropenem and nonsusceptible to all but 1 or 2 antibiotic classes in a patient in an Oregon healthcare facility during January 2012–December 2014. We queried clinical laboratories, reviewed medical records, oversaw patient and environmental surveillance surveys at 2 facilities, and recommended interventions. Pulsed-field gel electrophoresis (PFGE) and molecular analysis were performed.
We identified 21 cases, highly related by PFGE or healthcare facility exposure. Overall, 17 patients (81%) were admitted to either long-term acute-care hospital A (n=8), or skilled nursing facility A (n=8), or both (n=1) prior to XDR A. baumannii isolation. Interfacility communication of patient or resident XDR status was not performed during transfer between facilities. The rare plasmid-encoded carbapenemase gene blaOXA-237 was present in 16 outbreak isolates. Contact precautions, chlorhexidine baths, enhanced environmental cleaning, and interfacility communication were implemented for cases to halt transmission.
Interfacility transmission of XDR A. baumannii carrying the rare blaOXA-237 was facilitated by transfer of affected patients without communication to receiving facilities.
OBJECTIVES/SPECIFIC AIMS: Clinical guidelines recommend using predicted atherosclerotic cardiovascular disease (ASCVD) risk to inform treatment decisions. The objective was to compare the contribution of changes in modifiable risk factors Versus aging to the development of high 10-year predicted ASCVD risk. METHODS/STUDY POPULATION: Prospective follow-up of the Jackson Heart Study, an exclusively African-American cohort, at visit 1 (2000–2004) and visit 3 (2009–2012). Analyses included 1115 African-American participants without a high 10-year predicted ASCVD risk (<7.5%), hypertension, diabetes, or ASCVD at visit 1. We used the Pooled Cohort equations to calculate the incidence of high (≥7.5%) 10-year predicted ASCVD risk at visit 3. We recalculated the percentage with a high 10-year predicted ASCVD risk at visit 3 assuming each risk factor [age, systolic blood pressure (SBP), antihypertensive medication use, diabetes, smoking, total and high-density lipoprotein cholesterol], one at a time, did not change from visit 1. RESULTS/ANTICIPATED RESULTS: The mean age at visit 1 was 45.2±9.5 years. Overall, 30.9% (95% CI 28.3%–33.4%) of participants developed high 10-year predicted ASCVD risk. Aging accounted for 59.7% (95% CI 54.2%–65.1%) of the development of high 10-year predicted ASCVD risk compared with 32.8% (95% CI 27.0%–38.2%) for increases in SBP or antihypertensive medication initiation and 12.8% (95% CI 9.6%–16.5%) for incident diabetes. Among participants <50 years, the contribution of increases in SBP or antihypertensive medication initiation was similar to aging. DISCUSSION/SIGNIFICANCE OF IMPACT: Increases in SBP and antihypertensive medication initiation are major contributors to the development of high 10-year predicted ASCVD risk in African Americans, particularly among younger adults.
High-temperature X-ray diffraction with concurrent gas chromatography (GC) was used to study cobalt disulfide cathode pellets disassembled from thermal batteries. When CoS2 cathode materials were analyzed in an air environment, oxidation of the K(Br, Cl) salt phase in the cathode led to the formation of K2SO4 that subsequently reacted with the pyrite-type CoS2 phase leading to cathode decomposition between ~260 and 450 °C. Independent thermal analysis experiments, i.e. simultaneous thermogravimetric analysis/differential scanning calorimetry/mass spectrometry (MS), augmented the diffraction results and support the overall picture of CoS2 decomposition. Both gas analysis measurements (i.e. GC and MS) from the independent experiments confirmed the formation of SO2 off-gas species during breakdown of the CoS2. In contrast, characterization of the same cathode material under inert conditions showed the presence of CoS2 throughout the entire temperature range of analysis.
Decontaminating patients who have been exposed to hazardous chemicals can directly benefit the patients’ health by saving lives and reducing the severity of toxicity. While the importance of decontaminating patients to prevent the spread of contamination has long been recognized, its role in improving patient health outcomes has not been as widely appreciated. Acute chemical toxicity may manifest rapidly—often minutes to hours after exposure. Patient decontamination and emergency medical treatment must be initiated as early as possible to terminate further exposure and treat the effects of the dose already absorbed. In a mass exposure chemical incident, responders and receivers are faced with the challenges of determining the type of care that each patient needs (including medical treatment, decontamination, and behavioral health support), providing that care within the effective window of time, and protecting themselves from harm. The US Department of Health and Human Services and Department of Homeland Security have led the development of national planning guidance for mass patient decontamination in a chemical incident to help local communities meet these multiple, time-sensitive health demands. This report summarizes the science on which the guidance is based and the principles that form the core of the updated approach. (Disaster Med Public Health Preparedness. 2014;0:1–7)
This study aimed to replicate a previous study which showed that endogenous opioid release, following an oral dose of amphetamine, can be detected in the living human brain using [11C]carfentanil positron emission tomography (PET) imaging. Nine healthy volunteers underwent two [11C]carfentanil PET scans, one before and one 3 h following oral amphetamine administration (0.5 mg/kg). Regional changes in [11C]carfentanil BPND from pre- to post-amphetamine were assessed. The amphetamine challenge led to significant reductions in [11C]carfentanil BPND in the putamen, thalamus, frontal lobe, nucleus accumbens, anterior cingulate, cerebellum and insula cortices, replicating our earlier findings. None of the participants experienced significant euphoria/‘high’, supporting the use of oral amphetamine to characterize in vivo endogenous opioid release following a pharmacological challenge. [11C]carfentanil PET is able to detect changes in binding following an oral amphetamine challenge that reflects endogenous opioid release and is suitable to characterize the opioid system in neuropsychiatric disorders.