We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Herbicide resistance has been studied extensively in agronomic crops across North America but is rarely examined in vegetables. It is widely assumed that the limited number of registered herbicides combined with the adoption of diverse weed management strategies in most vegetable crops effectively inhibits the development of resistance. It is difficult to determine if resistance is truly less common in vegetable crops or if the lack of reported cases is due to the lack of resources focused on detection. This review highlights incidences of resistance that are thought to have arisen within vegetable crops. It also includes situations where herbicide-resistant weeds were likely selected for within agronomic crops but became a problem when vegetables were grown in sequence or in adjacent fields. Occurrence of herbicide resistance can have severe consequences for vegetable growers, and resistance management plans should be adopted to limit selection pressure. This review also highlights resistance management techniques that should slow the development and spread of herbicide resistance in vegetable crops.
The COVID-19 pandemic has increased rates of psychological distress and burnout in healthcare staff. How can we understand our experiences of the pandemic? We reflect on the experiences of psychiatry trainees in two north London mental health trusts. From a psychoanalytic understanding, states of extreme anxiety can lead to a manic defence and functioning in the paranoid–schizoid position. This position is derived from object relations theory and is characterised by binary thinking, splitting, projection, defensiveness and ‘knee-jerk’ decision-making. This can affect our perceptions, responses to others, relationships and ability to function and, therefore, our clinical practice and well-being. We consider the importance of recognising these processes and of organisational containment and having space to reflect. This supports functioning in the depressive position, a state of mind where we can tolerate anxiety, address difficult realities and develop new ideas. We hope these understandings are helpful to our colleagues in all professions.
Among outpatients with coronavirus disease 2019 (COVID-19) due to the severe acute respiratory coronavirus virus 2 (SARS-CoV-2) δ (delta) variant who did and did not receive 2 vaccine doses at 7 days after symptom onset, there was no difference in viral shedding (cycle threshold difference 0.59, 95% CI, −4.68 to 3.50; P = .77) with SARS-CoV-2 cultured from 2 (7%) of 28 and 1 (4%) of 26 outpatients, respectively.
Background:Candida auris is an emerging multidrug-resistant yeast that is transmitted in healthcare facilities and is associated with substantial morbidity and mortality. Environmental contamination is suspected to play an important role in transmission but additional information is needed to inform environmental cleaning recommendations to prevent spread. Methods: We conducted a multiregional (Chicago, IL; Irvine, CA) prospective study of environmental contamination associated with C. auris colonization of patients and residents of 4 long-term care facilities and 1 acute-care hospital. Participants were identified by screening or clinical cultures. Samples were collected from participants’ body sites (eg, nares, axillae, inguinal creases, palms and fingertips, and perianal skin) and their environment before room cleaning. Daily room cleaning and disinfection by facility environmental service workers was followed by targeted cleaning of high-touch surfaces by research staff using hydrogen peroxide wipes (see EPA-approved product for C. auris, List P). Samples were collected immediately after cleaning from high-touch surfaces and repeated at 4-hour intervals up to 12 hours. A pilot phase (n = 12 patients) was conducted to identify the value of testing specific high-touch surfaces to assess environmental contamination. High-yield surfaces were included in the full evaluation phase (n = 20 patients) (Fig. 1). Samples were submitted for semiquantitative culture of C. auris and other multidrug-resistant organisms (MDROs) including methicillin-resistant Staphylococcus aureus (MRSA), vancomycin-resistant Enterococcus (VRE), extended-spectrum β-lactamase–producing Enterobacterales (ESBLs), and carbapenem-resistant Enterobacterales (CRE). Times to room surface contamination with C. auris and other MDROs after effective cleaning were analyzed. Results:Candida auris colonization was most frequently detected in the nares (72%) and palms and fingertips (72%). Cocolonization of body sites with other MDROs was common (Fig. 2). Surfaces located close to the patient were commonly recontaminated with C. auris by 4 hours after cleaning, including the overbed table (24%), bed handrail (24%), and TV remote or call button (19%). Environmental cocontamination was more common with resistant gram-positive organisms (MRSA and, VRE) than resistant gram-negative organisms (Fig. 3). C. auris was rarely detected on surfaces located outside a patient’s room (1 of 120 swabs; <1%). Conclusions: Environmental surfaces near C. auris–colonized patients were rapidly recontaminated after cleaning and disinfection. Cocolonization of skin and environment with other MDROs was common, with resistant gram-positive organisms predominating over gram-negative organisms on environmental surfaces. Limitations include lack of organism sequencing or typing to confirm environmental contamination was from the room resident. Rapid recontamination of environmental surfaces after manual cleaning and disinfection suggests that alternate mitigation strategies should be evaluated.
Background: Outpatient parenteral antimicrobial therapy (OPAT) is used in the outpatient setting to treat infectious conditions that require a prolonged course of antimicrobials. OPAT has been shown to decrease length of hospital stay and healthcare costs without compromising patient care and has become a widely accepted practice nationally. Due to this trend, the study of OPAT is of vital importance and will continue to be relevant moving forward. Currently, few studies have explored risk factors associated with OPAT complications, and most are limited in their analysis by indication. Further work should be performed to expand upon what is currently known. We characterized factors associated with increased OPAT complication risk. Methods: We conducted a retrospective cohort study at 4 sites across NYU Langone Health in patients admitted from 2017 to 2020. We applied the following inclusion criteria: aged ≥18 years and discharged with OPAT. Complications were defined as follows: vascular-access-related (line occlusion, thrombosis, dislodgement, central-line associated bloodstream infection or CLABSI) and antimicrobial-related (laboratory derangement, drug reaction, Clostridioides difficile infection), all-cause 30-day readmission, and OPAT-related readmission. Data were obtained from electronic medical records and the OPAT database. This study was granted a waiver from informed consent by the NYU Institutional Review Board. Multivariate logistic regression was performed, adjusting for confounding variables (sex, age, hospital of admission, history of chronic medical conditions, line type, and line duration). Results: Overall, 1,846 patient encounters of 5,951 reviewed met inclusion criteria. The median age was 66 (IQR, 26), 42.2% were female. Moreover, 810 (44%) received a peripherally inserted central catheter (PICC) and 1,036 (56%) received a midline cathether. Also, 563 (30.5%) were discharged to subacute rehabilitation (SAR). The most frequent complications were line dislodgement (4.2% of all patients), laboratory derangement (3.0%), and drug reaction (2.4%). Furthermore, 27 patients (1.5%) developed CLABSI. Patients discharged to SAR were more likely to develop CLABSI (OR, 4.1l; P = .005), and they had higher rates of OPAT-related 30-day readmissions (OR, 2.675; P = .004) compared to those who were discharged home, after adjusting for key confounders. Conclusions: Discharge to SAR is strongly associated with increased risk of readmission for OPAT-related complications and CLABSI, after adjusting for key confounders. CLABSI prevention during SAR admission is a critically needed public health intervention.
This study reports water capacity estimates for four reservoirs within the Classic Maya city of El Perú-Waka’, Guatemala. Combining field survey, soil analysis, and a variety of GIS interpolation methods, it illustrates ways to more fully quantify a challenging resource—water—and its availability using an interdisciplinary approach. This is accomplished by comparing surface interpolation methods for estimating reservoir capacities to demonstrate that most provide reliable estimates. Reported estimates are further enhanced by analyzing internal reservoir soil morphology to better understand and quantify formation processes and refine estimates from field survey. These analyses document a multiscalar organization to water management within the Waka’ urban core that likely ran the gamut from individuals up to civic and state institutions. Although intricacies remain to be fully elucidated, this example offers an alternate path to theorizing about water management practices from traditional binary approaches.
OBJECTIVES/GOALS: In a familial case where 10 of 17 members inherited EA/LVNC in an autosomal dominant pattern, we discovered a novel, damaging missense variant in the gene KLHL26 that segregates with disease and comprises an altered electrostatic surface profile, likely decoupling the CUL3-interactome. We hypothesize that this KLHL26 variant is etiologic of EA/LVNC. METHODS/STUDY POPULATION: We differentiated a family trio (a heart-healthy daughter and EA/LVNC-affected mother and daughter) of induced pluripotent stem cells into cardiomyocytes (iPSC-CMs) in a blinded manner on three iPSC clones per subject. Using flow cytometry, immunofluorescence, and biomechanical, electrophysiological, and automated contraction methods, we investigated iPSC-CM differentiation efficiency between D10-20, contractility analysis and cell cycle regulation at D20, and sarcomere organization at D60. We further conducted differential analyses following label-free protein and RNA-Seq quantification at D20. Via CRISPR-Cas9 gene editing, we plan to characterize KLHL26 variant-specific iPSC-CM alterations and connect findings to discoveries from patient-specific studies. RESULTS/ANTICIPATED RESULTS: All iPSC lines differentiated into CMs with an increased percentage of cTnT+ cells in the affected daughter line. In comparison to the unaffected, affected iPSC-CMs had fewer contractions per minute and altered calcium transients, mainly a higher amount of total calcium release, faster rate of rise and faster rate of fall. The affected daughter line further had shorter shortening and relaxation times, higher proliferation, lower apoptosis, and a smaller cell surface area per cardiac nucleus. The affected mother line trended in a similar direction to the affected daughter line. There were no gross differences in sarcomere organization between the lines. We also discovered differential expression of candidate proteins such as kinase VRK1 and collagen COL5A1 from proteomic profiling. DISCUSSION/SIGNIFICANCE: These discoveries suggest that EA/LVNC characteristics or pathogenesis may result from decreased contractile ability, altered calcium transients, and cell cycle dysregulation. Through the KLHL26 variant correction and introduction in the daughter lines, we will build upon this understanding to inform exploration of critical clinical targets.
We consider the near-critical Erdős–Rényi random graph G(n, p) and provide a new probabilistic proof of the fact that, when p is of the form
$p=p(n)=1/n+\lambda/n^{4/3}$
and A is large,
where
$\mathcal{C}_{\max}$
is the largest connected component of the graph. Our result allows A and
$\lambda$
to depend on n. While this result is already known, our proof relies only on conceptual and adaptable tools such as ballot theorems, whereas the existing proof relies on a combinatorial formula specific to Erdős–Rényi graphs, together with analytic estimates.
Let T be the regular tree in which every vertex has exactly
$d\ge 3$
neighbours. Run a branching random walk on T, in which at each time step every particle gives birth to a random number of children with mean d and finite variance, and each of these children moves independently to a uniformly chosen neighbour of its parent. We show that, starting with one particle at some vertex 0 and conditionally on survival of the process, the time it takes for every vertex within distance r of 0 to be hit by a particle of the branching random walk is
$r + ({2}/{\log(3/2)})\log\log r + {\mathrm{o}}(\log\log r)$
.
From 2014 to 2020, we compiled radiocarbon ages from the lower 48 states, creating a database of more than 100,000 archaeological, geological, and paleontological ages that will be freely available to researchers through the Canadian Archaeological Radiocarbon Database. Here, we discuss the process used to compile ages, general characteristics of the database, and lessons learned from this exercise in “big data” compilation.
To describe the epidemiology of patients with nonintestinal carbapenem-resistant Enterobacterales (CRE) colonization and to compare clinical outcomes of these patients to those with CRE infection.
Design:
A secondary analysis of Consortium on Resistance Against Carbapenems in Klebsiella and other Enterobacteriaceae 2 (CRACKLE-2), a prospective observational cohort.
Setting:
A total of 49 US short-term acute-care hospitals.
Patients:
Patients hospitalized with CRE isolated from clinical cultures, April, 30, 2016, through August 31, 2017.
Methods:
We described characteristics of patients in CRACKLE-2 with nonintestinal CRE colonization and assessed the impact of site of colonization on clinical outcomes. We then compared outcomes of patients defined as having nonintestinal CRE colonization to all those defined as having infection. The primary outcome was a desirability of outcome ranking (DOOR) at 30 days. Secondary outcomes were 30-day mortality and 90-day readmission.
Results:
Of 547 patients with nonintestinal CRE colonization, 275 (50%) were from the urinary tract, 201 (37%) were from the respiratory tract, and 71 (13%) were from a wound. Patients with urinary tract colonization were more likely to have a more desirable clinical outcome at 30 days than those with respiratory tract colonization, with a DOOR probability of better outcome of 61% (95% confidence interval [CI], 53%–71%). When compared to 255 patients with CRE infection, patients with CRE colonization had a similar overall clinical outcome, as well as 30-day mortality and 90-day readmission rates when analyzed in aggregate or by culture site. Sensitivity analyses demonstrated similar results using different definitions of infection.
Conclusions:
Patients with nonintestinal CRE colonization had outcomes similar to those with CRE infection. Clinical outcomes may be influenced more by culture site than classification as “colonized” or “infected.”
Ambulance patients who are unable to be quickly transferred to an emergency department (ED) bed represent a key contributing factor to ambulance offload delay (AOD). Emergency department crowding and associated AOD are exacerbated by multiple factors, including infectious disease outbreaks such as the coronavirus disease 2019 (COVID-19) pandemic. Initiatives to address AOD present an opportunity to streamline ambulance offload procedures while improving patient outcomes.
Study Objective:
The goal of this study was to evaluate the initial outcomes and impact of a novel Emergency Medical Service (EMS)-based Hospital Liaison Program (HLP) on ambulance offload times (AOTs).
Methods:
Ambulance offload times associated with EMS patients transported to a community hospital six months before and after HLP implementation were retrospectively analyzed using proportional significance tests, t-tests, and multiple regression analysis.
Results:
A proportional increase in incidents in the zero to <30 minutes time category after program implementation (+2.96%; P <.01) and a commensurate decrease in the proportion of incidents in the 30 to <60 minutes category (−2.65%; P <.01) were seen. The fully adjusted regression model showed AOT was 16.31% lower (P <.001) after HLP program implementation, holding all other variables constant.
Conclusion:
The HLP is an innovative initiative that constitutes a novel pathway for EMS and hospital systems to synergistically enhance ambulance offload procedures. The greatest effect was demonstrated in patients exhibiting potentially life-threatening symptoms, with a reduction of approximately three minutes. While small, this outcome was a statistically significant decrease from the pre-intervention period. Ultimately, the HLP represents an additional strategy to complement existing approaches to mitigate AOD.
Sudden onset severe headache is usually caused by a primary headache disorder but may be secondary to a more serious problem, such as subarachnoid hemorrhage (SAH). Very few patients who present to hospital with headache have suffered a SAH, but early identification is important to improve patient outcomes. A systematic review was undertaken to assess the clinical effectiveness of different care pathways for the management of headache, suspicious for SAH, in the Emergency Department. Capturing the perspective of patients was an important part of the research.
Methods
The project team included a patient collaborator with experience of presenting to the Emergency Department with sudden onset severe headache. Three additional patients were recruited to our advisory group. The patient's perspective was collected at various points through the project including at team meetings, during protocol development and when interpreting the results of the systematic review and drawing conclusions.
Results
Patients were reassured by the very high diagnostic accuracy of computed tomography (CT) for detecting SAH. Patients and clinicians emphasized the importance of shared decision making about whether to undergo additional tests to rule out SAH, after a negative CT result. When lumbar puncture was necessary, patients expressed a preference to have it on an ambulatory basis; further research on the safety and acceptability of ambulatory lumbar puncture was recommended.
Conclusions
Patient input at the protocol development stage helped researchers understand the patient experience and highlighted important outcomes for assessment. Patient involvement added context to the review findings and highlighted the preferences of patients regarding the management of headache.
Sudden onset severe headache is usually caused by a primary headache disorder but occasionally is secondary to a more serious problem, such as subarachnoid hemorrhage (SAH). Guidelines recommend non-contrast brain computed tomography (CT) followed by lumbar puncture (LP) to exclude SAH. However, guidelines pre-date the introduction of more sensitive modern CT scanners. A systematic review was undertaken to assess the clinical effectiveness of different care pathways for the management of headache in the Emergency Department.
Methods
Eighteen databases (including MEDLINE and Embase) were searched to February 2020. Studies were quality assessed using criteria relevant to the study design; most studies were assessed using the QUADAS-2 tool for diagnostic accuracy studies. Where sufficient information was reported, diagnostic accuracy data were extracted into 2 × 2 tables to calculate sensitivity, specificity, false-positive and false-negative rates. Where possible, hierarchical bivariate meta-analysis was used to synthesize results, otherwise studies were synthesized narratively.
Results
Fifty-one studies were included in the review. Eight studies assessing the accuracy of the Ottawa SAH clinical decision rule were pooled; sensitivity was 99.5 percent, specificity was 23.7 percent. The high false positive rate suggests that 76.3 percent SAH-negative patients would undergo further investigation unnecessarily. Four studies assessing the accuracy of CT within six hours of headache onset were pooled; sensitivity was 98.7 percent, specificity was 100 percent. CT sensitivity beyond six hours was considerably lower (≤90%; 2 studies). Three studies assessing LP following negative CT were pooled; sensitivity was 100 percent, specificity was 95.2 percent. LP-related adverse events were reported in 5.3–9.5 percent of patients.
Conclusions
The evidence suggests that the Ottawa SAH Rule is not sufficiently accurate for ruling out SAH and does little to aid clinical decision making. Modern CT within six hours of headache onset (with images assessed by a neuroradiologist) is highly accurate, but sensitivity reduces considerably over time. The CT-LP pathway is highly sensitive for detecting SAH, although LP resulted in some false-positives and adverse events.
Catatonia, a severe neuropsychiatric syndrome, has few studies of sufficient scale to clarify its epidemiology or pathophysiology. We aimed to characterise demographic associations, peripheral inflammatory markers and outcome of catatonia.
Methods
Electronic healthcare records were searched for validated clinical diagnoses of catatonia. In a case–control study, demographics and inflammatory markers were compared in psychiatric inpatients with and without catatonia. In a cohort study, the two groups were compared in terms of their duration of admission and mortality.
Results
We identified 1456 patients with catatonia (of whom 25.1% had two or more episodes) and 24 956 psychiatric inpatients without catatonia. Incidence was 10.6 episodes of catatonia per 100 000 person-years. Patients with and without catatonia were similar in sex, younger and more likely to be of Black ethnicity. Serum iron was reduced in patients with catatonia [11.6 v. 14.2 μmol/L, odds ratio (OR) 0.65 (95% confidence interval (CI) 0.45–0.95), p = 0.03] and creatine kinase was raised [2545 v. 459 IU/L, OR 1.53 (95% CI 1.29–1.81), p < 0.001], but there was no difference in C-reactive protein or white cell count. N-Methyl-d-aspartate receptor antibodies were significantly associated with catatonia, but there were small numbers of positive results. Duration of hospitalisation was greater in the catatonia group (median: 43 v. 25 days), but there was no difference in mortality after adjustment.
Conclusions
In the largest clinical study of catatonia, we found catatonia occurred in approximately 1 per 10 000 person-years. Evidence for a proinflammatory state was mixed. Catatonia was associated with prolonged inpatient admission but not with increased mortality.
In 2015, an international outbreak of Mycobacterium chimaera infections among patients undergoing cardiothoracic surgeries was associated with exposure to contaminated LivaNova 3T heater-cooler devices (HCDs). From June 2017 to October 2020, the Centers for Disease Control and Prevention was notified of 18 patients with M. chimaera infections who had undergone cardiothoracic surgeries at 2 hospitals in Kansas (14 patients) and California (4 patients); 17 had exposure to 3T HCDs. Whole-genome sequencing of the clinical and environmental isolates matched the global outbreak strain identified in 2015.
Methods:
Investigations were conducted at each hospital to determine the cause of ongoing infections. Investigative methods included query of microbiologic records to identify additional cases, medical chart review, observations of operating room setup, HCD use and maintenance practices, and collection of HCD and environmental samples.
Results:
Onsite observations identified deviations in the positioning and maintenance of the 3T HCDs from the US Food and Drug Administration (FDA) recommendations and the manufacturer’s updated cleaning and disinfection protocols. Additionally, most 3T HCDs had not undergone the recommended vacuum and sealing upgrades by the manufacturer to decrease the dispersal of M. chimaera–containing aerosols into the operating room, despite hospital requests to the manufacturer.
Conclusions:
These findings highlight the need for continued awareness of the risk of M. chimaera infections associated with 3T HCDs, even if the devices are newly manufactured. Hospitals should maintain vigilance in adhering to FDA recommendations and the manufacturer’s protocols and in identifying patients with potential M. chimaera infections with exposure to these devices.
The neural mechanisms contributing to the social problems of pediatric brain tumor survivors (PBTS) are unknown. Face processing is important to social communication, social behavior, and peer acceptance. Research with other populations with social difficulties, namely autism spectrum disorder, suggests atypical brain activation in areas important for face processing. This case-controlled functional magnetic resonance imaging (fMRI) study compared brain activation during face processing in PBTS and typically developing (TD) youth.
Methods:
Participants included 36 age-, gender-, and IQ-matched youth (N = 18 per group). PBTS were at least 5 years from diagnosis and 2 years from the completion of tumor therapy. fMRI data were acquired during a face identity task and a control condition. Groups were compared on activation magnitude within the fusiform gyrus for the faces condition compared to the control condition. Correlational analyses evaluated associations between neuroimaging metrics and indices of social behavior for PBTS participants.
Results:
Both groups demonstrated face-specific activation within the social brain for the faces condition compared to the control condition. PBTS showed significantly decreased activation for faces in the medial portions of the fusiform gyrus bilaterally compared to TD youth, ps ≤ .004. Higher peak activity in the left fusiform gyrus was associated with better socialization (r = .53, p < .05).
Conclusions:
This study offers initial evidence of atypical activation in a key face processing area in PBTS. Such atypical activation may underlie some of the social difficulties of PBTS. Social cognitive neuroscience methodologies may elucidate the neurobiological bases for PBTS social behavior.
Electroencephalographic (EEG) abnormalities are greater in mild cognitive impairment (MCI) with Lewy bodies (MCI-LB) than in MCI due to Alzheimer’s disease (MCI-AD) and may anticipate the onset of dementia. We aimed to assess whether quantitative EEG (qEEG) slowing would predict a higher annual hazard of dementia in MCI across these etiologies. MCI patients (n = 92) and healthy comparators (n = 31) provided qEEG recording and underwent longitudinal clinical and cognitive follow-up. Associations between qEEG slowing, measured by increased theta/alpha ratio, and clinical progression from MCI to dementia were estimated with a multistate transition model to account for death as a competing risk, while controlling for age, cognitive function, and etiology classified by an expert consensus panel.
Over a mean follow-up of 1.5 years (SD = 0.5), 14 cases of incident dementia and 5 deaths were observed. Increased theta/alpha ratio on qEEG was associated with increased annual hazard of dementia (hazard ratio = 1.84, 95% CI: 1.01–3.35). This extends previous findings that MCI-LB features early functional changes, showing that qEEG slowing may anticipate the onset of dementia in prospectively identified MCI.