To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A ubiquitous arrangement in nature is a free-flowing fluid coupled to a porous medium, for example a river or lake lying above a porous bed. Depending on the environmental conditions, thermal convection can occur and may be confined to the clear fluid region, forming shallow convection cells, or it can penetrate into the porous medium, forming deep cells. Here, we combine three complementary approaches – linear stability analysis, fully nonlinear numerical simulations and a coarse-grained model – to determine the circumstances that lead to each configuration. The coarse-grained model yields an explicit formula for the transition between deep and shallow convection in the physically relevant limit of small Darcy number. Near the onset of convection, all three of the approaches agree, validating the predictive capability of the explicit formula. The numerical simulations extend these results into the strongly nonlinear regime, revealing novel hybrid configurations in which the flow exhibits a dynamic shift from shallow to deep convection. This hybrid shallow-to-deep convection begins with small, random initial data, progresses through a metastable shallow state and arrives at the preferred steady state of deep convection. We construct a phase diagram that incorporates information from all three approaches and depicts the regions in parameter space that give rise to each convective state.
Background:Candida auris is an emerging multidrug-resistant yeast that is transmitted in healthcare facilities and is associated with substantial morbidity and mortality. Environmental contamination is suspected to play an important role in transmission but additional information is needed to inform environmental cleaning recommendations to prevent spread. Methods: We conducted a multiregional (Chicago, IL; Irvine, CA) prospective study of environmental contamination associated with C. auris colonization of patients and residents of 4 long-term care facilities and 1 acute-care hospital. Participants were identified by screening or clinical cultures. Samples were collected from participants’ body sites (eg, nares, axillae, inguinal creases, palms and fingertips, and perianal skin) and their environment before room cleaning. Daily room cleaning and disinfection by facility environmental service workers was followed by targeted cleaning of high-touch surfaces by research staff using hydrogen peroxide wipes (see EPA-approved product for C. auris, List P). Samples were collected immediately after cleaning from high-touch surfaces and repeated at 4-hour intervals up to 12 hours. A pilot phase (n = 12 patients) was conducted to identify the value of testing specific high-touch surfaces to assess environmental contamination. High-yield surfaces were included in the full evaluation phase (n = 20 patients) (Fig. 1). Samples were submitted for semiquantitative culture of C. auris and other multidrug-resistant organisms (MDROs) including methicillin-resistant Staphylococcus aureus (MRSA), vancomycin-resistant Enterococcus (VRE), extended-spectrum β-lactamase–producing Enterobacterales (ESBLs), and carbapenem-resistant Enterobacterales (CRE). Times to room surface contamination with C. auris and other MDROs after effective cleaning were analyzed. Results:Candida auris colonization was most frequently detected in the nares (72%) and palms and fingertips (72%). Cocolonization of body sites with other MDROs was common (Fig. 2). Surfaces located close to the patient were commonly recontaminated with C. auris by 4 hours after cleaning, including the overbed table (24%), bed handrail (24%), and TV remote or call button (19%). Environmental cocontamination was more common with resistant gram-positive organisms (MRSA and, VRE) than resistant gram-negative organisms (Fig. 3). C. auris was rarely detected on surfaces located outside a patient’s room (1 of 120 swabs; <1%). Conclusions: Environmental surfaces near C. auris–colonized patients were rapidly recontaminated after cleaning and disinfection. Cocolonization of skin and environment with other MDROs was common, with resistant gram-positive organisms predominating over gram-negative organisms on environmental surfaces. Limitations include lack of organism sequencing or typing to confirm environmental contamination was from the room resident. Rapid recontamination of environmental surfaces after manual cleaning and disinfection suggests that alternate mitigation strategies should be evaluated.
Background: Contact tracing alone is often inadequate to determine the source of healthcare personnel (HCP) COVID-19 when SARS-CoV-2 is widespread in the community. We combined whole-genome sequencing (WGS) with traditional epidemiologic analysis to investigate the frequency with which patients or other HCP with symptomatic COVID-19 acted as the source of HCP infection at a large tertiary-care center early in the pandemic. Methods: Cohort samples were selected from patients and HCP with PCR-positive SARS-CoV-2 infection from a period with complete retention of samples (March 14, 2021–April 10, 2020) at Rush University Medical Center, a 664-bed hospital in Chicago, Illinois. During this period, testing was limited to symptomatic patients and HCP. Recommended respiratory equipment for HCP evolved under guidance, including a 19-day period when medical face masks were recommended for COVID-19 care except for aerosol-generating procedures. Viral RNA was extracted and sequenced (NovaSeq, Illumina) from remnant nasopharyngeal swab samples in M4RT viral transport medium. Genomes with >90% coverage underwent cluster detection using a 2 single-nucleotide variant genetic distance cutoff. Genomic clusters were independently evaluated for valid epidemiologic links by 2 infectious diseases physicians (with a third adjudicator) using metadata extracted from the electronic medical record and according to predetermined criteria (Table 1). Results: In total, 1,031 SARS-CoV-2 sequences were analyzed, identifying 49 genomic clusters with HCP (median, 8; range, 2–43 members per cluster; total, 268 patients and 115 HCP) (Fig. 1). Also, 20,190 flowsheet activities were documented for cohort HCP and patient interactions, including 686 instances in which a cohort HCP contributed to a cohort patient’s chart. Most HCP infections were considered not healthcare associated (88 of 115, 76.5%). We did not identify any strong linkages for patient-to-HCP transmission. Moreover, 13 HCP cases (11.3%) were attributed to patient source (weak linkage). Also, 14 HCP cases (12.2%) were attributed to HCP source (11 strong and 3 weak linkages). Weak linkages were due to lack of epidemiologic data for HCP location, particularly nonclinical staff (eg, an environmental service worker who lacked location documentation to rule out patient-specific contact). Agreement for epidemiologic linkage between the 2 evaluators was high (κ, 0.91). Conclusions: Using genomic and epidemiologic data, we found that most HCP COVID-19 infections were not healthcare associated. We found weak evidence to support symptomatic patient-to-HCP transmission of SARS-CoV-2 and stronger evidence for HCP-to-HCP transmission. Large genomic clusters without plausible epidemiologic links were identified, reflecting the limited utility of genomic surveillance alone to characterize chains of transmission of SARS-CoV-2 during extensive community spread.
Numerous scholars have argued that in Luke-Acts the location of sacred space or divine presence passes from the Jerusalem temple to Jesus, Christian believers, or both; in Acts, this transfer is understood as integral to the universal mission. The present article argues that such studies overlook the important motif of heaven as temple, which plays a role in Jesus’ trial and crucifixion and the Stephen and Cornelius episodes. Using Edward Soja's spatial theory, previous studies’ binary categorisation of temple space is critiqued. The heavenly temple disrupts and reconstitutes understandings of sacred space, and thus undergirds the universal spread of the Way.
Background: Identification of hospitalized patients with enteric multidrug-resistant organism (MDRO) carriage, combined with implementation of targeted infection control interventions, may help reduce MDRO transmission. However, the optimal surveillance approach has not been defined. We sought to determine whether daily serial rectal surveillance for MDROs detects more incident cases (acquisition) of MDRO colonization in medical intensive care unit (MICU) patients than admission and discharge surveillance alone. Methods: Prospective longitudinal observational single-center study from January 11, 2017, to January 11, 2018. Inclusion criteria were ≥3 consecutive MICU days and ≥2 rectal or stool swabs per MICU admission. Daily rectal or stool swabs were collected from patients and cultured for MDROs, including vancomycin-resistant Enterococcus (VRE), carbapenem-resistant Enterobacterales (CRE), third-generation cephalosporin-resistant Enterobacterales (3GCR), and extended-spectrum β-lactamase–producing Enterobacterales (ESBL-E) (as a subset of 3GCR). MDRO detection at any time during the MICU stay was used to calculate prevalent colonization. Incident colonization (acquisition) was defined as new detection of an MDRO after at least 1 prior negative swab. We then determined the proportion of prevalent and incident cases detected by daily testing that were also detected when only first swabs (admission) and last swabs (discharge) were tested. Data were analyzed using SAS version 9.4 software. Results: In total, 939 MICU stays of 842 patients were analyzed. Patient characteristics were median age 64 years (interquartile range [IQR], 51–74), median MICU length of stay 5 days (IQR, 3–8), median number of samples per admission 3 (IQR, 2–5), and median Charlson index 4 (IQR, 2–7). Prevalent colonization with any MDRO was detected by daily swabbing in 401 stays (42.7%). Compared to daily serial swabbing, an admission- and discharge-only approach detected ≥86% of MDRO cases (ie, overall prevalent MDRO colonization). Detection of incident MDRO colonization by an admission- or discharge-only approach would have detected fewer cases than daily swabbing (Figure 1); ≥34% of total MDRO acquisitions would have been missed. Conclusions: Testing patients upon admission and discharge to an MICU may fail to detect MDRO acquisition in more than one-third of patients, thereby reducing the effectiveness of MDRO control programs that are targeted against known MDRO carriers. The poor performance of a single discharge swab may be due to intermittent or low-level MDRO shedding, inadequate sampling, or transient MDRO colonization. Additional research is needed to determine the optimal surveillance approach of enteric MDRO carriage.
Infectious diseases outbreaks are a cause of significant morbidity and mortality among hospitalized patients. Infants admitted to the neonatal intensive care unit (NICU) are particularly vulnerable to infectious complications during hospitalization. Thus, rapid recognition of and response to outbreaks in the NICU is essential. At Rush University Medical Center, whole-genome sequencing (WGS) has been utilized since early 2016 as an adjunctive method for outbreak investigations. The use of WGS and potential lessons learned are illustrated for 3 different NICU outbreak investigations involving methicillin-resistant Staphylococcus aureus (MRSA), group B Streptococcus (GBS), and Serratia marcescens. WGS has contributed to the understanding of the epidemiology of outbreaks in our NICU, and it has also provided further insight in settings of unusual diseases or when lower-resolution typing methods have been inadequate. WGS has emerged as the new gold standard for evaluating strain relatedness. As barriers to implementation are overcome, WGS has the potential to transform outbreak investigation in healthcare settings.
The first demonstration of laser action in ruby was made in 1960 by T. H. Maiman of Hughes Research Laboratories, USA. Many laboratories worldwide began the search for lasers using different materials, operating at different wavelengths. In the UK, academia, industry and the central laboratories took up the challenge from the earliest days to develop these systems for a broad range of applications. This historical review looks at the contribution the UK has made to the advancement of the technology, the development of systems and components and their exploitation over the last 60 years.
Background: Carbapenemase-producing Enterobacterales (CPE) have become an increasingly common cause of hospital-acquired infections while their reservoirs within the clinical setting remain poorly understood. Outbreaks have been linked to hospital sinks, which have been shown to harbor and, under certain conditions, disperse CPE to surrounding surfaces. Hospital and laboratory studies have proposed that Gram-negative organisms, including CPE, can migrate through plumbing biofilms, leading to widespread contamination of the drainage system. Methods: To assess the prevalence of CPE in hospital sinks, drain swabs and waste trap water samples were taken from 10 sinks in 10 hospitals. Hospitals were in different regions of England; 4 had reported recent cases of CPE infection. To investigate spread and dispersal of CPE, waste traps from a single hospital were installed in a laboratory model sink system. Built to simulate a clinical setting, the model incorporated 12 sinks, 6 of which were connected through a common waste pipe. All 12 taps were automatically flushed. Drainage was automatically controlled. Nutrients were provided daily to maintain the bacterial populations, which were regularly sampled to monitor their composition. At 3 weeks after installation, the waste traps were subjected to a drainage backflow event. Waste trap water populations continued to be monitored, and when transfer between sinks was suspected, isolates were characterized and compared using whole-genome sequencing. Results: Between January and June 2019, 200 samples were taken from 103 sinks. In total, 24 (23%) sinks (in 8 hospitals) harbored CRE; of which 10 (in 5 hospitals) harbored at least 1 CPE. Immediately after a backflow event in the laboratory model system, 2 KPC-producing E. cloacae were recovered from a waste trap in which CPE had not been previously detected. The isolates were identified as ST501 and ST31 and were genetically indistinguishable from those colonizing sinks elsewhere in the system. Following intersink transfer, KPC-producing E. cloacae ST501 successfully integrated into the microbiome of the recipient sink and was detected in the waste trap water at least 6 months after the backflow event. At 2 and 3 months after the backflow, other intersink transfers involving Escherichia coli and KPC-producing E. cloacae were also observed. Conclusions: Sink waste traps and drains are a reservoir for CPE in hospitals. Once established, CPE contamination might not be confined to a single sink and could spread through wastewater plumbing. Hospitals frequently report drainage problems, which could cause or facilitate CPE transfer between sinks and could lead to long-term establishment.
The Antipsychotic Long-acTing injection in schizOphrenia (ALTO) study was a non-interventional study across several European countries examining prescription of long-acting injectable (LAI) antipsychotics to identify sociodemographic and clinical characteristics of patients receiving and physicians prescribing LAIs. ALTO was also the first large-scale study in Europe to report on the use of both first- or second-generation antipsychotic (FGA- or SGA-) LAIs.
Patients with schizophrenia receiving a FGA- or SGA-LAI were enrolled between June 2013 and July 2014 and categorized as incident or prevalent users. Assessments included measures of disease severity, functioning, insight, well-being, attitudes towards antipsychotics, and quality of life.
For the 572 patients, disease severity was generally mild-to-moderate and the majority were unemployed and/or socially withdrawn. 331/572 were prevalent LAI antipsychotic users; of whom 209 were prescribed FGA-LAI. Paliperidone was the most commonly prescribed SGA-LAI (56% of incident users, 21% of prevalent users). 337/572 (58.9%) were considered at risk of non-adherence. Prevalent LAI users had a tendency towards better insight levels (PANSS G12 item). Incident FGA-LAI users had more severe disease, poorer global functioning, lower quality of life, higher rates of non-adherence, and were more likely to have physician-reported lack of insight.
These results indicate a lower pattern of FGA-LAI usage, reserved by prescribers for seemingly more difficult-to-treat patients and those least likely to adhere to oral medication.
Oceanic anoxic events (OAEs) are contemporaneous with 11 of the 18 largest Phanerozoic extinction events, but the magnitude and selectivity of their paleoecological impact remains disputed. OAEs are associated with abrupt, rapid warming and increased CO2 flux to the atmosphere; thus, insights from this study may clarify the impact of current anthropogenic climate change on the biosphere. We investigated the influence of the Late Cretaceous Bonarelli event (OAE2; Cenomanian/Turonian stage boundary; ~94 Ma) on generic- and species-level molluscan diversity, extinction rates, and ecological turnover. Cenomanian/Turonian results were compared with changes across all Cretaceous stage boundaries, some of which are coincident with less severe OAEs. We found increased generic turnover, but not species-level turnover, associated with several Cretaceous OAEs. The absence of a species-level pattern may reflect species occurrence data that are too temporally coarse to robustly detect patterns. Five hypotheses of ecological selectivity relating anoxia to survivorship were tested across stage boundaries with respect to faunality, mobility, and diet using generalized linear models. Interestingly, benthic taxa were consistently selected against throughout the Cretaceous regardless of the presence or absence of OAEs. These results suggest that: (1) the Cenomanian/Turonian boundary (OAE2) was associated with a decline in molluscan diversity and increase in extinction rate that were significantly more severe than Cretaceous background levels; and (2) no differential ecological selectivity was associated with OAE-related diversity declines among the variables tested here.
We read with interest the recent editorial, “The Hennepin Ketamine Study,” by Dr. Samuel Stratton commenting on the research ethics, methodology, and the current public controversy surrounding this study.1 As researchers and investigators of this study, we strongly agree that prospective clinical research in the prehospital environment is necessary to advance the science of Emergency Medical Services (EMS) and emergency medicine. We also agree that accomplishing this is challenging as the prehospital environment often encounters patient populations who cannot provide meaningful informed consent due to their emergent conditions. To ensure that fellow emergency medicine researchers understand the facts of our work so they may plan future studies, and to address some of the questions and concerns in Dr. Stratton’s editorial, the lay press, and in social media,2 we would like to call attention to some inaccuracies in Dr. Stratton’s editorial, and to the lay media stories on which it appears to be based.
Ho JD, Cole JB, Klein LR, Olives TD, Driver BE, Moore JC, Nystrom PC, Arens AM, Simpson NS, Hick JL, Chavez RA, Lynch WL, Miner JR. The Hennepin Ketamine Study investigators’ reply. Prehosp Disaster Med. 2019;34(2):111–113
OBJECTIVES/SPECIFIC AIMS: The goal for this project is to determine the feasibility of using a novel multi-photon fiber-coupled microscope to aid surgeons in localizing STN during surgeries. In order to accomplish this goal, we needed to identify the source of a strong autofluorescent signal in the STN and determine whether we could use image classification methods to automatically distinguish STN from surrounding brain regions. METHODS/STUDY POPULATION: We acquired 3 cadaveric brains from the University of Colorado Anschutz Medical Campus, Department of Pathology. Two of these brains were non-PD controls whereas 1 was diagnosed with PD. We dissected a 10 square centimeter region of midbrain surrounding STN, then prepared this tissue for slicing on a vibratome or cryostat. Samples were immuno-labeled for various cellular markers for identification, or left unlabeled in order to observe the autofluorescence for image classification. RESULTS/ANTICIPATED RESULTS: The border of STN is clearly visible based on the density of a strong autofluorescent signal. The autofluorescent signal is visible using 2-photon (850–1040 nm excitation) and conventional confocal microscopy (488–647 nm excitation). We were also able to visualize blood vessels with second harmonic generation. The autofluorescent signal is quenched by high concentrations of Sudan-black B (0.5%–5%), and is primarily localized in microtubule-associated protein-2 (MAP2)+ cells, indicating that it is likely lipofuscin accumulation in neurons. Smaller lipofuscin particles also accumulate in microglia, identified based on ionized calcium binding adopter 1 (Iba1)+ labeling. We anticipate that colocalization analysis will confirm these qualitative observations. Using 2-photon images of the endogenous autofluorescent signal in these samples, we trained a logistic regression-based image classifier using features derived from gray-level co-occurrence matrices. Preliminary testing indicates that our classifier performed well, with a mean accuracy of 0.89 (standard deviation of 0.11) and a Cohen’s Kappa value of 0.76 (standard deviation of 0.24). We are currently using coherent anti-Stokes Raman scattering and third harmonic imaging to identify different features of myelin that can be used to distinguish between these regions and expect similar results. DISCUSSION/SIGNIFICANCE OF IMPACT: Traditional methods for localizing STN during DBS surgery include the use of stereotactic coordinates and multi-electrode recording (MER) during implantation. MERs are incredibly useful in DBS surgeries, but require penetration of brain structures in order to infer location. Using multi-photon microscopy techniques to aid identification of STN during DBS surgeries offers a number of advantages over traditional methods. For example, blood vessels can be clearly identified with second harmonic generation, something that is not possible with MER. Multi-photon microscopy also allows visualization deep into tissue without actually penetrating it. This ability to look within a depth of field is useful for detection of STN borders based on autofluorescent cell density. When combined with traditional stereotactic information, our preliminary image classification methods are a fast, reliable way to provide surgeons with extra information concerning their location in the midbrain. We anticipate that future advancements and refinements to our image classifier will only increase accuracy and the potential applications and value. In summary, these preliminary data support the feasibility of multi-photon microscopy to aid in the identification of target brain regions during DBS surgeries. The techniques described here complement and enhance current stereotactic and electrophysiological methods for DBS surgeries.
To identify modifiable risk factors for acquisition of Klebsiella pneumoniae carbapenemase-producing Enterobacteriaceae (KPC) colonization among long-term acute-care hospital (LTACH) patients.
Multicenter, matched case-control study.
Four LTACHs in Chicago, Illinois.
Each case patient included in this study had a KPC-negative rectal surveillance culture on admission followed by a KPC-positive surveillance culture later in the hospital stay. Each matched control patient had a KPC-negative rectal surveillance culture on admission and no KPC isolated during the hospital stay.
From June 2012 to June 2013, 2,575 patients were admitted to 4 LTACHs; 217 of 2,144 KPC-negative patients (10.1%) acquired KPC. In total, 100 of these patients were selected at random and matched to 100 controls by LTACH facility, admission date, and censored length of stay. Acquisitions occurred a median of 16.5 days after admission. On multivariate analysis, we found that exposure to higher colonization pressure (OR, 1.02; 95% CI, 1.01–1.04; P=.002), exposure to a carbapenem (OR, 2.25; 95% CI, 1.06–4.77; P=.04), and higher Charlson comorbidity index (OR, 1.14; 95% CI, 1.01–1.29; P=.04) were independent risk factors for KPC acquisition; the odds of KPC acquisition increased by 2% for each 1% increase in colonization pressure.
Higher colonization pressure, exposure to carbapenems, and a higher Charlson comorbidity index independently increased the odds of KPC acquisition among LTACH patients. Reducing colonization pressure (through separation of KPC-positive patients from KPC-negative patients using strict cohorts or private rooms) and reducing carbapenem exposure may prevent KPC cross transmission in this high-risk patient population.
The final effort of the CLIMAP project was a study of the last interglaciation, a time of minimum ice volume some 122,000 yr ago coincident with the Substage 5e oxygen isotopic minimum. Based on detailed oxygen isotope analyses and biotic census counts in 52 cores across the world ocean, last interglacial sea-surface temperatures (SST) were compared with those today. There are small SST departures in the mid-latitude North Atlantic (warmer) and the Gulf of Mexico (cooler). The eastern boundary currents of the South Atlantic and Pacific oceans are marked by large SST anomalies in individual cores, but their interpretations are precluded by no-analog problems and by discordancies among estimates from different biotic groups. In general, the last interglacial ocean was not significantly different from the modern ocean. The relative sequencing of ice decay versus oceanic warming on the Stage 6/5 oxygen isotopic transition and of ice growth versus oceanic cooling on the Stage 5e/5d transition was also studied. In most of the Southern Hemisphere, the oceanic response marked by the biotic census counts preceded (led) the global ice-volume response marked by the oxygen-isotope signal by several thousand years. The reverse pattern is evident in the North Atlantic Ocean and the Gulf of Mexico, where the oceanic response lagged that of global ice volume by several thousand years. As a result, the very warm temperatures associated with the last interglaciation were regionally diachronous by several thousand years. These regional lead-lag relationships agree with those observed on other transitions and in long-term phase relationships; they cannot be explained simply as artifacts of bioturbational translations of the original signals.
Using the concept of “orbital tuning”, a continuous, high-resolution deep-sea chronostratigraphy has been developed spanning the last 300,000 yr. The chronology is developed using a stacked oxygen-isotope stratigraphy and four different orbital tuning approaches, each of which is based upon a different assumption concerning the response of the orbital signal recorded in the data. Each approach yields a separate chronology. The error measured by the standard deviation about the average of these four results (which represents the “best” chronology) has an average magnitude of only 2500 yr. This small value indicates that the chronology produced is insensitive to the specific orbital tuning technique used. Excellent convergence between chronologies developed using each of five different paleoclimatological indicators (from a single core) is also obtained. The resultant chronology is also insensitive to the specific indicator used. The error associated with each tuning approach is estimated independently and propagated through to the average result. The resulting error estimate is independent of that associated with the degree of convergence and has an average magnitude of 3500 yr, in excellent agreement with the 2500-yr estimate. Transfer of the final chronology to the stacked record leads to an estimated error of ±1500 yr. Thus the final chronology has an average error of ±5000 yr.
Wind-borne diseases can spread rapidly and cause large losses. Producers may have little incentive to prevent disease spread because prevention may not be welfare-maximizing. This study proposes a market-based mitigation program that indemnifies producers against disease-related losses and provides an incentive to neighboring producers to take preventive action, which can substantially mitigate infestations, reduce the likelihood of catastrophic losses, and increase social welfare. An equilibrium displacement model simulates introduction of the program for U.S. soybeans. Simulations reveal that the market-based solution contributes to minor market distortions but also reduces social welfare losses and could succeed for other at-risk commodities.