To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The reported incidence of Clostridoides difficile infection (CDI) has increased in recent years, partly due to broadening adoption of nucleic acid amplification tests (NAATs) replacing enzyme immunoassay (EIA) methods. Our aim was to quantify the impact of this switch on reported CDI rates using a large, multihospital, empirical dataset.
We analyzed 9 years of retrospective CDI data (2009–2017) from 47 hospitals in the southeastern United States; 37 hospitals switched to NAAT during this period, including 24 with sufficient pre- and post-switch data for statistical analyses. Poisson regression was used to quantify the NAAT-over-EIA incidence rate ratio (IRR) at hospital and network levels while controlling for longitudinal trends, the proportion of intensive care unit patient days, changes in surveillance methodology, and previously detected infection cluster periods. We additionally used change-point detection methods to identify shifts in the mean and/or slope of hospital-level CDI rates, and we compared results to recorded switch dates.
For hospitals that transitioned to NAAT, average unadjusted CDI rates increased substantially after the test switch from 10.9 to 23.9 per 10,000 patient days. Individual hospital IRRs ranged from 0.75 to 5.47, with a network-wide IRR of 1.75 (95% confidence interval, 1.62–1.89). Reported CDI rates significantly changed 1.6 months on average after switching to NAAT testing (standard deviation, 1.9 months).
Hospitals that switched from EIA to NAAT testing experienced an average postswitch increase of 75% in reported CDI rates after adjusting for other factors, and this increase was often gradual or delayed.
The Murchison Widefield Array (MWA) is an open access telescope dedicated to studying the low-frequency (80–300 MHz) southern sky. Since beginning operations in mid-2013, the MWA has opened a new observational window in the southern hemisphere enabling many science areas. The driving science objectives of the original design were to observe 21 cm radiation from the Epoch of Reionisation (EoR), explore the radio time domain, perform Galactic and extragalactic surveys, and monitor solar, heliospheric, and ionospheric phenomena. All together
programs recorded 20 000 h producing 146 papers to date. In 2016, the telescope underwent a major upgrade resulting in alternating compact and extended configurations. Other upgrades, including digital back-ends and a rapid-response triggering system, have been developed since the original array was commissioned. In this paper, we review the major results from the prior operation of the MWA and then discuss the new science paths enabled by the improved capabilities. We group these science opportunities by the four original science themes but also include ideas for directions outside these categories.
The Murchison Widefield Array (MWA) is an electronically steered low-frequency (<300 MHz) radio interferometer, with a ‘slew’ time less than 8 s. Low-frequency (∼100 MHz) radio telescopes are ideally suited for rapid response follow-up of transients due to their large field of view, the inverted spectrum of coherent emission, and the fact that the dispersion delay between a 1 GHz and 100 MHz pulse is on the order of 1–10 min for dispersion measures of 100–2000 pc/cm3. The MWA has previously been used to provide fast follow-up for transient events including gamma-ray bursts (GRBs), fast radio bursts (FRBs), and gravitational waves, using systems that respond to gamma-ray coordinates network packet-based notifications. We describe a system for automatically triggering MWA observations of such events, based on Virtual Observatory Event standard triggers, which is more flexible, capable, and accurate than previous systems. The system can respond to external multi-messenger triggers, which makes it well-suited to searching for prompt coherent radio emission from GRBs, the study of FRBs and gravitational waves, single pulse studies of pulsars, and rapid follow-up of high-energy superflares from flare stars. The new triggering system has the capability to trigger observations in both the regular correlator mode (limited to ≥0.5 s integrations) and using the Voltage Capture System (VCS, 0.1 ms integration) of the MWA and represents a new mode of operation for the MWA. The upgraded standard correlator triggering capability has been in use since MWA observing semester 2018B (July–Dec 2018), and the VCS and buffered mode triggers will become available for observing in a future semester.
An antimicrobial screen was applied to the cell phones of 26 resident physicians to determine its effects on the phone microbiome and its potential to serve as a selective agent for antibiotic or silver resistance genes. No increase of these genes was observed now was there a shift in the overall microbial community.
Langmuir circulation, a key turbulent process in the upper ocean, is mechanistically driven and sustained by imposed atmospheric wind stress and surface wave drift. In addition, and specifically in coastal zones, the presence of a mean current – whether associated with tidal currents or large-scale eddies – generates bottom-boundary-layer shear, which further modulates the physical attributes of coastal-zone Langmuir turbulence. We show that the presence of bottom-boundary-layer shear generated by oblique forcing between the mean current, atmospheric drag, and monochromatic wave field direction changes the orientation of the resultant, large-scale Langmuir cells. A model to predict this resultant orientation, based on salient parameters defining the forcing obliquity, is proposed. We also perform a systematic parametric study to isolate the ‘turning’ influence of salient parameters, which reveals that the resultant Langmuir cell orientation is always intermediate to the imposed forces. In order to provide a rigorous basis for the results, we study terms responsible for sustenance of streamwise vorticity, and provide a theoretical justification for the observed results.
We analyzed antibiotic use data from 29 southeastern US hospitals over a 5-year period to determine changes in antibiotic use after the fluoroquinolone US Food and Drug Administration (FDA) advisory update in 2016. Fluoroquinolone use declined both before and after the FDA announcement, and the use of select, alternative antibiotics increased after the announcement.
Fluoroquinolones are among the 4 most commonly prescribed antibiotic classes.1,2 Postmarketing reports of serious adverse events linked to fluoroquinolones include tendonitis, neuropathy, hypoglycemia, psychiatric side effects, and possible aortic vessel rupture, leading to safety label changes in July 2008 and August 2013.3 In July 2016, the US Food and Drug Administration (FDA) strengthened the “black box” warning following an initial safety announcement in May 2016, recommending avoidance of fluoroquinolones for uncomplicated infections such as acute exacerbation of chronic bronchitis, uncomplicated urinary tract infections, and acute bacterial sinusitis.4 Concerns over safety and the association with Clostridiodes difficile infection have led inpatient antimicrobial stewardship programs (ASPs) to develop initiatives to promote avoidance of quinolones. The objective of this study was to quantify the effect of the 2016 FDA “black box” update on inpatient antibiotic use among a cohort of southeastern US hospitals.
Recent years have seen an exponential increase in the variety of healthcare data captured across numerous sources. However, mechanisms to leverage these data sources to support scientific investigation have remained limited. In 2013 the Pediatric Heart Network (PHN), funded by the National Heart, Lung, and Blood Institute, developed the Integrated CARdiac Data and Outcomes (iCARD) Collaborative with the goals of leveraging available data sources to aid in efficiently planning and conducting PHN studies; supporting integration of PHN data with other sources to foster novel research otherwise not possible; and mentoring young investigators in these areas. This review describes lessons learned through the development of iCARD, initial efforts and scientific output, challenges, and future directions. This information can aid in the use and optimisation of data integration methodologies across other research networks and organisations.
We have observed the G23 field of the Galaxy AndMass Assembly (GAMA) survey using the Australian Square Kilometre Array Pathfinder (ASKAP) in its commissioning phase to validate the performance of the telescope and to characterise the detected galaxy populations. This observation covers ~48 deg2 with synthesised beam of 32.7 arcsec by 17.8 arcsec at 936MHz, and ~39 deg2 with synthesised beam of 15.8 arcsec by 12.0 arcsec at 1320MHz. At both frequencies, the root-mean-square (r.m.s.) noise is ~0.1 mJy/beam. We combine these radio observations with the GAMA galaxy data, which includes spectroscopy of galaxies that are i-band selected with a magnitude limit of 19.2. Wide-field Infrared Survey Explorer (WISE) infrared (IR) photometry is used to determine which galaxies host an active galactic nucleus (AGN). In properties including source counts, mass distributions, and IR versus radio luminosity relation, the ASKAP-detected radio sources behave as expected. Radio galaxies have higher stellar mass and luminosity in IR, optical, and UV than other galaxies. We apply optical and IR AGN diagnostics and find that they disagree for ~30% of the galaxies in our sample. We suggest possible causes for the disagreement. Some cases can be explained by optical extinction of the AGN, but for more than half of the cases we do not find a clear explanation. Radio sources aremore likely (~6%) to have an AGN than radio quiet galaxies (~1%), but the majority of AGN are not detected in radio at this sensitivity.
To assess the feasibility of electronic data capture of postdischarge durations and evaluate total durations of antimicrobial exposure related to inpatient hospital stays.
Multicenter, retrospective cohort study.
Two community hospitals and 1 academic medical center.
Hospitalized patients who received ≥1 dose of a systemic antimicrobial agent.
We collected and reviewed electronic data on inpatient and discharge antimicrobial prescribing from April to September 2016 in 3 pilot hospitals. Inpatient antimicrobial use was obtained from electronic medication administration records. Postdischarge antimicrobial use was calculated from electronic discharge prescriptions. We completed a manual validation to evaluate the ability of electronic prescriptions to capture intended postdischarge antibiotics. Inpatient, postdischarge, and total lengths of therapy (LOT) per admission were calculated to assess durations of antimicrobial therapy attributed to hospitalization.
A total of 45,693 inpatient admissions were evaluated. Antimicrobials were given during 23,447 admissions (51%), and electronic discharge prescriptions were captured in 7,442 admissions (16%). Manual validation revealed incomplete data capture in scenarios in which prescribers avoided the electronic system. The postdischarge LOT among admissions with discharge antimicrobials was median 8 days (range, 1–360) with peaks at 5, 7, 10, and 14 days. Postdischarge days accounted for 38% of antimicrobial exposure days.
Discharge antimicrobial therapy accounted for a large portion of antimicrobial exposure related to inpatient hospital stays. Discharge prescription data can feasibly be captured through electronic prescribing records and may aid in designing stewardship interventions at transitions of care.
Bacterial community composition and presence of antibiotic resistance genes (mecA, tetK, and vanA) on personal mobile devices (PMDs) of nurses in intensive care units (ICUs) were evaluated. Antibiotic resistance genes on PMDs decreased at the end of the shift, and a several microbial genera changed.
We evaluated whether a diagnostic stewardship initiative consisting of ASP preauthorization paired with education could reduce false-positive hospital-onset (HO) Clostridioides difficile infection (CDI).
Single center, quasi-experimental study.
Tertiary academic medical center in Chicago, Illinois.
Adult inpatients were included in the intervention if they were admitted between October 1, 2016, and April 30, 2018, and were eligible for C. difficile preauthorization review. Patients admitted to the stem cell transplant (SCT) unit were not included in the intervention and were therefore considered a contemporaneous noninterventional control group.
The intervention consisted of requiring prescriber attestation that diarrhea has met CDI clinical criteria, ASP preauthorization, and verbal clinician feedback. Data were compared 33 months before and 19 months after implementation. Facility-wide HO-CDI incidence rates (IR) per 10,000 patient days (PD) and standardized infection ratios (SIR) were extracted from hospital infection prevention reports.
During the entire 52 month period, the mean facility-wide HO-CDI-IR was 7.8 per 10,000 PD and the SIR was 0.9 overall. The mean ± SD HO-CDI-IR (8.5 ± 2.0 vs 6.5 ± 2.3; P < .001) and SIR (0.97 ± 0.23 vs 0.78 ± 0.26; P = .015) decreased from baseline during the intervention. Segmented regression models identified significant decreases in HO-CDI-IR (Pstep = .06; Ptrend = .008) and SIR (Pstep = .1; Ptrend = .017) trends concurrent with decreases in oral vancomycin (Pstep < .001; Ptrend < .001). HO-CDI-IR within a noninterventional control unit did not change (Pstep = .125; Ptrend = .115).
A multidisciplinary, multifaceted intervention leveraging clinician education and feedback reduced the HO-CDI-IR and the SIR in select populations. Institutions may consider interventions like ours to reduce false-positive C. difficile NAAT tests.
A point-prevalence study of antimicrobial use among inpatients at 5 public hospitals in Sri Lanka revealed that 54.6% were receiving antimicrobials: 43.1% in medical wards, 68.0% in surgical wards, and 97.6% in intensive care wards. Amoxicillin-clavulanate was most commonly used for major indications. Among patients receiving antimicrobials, 31.0% received potentially inappropriate therapy.
Contemporary ice stream flow is directly linked to conditions at the ice/bed interface, yet this environment is logistically difficult to access. Instead, we investigate subglacial processes important for ice stream flow by studying tills on the deglaciated Antarctic continental shelf. We test currently-accepted hypotheses surrounding subglacial processes and till properties with a Ross Sea dataset. Till shear strengths indicate a continuum of simultaneous processes acting at the bed, rather than discrete ‘deformation’ and ‘lodgement’ end-members. We identify a threshold water content representing saturated pore spaces, leading to basal sliding and meltwater channelization. Based on observations of till properties relative to glacial landforms, we challenge the assumption that low shear strength is linked to intense deformation. Spatial variability in landform morphology reflects variability in deforming processes at the sub-ice stream scale and suggests a maximum deforming bed thickness of 2 m at the grounding line. Regional till properties generally correlate with seafloor geology and deglacial history; the western Ross Sea is characterized by higher and more variable shear strengths and water contents, while lower-shear strength till was preserved in the Eastern Basin. These observations inform till interpretation and provide context for deforming beds beneath the modern ice sheet and on glaciated continental shelves.
Hospital environmental surfaces are frequently contaminated by microorganisms. However, the causal mechanism of bacterial contamination of the environment as a source of transmission is still debated. This prospective study was performed to characterize the nature of multidrug-resistant organism (MDRO) transmission between the environment and patients using standard microbiological and molecular techniques.
Prospective cohort study at 2 academic medical centers.
A prospective multicenter study to characterize the nature of bacterial transfer events between patients and environmental surfaces in rooms that previously housed patients with 1 of 4 ‘marker’ MDROs: methicillin-resistant Staphylococcus aureus, vancomycin-resistant enterococci, Clostridium difficile, and MDR Acinetobacter baumannii. Environmental and patient microbiological samples were obtained on admission into a freshly disinfected inpatient room. Repeat samples from room surfaces and patients were taken on days 3 and 7 and each week the patient stayed in the same room. The bacterial identity, antibiotic susceptibility, and molecular sequences were compared between organisms found in the environment samples and patient sources.
We enrolled 80 patient–room admissions; 9 of these patients (11.3%) were asymptomatically colonized with MDROs at study entry. Hospital room surfaces were contaminated with MDROs despite terminal disinfection in 44 cases (55%). Microbiological Bacterial Transfer events either to the patient, the environment, or both occurred in 12 patient encounters (18.5%) from the microbiologically evaluable cohort.
Microbiological Bacterial Transfer events between patients and the environment were observed in 18.5% of patient encounters and occurred early in the admission. This study suggests that research on prevention methods beyond the standard practice of room disinfection at the end of a patient’s stay is needed to better prevent acquisition of MDROs through the environment.
To determine the feasibility and value of developing a regional antibiogram for community hospitals.
Multicenter retrospective analysis of antibiograms.
SETTING AND PARTICIPANTS
A total of 20 community hospitals in central and eastern North Carolina and south central Virginia participated in this study.
We combined antibiogram data from participating hospitals for 13 clinically relevant gram-negative pathogen–antibiotic combinations. From this combined antibiogram, we developed a regional antibiogram based on the mean susceptibilities of the combined data.
We combined a total of 69,778 bacterial isolates across 13 clinically relevant gram-negative pathogen–antibiotic combinations (median for each combination, 1100; range, 174–27,428). Across all pathogen–antibiotic combinations, 69% of local susceptibility rates fell within 1 SD of the regional mean susceptibility rate, and 97% of local susceptibilities fell within 2 SD of the regional mean susceptibility rate. No individual hospital had >1 pathogen–antibiotic combination with a local susceptibility rate >2 SD of the regional mean susceptibility rate. All hospitals’ local susceptibility rates were within 2 SD of the regional mean susceptibility rate for low-prevalence pathogens (<500 isolates cumulative for the region).
Small community hospitals frequently cannot develop an accurate antibiogram due to a paucity of local data. A regional antibiogram is likely to provide clinically useful information to community hospitals for low-prevalence pathogens.
The ‘Digital Index of North American Archaeology’ (DINAA) project demonstrates how the aggregation and publication of government-held archaeological data can help to document human activity over millennia and at a continental scale. These data can provide a valuable link between specific categories of information available from publications, museum collections and online databases. Integration improves the discovery and retrieval of records of archaeological research currently held by multiple institutions within different information systems. It also aids in the preservation of those data and makes efforts to archive these research results more resilient to political turmoil. While DINAA focuses on North America, its methods have global applicability.
Patient days and days present were compared to directly measured person time to quantify how choice of different denominator metrics may affect antimicrobial use rates. Overall, days present were approximately one-third higher than patient days. This difference varied among hospitals and units and was influenced by short length of stay.
To summarize and discuss logistic and administrative challenges we encountered during the Benefits of Enhanced Terminal Room (BETR) Disinfection Study and lessons learned that are pertinent to future utilization of ultraviolet (UV) disinfection devices in other hospitals
Multicenter cluster randomized trial
SETTING AND PARTICIPANTS
Nine hospitals in the southeastern United States
All participating hospitals developed systems to implement 4 different strategies for terminal room disinfection. We measured compliance with disinfection strategy, barriers to implementation, and perceptions from nurse managers and environmental services (EVS) supervisors throughout the 28-month trial.
Implementation of enhanced terminal disinfection with UV disinfection devices provides unique challenges, including time pressures from bed control personnel, efficient room identification, negative perceptions from nurse managers, and discharge volume. In the course of the BETR Disinfection Study, we utilized several strategies to overcome these barriers: (1) establishing safety as the priority; (2) improving communication between EVS, bed control, and hospital administration; (3) ensuring availability of necessary resources; and (4) tracking and providing feedback on compliance. Using these strategies, we deployed ultraviolet (UV) disinfection devices in 16,220 (88%) of 18,411 eligible rooms during our trial (median per hospital, 89%; IQR, 86%–92%).
Implementation of enhanced terminal room disinfection strategies using UV devices requires recognition and mitigation of 2 key barriers: (1) timely and accurate identification of rooms that would benefit from enhanced terminal disinfection and (2) overcoming time constraints to allow EVS cleaning staff sufficient time to properly employ enhanced terminal disinfection methods.