To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The IntCal family of radiocarbon (14C) calibration curves is based on research spanning more than three decades. The IntCal group have collated the 14C and calendar age data (mostly derived from primary publications with other types of data and meta-data) and, since 2010, made them available for other sorts of analysis through an open-access database. This has ensured transparency in terms of the data used in the construction of the ratified calibration curves. As the IntCal database expands, work is underway to facilitate best practice for new data submissions, make more of the associated metadata available in a structured form, and help those wishing to process the data with programming languages such as R, Python, and MATLAB. The data and metadata are complex because of the range of different types of archives. A restructured interface, based on the “IntChron” open-access data model, includes tools which allow the data to be plotted and compared without the need for export. The intention is to include complementary information which can be used alongside the main 14C series to provide new insights into the global carbon cycle, as well as facilitating access to the data for other research applications. Overall, this work aims to streamline the generation of new calibration curves.
To evaluate the incidence of a candidate definition of healthcare facility-onset, treated Clostridioides difficile (CD) infection (cHT-CDI) and to identify variables and best model fit of a risk-adjusted cHT-CDI metric using extractable electronic heath data.
We analyzed 9,134,276 admissions from 265 hospitals during 2015–2020. The cHT-CDI events were defined based on the first positive laboratory final identification of CD after day 3 of hospitalization, accompanied by use of a CD drug. The generalized linear model method via negative binomial regression was used to identify predictors. Standardized infection ratios (SIRs) were calculated based on 2 risk-adjusted models: a simple model using descriptive variables and a complex model using descriptive variables and CD testing practices. The performance of each model was compared against cHT-CDI unadjusted rates.
The median rate of cHT-CDI events per 100 admissions was 0.134 (interquartile range, 0.023–0.243). Hospital variables associated with cHT-CDI included the following: higher community-onset CDI (CO-CDI) prevalence; highest-quartile length of stay; bed size; percentage of male patients; teaching hospitals; increased CD testing intensity; and CD testing prevalence. The complex model demonstrated better model performance and identified the most influential predictors: hospital-onset testing intensity and prevalence, CO-CDI rate, and community-onset testing intensity (negative correlation). Moreover, 78% of the hospitals ranked in the highest quartile based on raw rate shifted to lower percentiles when we applied the SIR from the complex model.
Hospital descriptors, aggregate patient characteristics, CO-CDI burden, and clinical testing practices significantly influence incidence of cHT-CDI. Benchmarking a cHT-CDI metric is feasible and should include facility and clinical variables.
To examine temporal changes in coverage with a complete primary series of coronavirus disease 2019 (COVID-19) vaccination and staffing shortages among healthcare personnel (HCP) working in nursing homes in the United States before, during, and after the implementation of jurisdiction-based COVID-19 vaccination mandates for HCP.
Sample and setting:
HCP in nursing homes from 15 US jurisdictions.
We analyzed weekly COVID-19 vaccination data reported to the Centers for Disease Control and Prevention’s National Healthcare Safety Network from June 7, 2021, through January 2, 2022. We assessed 3 periods (preintervention, intervention, and postintervention) based on the announcement of vaccination mandates for HCP in 15 jurisdictions. We used interrupted time-series models to estimate the weekly percentage change in vaccination with complete primary series and the odds of reporting a staffing shortage for each period.
Complete primary series vaccination among HCP increased from 66.7% at baseline to 94.3% at the end of the study period and increased at the fastest rate during the intervention period for 12 of 15 jurisdictions. The odds of reporting a staffing shortage were lowest after the intervention.
These findings demonstrate that COVID-19 vaccination mandates may be an effective strategy for improving HCP vaccination coverage in nursing homes without exacerbating staffing shortages. These data suggest that mandates can be considered to improve COVID-19 coverage among HCP in nursing homes to protect both HCP and vulnerable nursing home residents.
To evaluate the prevalence of hospital-onset bacteremia and fungemia (HOB), identify hospital-level predictors, and to evaluate the feasibility of an HOB metric.
We analyzed 9,202,650 admissions from 267 hospitals during 2015–2020. An HOB event was defined as the first positive blood-culture pathogen on day 3 of admission or later. We used the generalized linear model method via negative binomial regression to identify variables and risk markers for HOB. Standardized infection ratios (SIRs) were calculated based on 2 risk-adjusted models: a simple model using descriptive variables and a complex model using descriptive variables plus additional measures of blood-culture testing practices. Performance of each model was compared against the unadjusted rate of HOB.
Overall median rate of HOB per 100 admissions was 0.124 (interquartile range, 0.00–0.22). Facility-level predictors included bed size, sex, ICU admissions, community-onset (CO) blood culture testing intensity, and hospital-onset (HO) testing intensity, and prevalence (all P < .001). In the complex model, CO bacteremia prevalence, HO testing intensity, and HO testing prevalence were the predictors most associated with HOB. The complex model demonstrated better model performance; 55% of hospitals that ranked in the highest quartile based on their raw rate shifted to a lower quartile when the SIR from the complex model was applied.
Hospital descriptors, aggregate patient characteristics, community bacteremia and/or fungemia burden, and clinical blood-culture testing practices influence rates of HOB. Benchmarking an HOB metric is feasible and should endeavor to include both facility and clinical variables.
To evaluate hospital-level variation in using first-line antibiotics for Clostridioides difficile infection (CDI) based on the burden of laboratory-identified (LabID) CDI.
Using data on hospital-level LabID CDI events and antimicrobial use (AU) for CDI (oral/rectal vancomycin or fidaxomicin) submitted to the National Healthcare Safety Network in 2019, we assessed the association between hospital-level CDI prevalence (per 100 patient admissions) and rate of CDI AU (days of therapy per 1,000 days present) to generate a predicted value of AU based on CDI prevalence and CDI test type using negative binomial regression. The ratio of the observed to predicted AU was then used to identify hospitals with extreme discordance between CDI prevalence and CDI AU, defined as hospitals with a ratio outside of the intervigintile range.
Among 963 acute-care hospitals, rate of CDI prevalence demonstrated a positive dose–response relationship with rate of CDI AU. Compared with hospitals without extreme discordance (n = 902), hospitals with lower-than-expected CDI AU (n = 31) had, on average, fewer beds (median, 106 vs 208), shorter length of stay (median, 3.8 vs 4.2 days), and higher proportion of undergraduate or nonteaching medical school affiliation (48% vs 39%). Hospitals with higher-than-expected CDI AU (n = 30) were similar overall to hospitals without extreme discordance.
The prevalence rate of LabID CDI had a significant dose–response association with first-line antibiotics for treating CDI. We identified hospitals with extreme discordance between CDI prevalence and CDI AU, highlighting potential opportunities for data validation and improvements in diagnostic and treatment practices for CDI.
We summarize some of the past year's most important findings within climate change-related research. New research has improved our understanding about the remaining options to achieve the Paris Agreement goals, through overcoming political barriers to carbon pricing, taking into account non-CO2 factors, a well-designed implementation of demand-side and nature-based solutions, resilience building of ecosystems and the recognition that climate change mitigation costs can be justified by benefits to the health of humans and nature alone. We consider new insights about what to expect if we fail to include a new dimension of fire extremes and the prospect of cascading climate tipping elements.
A synthesis is made of 10 topics within climate research, where there have been significant advances since January 2020. The insights are based on input from an international open call with broad disciplinary scope. Findings include: (1) the options to still keep global warming below 1.5 °C; (2) the impact of non-CO2 factors in global warming; (3) a new dimension of fire extremes forced by climate change; (4) the increasing pressure on interconnected climate tipping elements; (5) the dimensions of climate justice; (6) political challenges impeding the effectiveness of carbon pricing; (7) demand-side solutions as vehicles of climate mitigation; (8) the potentials and caveats of nature-based solutions; (9) how building resilience of marine ecosystems is possible; and (10) that the costs of climate change mitigation policies can be more than justified by the benefits to the health of humans and nature.
Social media summary
How do we limit global warming to 1.5 °C and why is it crucial? See highlights of latest climate science.
To determine the impact of the coronavirus disease 2019 (COVID-19) pandemic on healthcare-associated infection (HAI) incidence in US hospitals, national- and state-level standardized infection ratios (SIRs) were calculated for each quarter in 2020 and compared to those from 2019.
Central–line–associated bloodstream infections (CLABSIs), catheter-associated urinary tract infections (CAUTIs), ventilator-associated events (VAEs), select surgical site infections, and Clostridioides difficile and methicillin-resistant Staphylococcus aureus (MRSA) bacteremia laboratory-identified events reported to the National Healthcare Safety Network for 2019 and 2020 by acute-care hospitals were analyzed. SIRs were calculated for each HAI and quarter by dividing the number of reported infections by the number of predicted infections, calculated using 2015 national baseline data. Percentage changes between 2019 and 2020 SIRs were calculated. Supporting analyses, such as an assessment of device utilization in 2020 compared to 2019, were also performed.
Significant increases in the national SIRs for CLABSI, CAUTI, VAE, and MRSA bacteremia were observed in 2020. Changes in the SIR varied by quarter and state. The largest increase was observed for CLABSI, and significant increases in VAE incidence and ventilator utilization were seen across all 4 quarters of 2020.
This report provides a national view of the increases in HAI incidence in 2020. These data highlight the need to return to conventional infection prevention and control practices and build resiliency in these programs to withstand future pandemics.
During March 27–July 14, 2020, the Centers for Disease Control and Prevention’s National Healthcare Safety Network extended its surveillance to hospital capacities responding to COVID-19 pandemic. The data showed wide variations across hospitals in case burden, bed occupancies, ventilator usage, and healthcare personnel and supply status. These data were used to inform emergency responses.
We analyzed 2017 healthcare facility-onset (HO) vancomycin-resistant Enterococcus (VRE) bacteremia data to identify hospital-level factors that were significant predictors of HO-VRE using the Centers for Disease Control and Prevention (CDC) National Healthcare Safety Network (NHSN) multidrug-resistant organism and Clostridioides difficile reporting module. A risk-adjusted model that can be used to calculate the number of predicted HO-VRE bacteremia events in a facility was developed, thus enabling the calculation of VRE standardized infection ratios (SIRs).
Acute-care hospitals reporting at least 1 month of 2017 VRE bacteremia data were included in the analysis. Various hospital-level characteristics were assessed to develop a best-fit model and subsequently derive the 2018 national and state SIRs.
In 2017, 470 facilities in 35 states participated in VRE bacteremia surveillance. Inpatient VRE community-onset prevalence rate, average length of patient stay, outpatient VRE community-onset prevalence rate, and presence of an oncology unit were all significantly associated (all 95% likelihood ratio confidence limits excluded the nominal value of zero) with HO-VRE bacteremia. The 2018 national SIR was 1.01 (95% CI, 0.93–1.09) with 577 HO bacteremia events reported.
The creation of an SIR enables national-, state-, and facility-level monitoring of VRE bacteremia while controlling for individual hospital-level factors. Hospitals can compare their VRE burden to a national benchmark to help them determine the effectiveness of infection prevention efforts over time.
Data reported to the Centers for Disease Control and Prevention’s National Healthcare Safety Network (CDC NHSN) were analyzed to understand the potential impact of the COVID-19 pandemic on central-line–associated bloodstream infections (CLABSIs) in acute-care hospitals. Descriptive analysis of the standardized infection ratio (SIR) was conducted by location, location type, geographic area, and bed size.
The rapid spread of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) throughout key regions of the United States in early 2020 placed a premium on timely, national surveillance of hospital patient censuses. To meet that need, the Centers for Disease Control and Prevention’s National Healthcare Safety Network (NHSN), the nation’s largest hospital surveillance system, launched a module for collecting hospital coronavirus disease 2019 (COVID-19) data. We present time-series estimates of the critical hospital capacity indicators from April 1 to July 14, 2020.
From March 27 to July 14, 2020, the NHSN collected daily data on hospital bed occupancy, number of hospitalized patients with COVID-19, and the availability and/or use of mechanical ventilators. Time series were constructed using multiple imputation and survey weighting to allow near–real-time daily national and state estimates to be computed.
During the pandemic’s April peak in the United States, among an estimated 431,000 total inpatients, 84,000 (19%) had COVID-19. Although the number of inpatients with COVID-19 decreased from April to July, the proportion of occupied inpatient beds increased steadily. COVID-19 hospitalizations increased from mid-June in the South and Southwest regions after stay-at-home restrictions were eased. The proportion of inpatients with COVID-19 on ventilators decreased from April to July.
The NHSN hospital capacity estimates served as important, near–real-time indicators of the pandemic’s magnitude, spread, and impact, providing quantitative guidance for the public health response. Use of the estimates detected the rise of hospitalizations in specific geographic regions in June after they declined from a peak in April. Patient outcomes appeared to improve from early April to mid-July.
Background: The NHSN is the nation’s largest surveillance system for healthcare-associated infections. Since 2011, acute-care hospitals (ACHs) have been required to report intensive care unit (ICU) central-line–associated bloodstream infections (CLABSIs) to the NHSN pursuant to CMS requirements. In 2015, this requirement included general medical, surgical, and medical-surgical wards. Also in 2015, the NHSN implemented a repeat infection timeframe (RIT) that required repeat CLABSIs, in the same patient and admission, to be excluded if onset was within 14 days. This analysis is the first at the national level to describe repeat CLABSIs. Methods: Index CLABSIs reported in ACH ICUs and select wards during 2015–2108 were included, in addition to repeat CLABSIs occurring at any location during the same period. CLABSIs were stratified into 2 groups: single and repeat CLABSIs. The repeat CLABSI group included the index CLABSI and subsequent CLABSI(s) reported for the same patient. Up to 5 CLABSIs were included for a single patient. Pathogen analyses were limited to the first pathogen reported for each CLABSI, which is considered to be the most important cause of the event. Likelihood ratio χ2 tests were used to determine differences in proportions. Results: Of the 70,214 CLABSIs reported, 5,983 (8.5%) were repeat CLABSIs. Of 3,264 nonindex CLABSIs, 425 (13%) were identified in non-ICU or non-select ward locations. Staphylococcus aureus was the most common pathogen in both the single and repeat CLABSI groups (14.2% and 12%, respectively) (Fig. 1). Compared to all other pathogens, CLABSIs reported with Candida spp were less likely in a repeat CLABSI event than in a single CLABSI event (P < .0001). Insertion-related organisms were more likely to be associated with single CLABSIs than repeat CLABSIs (P < .0001) (Fig. 2). Alternatively, Enterococcus spp or Klebsiella pneumoniae and K. oxytoca were more likely to be associated with repeat CLABSIs than single CLABSIs (P < .0001). Conclusions: This analysis highlights differences in the aggregate pathogen distributions comparing single versus repeat CLABSIs. Assessing the pathogens associated with repeat CLABSIs may offer another way to assess the success of CLABSI prevention efforts (eg, clean insertion practices). Pathogens such as Enterococcus spp and Klebsiella spp demonstrate a greater association with repeat CLABSIs. Thus, instituting prevention efforts focused on these organisms may warrant greater attention and could impact the likelihood of repeat CLABSIs. Additional analysis of patient-specific pathogens identified in the repeat CLABSI group may yield further clarification.
Background:Staphylococcus aureus has long been an important cause of healthcare-associated infections (HAIs) and remains the second most common HAI pathogen in the United States. Often resistant to several antibiotics, S. aureus infections are difficult to treat and can leave patients at risk for serious complications such as pneumonia and sepsis. HAI pathogens and their antimicrobial susceptibility testing (AST) results have been reported to NHSN since its inception in 2005. Previous NHSN surveillance reports have presented national annual benchmarks for antimicrobial resistance phenotypes, such as methicillin-resistant S. aureus (MRSA). Whether there have been any significant changes over time in the prevalence of methicillin resistance among S. aureus infections reported to NHSN has not been previously assessed. Methods:S. aureus AST data from central-line–associated bloodstream infections, catheter-associated urinary tract infections, and inpatient surgical site infections reported from acute-care hospitals between 2009 and 2018 were analyzed. S. aureus was defined as MRSA if it was reported as resistant to oxacillin, cefoxitin, or methicillin. A national percentage resistant (%R) was calculated for each year as the number of resistant pathogens divided by the number of pathogens tested for susceptibility multiplied by 100. A generalized linear mixed model with logistic function was created to evaluate annual changes in the percentage resistant. Several patient-level and hospital-level characteristics were assessed as potential covariates. To account for differential baseline %R values between individual hospitals, specification of random intercept and slope were used during model creation. Differences in the trend of %R between HAI types were assessed using interaction terms. Data were analyzed using SAS v 9.3 software, and P < .05 was considered significant. Results: Overall, 3,317 hospitals reported at least 1 S. aureus pathogen tested for susceptibility between 2009 and 2018. The national unadjusted %R decreased from 49.2% (2009) to 41.2% (2018), with similar decreases seen in each HAI type (Table 1). After adjusting for significant covariates, a statistically significant annual 3% decrease in the prevalence of resistance was observed (Fig. 1). Significant differences between HAI types did not exist. Conclusions: The percentage of healthcare-associated S. aureus resistant to oxacillin, cefoxitin, or methicillin has declined consistently over the past 10 years. Continued efforts in infection prevention and antimicrobial stewardship are vital to sustaining this decline.
Radiocarbon (14C) ages cannot provide absolutely dated chronologies for archaeological or paleoenvironmental studies directly but must be converted to calendar age equivalents using a calibration curve compensating for fluctuations in atmospheric 14C concentration. Although calibration curves are constructed from independently dated archives, they invariably require revision as new data become available and our understanding of the Earth system improves. In this volume the international 14C calibration curves for both the Northern and Southern Hemispheres, as well as for the ocean surface layer, have been updated to include a wealth of new data and extended to 55,000 cal BP. Based on tree rings, IntCal20 now extends as a fully atmospheric record to ca. 13,900 cal BP. For the older part of the timescale, IntCal20 comprises statistically integrated evidence from floating tree-ring chronologies, lacustrine and marine sediments, speleothems, and corals. We utilized improved evaluation of the timescales and location variable 14C offsets from the atmosphere (reservoir age, dead carbon fraction) for each dataset. New statistical methods have refined the structure of the calibration curves while maintaining a robust treatment of uncertainties in the 14C ages, the calendar ages and other corrections. The inclusion of modeled marine reservoir ages derived from a three-dimensional ocean circulation model has allowed us to apply more appropriate reservoir corrections to the marine 14C data rather than the previous use of constant regional offsets from the atmosphere. Here we provide an overview of the new and revised datasets and the associated methods used for the construction of the IntCal20 curve and explore potential regional offsets for tree-ring data. We discuss the main differences with respect to the previous calibration curve, IntCal13, and some of the implications for archaeology and geosciences ranging from the recent past to the time of the extinction of the Neanderthals.
Surgical site infections (SSIs) are among the most common healthcare-associated infections in low- and middle-income countries. To encourage establishment of actionable and standardized SSI surveillance in these countries, we propose simplified surveillance case definitions. Here, we use NHSN reports to explore concordance of these simplified definitions to NHSN as ‘reference standard.’
Prevention of Clostridioides difficile infection (CDI) is a national priority and may be facilitated by deployment of the Targeted Assessment for Prevention (TAP) Strategy, a quality improvement framework providing a focused approach to infection prevention. This article describes the process and outcomes of TAP Strategy implementation for CDI prevention in a healthcare system.
Hospital A was identified based on CDI surveillance data indicating an excess burden of infections above the national goal; hospitals B and C participated as part of systemwide deployment. TAP facility assessments were administered to staff to identify infection control gaps and inform CDI prevention interventions. Retrospective analysis was performed using negative-binomial, interrupted time series (ITS) regression to assess overall effect of targeted CDI prevention efforts. Analysis included hospital-onset, laboratory-identified C. difficile event data for 18 months before and after implementation of the TAP facility assessments.
The systemwide monthly CDI rate significantly decreased at the intervention (β2, −44%; P = .017), and the postintervention CDI rate trend showed a sustained decrease (β1 + β3; −12% per month; P = .008). At an individual hospital level, the CDI rate trend significantly decreased in the postintervention period at hospital A only (β1 + β3, −26% per month; P = .003).
This project demonstrates TAP Strategy implementation in a healthcare system, yielding significant decrease in the laboratory-identified C. difficile rate trend in the postintervention period at the system level and in hospital A. This project highlights the potential benefit of directing prevention efforts to facilities with the highest burden of excess infections to more efficiently reduce CDI rates.
To describe pathogen distribution and rates for central-line–associated bloodstream infections (CLABSIs) from different acute-care locations during 2011–2017 to inform prevention efforts.
CLABSI data from the Centers for Disease Control and Prevention (CDC) National Healthcare Safety Network (NHSN) were analyzed. Percentages and pooled mean incidence density rates were calculated for a variety of pathogens and stratified by acute-care location groups (adult intensive care units [ICUs], pediatric ICUs [PICUs], adult wards, pediatric wards, and oncology wards).
From 2011 to 2017, 136,264 CLABSIs were reported to the NHSN by adult and pediatric acute-care locations; adult ICUs and wards reported the most CLABSIs: 59,461 (44%) and 40,763 (30%), respectively. In 2017, the most common pathogens were Candida spp/yeast in adult ICUs (27%) and Enterobacteriaceae in adult wards, pediatric wards, oncology wards, and PICUs (23%–31%). Most pathogen-specific CLABSI rates decreased over time, excepting Candida spp/yeast in adult ICUs and Enterobacteriaceae in oncology wards, which increased, and Staphylococcus aureus rates in pediatric locations, which did not change.
The pathogens associated with CLABSIs differ across acute-care location groups. Learning how pathogen-targeted prevention efforts could augment current prevention strategies, such as strategies aimed at preventing Candida spp/yeast and Enterobacteriaceae CLABSIs, might further reduce national rates.
Sulfur-bearing monazite-(Ce) occurs in silicified carbonatite at Eureka, Namibia, forming rims up to ~0.5 mm thick on earlier-formed monazite-(Ce) megacrysts. We present X-ray photoelectron spectroscopy data demonstrating that sulfur is accommodated predominantly in monazite-(Ce) as sulfate, via a clino-anhydrite-type coupled substitution mechanism. Minor sulfide and sulfite peaks in the X-ray photoelectron spectra, however, also indicate that more complex substitution mechanisms incorporating S2– and S4+ are possible. Incorporation of S6+ through clino-anhydrite-type substitution results in an excess of M2+ cations, which previous workers have suggested is accommodated by auxiliary substitution of OH– for O2–. However, Raman data show no indication of OH–, and instead we suggest charge imbalance is accommodated through F– substituting for O2–. The accommodation of S in the monazite-(Ce) results in considerable structural distortion that may account for relatively high contents of ions with radii beyond those normally found in monazite-(Ce), such as the heavy rare earth elements, Mo, Zr and V. In contrast to S-bearing monazite-(Ce) in other carbonatites, S-bearing monazite-(Ce) at Eureka formed via a dissolution–precipitation mechanism during prolonged weathering, with S derived from an aeolian source. While large S-bearing monazite-(Ce) grains are likely to be rare in the geological record, formation of secondary S-bearing monazite-(Ce) in these conditions may be a feasible mineral for dating palaeo-weathering horizons.