To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We analyzed 2017 healthcare facility-onset (HO) vancomycin-resistant Enterococcus (VRE) bacteremia data to identify hospital-level factors that were significant predictors of HO-VRE using the Centers for Disease Control and Prevention (CDC) National Healthcare Safety Network (NHSN) multidrug-resistant organism and Clostridioides difficile reporting module. A risk-adjusted model that can be used to calculate the number of predicted HO-VRE bacteremia events in a facility was developed, thus enabling the calculation of VRE standardized infection ratios (SIRs).
Acute-care hospitals reporting at least 1 month of 2017 VRE bacteremia data were included in the analysis. Various hospital-level characteristics were assessed to develop a best-fit model and subsequently derive the 2018 national and state SIRs.
In 2017, 470 facilities in 35 states participated in VRE bacteremia surveillance. Inpatient VRE community-onset prevalence rate, average length of patient stay, outpatient VRE community-onset prevalence rate, and presence of an oncology unit were all significantly associated (all 95% likelihood ratio confidence limits excluded the nominal value of zero) with HO-VRE bacteremia. The 2018 national SIR was 1.01 (95% CI, 0.93–1.09) with 577 HO bacteremia events reported.
The creation of an SIR enables national-, state-, and facility-level monitoring of VRE bacteremia while controlling for individual hospital-level factors. Hospitals can compare their VRE burden to a national benchmark to help them determine the effectiveness of infection prevention efforts over time.
Vesicular monoamine transporter 2 (VMAT2) inhibitors including valbenazine are first-line therapies for tardive dyskinesia (TD), a persistent movement disorder associated with antipsychotic exposure. This real-world study was performed to assess the association between patient awareness of TD symptoms and clinician-assessed symptom severity.
Clinicians who treated antipsychotic-induced TD with a VMAT2 inhibitor within the past 24 months were asked to extract demographic/clinical data from patients charts and complete a survey for additional data, including patient awareness of TD (yes/no) and TD symptom severity (mild/moderate/severe).
Data for 601 patients were provided by 163 clinicians (113 psychiatrists; 46 neurologists; 4 primary care physicians). Patient demographics: 50% male; mean age 50.6 years; 55% schizophrenia/schizoaffective disorder; 29% bipolar disorder; 16% other psychiatric diagnoses. Positive relationships were seen between patient awareness and clinician-assessed symptom severity. Awareness was highest in patients with severe symptoms in specific body regions: face (88% vs 78%/69% [awareness by severe vs moderate/mild symptoms]); jaw (90% vs 80%/67%); wrists (90% vs 69%/63%). In other regions, awareness was similar in patients with severe or moderate symptoms: lips (85%/86% vs 68% [severe/moderate vs mild]); tongue (81%/80% vs 73%); neck (80%/78% vs 68%); arms (67%/66% vs 62%); knees (67%/67% vs 53%).
In patients prescribed a VMAT2 inhibitor for TD, patient awareness was generally higher in those determined to have moderate-to-severe symptom severity as assessed by the clinician. More research is needed to understand how awareness and severity contribute to TD burden, and whether different treatment strategies are needed based on these factors.
Depression and overweight are each associated with abnormal immune system activation. We sought to disentangle the extent to which depressive symptoms and overweight status contributed to increased inflammation and abnormal cortisol levels.
Participants were recruited through the Wellcome Trust NIMA Consortium. The sample of 216 participants consisted of 69 overweight patients with depression; 35 overweight controls; 55 normal-weight patients with depression and 57 normal-weight controls. Peripheral inflammation was measured as high-sensitivity C-Reactive Protein (hsCRP) in serum. Salivary cortisol was collected at multiple points throughout the day to measure cortisol awakening response and diurnal cortisol levels.
Overweight patients with depression had significantly higher hsCRP compared with overweight controls (p = 0.042), normal-weight depressed patients (p < 0.001) and normal-weight controls (p < 0.001), after controlling for age and gender. Multivariable logistic regression showed that comorbid depression and overweight significantly increased the risk of clinically elevated hsCRP levels ⩾3 mg/L (OR 2.44, 1.28–3.94). In a separate multivariable logistic regression model, overweight status contributed most to the risk of having hsCRP levels ⩾3 mg/L (OR 1.52, 0.7–2.41), while depression also contributed a significant risk (OR 1.09, 0.27–2). There were no significant differences between groups in cortisol awakening response and diurnal cortisol levels.
Comorbid depression and overweight status are associated with increased hsCRP, and the coexistence of these conditions amplified the risk of clinically elevated hsCRP levels. Overweight status contributed most to the risk of clinically elevated hsCRP levels, but depression also contributed to a significant risk. We observed no differences in cortisol levels between groups.
Data reported to the Centers for Disease Control and Prevention’s National Healthcare Safety Network (CDC NHSN) were analyzed to understand the potential impact of the COVID-19 pandemic on central-line–associated bloodstream infections (CLABSIs) in acute-care hospitals. Descriptive analysis of the standardized infection ratio (SIR) was conducted by location, location type, geographic area, and bed size.
The rapid spread of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) throughout key regions of the United States in early 2020 placed a premium on timely, national surveillance of hospital patient censuses. To meet that need, the Centers for Disease Control and Prevention’s National Healthcare Safety Network (NHSN), the nation’s largest hospital surveillance system, launched a module for collecting hospital coronavirus disease 2019 (COVID-19) data. We present time-series estimates of the critical hospital capacity indicators from April 1 to July 14, 2020.
From March 27 to July 14, 2020, the NHSN collected daily data on hospital bed occupancy, number of hospitalized patients with COVID-19, and the availability and/or use of mechanical ventilators. Time series were constructed using multiple imputation and survey weighting to allow near–real-time daily national and state estimates to be computed.
During the pandemic’s April peak in the United States, among an estimated 431,000 total inpatients, 84,000 (19%) had COVID-19. Although the number of inpatients with COVID-19 decreased from April to July, the proportion of occupied inpatient beds increased steadily. COVID-19 hospitalizations increased from mid-June in the South and Southwest regions after stay-at-home restrictions were eased. The proportion of inpatients with COVID-19 on ventilators decreased from April to July.
The NHSN hospital capacity estimates served as important, near–real-time indicators of the pandemic’s magnitude, spread, and impact, providing quantitative guidance for the public health response. Use of the estimates detected the rise of hospitalizations in specific geographic regions in June after they declined from a peak in April. Patient outcomes appeared to improve from early April to mid-July.
This chapter addresses the challenges arising from the often unclear legal status of intra-state peace agreements in establishing their binding force, applicable law and relevant principles of interpretation, and considers how drafting techniques affect the implementation of such agreements. First, it maps out the means available to participants in a peace process to confer binding status on (the substance of) an agreement under either domestic or international law, including UN Security Council endorsement, domestic entrenchment or constitutional reform. Second, the chapter examines how (predominantly international) courts and tribunals have grappled with the task of determining the law applicable to peace agreements, as well as examining trends in the principles of interpretation applied by adjudicatory bodies. Finally, it turns to the effects of drafting techniques on the implementation of agreements from the perspectives of the constitutive and instrumental approaches, including weighing the merits of constructive ambiguity.
All patients and staff on the outbreak ward (case cluster), and randomly selected patients and staff on COVID-19 wards (positive control cluster) and a non-COVID-19 wards (negative control cluster) underwent reverse-transcriptase polymerase chain reaction (RT-PCR) testing. Hand hygiene and personal protective equipment (PPE) compliance, detection of environmental SARS-COV-2 RNA, patient behavior, and SARS-CoV-2 IgG antibody prevalence were assessed.
In total, 145 staff and 26 patients were exposed, resulting in 24 secondary cases. Also, 4 of 14 (29%) staff and 7 of 10 (70%) patients were asymptomatic or presymptomatic. There was no difference in mean cycle threshold between asymptomatic or presymptomatic versus symptomatic individuals. None of 32 randomly selected staff from the control wards tested positive. Environmental RNA detection levels were higher on the COVID-19 ward than on the negative control ward (OR, 19.98; 95% CI, 2.63–906.38; P < .001). RNA levels on the COVID-19 ward (where there were no outbreaks) and the outbreak ward were similar (OR, 2.38; P = .18). Mean monthly hand hygiene compliance, based on 20,146 observations (over preceding year), was lower on the outbreak ward (P < .006). Compared to both control wards, the proportion of staff with detectable antibodies was higher on the outbreak ward (OR, 3.78; 95% CI, 1.01–14.25; P = .008).
Staff seroconversion was more likely during a short-term outbreak than from sustained duty on a COVID-19 ward. Environmental contamination and PPE use were similar on the outbreak and control wards. Patient noncompliance, decreased hand hygiene, and asymptomatic or presymptomatic transmission were more frequent on the outbreak ward.
Background: The CDC NHSN launched the Antimicrobial Use Option in 2011. The Antimicrobial Use Option allows users to implement risk-adjusted antimicrobial use benchmarking within- and between- facilities using the standardized antimicrobial administration ratio (SAAR) and to evaluate use over time. The SAAR can be used for public health surveillance and to guide an organization’s stewardship or quality improvement efforts. Methods: Antimicrobial Use Option enrollment grew through partner engagement, targeted education, and development of data benchmarking. We analyze enrollment over time and discuss key drivers of participation. Results: Initial 2011 Antimicrobial Use Option enrollment efforts awarded grant Funding: to 4 health departments. These health departments partnered with hospitals, which encouraged vendors to build infrastructure for electronic antimicrobial use reporting. CDC supported vendors through outreach and education. In 2012, with CDC support, Veterans’ Affairs (VA) Informatics, Decision-Enhancement, and Analytic Sciences Center and partners began implementation of Antimicrobial Use Option reporting and validation of submitted data. These early efforts led to enrollment of 64 facilities by 2014 (Fig. 1). As awareness of the antimicrobial use option grew, we focused on facility engagement and development of benchmark metrics. A second round of grant Funding: in 2015 supported submission to the Antimicrobial Use Option from additional facilities by Funding: a vendor, a healthcare system, and an antimicrobial stewardship network. In 2015, CMS recognized the Antimicrobial Use Option as a choice for public health registry reporting under Meaningful Use Stage 3, resulting in an increase in participating hospitals. Antimicrobial Use Option enrollment increased in 2015 (n = 120), coinciding with national prioritization of antimicrobial stewardship. In 2016, the SAAR, was released in NHSN. We leveraged the SAAR to encourage participation from additional facilities and began quarterly calls to encourage continued participation from existing users. In 2016, the Department of Defense began submitting data to the Antimicrobial Use Option, resulting in 207 facilities enrolled in 2016, which grew to 616 in 2017. As of November 2019, 12 vendors self-report submission capabilities and 1,470 facilities, of ~6,800 active NHSN participants, are enrolled in the Antimicrobial Use Option. Two states have passed requirements regulating Antimicrobial Use Option reporting with Tennessee’s requirement going into effect in 2021. Conclusions: The Antimicrobial Use Option offers evidence that collaboration with partners, and leveraging of benchmarking metrics available to a national surveillance system can lead to increased voluntary participation in surveillance of high-priority public health data. Moving forward, we will continue expanding analytic capabilities and partner engagement.
Background:Clostridioides difficile infection (CDI) is one of the most common laboratory-identified (LabID) healthcare-associated events reported to the National Healthcare Safety Network (NHSN). CDI prevention remains a national priority, and efforts to reduce infection burden and improve antibiotic stewardship continue to expand across the healthcare spectrum. Beginning in 2013, the Centers for Medicare and Medicaid Services (CMS) required acute-care hospitals participating in CMS’ Inpatient Quality Reporting program to report CDI LabID data to NHSN and, in 2015, extended this reporting requirement to emergency departments (ED) and 24-hour observation units. To assess national progress, we evaluated changes in hospital onset CDI (HO-CDI) incidence during 2010–2018. Methods: Cases of HO-CDI were reported to NHSN by hospitals using the NHSN’s LabID criteria. Generalized linear mixed-effects modeling was used to assess trends of HO-CDI by treating the hospital as a random intercept to account for the correlation of the repeated responses over time. The data were summarized at the quarterly level, the main effect was time, and the covariates of interest were the following: CDI test type, inpatient community-onset (CO) infection rate, hospital type, average length of stay, medical school affiliation, number of beds, number of ICU beds, number of infection control professionals, presence of an ED or observation unit , and an indicator for 2015 to account for CDI protocol changes that required hospitals to conduct surveillance in both inpatient and ED or observation unit setting. Results: During 2010–2013, the number of hospitals reporting CDI increased and then stabilized after 2013 (Table 1). Crude HO-CDI rates decreased over time, except for an increase in 2015 and steeper reduction thereafter. (Table 2). During 2010–2014, the adjusted quarterly rate of change was −0.45% (95% CI, −0.57% to −0.33%; P < .0001). The rate of reduction was smaller in 2010–2014 compared to those of 2015–2018 (−2.82%; 95% CI, −3.10% to −2.54%; P < .0001). Compared to 2014, the adjusted rate in 2015 increased by 79.14% (95% CI, 72.42%–86.11%; P < .0001). Conclusions: The number of hospitals reporting CDI LabID data grew substantially in 2013 as a result of the CMS requirement for reporting. Adjusted HO-CDI rates decreased over time, with a rate hike in the year of 2015 and a rapid decrease thereafter. The increase in 2015 may be explained by changes in the NHSN CDI surveillance protocol and better test type classification in later years. Overall decreases in HO-CDI rates may be influenced by prevention strategies.
Background: The Centers for Disease Control and Prevention’s National Healthcare Safety Network (NHSN) has included surveillance of laboratory-identified (LabID) methicillin-resistant Staphylococcus aureus (MRSA) bacteremia events since 2009. In 2013, the Centers for Medicare & Medicaid Services (CMS) began requiring acute-care hospitals (ACHs) that participate in the CMS Inpatient Quality Reporting program to report MRSA LabID events to the NHSN and, in 2015, ACHs were required to report MRSA LabID events from emergency departments (EDs) and/or 24-hour observation locations. Prior studies observed a decline in hospital-onset MRSA (HO-MRSA) rates in national studies over shorter periods or other surveillance systems. In this analysis, we review the national reporting trend for HO-MRSA bacteremia LabID events, 2010–2018. Method: This analysis was limited to MRSA bacteremia LabID event data reported by ACHs that follow NHSN surveillance protocols. The data were restricted to events reported for overall inpatient facility-wide and, if applicable, EDs and 24-hour observation locations. MRSA events were classified as HO (collected >3 days after admission) or inpatient or outpatient community onset (CO, collected ≤3 days after admission). An interrupted time series random-effects generalized linear model was used to examine the relationship between HO-MRSA incidence rates (per 1,000 patient days) and time (year) while controlling for potential risk factors as fixed effects. The following potential risk factors were evaluated: facility’s annual survey data (facility type, medical affiliation, length of facility stay, number of beds, and number of intensive care unit beds) and quarterly summary data (inpatient and outpatient CO prevalence rates). Result: The number of reporting ACHs increased during this period, from 473 in 2010 to 3,651 in 2018. The crude HO-MRSA incidence rates (per 1,000 patient days) have declined over time, from a high of 0.067 in 2011 to 0.052 in 2018 (Table 1). Compared to 2014, the adjusted annual incidence rate increased in 2015 by 16.38%, (95% confidence interval [CI], 10.26%–22.84%; P < .0001). After controlling for all significant risk factors, the estimated annual HO-MRSA incidence rates declined by 5.98% (95% CI, 5.17%–6.78%; P < .0001) (Table 2). Conclusions: HO-MRSA bacteremia incidence rates have decreased over the past 9 years, despite a slight increase in 2015. This national trend analysis reviewed a longer period while analyzing potential risk factors. The decline in HO-MRSA incidence rates has been gradual; however, given the current trend, it is not likely to meet the Healthy People 2020 objectives. This analysis suggests the need for hospitals to continue and/or enhance HO-MRSA infection prevention efforts to reduce rates further.
Background: The NHSN is the nation’s largest surveillance system for healthcare-associated infections. Since 2011, acute-care hospitals (ACHs) have been required to report intensive care unit (ICU) central-line–associated bloodstream infections (CLABSIs) to the NHSN pursuant to CMS requirements. In 2015, this requirement included general medical, surgical, and medical-surgical wards. Also in 2015, the NHSN implemented a repeat infection timeframe (RIT) that required repeat CLABSIs, in the same patient and admission, to be excluded if onset was within 14 days. This analysis is the first at the national level to describe repeat CLABSIs. Methods: Index CLABSIs reported in ACH ICUs and select wards during 2015–2108 were included, in addition to repeat CLABSIs occurring at any location during the same period. CLABSIs were stratified into 2 groups: single and repeat CLABSIs. The repeat CLABSI group included the index CLABSI and subsequent CLABSI(s) reported for the same patient. Up to 5 CLABSIs were included for a single patient. Pathogen analyses were limited to the first pathogen reported for each CLABSI, which is considered to be the most important cause of the event. Likelihood ratio χ2 tests were used to determine differences in proportions. Results: Of the 70,214 CLABSIs reported, 5,983 (8.5%) were repeat CLABSIs. Of 3,264 nonindex CLABSIs, 425 (13%) were identified in non-ICU or non-select ward locations. Staphylococcus aureus was the most common pathogen in both the single and repeat CLABSI groups (14.2% and 12%, respectively) (Fig. 1). Compared to all other pathogens, CLABSIs reported with Candida spp were less likely in a repeat CLABSI event than in a single CLABSI event (P < .0001). Insertion-related organisms were more likely to be associated with single CLABSIs than repeat CLABSIs (P < .0001) (Fig. 2). Alternatively, Enterococcus spp or Klebsiella pneumoniae and K. oxytoca were more likely to be associated with repeat CLABSIs than single CLABSIs (P < .0001). Conclusions: This analysis highlights differences in the aggregate pathogen distributions comparing single versus repeat CLABSIs. Assessing the pathogens associated with repeat CLABSIs may offer another way to assess the success of CLABSI prevention efforts (eg, clean insertion practices). Pathogens such as Enterococcus spp and Klebsiella spp demonstrate a greater association with repeat CLABSIs. Thus, instituting prevention efforts focused on these organisms may warrant greater attention and could impact the likelihood of repeat CLABSIs. Additional analysis of patient-specific pathogens identified in the repeat CLABSI group may yield further clarification.