We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A 1901 report by the Smithsonian Custodian of Paleozoic Plants noted that the nonbiomineralized taxa Buthotrephis divaricata White, 1901, B. newlini White, 1901, and B. lesquereuxi Grote and Pitt, 1876, from the upper Silurian of the Great Lakes area, shared key characteristics in common with the extant green macroalga Codium. A detailed reexamination of these Codium-like taxa and similar forms from the lower Silurian of Ontario, New York, and Michigan, including newly collected material of Thalassocystis striata Taggart and Parker, 1976, aided by scanning electron microscopy and stable carbon isotope analysis, provides new data in support of an algal affinity. Crucially, as with Codium, the originally cylindrical axes of all of these taxa consist of a complex internal array of tubes divided into distinct medullary and cortical regions, the medullary tubes being arranged in a manner similar to those of living Pseudocodium. In view of these findings, the three study taxa originally assigned to Buthotrephis, together with Chondrites verus Ruedemann, 1925, are transferred to the new algal taxon Inocladus new genus, thereby establishing Inocladus lesquereuxi new combination, Inocladus newlini new comb., Inocladus divaricata new comb., and Inocladus verus new comb. Morphological and paleoecological data point to a phylogenetic affinity for Inocladus n. gen. and Thalassocystis within the Codium-bearing green algal order Bryopsidales, but perhaps nested within an extinct lineage. Collectively, this material fits within a large-scale pattern of major macroalgal morphological diversification initiated in concert with the Great Ordovician Biodiversification Event and apparently driven by a marked escalation in grazing pressure.
Seeking to reach the unbanked, the US Postal Savings System provided a federally insured savings alternative to traditional banks. Using novel data sets on postal deposits, demographic characteristics, and banks, we study how and by whom the system was used. We find the program was initially used by nonfarming immigrant populations for short-term saving, then as a safe haven during the Great Depression, and finally as long-term investments for the wealthy during the 1940s. Postal Savings was only a partial substitute for traditional banks, as locations with banks often still heavily used Postal Savings.
Antibiotic prescribing practices across the Veterans’ Health Administration (VA) experienced significant shifts during the coronavirus disease 2019 (COVID-19) pandemic. From 2015 to 2019, antibiotic use between January and May decreased from 638 to 602 days of therapy (DOT) per 1,000 days present (DP), while the corresponding months in 2020 saw antibiotic utilization rise to 628 DOT per 1,000 DP.
Background:Clostridioides difficile infection (CDI) is a major contributor to morbidity and mortality in patients with hematologic malignancy. Due to both immunosuppression and frequent antibiotic exposures, up to one-third of inpatients receiving chemotherapy or stem-cell transplant develop CDI. Transmission of C. difficile in healthcare facilities occurs due to environmental surface contamination and hand carriage by healthcare workers from colonized and infected patients. We investigated the effectiveness of enhanced room cleaning in collaboration with environmental services (EVS) staff to prevent CDI transmission and infection.
Methods: From April 1, 2018, to September 30, 2018, a multimodal enhanced cleaning intervention was implemented on 2 oncology units at the Hospital of the University of Pennsylvania. This intervention included real-time feedback to EVS staff following ATP bioluminescence monitoring. Additionally, all rooms on the intervention units underwent UV disinfection after terminal cleaning. We performed a system-level cohort study, comparing rates of CDI on the 2 study units to historic and 2 concurrent control units. Historic and concurrent control units received UV disinfection only for rooms with prior occupants with MRSA or CDI. All units during the intervention period received education on the importance of environmental cleaning for infection prevention. Mixed-effects Poisson regression was used to adjust for system-level confounders. Results: A median of 1.34 CDI cases per 1,000 patient days (IQR, 1.20–3.62) occurred during the 12-month baseline period. There was a trend toward a reduced rate of CDI across all units during the intervention period (median, 1.19; IQR, 0.00–2.47; P = .13) compared with all units during the historical period. Using mixed-effects Poisson regression, accounting for the random effects of study units, the intervention was associated with an incidence rate ratio for C. difficile of 0.72 compared to control units (95% CI, 0.53–0.97; P = .03). Average room turnaround time (TAT) increased across all units during the study period, from 78 minutes (IQR 74–81) to 92 minutes (IQR, 85–96; P < .001). Within the intervention period, TAT was higher on intervention units (median, 94 minutes; IQR, 92–98) compared to concurrent control units (median, 85; IQR, 80–92; P = .005). Conclusions: Enhanced environmental cleaning, including UV disinfection of all patient rooms and ATP bioluminescent monitoring with real-time feedback, was associated with a reduction in the incidence of CDI.
Background: Fluoroquinolones (FQs) are one of the most commonly prescribed antibiotic classes in the United States. In recent years, their widespread use has come under heightened scrutiny due potential adverse drug reactions including risks of mental health side effects, serious blood sugar disturbances, and Food and Drug Administration (FDA) black-box warnings for tendinopathy, aortic aneurysm, and dissection. These warnings prompted the Department of Veterans Affairs Pharmacy Benefits Management Service to perform a nationwide FQ utilization review, which identified our facility for potential overuse of FQs in the outpatient setting: 82.2 prescriptions per 1,000 unique patients compared to an average of 48 prescriptions per 1,000 unique patients across all VHA facilities. We then embarked on a FQ medication use evaluation (MUE). Objective: To determine appropriateness of FQ prescribing practices in the outpatient setting. Methods: The study setting was a 399-bed tertiary-care Veterans Hospital with >250 affiliated outpatient clinics in Richmond, Virginia. A retrospective chart review was conducted on a convenience sample of consecutive patients prescribed an FQ from each quarter between April 1, 2018, and March 31, 2019. Chart review included patient demographics, location, FQ used, dose, indication, appropriateness, prescriber, and documentation of patient counseling on FDA black box warnings. Appropriate treatment was defined by national and local antimicrobial therapy guidelines. Results: In total, 265 patients were included the study. Among them, 233 patients (88%) were men and the mean age was 68 years. Overall, 127 patients (48%) were prescribed FQs inappropriately. Primary care clinics and the emergency department (ED) had the highest frequency of inappropriate FQ prescriptions (Fig. 1). Moreover, 92 patients (35%) were prescribed FQs for surgical prophylaxis prior to urological procedures. FQs were most commonly inappropriately prescribed for urinary tract infection (UTI, n =74, 84%) and upper respiratory tract infection (URI, n=27, 84%) (Fig. 2). Documented counseling on FDA black box warnings occurred in 82 cases (31%). Conclusions: In our MUE, outpatient prescribing of FQs was inappropriate nearly 50% of the time. The most commonly documented indications for FQs determined to be inappropriate included UTI and URI. Inappropriate prescriptions most commonly originated from primary care and the emergency department. Urology had the highest volume of FQ prescriptions, which were mostly appropriate surgical prophylaxis based on indication (though an alternative agent would be preferred based on local resistance rates). Documentation of patient counseling for FDA black-box warnings on FQs was uncommon.
Background: Updated IDSA-SHEA guidelines recommend different diagnostic approaches to C. difficile depending on whether There are pre-agreed institutional criteria for patient stool submission. If stool submission criteria are in place, nucleic acid amplification testing (NAAT) alone may be used. If not, a multistep algorithm is suggested, incorporating various combinations of toxin enzyme immunoassay (EIA), glutamate dehydrogenase (GDH), and NAAT, with discordant results adjudicated by NAAT. At our institution, we developed a multistep algorithm leading with NAAT with reflex to EIA for toxin testing if NAAT is positive. This algorithm resulted in a significant proportion of patients with discordant results (NAAT positive and toxin EIA negative) that some experts have categorized as possible carriers or C. difficile colonized. In this study, we describe the impact of a multistep algorithm on hospital-onset, community-onset, and healthcare-facility–associated C. difficile infection (HO-CDI, CO-CDI, and HFA-CDI, respectively) rates and the management of possible carriers. Methods: The study setting was a 399-bed, tertiary-care VA Medical Center in Richmond, Virginia. A retrospective chart review was conducted. The multistep C. difficile testing algorithm was implemented June 4, 2019 (Fig. 1). C. difficile testing results and possible carriers were reviewed for the 5 months before and 4 months after implementation (January 2019 to September 2019). Results: In total, 587 NAATs were performed in the inpatient and outpatient setting (mean, 58.7 per month). Overall, 123 NAATs (21%) were positive: 59 in the preintervention period and 63 in the postintervention period. In the postintervention period, 23 positive NAATs (26%) had a positive toxin EIA. Based on LabID events, the mean rate of HO+CO+HCFA CDI cases per 10,000 bed days of care (BDOC) decreased significantly from 9.49 in the preintervention period to 1.15 in the postintervention period (P = .019) (Fig. 2). Also, 9 of the possible carriers (22%) were treated for CDI based on high clinical suspicion, and 6 of the possible carriers (14%) had a previous history of CDI. Of these, 5 (83%) were treated for CDI. In addition, 1 patient (2%) converted from possible carrier to positive toxin EIA within 14 days. The infectious diseases team was consulted for 11 possible carriers (27%). Conclusions: Implementation of a 2-step C difficile algorithm leading with NAAT was associated with a lower rate of HO+CO+HCFA CDI per 10,000 BDOC. A considerable proportion (22%) of possible carriers were treated for CDI but did not count as LabID events. Only 2% of the possible carriers in our study converted to a positive toxin EIA.
Background: Shared Healthcare Intervention to Eliminate Life-threatening Dissemination of MDROs in Orange County, California (SHIELD OC) was a CDC-funded regional decolonization intervention from April 2017 through July 2019 involving 38 hospitals, nursing homes (NHs), and long-term acute-care hospitals (LTACHs) to reduce MDROs. Decolonization in NH and LTACHs consisted of universal antiseptic bathing with chlorhexidine (CHG) for routine bathing and showering plus nasal iodophor decolonization (Monday through Friday, twice daily every other week). Hospitals used universal CHG in ICUs and provided daily CHG and nasal iodophor to patients in contact precautions. We sought to evaluate whether decolonization reduced hospitalization and associated healthcare costs due to infections among residents of NHs participating in SHIELD compared to nonparticipating NHs. Methods: Medicaid insurer data covering NH residents in Orange County were used to calculate hospitalization rates due to a primary diagnosis of infection (counts per member quarter), hospital bed days/member-quarter, and expenditures/member quarter from the fourth quarter of 2015 to the second quarter of 2019. We used a time-series design and a segmented regression analysis to evaluate changes attributable to the SHIELD OC intervention among participating and nonparticipating NHs. Results: Across the SHIELD OC intervention period, intervention NHs experienced a 44% decrease in hospitalization rates, a 43% decrease in hospital bed days, and a 53% decrease in Medicaid expenditures when comparing the last quarter of the intervention to the baseline period (Fig. 1). These data translated to a significant downward slope, with a reduction of 4% per quarter in hospital admissions due to infection (P < .001), a reduction of 7% per quarter in hospitalization days due to infection (P < .001), and a reduction of 9% per quarter in Medicaid expenditures (P = .019) per NH resident. Conclusions: The universal CHG bathing and nasal decolonization intervention adopted by NHs in the SHIELD OC collaborative resulted in large, meaningful reductions in hospitalization events, hospitalization days, and healthcare expenditures among Medicaid-insured NH residents. The findings led CalOptima, the Medicaid provider in Orange County, California, to launch an NH incentive program that provides dedicated training and covers the cost of CHG and nasal iodophor for OC NHs that enroll.
Funding: None
Disclosures: Gabrielle M. Gussin, University of California, Irvine, Stryker (Sage Products): Conducting studies in which contributed antiseptic product is provided to participating hospitals and nursing homes. Clorox: Conducting studies in which contributed antiseptic product is provided to participating hospitals and nursing homes. Medline: Conducting studies in which contributed antiseptic product is provided to participating hospitals and nursing homes. Xttrium: Conducting studies in which contributed antiseptic product is provided to participating hospitals and nursing homes.
Background: Peritoneal dialysis is a type of dialysis performed by patients in their homes; patients receive training from dialysis clinic staff. Peritonitis is a serious complication of peritoneal dialysis, most commonly caused by gram-positive organisms. During March‒April 2019, a dialysis provider organization transitioned ~400 patients to a different manufacturer of peritoneal dialysis equipment and supplies (from product A to B). Shortly thereafter, patients experienced an increase in peritonitis episodes, caused predominantly by gram-negative organisms. In May 2019, we initiated an investigation to determine the source. Methods: We conducted case finding, reviewed medical records, observed peritoneal dialysis procedures and trainings, and performed patient home visits and interviews. A 1:1 matched case–control study was performed in 1 state. A case had ≥2 of the following: (1) positive peritoneal fluid culture, (2) high peritoneal fluid white cell count with ≥50% polymorphonuclear cells, or (3) cloudy peritoneal fluid and/or abdominal pain. Controls were matched to cases by week of clinic visit. Conditional logistic regression was used to estimate univariate matched odds ratios (mOR) and 95% confidence intervals (CIs). We conducted microbiological testing of peritoneal dialysis fluid bags to rule out product contamination. Results: During March‒September 2019, we identified 157 cases of peritonitis across 15 clinics in 2 states (attack rate≍39%). Staphylococcus spp (14%), Serratia spp (12%) and Klebsiella spp (6.3%) were the most common pathogens. Steps to perform peritoneal dialysis using product B differed from product A in several key areas; however, no common errors in practice were identified to explain the outbreak. Patient training on transitioning products was not standardized. Outcomes of the 73 cases in the case–control study included hospitalization (77%), peritoneal dialysis failure (40%), and death (7%). The median duration of training prior to product transition was 1 day for cases and controls (P = .86). Transitioning to product B (mOR, 18.00; 95% CI, 2.40‒134.83), using product B (mOR, 18.26; 95% CI, 3.86‒∞), drain-line reuse (mOR, 4.67; 95% CI, 1.34‒16.24) and performing daytime exchanges (mOR, 3.63; 95% CI, 1.71‒8.45) were associated with peritonitis. After several interventions, including transition of patients back to product A (Fig. 1), overall cases declined. Sterility testing of samples from 23 unopened product B peritoneal dialysis solution bags showed no contamination. Conclusions: Multiple factors may have contributed to this large outbreak, including a rapid transition in peritoneal dialysis products and potentially inadequate patient training. Efforts are needed to identify and incorporate best training practices, and product advances are desired to improve the safety of patient transitions between different types of peritoneal dialysis equipment.
Background: Assessing antimicrobial use (AU) appropriateness isa cornerstone of antimicrobial stewardship, largely accomplished through time-intensive manual chart review of specific agents or diagnoses. Efforts to evaluate appropriateness have focused on assessing the appropriateness of an entire treatment course. An electronic measure was developed to assess the appropriateness of each day of inpatient AU leveraging electronic health record data. Methods: We extracted contextual data, including risk factors for resistant organisms, allergies, constitutional signs and symptoms from diagnostic and procedural codes, and microbiological findings, from the electronic health records of patients in Veterans’ Health Administration inpatient wards reporting data to the National Healthcare Safety Network (NHSN) AU option from 2017–2018. Only the antibacterial categories shown in Figure 1 were included. Respiratory, urinary tract, skin and soft-tissue, and other infection categories were defined and applied to each hospital day. Algorithm rules were constructed to evaluate AU based on the clinical context (eg, in the ICU, during empiric therapy, drug–pathogen match, recommended drugs, and duration). Rules were drawn from available literature, were discussed with experts, and were then refined empirically. Generally, the rules allowed for use of first-line agents unless risk factors or contraindications were identified. AU was categorized as appropriate, inappropriate, or indeterminate for each day, then aggregated into an overall measure of facility-level AU appropriateness. A validation set of 20 charts were randomly selected for manual review. Results: Facility distribution of appropriateness, inappropriateness, and indeterminate AU by 4 of the adult, 2017 baseline NHSN Standardized Antimicrobial Administration Ratio (SAAR) categories are shown in Figure 1. The median facility-level inappropriateness across all SAAR categories was 37.2% (IQR, 29.4%–52.5%). The median facility-level indeterminate AU across all SAAR categories was 14.4% (IQR, 9.1%–21.2%). Chart review of 20 admissions showed agreement with algorithm appropriateness and inappropriateness in 95.4% of 240 antibacterial days.
Conclusions: We developed a comprehensive, flexible electronic tool to evaluate AU appropriateness for combinations of setting, antibacterial agent, syndrome, or time frame of interest (eg, empiric, definitive, or excess duration). Application of our algorithm in 2 years of VA acute-care data suggest substantial interfacility variability; the highest rates of inappropriateness were for anti-MRSA therapy. Our preliminary chart review demonstrated agreement between electronic and manual review in >95% of antimicrobial days. This approach may be useful to identify potential stewardship targets, in the development of decision support systems, and in conjunction with other metrics to track AU over time.
To determine whether the Society for Healthcare Epidemiology of America (SHEA) and the Infectious Diseases Society of America (IDSA) Clostridioides difficile infection (CDI) severity criteria adequately predicts poor outcomes.
Design:
Retrospective validation study.
Setting and participants:
Patients with CDI in the Veterans’ Affairs Health System from January 1, 2006, to December 31, 2016.
Methods:
For the 2010 criteria, patients with leukocytosis or a serum creatinine (SCr) value ≥1.5 times the baseline were classified as severe. For the 2018 criteria, patients with leukocytosis or a SCr value ≥1.5 mg/dL were classified as severe. Poor outcomes were defined as hospital or intensive care admission within 7 days of diagnosis, colectomy within 14 days, or 30-day all-cause mortality; they were modeled as a function of the 2010 and 2018 criteria separately using logistic regression.
Results:
We analyzed data from 86,112 episodes of CDI. Severity was unclassifiable in a large proportion of episodes diagnosed in subacute care (2010, 58.8%; 2018, 49.2%). Sensitivity ranged from 0.48 for subacute care using 2010 criteria to 0.73 for acute care using 2018 criteria. Areas under the curve were poor and similar (0.60 for subacute care and 0.57 for acute care) for both versions, but negative predictive values were >0.80.
Conclusions:
Model performances across care settings and criteria versions were generally poor but had reasonably high negative predictive value. Many patients in the subacute-care setting, an increasing fraction of CDI cases, could not be classified. More work is needed to develop criteria to identify patients at risk of poor outcomes.
We tracked the relative integration and differentiation among life history traits over the period spanning AD 1800–1999 in the Britannic and Gallic biocultural groups. We found that Britannic populations tended toward greater strategic differentiation, while Gallic populations tended toward greater strategic integration. The dynamics of between-group competition between these two erstwhile rival biocultural groups were hypothesized as driving these processes. We constructed a latent factor that specifically sought to measure between-group competition and residualized it for the logarithmic effects of time. We found a significantly asymmetrical impact of between-group competition, where the between-group competition factor appeared to be driving the diachronic integration in Gallic populations but had no significantly corresponding influence on the parallel process of diachronic differentiation in Britannic populations. This suggests that the latter process was attributable to some alternative and unmeasured causes, such as the resource abundance consequent to territorial expansion rather than contraction.