We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The incidence of infections from extended-spectrum β-lactamase (ESBL)–producing Enterobacterales (ESBL-E) is increasing in the United States. We describe the epidemiology of ESBL-E at 5 Emerging Infections Program (EIP) sites.
Methods
During October–December 2017, we piloted active laboratory- and population-based (New York, New Mexico, Tennessee) or sentinel (Colorado, Georgia) ESBL-E surveillance. An incident case was the first isolation from normally sterile body sites or urine of Escherichia coli or Klebsiella pneumoniae/oxytoca resistant to ≥1 extended-spectrum cephalosporin and nonresistant to all carbapenems tested at a clinical laboratory from a surveillance area resident in a 30-day period. Demographic and clinical data were obtained from medical records. The Centers for Disease Control and Prevention (CDC) performed reference antimicrobial susceptibility testing and whole-genome sequencing on a convenience sample of case isolates.
Results
We identified 884 incident cases. The estimated annual incidence in sites conducting population-based surveillance was 199.7 per 100,000 population. Overall, 800 isolates (96%) were from urine, and 790 (89%) were E. coli. Also, 393 cases (47%) were community-associated. Among 136 isolates (15%) tested at the CDC, 122 (90%) met the surveillance definition phenotype; 114 (93%) of 122 were shown to be ESBL producers by clavulanate testing. In total, 111 (97%) of confirmed ESBL producers harbored a blaCTX-M gene. Among ESBL-producing E. coli isolates, 52 (54%) were ST131; 44% of these cases were community associated.
Conclusions
The burden of ESBL-E was high across surveillance sites, with nearly half of cases acquired in the community. EIP has implemented ongoing ESBL-E surveillance to inform prevention efforts, particularly in the community and to watch for the emergence of new ESBL-E strains.
To determine the incidence of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) infection among healthcare personnel (HCP) and to assess occupational risks for SARS-CoV-2 infection.
Design:
Prospective cohort of healthcare personnel (HCP) followed for 6 months from May through December 2020.
Setting:
Large academic healthcare system including 4 hospitals and affiliated clinics in Atlanta, Georgia.
Participants:
HCP, including those with and without direct patient-care activities, working during the coronavirus disease 2019 (COVID-19) pandemic.
Methods:
Incident SARS-CoV-2 infections were determined through serologic testing for SARS-CoV-2 IgG at enrollment, at 3 months, and at 6 months. HCP completed monthly surveys regarding occupational activities. Multivariable logistic regression was used to identify occupational factors that increased the risk of SARS-CoV-2 infection.
Results:
Of the 304 evaluable HCP that were seronegative at enrollment, 26 (9%) seroconverted for SARS-CoV-2 IgG by 6 months. Overall, 219 participants (73%) self-identified as White race, 119 (40%) were nurses, and 121 (40%) worked on inpatient medical-surgical floors. In a multivariable analysis, HCP who identified as Black race were more likely to seroconvert than HCP who identified as White (odds ratio, 4.5; 95% confidence interval, 1.3–14.2). Increased risk for SARS-CoV-2 infection was not identified for any occupational activity, including spending >50% of a typical shift at a patient’s bedside, working in a COVID-19 unit, or performing or being present for aerosol-generating procedures (AGPs).
Conclusions:
In our study cohort of HCP working in an academic healthcare system, <10% had evidence of SARS-CoV-2 infection over 6 months. No specific occupational activities were identified as increasing risk for SARS-CoV-2 infection.
Understand how the built environment can affect safety and efficiency outcomes during doffing of personal protective equipment (PPE) in the context of coronavirus disease 2019 (COVID-19) patient care.
Study design:
We conducted (1) field observations and surveys administered to healthcare workers (HCWs) performing PPE doffing, (2) focus groups with HCWs and infection prevention experts, and (3) a with healthcare design experts.
Settings:
This study was conducted in 4 inpatient units treating patients with COVID-19, in 3 hospitals of a single healthcare system.
Participants:
The study included 24 nurses, 2 physicians, 1 respiratory therapist, and 2 infection preventionists.
Results:
The doffing task sequence and the layout of doffing spaces varied considerably across sites, with field observations showing most doffing tasks occurring around the patient room door and PPE support stations. Behaviors perceived as most risky included touching contaminated items and inadequate hand hygiene. Doffing space layout and types of PPE storage and work surfaces were often associated with inadequate cleaning and improper storage of PPE. Focus groups and the design charrette provided insights on how design affording standardization, accessibility, and flexibility can support PPE doffing safety and efficiency in this context.
Conclusions:
There is a need to define, organize and standardize PPE doffing spaces in healthcare settings and to understand the environmental implications of COVID-19–specific issues related to supply shortage and staff workload. Low-effort and low-cost design adaptations of the layout and design of PPE doffing spaces may improve HCW safety and efficiency in existing healthcare facilities.
We performed an epidemiological investigation and genome sequencing of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) to define the source and scope of an outbreak in a cluster of hospitalized patients. Lack of appropriate respiratory hygiene led to SARS-CoV-2 transmission to patients and healthcare workers during a single hemodialysis session, highlighting the importance of infection prevention precautions.
To describe the cumulative seroprevalence of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) antibodies during the coronavirus disease 2019 (COVID-19) pandemic among employees of a large pediatric healthcare system.
Design, setting, and participants:
Prospective observational cohort study open to adult employees at the Children’s Hospital of Philadelphia, conducted April 20–December 17, 2020.
Methods:
Employees were recruited starting with high-risk exposure groups, utilizing e-mails, flyers, and announcements at virtual town hall meetings. At baseline, 1 month, 2 months, and 6 months, participants reported occupational and community exposures and gave a blood sample for SARS-CoV-2 antibody measurement by enzyme-linked immunosorbent assays (ELISAs). A post hoc Cox proportional hazards regression model was performed to identify factors associated with increased risk for seropositivity.
Results:
In total, 1,740 employees were enrolled. At 6 months, the cumulative seroprevalence was 5.3%, which was below estimated community point seroprevalence. Seroprevalence was 5.8% among employees who provided direct care and was 3.4% among employees who did not perform direct patient care. Most participants who were seropositive at baseline remained positive at follow-up assessments. In a post hoc analysis, direct patient care (hazard ratio [HR], 1.95; 95% confidence interval [CI], 1.03–3.68), Black race (HR, 2.70; 95% CI, 1.24–5.87), and exposure to a confirmed case in a nonhealthcare setting (HR, 4.32; 95% CI, 2.71–6.88) were associated with statistically significant increased risk for seropositivity.
Conclusions:
Employee SARS-CoV-2 seroprevalence rates remained below the point-prevalence rates of the surrounding community. Provision of direct patient care, Black race, and exposure to a confirmed case in a nonhealthcare setting conferred increased risk. These data can inform occupational protection measures to maximize protection of employees within the workplace during future COVID-19 waves or other epidemics.
Racial identification is a critical factor in understanding a multitude of important outcomes in many fields. However, inferring an individual’s race from ecological data is prone to bias and error. This process was only recently improved via Bayesian improved surname geocoding (BISG). With surname and geographic-based demographic data, it is possible to more accurately estimate individual racial identification than ever before. However, the level of geography used in this process varies widely. Whereas some existing work makes use of geocoding to place individuals in precise census blocks, a substantial portion either skips geocoding altogether or relies on estimation using surname or county-level analyses. Presently, the trade-offs of such variation are unknown. In this letter, we quantify those trade-offs through a validation of BISG on Georgia’s voter file using both geocoded and nongeocoded processes and introduce a new level of geography—ZIP codes—to this method. We find that when estimating the racial identification of White and Black voters, nongeocoded ZIP code-based estimates are acceptable alternatives. However, census blocks provide the most accurate estimations when imputing racial identification for Asian and Hispanic voters. Our results document the most efficient means to sequentially conduct BISG analysis to maximize racial identification estimation while simultaneously minimizing data missingness and bias.
The Variables and Slow Transients Survey (VAST) on the Australian Square Kilometre Array Pathfinder (ASKAP) is designed to detect highly variable and transient radio sources on timescales from 5 s to
$\sim\!5$
yr. In this paper, we present the survey description, observation strategy and initial results from the VAST Phase I Pilot Survey. This pilot survey consists of
$\sim\!162$
h of observations conducted at a central frequency of 888 MHz between 2019 August and 2020 August, with a typical rms sensitivity of
$0.24\ \mathrm{mJy\ beam}^{-1}$
and angular resolution of
$12-20$
arcseconds. There are 113 fields, each of which was observed for 12 min integration time, with between 5 and 13 repeats, with cadences between 1 day and 8 months. The total area of the pilot survey footprint is 5 131 square degrees, covering six distinct regions of the sky. An initial search of two of these regions, totalling 1 646 square degrees, revealed 28 highly variable and/or transient sources. Seven of these are known pulsars, including the millisecond pulsar J2039–5617. Another seven are stars, four of which have no previously reported radio detection (SCR J0533–4257, LEHPM 2-783, UCAC3 89–412162 and 2MASS J22414436–6119311). Of the remaining 14 sources, two are active galactic nuclei, six are associated with galaxies and the other six have no multi-wavelength counterparts and are yet to be identified.
Understanding the cognitive determinants of healthcare worker (HCW) behavior is important for improving the use of infection prevention and control (IPC) practices. Given a patient requiring only standard precautions, we examined the dimensions along which different populations of HCWs cognitively organize patient care tasks (ie, their mental models).
Design:
HCWs read a description of a patient and then rated the similarities of 25 patient care tasks from an infection prevention perspective. Using multidimensional scaling, we identified the dimensions (ie, characteristics of tasks) underlying these ratings and the salience of each dimension to HCWs.
Setting:
Adult inpatient hospitals across an academic hospital network.
Participants:
In total, 40 HCWs, comprising infection preventionists and nurses from intensive care units, emergency departments, and medical-surgical floors rated the similarity of tasks. To identify the meaning of each dimension, another 6 nurses rated each task in terms of specific characteristics of tasks.
Results:
Each HCW population perceived patient care tasks to vary along 3 common dimensions; most salient was the perceived magnitude of infection risk to the patient in a task, followed by the perceived dirtiness and risk of HCW exposure to body fluids, and lastly, the relative importance of a task for preventing versus controlling an infection in a patient.
Conclusions:
For a patient requiring only standard precautions, different populations of HCWs have similar mental models of how various patient care tasks relate to IPC. Techniques for eliciting mental models open new avenues for understanding and ultimately modifying the cognitive determinants of IPC behaviors.
Prior to the recent release of appropriate use criteria for imaging valvulopathies in children, follow-up of valvular lesions, including isolated bicuspid aortic valve, was not standardised. We describe current follow up, treatment, and intervention strategies for isolated bicuspid aortic valve with varying degrees of stenosis, regurgitation, and dilation in children up to 18 years old and compare them with newly released appropriate use criteria.
Methods:
Online survey was sent to members of the American Academy of Pediatrics Section on Cardiology and Cardiac Surgery and PediHeartNet.
Results:
Totally, 106 responses with interpretable data were received. For asymptomatic patients with isolated BAV without stenosis, regurgitation, or dilation follow-up-intervals increased from 7+/−4 months in the newborn period to 28 +/− 14 months at 18 years of age. Respondents recommended more frequent follow-up for younger patients and those with greater disease severity. More than 80% of respondents treat aortic regurgitation or aortic dilation in the setting of bicuspid aortic valve medically. In general, intervention was recommended once stenosis or regurgitation became severe (stenosis of >4 m/s; regurgitation with LV Z score 4) regardless of age, but was not routinely recommended for younger children (newborn – age 6 years) with severe dilation. Exercise was restricted at 38+/−11 mmHg echocardiographic mean gradient.
Conclusions:
Current follow-up, treatment, and intervention strategies for isolated bicuspid aortic valve deviate from appropriate use criteria. Differences between the two highlight the need to better delineate the disease course, clarify recommendations for care, and encourage wider adoption of guidelines.
Among 353 healthcare personnel in a longitudinal cohort in 4 hospitals in Atlanta, Georgia (May–June 2020), 23 (6.5%) had severe acute respiratory coronavirus virus 2 (SARS-CoV-2) antibodies. Spending >50% of a typical shift at the bedside (OR, 3.4; 95% CI, 1.2–10.5) and black race (OR, 8.4; 95% CI, 2.7–27.4) were associated with SARS-CoV-2 seropositivity.
Energy deficit is common during prolonged periods of strenuous physical activity and limited sleep, but the extent to which appetite suppression contributes is unclear. The aim of this randomised crossover study was to determine the effects of energy balance on appetite and physiological mediators of appetite during a 72-h period of high physical activity energy expenditure (about 9·6 MJ/d (2300 kcal/d)) and limited sleep designed to simulate military operations (SUSOPS). Ten men consumed an energy-balanced diet while sedentary for 1 d (REST) followed by energy-balanced (BAL) and energy-deficient (DEF) controlled diets during SUSOPS. Appetite ratings, gastric emptying time (GET) and appetite-mediating hormone concentrations were measured. Energy balance was positive during BAL (18 (sd 20) %) and negative during DEF (–43 (sd 9) %). Relative to REST, hunger, desire to eat and prospective consumption ratings were all higher during DEF (26 (sd 40) %, 56 (sd 71) %, 28 (sd 34) %, respectively) and lower during BAL (–55 (sd 25) %, −52 (sd 27) %, −54 (sd 21) %, respectively; Pcondition < 0·05). Fullness ratings did not differ from REST during DEF, but were 65 (sd 61) % higher during BAL (Pcondition < 0·05). Regression analyses predicted hunger and prospective consumption would be reduced and fullness increased if energy balance was maintained during SUSOPS, and energy deficits of ≥25 % would be required to elicit increases in appetite. Between-condition differences in GET and appetite-mediating hormones identified slowed gastric emptying, increased anorexigenic hormone concentrations and decreased fasting acylated ghrelin concentrations as potential mechanisms of appetite suppression. Findings suggest that physiological responses that suppress appetite may deter energy balance from being achieved during prolonged periods of strenuous activity and limited sleep.
Background: Automated testing instruments (ATIs) are commonly used by clinical microbiology laboratories to perform antimicrobial susceptibility testing (AST), whereas public health laboratories may use established reference methods such as broth microdilution (BMD). We investigated discrepancies in carbapenem minimum inhibitory concentrations (MICs) among Enterobacteriaceae tested by clinical laboratory ATIs and by reference BMD at the CDC. Methods: During 2016–2018, we conducted laboratory- and population-based surveillance for carbapenem-resistant Enterobacteriaceae (CRE) through the CDC Emerging Infections Program (EIP) sites (10 sites by 2018). We defined an incident case as the first isolation of Enterobacter spp (E. cloacae complex or E. aerogenes), Escherichia coli, Klebsiella pneumoniae, K. oxytoca, or K. variicola resistant to doripenem, ertapenem, imipenem, or meropenem from normally sterile sites or urine identified from a resident of the EIP catchment area in a 30-day period. Cases had isolates that were determined to be carbapenem-resistant by clinical laboratory ATI MICs (MicroScan, BD Phoenix, or VITEK 2) or by other methods, using current Clinical and Laboratory Standards Institute (CLSI) criteria. A convenience sample of these isolates was tested by reference BMD at the CDC according to CLSI guidelines. Results: Overall, 1,787 isolates from 112 clinical laboratories were tested by BMD at the CDC. Of these, clinical laboratory ATI MIC results were available for 1,638 (91.7%); 855 (52.2%) from 71 clinical laboratories did not confirm as CRE at the CDC. Nonconfirming isolates were tested on either a MicroScan (235 of 462; 50.9%), BD Phoenix (249 of 411; 60.6%), or VITEK 2 (371 of 765; 48.5%). Lack of confirmation was most common among E. coli (62.2% of E. coli isolates tested) and Enterobacter spp (61.4% of Enterobacter isolates tested) (Fig. 1A), and among isolates testing resistant to ertapenem by the clinical laboratory ATI (52.1%, Fig. 1B). Of the 1,388 isolates resistant to ertapenem in the clinical laboratory, 1,006 (72.5%) were resistant only to ertapenem. Of the 855 nonconfirming isolates, 638 (74.6%) were resistant only to ertapenem based on clinical laboratory ATI MICs. Conclusions: Nonconfirming isolates were widespread across laboratories and ATIs. Lack of confirmation was most common among E. coli and Enterobacter spp. Among nonconfirming isolates, most were resistant only to ertapenem. These findings may suggest that ATIs overcall resistance to ertapenem or that isolate transport and storage conditions affect ertapenem resistance. Further investigation into this lack of confirmation is needed, and CRE case identification in public health surveillance may need to account for this phenomenon.
To assess the clarity and efficacy of the World Health Organization (WHO) hand-rub diagram, develop a modified version, and compare the 2 diagrams.
Design:
Randomized group design preceded by controlled observation and iterative product redesigns.
Setting:
The Cognitive Ergonomics Lab in the School of Psychology at the Georgia Institute of Technology.
Participants:
We included participants who were unfamiliar with the WHO hand-rub diagram (convenience sampling) to ensure that performance was based on the diagram and not, for example, on prior experience.
Methods:
We iterated through the steps of a human factors design procedure: (1) Participants simulated hand hygiene using ultraviolet (UV) absorbent lotion and a hand-rub technique diagram (ie, WHO or a redesign). (2) Coverage, confusion judgments, and behavioral videos informed potentially improved diagrams. And (3) the redesigned diagrams were compared with the WHO version in a randomized group design. Coverage was assessed across 72 hand areas from multiple UV photographs.
Results:
The WHO diagram led to multiple omissions in hand-surface coverage, including inadequate coverage by up to 75% of participants for the ulnar edge. The redesigns improved coverage significantly overall and often substantially.
Conclusions:
Human factors modification to the WHO diagram reduced inadequate coverage for naïve users. Implementation of an improved diagram should help in the prevention of healthcare-associated infections.
The concentration of radiocarbon (14C) differs between ocean and atmosphere. Radiocarbon determinations from samples which obtained their 14C in the marine environment therefore need a marine-specific calibration curve and cannot be calibrated directly against the atmospheric-based IntCal20 curve. This paper presents Marine20, an update to the internationally agreed marine radiocarbon age calibration curve that provides a non-polar global-average marine record of radiocarbon from 0–55 cal kBP and serves as a baseline for regional oceanic variation. Marine20 is intended for calibration of marine radiocarbon samples from non-polar regions; it is not suitable for calibration in polar regions where variability in sea ice extent, ocean upwelling and air-sea gas exchange may have caused larger changes to concentrations of marine radiocarbon. The Marine20 curve is based upon 500 simulations with an ocean/atmosphere/biosphere box-model of the global carbon cycle that has been forced by posterior realizations of our Northern Hemispheric atmospheric IntCal20 14C curve and reconstructed changes in CO2 obtained from ice core data. These forcings enable us to incorporate carbon cycle dynamics and temporal changes in the atmospheric 14C level. The box-model simulations of the global-average marine radiocarbon reservoir age are similar to those of a more complex three-dimensional ocean general circulation model. However, simplicity and speed of the box model allow us to use a Monte Carlo approach to rigorously propagate the uncertainty in both the historic concentration of atmospheric 14C and other key parameters of the carbon cycle through to our final Marine20 calibration curve. This robust propagation of uncertainty is fundamental to providing reliable precision for the radiocarbon age calibration of marine based samples. We make a first step towards deconvolving the contributions of different processes to the total uncertainty; discuss the main differences of Marine20 from the previous age calibration curve Marine13; and identify the limitations of our approach together with key areas for further work. The updated values for ΔR, the regional marine radiocarbon reservoir age corrections required to calibrate against Marine20, can be found at the data base http://calib.org/marine/.
Intensified cover-cropping practices are increasingly viewed as a herbicide-resistance management tool but clear distinction between reactive and proactive resistance management performance targets is needed. We evaluated two proactive performance targets for integrating cover-cropping tactics, including (1) facilitation of reduced herbicide inputs and (2) reduced herbicide selection pressure. We conducted corn (Zea mays L.) and soybean [Glycine max (L.) Merr.] field experiments in Pennsylvania and Delaware using synthetic weed seedbanks of horseweed [Conyza canadensis (L.) Cronquist] and smooth pigweed (Amaranthus hybridus L.) to assess winter and summer annual population dynamics, respectively. The effect of alternative cover crops was evaluated across a range of herbicide inputs. Cover crop biomass production ranged from 2,000 to 8,500 kg ha−1 in corn and 3,000 to 5,500 kg ha−1 in soybean. Experimental results demonstrated that herbicide-based tactics were the primary drivers of total weed biomass production, with cover-cropping tactics providing an additive weed-suppression benefit. Substitution of cover crops for PRE or POST herbicide programs did not reduce total weed control levels or cash crop yields but did result in lower net returns due to higher input costs. Cover-cropping tactics significantly reduced C. canadensis populations in three of four cover crop treatments and decreased the number of large rosettes (>7.6-cm diameter) at the time of preplant herbicide exposure. Substitution of cover crops for PRE herbicides resulted in increased selection pressure on POST herbicides, but reduced the number of large individuals (>10 cm) at POST applications. Collectively, our findings suggest that cover crops can reduce the intensity of selection pressure on POST herbicides, but the magnitude of the effect varies based on weed life-history traits. Additional work is needed to describe proactive resistance management concepts and performance targets for integrating cover crops so producers can apply these concepts in site-specific, within-field management practices.
An inviscid flow model is presented to gain a basic understanding of the reflection of a swept oblique shock from a planar wall. The analytical model is constructed to describe the fundamental influence of sweep on this shock configuration, which has been commonly studied as an unswept non-dimensional shock boundary layer interaction (SBLI). Transformation of model parameters into a plane perpendicular to the sweep angle reduces the resultant flow to a two-parameter system. An equivalency between this configuration and others commonly assessed is presented with advisory notes on the definition of effective coordinate systems. Inviscid shock detachment has been associated with the onset of quasi-conical SBLI spanwise development; see (Settles & Teng, AIAA J., vol. 22 (2), 1984, pp. 194–200). Its occurrence for this SBLI configuration is determined for a range of conditions and compared to experimental observations of swept SBLIs claiming cylindrical/conical similarity scalings. Finally, influence of a zero-mass flux plane associated with typical experimental and numerical analyses is presented with an accompanying model for the shock structure. While this paper serves as a useful resource when designing swept impinging oblique SBLI studies, it also provides a vital benchmark for this complex configuration and helps to unify various SBLI configurations that are often analysed in isolation.