To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To investigate the timing and routes of contamination of the rooms of patients newly admitted to the hospital.
Observational cohort study and simulations of pathogen transfer.
A Veterans’ Affairs hospital.
Patients newly admitted to the hospital with no known carriage of healthcare-associated pathogens.
Interactions between the participants and personnel or portable equipment were observed, and cultures of high-touch surfaces, floors, bedding, and patients’ socks and skin were collected for up to 4 days. Cultures were processed for Clostridioides difﬁcile, methicillin-resistant Staphylococcus aureus (MRSA), and vancomycin-resistant enterococci (VRE). Simulations were conducted with bacteriophage MS2 to assess plausibility of transfer from contaminated floors to high-touch surfaces and to assess the effectiveness of wearing slippers in reducing transfer.
Environmental cultures became positive for at least 1 pathogen in 10 (59%) of the 17 rooms, with cultures positive for MRSA, C. difficile, and VRE in the rooms of 10 (59%), 2 (12%), and 2 (12%) participants, respectively. For all 14 instances of pathogen detection, the initial site of recovery was the floor followed in a subset of patients by detection on sock bottoms, bedding, and high-touch surfaces. In simulations, wearing slippers over hospital socks dramatically reduced transfer of bacteriophage MS2 from the floor to hands and to high-touch surfaces.
Floors may be an underappreciated source of pathogen dissemination in healthcare facilities. Simple interventions such as having patients wear slippers could potentially reduce the risk for transfer of pathogens from floors to hands and high-touch surfaces.
Critical shortages of personal protective equipment, especially N95 respirators, during the coronavirus disease 2019 (COVID-19) pandemic continues to be a source of concern. Novel methods of N95 filtering face-piece respirator decontamination that can be scaled-up for in-hospital use can help address this concern and keep healthcare workers (HCWs) safe.
A multidisciplinary pragmatic study was conducted to evaluate the use of an ultrasonic room high-level disinfection system (HLDS) that generates aerosolized peracetic acid (PAA) and hydrogen peroxide for decontamination of large numbers of N95 respirators. A cycle duration that consistently achieved disinfection of N95 respirators (defined as ≥6 log10 reductions in bacteriophage MS2 and Geobacillus stearothermophilus spores inoculated onto respirators) was identified. The treated masks were assessed for changes to their hydrophobicity, material structure, strap elasticity, and filtration efficiency. PAA and hydrogen peroxide off-gassing from treated masks were also assessed.
The PAA room HLDS was effective for disinfection of bacteriophage MS2 and G. stearothermophilus spores on respirators in a 2,447 cubic-foot (69.6 cubic-meter) room with an aerosol deployment time of 16 minutes and a dwell time of 32 minutes. The total cycle time was 1 hour and 16 minutes. After 5 treatment cycles, no adverse effects were detected on filtration efficiency, structural integrity, or strap elasticity. There was no detectable off-gassing of PAA and hydrogen peroxide from the treated masks at 20 and 60 minutes after the disinfection cycle, respectively.
The PAA room disinfection system provides a rapidly scalable solution for in-hospital decontamination of large numbers of N95 respirators during the COVID-19 pandemic.
Reduction in the use of fluoroquinolone antibiotics has been associated with reductions in Clostridioides difficile infections (CDIs) due to fluoroquinolone-resistant strains.
To determine whether facility-level fluoroquinolone use predicts healthcare facility-associated (HCFA) CDI due to fluoroquinolone-resistant 027 strains.
Using a nationwide cohort of hospitalized patients in the Veterans’ Affairs Healthcare System, we identified hospitals that categorized >80% of CDI cases as positive or negative for the 027 strain for at least one-quarter of fiscal years 2011–2018. Within these facilities, we used visual summaries and multilevel logistic regression models to assess the association between facility-level fluoroquinolone use and rates of HCFA-CDI due to 027 strains, controlling for time and facility complexity level, and adjusting for correlated outcomes within facilities.
Between 2011 and 2018, 55 hospitals met criteria for reporting 027 results, including a total of 5,091 HCFA-CDI cases, with 1,017 infections (20.0%) due to 027 strains. Across these facilities, the use of fluoroquinolones decreased by 52% from 2011 to 2018, with concurrent reductions in the overall HCFA-CDI rate and the proportion of HCFA-CDI cases due to the 027 strain of 13% and 55%, respectively. A multilevel logistic model demonstrated a significant effect of facility-level fluoroquinolone use on the proportion of infections in the facility due to the 027 strain, most noticeably in low-complexity facilities.
Our findings provide support for interventions to reduce use of fluroquinolones as a control measure for CDI, particularly in settings where fluoroquinolone use is high and fluoroquinolone-resistant strains are common causes of infection.
On coronavirus disease 2019 (COVID-19) wards, severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) nucleic acid was frequently detected on high-touch surfaces, floors, and socks inside patient rooms. Contamination of floors and shoes was common outside patient rooms on the COVID-19 wards but decreased after improvements in floor cleaning and disinfection were implemented.
Bipolar disorder (BPD) and alcoholism are strongly comorbid and both have significant genetic influences but no consistent genetic vulnerability has been found. We aimed to find bipolar-alcoholism vulnerability genes.
A genome-wide association study (GWAS) of 510 patients with bipolar disorder (BPD), of whom 143 met Research Diagnostic Criteria (RDC) alcoholism diagnoses, and 506 ancestrally matched supernormal controls. We genotyped 372K genetic markers on an Affymetrix 500K-array. Chi-square analysis of allelic association using PLINK, and permutation testing for gene-wise association of genes previously associated with alcoholism-related phenotypes using COMBASSOC, were performed.
No marker met genomewide significance. Gene-wise analyses of markers clustering near genes already implicated in alcoholism, but which were not associated in non-alcoholic BPD, were: Cadherin-11 (CDH11, p = 6 × 10-4), Exportin 7 (XPO7), neuromedin-U receptor 2 (NMUR2), collagen type XI-alpha 2 (COL11A2) and Semaphorin-5A (SEMA5A).
These genes replicated prior genetic reports implicating “connectivity” (adhesion, migration and neuronal signalling) genes in addictions and comorbid BPD. Connectivity genes regulate neuronal connections during development and play roles in later neuroadaptive and mnemonic processes. These processes may influence addiction vulnerability, as seen clinically in denial, cognitive impairment, and repetitive substance misuse and relapse behaviour. We propose that we have identified genes i) increasing susceptibility to alcoholism that could be unmasked or released by the presence of bipolar affective disorder; ii) and genes increasing susceptibility to affective disorder that also predispose to secondary alcoholism. We were limited by small sample size. Larger future studies are needed.
We present an overview of Guitarra, a simulator for the Near Infrared Camera that creates scenes from catalogues of mock or real sources using the current best estimates of the instrument characteristics and the pattern on the sky of the observations.
Introduction: Intravenous insertion (IVI) is identified by children as extremely painful and the resultant distress can have lasting negative consequences. There is an urgent need to effectively manage such procedures. Our primary objective was to compare the pain and distress of IVI with the addition of humanoid robot-based distraction to standard care, versus standard care alone. Methods: This two-armed randomized controlled trial (RCT) was conducted from April 2017 to May 2018 at the Stollery Children's Hospital emergency department (ED). Children aged 6 to 11 years who required IVI were included. Exclusion criteria included hearing or visual impairments, neurocognitive delays, sensory impairment to pain, previous enrolment, and discretion of the ED clinical staff. Primary outcomes were measured using the Observational Scale of Behavioural Distress-Revised (OSBD-R) (distress) and the Faces Pain Scale-Revised (FPS-R) (pain). A total of 426 pediatric patients were screened and 340 were excluded. Results: We recruited 86 children, of which 55% (47/86) were male; 9% (7/82) were premature at birth; 82% (67/82) had a previous ED visit; 30% (25/82) required previous hospitalization; 78% (64/82) had previous IV placement and 96% (78/81) received topical anesthesia. The mean total OSBD-R score was 1.49 ± 2.36 (standard care) compared to 0.78 ± 1.32 (robot group) (p = 0.047). The median FPS-R during the IV procedure was 4 (IQR 2,6) in the standard care group alone, compared to 2 (IQR 0,4) with the addition of humanoid robot-based distraction (p = 0.10). Change in parental state anxiety pre-procedure versus post-procedure was not significantly different between groups (p = 0.49). Parental satisfaction with the IV start was 93% (39/42) in the robot arm compared to 74% (29/39) in the standard care arm (p = 0.03). Parents were also more satisfied with management of their child's pain in the robot group (95% very satisfied) compared with standard care (72% very satisfied) (p = 0.002). Conclusion: A statistically significant reduction in distress was observed with the addition of robot-based distraction to standard care. Humanoid robot-based distraction therapy reduces distress and to a lesser extent, pain, in children undergoing IVI in the ED. Further trials are required to confirm utility in other age groups and settings.
Multiple studies have demonstrated that daily chlorhexidine gluconate (CHG) bathing is associated with a significant reduction in infections caused by gram-positive pathogens. However, there are limited data on the effectiveness of daily CHG bathing on gram-negative infections. The aim of this study was to determine whether daily CHG bathing is effective in reducing the rate of gram-negative infections in adult intensive care unit (ICU) patients.
We searched MEDLINE and 3 other databases for original studies comparing daily bathing with and without CHG. Two investigators extracted data independently on baseline characteristics, study design, form and concentration of CHG, incidence, and outcomes related to gram-negative infections. Data were combined using a random-effects model and pooled relative risk ratios (RRs), and 95% confidence intervals (CIs) were derived.
In total, 15 studies (n = 34,895 patients) met inclusion criteria. Daily CHG bathing was not significantly associated with a lower risk of gram-negative infections compared with controls (RR, 0.89; 95% CI, 0.73–1.08; P = .24). Subgroup analysis demonstrated that daily CHG bathing was not effective for reducing the risk of gram-negative infections caused by Acinetobacter, Escherichia coli, Klebsiella, Enterobacter, or Pseudomonas spp.
The use of daily CHG bathing was not associated with a lower risk of gram-negative infections. Further, better designed trials with adequate power and with gram-negative infections as the primary end point are needed.
Current methods of control recruitment for case-control studies can be slow (a particular issue for outbreak investigations), resource-intensive and subject to a range of biases. Commercial market panels are a potential source of rapidly recruited controls. Our study evaluated food exposure data from these panel controls, compared with an established reference dataset. Market panel data were collected from two companies using retrospective internet-based surveys; these were compared with reference data from the National Diet and Nutrition Survey (NDNS). We used logistic regression to calculate adjusted odds ratios to compare exposure to each of the 71 food items between the market panel and NDNS participants. We compared 2103 panel controls with 2696 reference participants. Adjusted for socio-demographic factors, exposure to 90% of foods was statistically different between both panels and the reference data. However, these differences were likely to be of limited practical importance for 89% of Panel A foods and 79% of Panel B foods. Market panel food exposures were comparable with reference data for common food exposures but more likely to be different for uncommon exposures. This approach should be considered for outbreak investigation, in conjunction with other considerations such as population at risk, timeliness of response and study resources.
To test the hypothesis that long-term care facility (LTCF) residents with Clostridium difficile infection (CDI) or asymptomatic carriage of toxigenic strains are an important source of transmission in the LTCF and in the hospital during acute-care admissions.
A 6-month cohort study with identification of transmission events was conducted based on tracking of patient movement combined with restriction endonuclease analysis (REA) and whole-genome sequencing (WGS).
Veterans Affairs hospital and affiliated LTCF.
The study included 29 LTCF residents identified as asymptomatic carriers of toxigenic C. difficile based on every other week perirectal screening and 37 healthcare facility-associated CDI cases (ie, diagnosis >3 days after admission or within 4 weeks of discharge to the community), including 26 hospital-associated and 11 LTCF-associated cases.
Of the 37 CDI cases, 7 (18·9%) were linked to LTCF residents with LTCF-associated CDI or asymptomatic carriage, including 3 of 26 hospital-associated CDI cases (11·5%) and 4 of 11 LTCF-associated cases (36·4%). Of the 7 transmissions linked to LTCF residents, 5 (71·4%) were linked to asymptomatic carriers versus 2 (28·6%) to CDI cases, and all involved transmission of epidemic BI/NAP1/027 strains. No incident hospital-associated CDI cases were linked to other hospital-associated CDI cases.
Our findings suggest that LTCF residents with asymptomatic carriage of C. difficile or CDI contribute to transmission both in the LTCF and in the affiliated hospital during acute-care admissions. Greater emphasis on infection control measures and antimicrobial stewardship in LTCFs is needed, and these efforts should focus on LTCF residents during hospital admissions.
Introduction: Intravenous (IV) cannulation is commonly performed in emergency departments (ED), often causing substantial pain and distress. Distraction has been shown to reduce child-reported pain, but there is currently little published about the effects of using iPad technology as a distraction tool. Our primary objective was to compare the reduction of pain and distress using iPad distraction (games, movies, books of the child’s choice) in addition to standard care, versus standard care alone. Methods: This randomized clinical trial, conducted at the Stollery Childrens Hospital ED, recruited children between ages 6 to 11 years requiring IV cannulation. Study arm assignment was performed using REDCaps randomization feature. Due to the nature of the intervention, blinding was not possible for the children, parents or research and ED staff, but the data analyst was blinded to intervention assignment until completion of analysis. Pain, distress, and parental anxiety were measured using the Faces Pain Scale-Revised, the Observed Scale of Behavioural Distress-Revised, and the State Trait Anxiety Inventory, respectively. The pain scores and observed behavioural distress scores were compared using the Mann-Whitney U test. Other co-variates were analyzed using a linear regression analysis. Results: A total of 85 children were enrolled, with 42 receiving iPad distraction and 43 standard care, of which 40 (95%) and 35 (81%) children received topical anesthesia, respectively (p=0.09). There were 40 girls (47.1%) with a mean age of 8.32 +/− 1.61 years. The pain scores during IV cannulation (p=0.35) and the change in pain score during the procedure compared to baseline (p=0.79) were not significantly different between the groups, nor were the observed distress scores during IV cannulation (p=0.09), or the change in observed distress during the procedure compared to baseline (p=0.44). A regression analysis showed children in both groups had greater total behavioural stress if it was their first ED visit (p=0.01), had prior hospitalization experience (p=0.04) or were admitted to hospital during this visit (p=0.007). A previous ED visit, however, was predictive of a greater increase in parental anxiety from baseline (p=0.02). When parents were asked whether they would use the same methods to manage pain for their child, parents of the iPad group were more likely to say yes than were parents of the standard care group (p=0.03). Conclusion: iPad distraction during IV cannulation in school-aged children was not found to decrease pain or distress more than standard care alone, but parents preferred its use. The effects of iPad distraction may have been over-shadowed by potent topical anesthetic effect. Future directions include exploring iPad distraction for other age groups, and studying novel technology such as virtual reality and interactive humanoid robots.
Implementation of an antimicrobial stewardship program bundle for urinary tract infections among 92 patients led to a higher rate of discontinuation of therapy for asymptomatic bacteriuria (52.4% vs 12.5%; P =.004), more appropriate durations of therapy (88.7% vs 63.6%; P =.001), and significantly higher overall bundle compliance (75% vs 38.2%; P < .001).
The nutrient choline is necessary for membrane synthesis and methyl donation, with increased requirements during lactation. The majority of immune development occurs postnatally, but the importance of choline supply for immune development during this critical period is unknown. The objective of this study was to determine the importance of maternal supply of choline during suckling on immune function in their offspring among rodents. At parturition, Sprague–Dawley dams were randomised to either a choline-devoid (ChD; n 7) or choline-sufficient (ChS, 1 g/kg choline; n 10) diet with their offspring euthanised at 3 weeks of age. In a second experiment, offspring were weaned to a ChS diet until 10 weeks of age (ChD-ChS, n 5 and ChS-ChS, n 9). Splenocytes were isolated, and parameters of immune function were measured. The ChD offspring received less choline in breast milk and had lower final body and organ weight compared with ChS offspring (P<0·05), but this effect disappeared by week 10 with choline supplementation from weaning. ChD offspring had a higher proportion of T cells expressing activation markers (CD71 or CD28) and a lower proportion of total B cells (CD45RA+) and responded less to T cell stimulation (lower stimulation index and less IFN-γ production) ex vivo (P<0·05). ChD-ChS offspring had a lower proportion of total and activated CD4+ T cells, and produced less IL-6 after mitogen stimulation compared with cells from ChS-ChS (P<0·05). Our study suggests that choline is required in the suckling diet to facilitate immune development, and choline deprivation during this critical period has lasting effects on T cell function later in life.
High-temperature X-ray diffraction with concurrent gas chromatography (GC) was used to study cobalt disulfide cathode pellets disassembled from thermal batteries. When CoS2 cathode materials were analyzed in an air environment, oxidation of the K(Br, Cl) salt phase in the cathode led to the formation of K2SO4 that subsequently reacted with the pyrite-type CoS2 phase leading to cathode decomposition between ~260 and 450 °C. Independent thermal analysis experiments, i.e. simultaneous thermogravimetric analysis/differential scanning calorimetry/mass spectrometry (MS), augmented the diffraction results and support the overall picture of CoS2 decomposition. Both gas analysis measurements (i.e. GC and MS) from the independent experiments confirmed the formation of SO2 off-gas species during breakdown of the CoS2. In contrast, characterization of the same cathode material under inert conditions showed the presence of CoS2 throughout the entire temperature range of analysis.
We present the first experimentally determined oscillator strengths for the Pb ii transitions at 1203.6 Å and 1433.9 Å, obtained from lifetime measurements made using beam-foil techniques. We also present new detections of these lines in the interstellar medium from an analysis of archival spectra acquired by the Space Telescope Imaging Spectrograph onboard the Hubble Space Telescope. Our observations of the Pb ii λ1203 line represent the first detection of this transition in interstellar gas. Our experimental f-values for the Pb ii λ1203 and λ1433 transitions are consistent with recent theoretical results, including our own relativistic calculations, but are significantly smaller than previous values based on older calculations. Our new f-value for Pb ii λ1433 (0.321 ± 0.034) yields an increase in the interstellar abundance of Pb of 0.43 dex over estimates based on the f-value listed by Morton. With our revised f-values, and with our new detections of Pb ii λ1203 and λ1433, we find that the depletion of Pb onto interstellar grains is not nearly as severe as previously thought, and is very similar to the depletions seen for elements such as Zn and Sn, which have similar condensation temperatures.
Previously published guidelines are available that provide comprehensive recommendations for detecting and preventing healthcare-associated infections (HAIs). The intent of this document is to highlight practical recommendations in a concise format designed to assist acute care hospitals in implementing and prioritizing their Clostridium difficile infection (CDI) prevention efforts. This document updates “Strategies to Prevent Clostridium difficile Infections in Acute Care Hospitals,” published in 2008. This expert guidance document is sponsored by the Society for Healthcare Epidemiology of America (SHEA) and is the product of a collaborative effort led by SHEA, the Infectious Diseases Society of America (IDSA), the American Hospital Association (AHA), the Association for Professionals in Infection Control and Epidemiology (APIC), and The Joint Commission, with major contributions from representatives of a number of organizations and societies with content expertise. The list of endorsing and supporting organizations is presented in the introduction to the 2014 updates.
Fisheries bycatch threatens populations of marine megafauna such as marine mammals, turtles, seabirds, sharks and rays, but fisheries impacts on non-target populations are often difficult to assess due to factors such as data limitation, poorly defined management objectives and lack of quantitative bycatch reduction targets. Limit reference points can be used to address these issues and thereby facilitate adoption and implementation of mitigation efforts. Reference points based on catch data and life history analysis can identify sustainability limits for bycatch with respect to defined population goals even when data are quite limited. This can expedite assessments for large numbers of species and enable prioritization of management actions based on mitigation urgency and efficacy. This paper reviews limit reference point estimators for marine megafauna bycatch, with the aim of highlighting their utility in fisheries management and promoting best practices for use. Different estimators share a common basic structure that can be flexibly applied to different contexts depending on species life history and available data types. Information on demographic vital rates and abundance is required; of these, abundance is the most data-dependent and thus most limiting factor for application. There are different approaches for handling management risk stemming from uncertainty in reference point and bycatch estimates. Risk tolerance can be incorporated explicitly into the reference point estimator itself, or probability distributions may be used to describe uncertainties in bycatch and reference point estimates, and risk tolerance may guide how those are factored into the management process. Either approach requires simulation-based performance testing such as management strategy evaluation to ensure that management objectives can be achieved. Factoring potential sources of bias into such evaluations is critical. This paper reviews the technical, operational, and political challenges to widespread application of reference points for management of marine megafauna bycatch, while emphasizing the importance of developing assessment frameworks that can facilitate sustainable fishing practices.