To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Healthcare personnel who perform invasive procedures and are living with HIV or hepatitis B have been required to self-notify the NC state health department since 1992. State coordinated review of HCP utilizes a panel of experts to evaluate transmission risk and recommend infection prevention measures. We describe how this practice balances HCP privacy and patient safety and health.
To measure the association between statewide adoption of the Centers for Disease Control and Prevention’s (CDC’s) Core Elements for Hospital Antimicrobial Stewardship Programs (Core Elements) and hospital-associated methicillin-resistant Staphylococcus aureus bacteremia (MRSA) and Clostridioides difficile infection (CDI) rates in the United States. We hypothesized that states with a higher percentage of reported compliance with the Core Elements have significantly lower MRSA and CDI rates.
All US states.
Observational longitudinal study.
We used 2014–2016 data from Hospital Compare, Provider of Service files, Medicare cost reports, and the CDC’s Patient Safety Atlas website. Outcomes were MRSA standardized infection ratio (SIR) and CDI SIR. The key explanatory variable was the percentage of hospitals that meet the Core Elements in each state. We estimated state and time fixed-effects models with time-variant controls, and we weighted our analyses for the number of hospitals in the state.
The percentage of hospitals reporting compliance with the Core Elements between 2014 and 2016 increased in all states. A 1% increase in reported ASP compliance was associated with a 0.3% decrease (P < .01) in CDIs in 2016 relative to 2014. We did not find an association for MRSA infections.
Increasing documentation of the Core Elements may be associated with decreases in the CDI SIR. We did not find evidence of such an association for the MRSA SIR, probably due to the short length of the study and variety of stewardship strategies that ASPs may encompass.
Parasitism can affect every aspect of wildlife ecology, from predator avoidance and competition for food to migrations and reproduction. In the wild, these ecological effects can have implications for host fitness and parasite dynamics. In contrast, domestic environments are typically characterised by high host densities, low host diversity, and veterinary interventions, and are not subject to processes like predation, competition, and migration. When wild and domesticated hosts interact via shared parasite populations, understanding and predicting the outcomes of parasite ecology and evolution for wildlife conservation and sustainable farming can be a challenge. We describe the ecology and evolution of ectoparasitic sea lice that are shared by farmed and wild salmon and the insights that experiments, fieldwork, and mathematical modelling have generated for theory and applied problems of host–parasite interactions over the course of a long-term study in Pacific Canada. The salmon–sea lice host–parasite system provides a rich case study to examine the ecological context of host–parasite interactions and to shed light on the principal challenges of parasite management for wildlife health and conservation.
Night-migratory songbirds appear to sense the direction of the Earth's magnetic field via radical pair intermediates formed photochemically in cryptochrome flavoproteins contained in photoreceptor cells in their retinas. It is an open question whether this light-dependent mechanism could be sufficiently sensitive given the low-light levels experienced by nocturnal migrants. The scarcity of available photons results in significant uncertainty in the signal generated by the magnetoreceptors distributed around the retina. Here we use results from Information Theory to obtain a lower bound estimate of the precision with which a bird could orient itself using only geomagnetic cues. Our approach bypasses the current lack of knowledge about magnetic signal transduction and processing in vivo by computing the best-case compass precision under conditions where photons are in short supply. We use this method to assess the performance of three plausible cryptochrome-derived flavin-containing radical pairs as potential magnetoreceptors.
The role that vitamin D plays in pulmonary function remains uncertain. Epidemiological studies reported mixed findings for serum 25-hydroxyvitamin D (25(OH)D)–pulmonary function association. We conducted the largest cross-sectional meta-analysis of the 25(OH)D–pulmonary function association to date, based on nine European ancestry (EA) cohorts (n 22 838) and five African ancestry (AA) cohorts (n 4290) in the Cohorts for Heart and Aging Research in Genomic Epidemiology Consortium. Data were analysed using linear models by cohort and ancestry. Effect modification by smoking status (current/former/never) was tested. Results were combined using fixed-effects meta-analysis. Mean serum 25(OH)D was 68 (sd 29) nmol/l for EA and 49 (sd 21) nmol/l for AA. For each 1 nmol/l higher 25(OH)D, forced expiratory volume in the 1st second (FEV1) was higher by 1·1 ml in EA (95 % CI 0·9, 1·3; P<0·0001) and 1·8 ml (95 % CI 1·1, 2·5; P<0·0001) in AA (Prace difference=0·06), and forced vital capacity (FVC) was higher by 1·3 ml in EA (95 % CI 1·0, 1·6; P<0·0001) and 1·5 ml (95 % CI 0·8, 2·3; P=0·0001) in AA (Prace difference=0·56). Among EA, the 25(OH)D–FVC association was stronger in smokers: per 1 nmol/l higher 25(OH)D, FVC was higher by 1·7 ml (95 % CI 1·1, 2·3) for current smokers and 1·7 ml (95 % CI 1·2, 2·1) for former smokers, compared with 0·8 ml (95 % CI 0·4, 1·2) for never smokers. In summary, the 25(OH)D associations with FEV1 and FVC were positive in both ancestries. In EA, a stronger association was observed for smokers compared with never smokers, which supports the importance of vitamin D in vulnerable populations.
To determine the feasibility and value of developing a regional antibiogram for community hospitals.
Multicenter retrospective analysis of antibiograms.
SETTING AND PARTICIPANTS
A total of 20 community hospitals in central and eastern North Carolina and south central Virginia participated in this study.
We combined antibiogram data from participating hospitals for 13 clinically relevant gram-negative pathogen–antibiotic combinations. From this combined antibiogram, we developed a regional antibiogram based on the mean susceptibilities of the combined data.
We combined a total of 69,778 bacterial isolates across 13 clinically relevant gram-negative pathogen–antibiotic combinations (median for each combination, 1100; range, 174–27,428). Across all pathogen–antibiotic combinations, 69% of local susceptibility rates fell within 1 SD of the regional mean susceptibility rate, and 97% of local susceptibilities fell within 2 SD of the regional mean susceptibility rate. No individual hospital had >1 pathogen–antibiotic combination with a local susceptibility rate >2 SD of the regional mean susceptibility rate. All hospitals’ local susceptibility rates were within 2 SD of the regional mean susceptibility rate for low-prevalence pathogens (<500 isolates cumulative for the region).
Small community hospitals frequently cannot develop an accurate antibiogram due to a paucity of local data. A regional antibiogram is likely to provide clinically useful information to community hospitals for low-prevalence pathogens.
Patient days and days present were compared to directly measured person time to quantify how choice of different denominator metrics may affect antimicrobial use rates. Overall, days present were approximately one-third higher than patient days. This difference varied among hospitals and units and was influenced by short length of stay.
To summarize and discuss logistic and administrative challenges we encountered during the Benefits of Enhanced Terminal Room (BETR) Disinfection Study and lessons learned that are pertinent to future utilization of ultraviolet (UV) disinfection devices in other hospitals
Multicenter cluster randomized trial
SETTING AND PARTICIPANTS
Nine hospitals in the southeastern United States
All participating hospitals developed systems to implement 4 different strategies for terminal room disinfection. We measured compliance with disinfection strategy, barriers to implementation, and perceptions from nurse managers and environmental services (EVS) supervisors throughout the 28-month trial.
Implementation of enhanced terminal disinfection with UV disinfection devices provides unique challenges, including time pressures from bed control personnel, efficient room identification, negative perceptions from nurse managers, and discharge volume. In the course of the BETR Disinfection Study, we utilized several strategies to overcome these barriers: (1) establishing safety as the priority; (2) improving communication between EVS, bed control, and hospital administration; (3) ensuring availability of necessary resources; and (4) tracking and providing feedback on compliance. Using these strategies, we deployed ultraviolet (UV) disinfection devices in 16,220 (88%) of 18,411 eligible rooms during our trial (median per hospital, 89%; IQR, 86%–92%).
Implementation of enhanced terminal room disinfection strategies using UV devices requires recognition and mitigation of 2 key barriers: (1) timely and accurate identification of rooms that would benefit from enhanced terminal disinfection and (2) overcoming time constraints to allow EVS cleaning staff sufficient time to properly employ enhanced terminal disinfection methods.
To determine whether antimicrobial-impregnated textiles decrease the acquisition of pathogens by healthcare provider (HCP) clothing.
We completed a 3-arm randomized controlled trial to test the efficacy of 2 types of antimicrobial-impregnated clothing compared to standard HCP clothing. Cultures were obtained from each nurse participant, the healthcare environment, and patients during each shift. The primary outcome was the change in total contamination on nurse scrubs, measured as the sum of colony-forming units (CFU) of bacteria.
PARTICIPANTS AND SETTING
Nurses working in medical and surgical ICUs in a 936-bed tertiary-care hospital.
Nurse subjects wore standard cotton-polyester surgical scrubs (control), scrubs that contained a complex element compound with a silver-alloy embedded in its fibers (Scrub 1), or scrubs impregnated with an organosilane-based quaternary ammonium and a hydrophobic fluoroacrylate copolymer emulsion (Scrub 2). Nurse participants were blinded to scrub type and randomly participated in all 3 arms during 3 consecutive 12-hour shifts in the intensive care unit.
In total, 40 nurses were enrolled and completed 3 shifts. Analyses of 2,919 cultures from the environment and 2,185 from HCP clothing showed that scrub type was not associated with a change in HCP clothing contamination (P=.70). Mean difference estimates were 0.118 for the Scrub 1 arm (95% confidence interval [CI], −0.206 to 0.441; P=.48) and 0.009 for the Scrub 2 rm (95% CI, −0.323 to 0.342; P=.96) compared to the control. HCP became newly contaminated with important pathogens during 19 of the 120 shifts (16%).
Antimicrobial-impregnated scrubs were not effective at reducing HCP contamination. However, the environment is an important source of HCP clothing contamination.
We performed a spatial-temporal analysis to assess household risk factors for Ebola virus disease (Ebola) in a remote, severely-affected village. We defined a household as a family's shared living space and a case-household as a household with at least one resident who became a suspect, probable, or confirmed Ebola case from 1 August 2014 to 10 October 2014. We used Geographic Information System (GIS) software to calculate inter-household distances, performed space-time cluster analyses, and developed Generalized Estimating Equations (GEE). Village X consisted of 64 households; 42% of households became case-households over the observation period. Two significant space-time clusters occurred among households in the village; temporal effects outweighed spatial effects. GEE demonstrated that the odds of becoming a case-household increased by 4·0% for each additional person per household (P < 0·02) and 2·6% per day (P < 0·07). An increasing number of persons per household, and to a lesser extent, the passage of time after onset of the outbreak were risk factors for household Ebola acquisition, emphasizing the importance of prompt public health interventions that prioritize the most populated households. Using GIS with GEE can reveal complex spatial-temporal risk factors, which can inform prioritization of response activities in future outbreaks.
Depression and obesity are highly prevalent, and major impacts on public health frequently co-occur. Recently, we reported that having depression moderates the effect of the FTO gene, suggesting its implication in the association between depression and obesity.
To confirm these findings by investigating the FTO polymorphism rs9939609 in new cohorts, and subsequently in a meta-analysis.
The sample consists of 6902 individuals with depression and 6799 controls from three replication cohorts and two original discovery cohorts. Linear regression models were performed to test for association between rs9939609 and body mass index (BMI), and for the interaction between rs9939609 and depression status for an effect on BMI. Fixed and random effects meta-analyses were performed using METASOFT.
In the replication cohorts, we observed a significant interaction between FTO, BMI and depression with fixed effects meta-analysis (β=0.12, P = 2.7 × 10−4) and with the Han/Eskin random effects method (P = 1.4 × 10−7) but not with traditional random effects (β = 0.1, P = 0.35). When combined with the discovery cohorts, random effects meta-analysis also supports the interaction (β = 0.12, P = 0.027) being highly significant based on the Han/Eskin model (P = 6.9 × 10−8). On average, carriers of the risk allele who have depression have a 2.2% higher BMI for each risk allele, over and above the main effect of FTO.
This meta-analysis provides additional support for a significant interaction between FTO, depression and BMI, indicating that depression increases the effect of FTO on BMI. The findings provide a useful starting point in understanding the biological mechanism involved in the association between obesity and depression.
Cognitive deficits in schizophrenia have major functional impacts. Modafinil is a cognitive enhancer whose effect in healthy volunteers is well-described, but whose effects on the cognitive deficits of schizophrenia appear to be inconsistent. Two possible reasons for this are that cognitive test batteries vary in their sensitivity, or that the phase of illness may be important, with patients early in their illness responding better.
A double-blind, randomised, placebo-controlled single-dose crossover study of modafinil 200 mg examined this with two cognitive batteries [MATRICS Consensus Cognitive Battery (MCCB) and Cambridge Neuropsychological Test Automated Battery (CANTAB)] in 46 participants with under 3 years’ duration of DSM-IV schizophrenia, on stable antipsychotic medication. In parallel, the same design was used in 28 age-, sex-, and education-matched healthy volunteers. Uncorrected p values were calculated using mixed effects models.
In patients, modafinil significantly improved CANTAB Paired Associate Learning, non-significantly improved efficiency and significantly slowed performance of the CANTAB Stockings of Cambridge spatial planning task. There was no significant effect on any MCCB domain. In healthy volunteers, modafinil significantly increased CANTAB Rapid Visual Processing, Intra-Extra Dimensional Set Shifting and verbal recall accuracy, and MCCB social cognition performance. The only significant differences between groups were in MCCB visual learning.
As in earlier chronic schizophrenia studies, modafinil failed to produce changes in cognition in early psychosis as measured by MCCB. CANTAB proved more sensitive to the effects of modafinil in participants with early schizophrenia and in healthy volunteers. This confirms the importance of selecting the appropriate test battery in treatment studies of cognition in schizophrenia.
To evaluate the impact of multidrug-resistant gram-negative rod (MDR-GNR) infections on mortality and healthcare resource utilization in community hospitals.
Two matched case-control analyses.
Six community hospitals participating in the Duke Infection Control Outreach Network from January 1, 2010, through December 31, 2012.
Adult patients admitted to study hospitals during the study period.
Patients with MDR-GNR bloodstream and urinary tract infections were compared with 2 groups: (1) patients with infections due to nonMDR-GNR and (2) control patients representative of the nonpsychiatric, non-obstetric hospitalized population. Four outcomes were assessed: mortality, direct cost of hospitalization, length of stay, and 30-day readmission rates. Multivariable regression models were created to estimate the effect of MDR status on each outcome measure.
No mortality difference was seen in either analysis. Patients with MDR-GNR infections had 2.03 higher odds of 30-day readmission compared with patients with nonMDR-GNR infections (95% CI, 1.04–3.97, P=.04). There was no difference in hospital direct costs between patients with MDR-GNR infections and patients with nonMDR-GNR infections. Hospitalizations for patients with MDR-GNR infections cost $5,320.03 more (95% CI, $2,366.02–$8,274.05, P<.001) and resulted in 3.40 extra hospital days (95% CI, 1.41–5.40, P<.001) than hospitalizations for control patients.
Our study provides novel data regarding the clinical and financial impact of MDR gram-negative bacterial infections in community hospitals. There was no difference in mortality between patients with MDR-GNR infections and patients with nonMDR-GNR infections or control patients.
Deficits in gamma aminobutyric acid (GABA) neuron-related markers, including the GABA-synthesizing enzyme GAD67, the calcium-binding protein parvalbumin, the neuropeptide somatostatin, and the transcription factor Lhx6, are most pronounced in a subset of schizophrenia subjects identified as having a ‘low GABA marker’ (LGM) molecular phenotype. Furthermore, schizophrenia shares degrees of genetic liability, clinical features and cortical circuitry abnormalities with schizoaffective disorder and bipolar disorder. Therefore, we determined the extent to which a similar LGM molecular phenotype may also exist in subjects with these disorders.
Transcript levels for GAD67, parvalbumin, somatostatin, and Lhx6 were quantified using quantitative PCR in prefrontal cortex area 9 of 184 subjects with a diagnosis of schizophrenia (n = 39), schizoaffective disorder (n = 23) or bipolar disorder (n = 35), or with a confirmed absence of any psychiatric diagnoses (n = 87). A blinded clustering approach was employed to determine the presence of a LGM molecular phenotype across all subjects.
Approximately 49% of the subjects with schizophrenia, 48% of the subjects with schizoaffective disorder, and 29% of the subjects with bipolar disorder, but only 5% of unaffected subjects, clustered in the cortical LGM molecular phenotype.
These findings support the characterization of psychotic and bipolar disorders by cortical molecular phenotype which may help elucidate more pathophysiologically informed and personalized medications.
To assess the impact of an emergency intensive care unit (EICU) established concomitantly with a freestanding emergency department (ED) during the aftermath of Hurricane Sandy.
We retrospectively reviewed records of all patients in Bellevue’s EICU from freestanding ED opening (December 10, 2012) until hospital inpatient reopening (February 7, 2013). Temporal and clinical data, and disposition upon EICU arrival, and ultimate disposition were evaluated.
Two hundred twenty-seven patients utilized the EICU, representing approximately 1.8% of freestanding ED patients. Ambulance arrival occurred in 31.6% of all EICU patients. Median length of stay was 11.55 hours; this was significantly longer for patients requiring airborne isolation (25.60 versus 11.37 hours, P<0.0001 by Wilcoxon rank sum test). After stabilization and treatment, 39% of EICU patients had an improvement in their disposition status (P<0.0001 by Wilcoxon signed rank test); upon interhospital transfer, the absolute proportion of patients requiring ICU and SDU resources decreased from 37.8% to 27.1% and from 22.2% to 2.7%, respectively.
An EICU attached to a freestanding ED achieved significant reductions in resource-intensive medical care. Flexible, adaptable care systems should be explored for implementation in disaster response. (Disaster Med Public Health Preparedness. 2016;10:496–502)
To describe the epidemiology of complex surgical site infection (SSI) following commonly performed surgical procedures in community hospitals and to characterize trends of SSI prevalence rates over time for MRSA and other common pathogens
We prospectively collected SSI data at 29 community hospitals in the southeastern United States from 2008 through 2012. We determined the overall prevalence rates of SSI for commonly performed procedures during this 5-year study period. For each year of the study, we then calculated prevalence rates of SSI stratified by causative organism. We created log-binomial regression models to analyze trends of SSI prevalence over time for all pathogens combined and specifically for MRSA.
A total of 3,988 complex SSIs occurred following 532,694 procedures (prevalence rate, 0.7 infections per 100 procedures). SSIs occurred most frequently after small bowel surgery, peripheral vascular bypass surgery, and colon surgery. Staphylococcus aureus was the most common pathogen. The prevalence rate of SSI decreased from 0.76 infections per 100 procedures in 2008 to 0.69 infections per 100 procedures in 2012 (prevalence rate ratio [PRR], 0.90; 95% confidence interval [CI], 0.82–1.00). A more substantial decrease in MRSA SSI (PRR, 0.69; 95% CI, 0.54–0.89) was largely responsible for this overall trend.
The prevalence of MRSA SSI decreased from 2008 to 2012 in our network of community hospitals. This decrease in MRSA SSI prevalence led to an overall decrease in SSI prevalence over the study period.