To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Healthcare personnel who perform invasive procedures and are living with HIV or hepatitis B have been required to self-notify the NC state health department since 1992. State coordinated review of HCP utilizes a panel of experts to evaluate transmission risk and recommend infection prevention measures. We describe how this practice balances HCP privacy and patient safety and health.
Measuring diet choice in grazing animals is challenging, complicating the assessment of feed efficiency in pasture-based systems. Furthermore, animals may modify their intake of a forage species depending on its nutritive value and on their own physiological status. Various fecal markers have been used to estimate feed intake in grazing animals. However, plant-wax markers such as n-alkanes (ALK) and long-chain alcohols may provide reliable estimates of both dietary choices and intakes. Still, their use in beef cattle has been relatively limited. The present study was designed to test the reliability of the ALK technique to estimate diet choices in beef heifers. Twenty-two Angus-cross heifers were evaluated at both post-weaning and yearling age. At each age, they were offered both red clover and fescue hay as cubes. Following 3-week acclimation periods, daily intake of each forage species was assessed daily for 10 days. During the final 5 days, fecal grab samples were collected twice daily. The ALK fecal concentrations were adjusted using recovery fractions compiled from literature. Diet composition was estimated using two statistical methods. Post-weaning, dietary choices were reliably estimated, with low residual error, regardless of the statistical approach adopted. The regression of observed on estimated red clover proportion ranged from 0.85±0.08 to 1.01±0.09 for fecal samples collected in the p.m. and for daily proportions once averaged, respectively. However, at yearling age, the estimates were less reliable. There was a tendency to overestimate the red clover proportion in diets of heifers preferring fescue, and vice versa. This was due to greater variability in ALK fecal concentrations in the yearling heifers. Overall, the ALK technique provided a reliable tool for estimating diet choice in animals fed a simple forage diet. Although further refinements in the application of this methodology are needed, plant-wax markers provide opportunities for evaluating diet composition in grazing systems in cattle.
The role that vitamin D plays in pulmonary function remains uncertain. Epidemiological studies reported mixed findings for serum 25-hydroxyvitamin D (25(OH)D)–pulmonary function association. We conducted the largest cross-sectional meta-analysis of the 25(OH)D–pulmonary function association to date, based on nine European ancestry (EA) cohorts (n 22 838) and five African ancestry (AA) cohorts (n 4290) in the Cohorts for Heart and Aging Research in Genomic Epidemiology Consortium. Data were analysed using linear models by cohort and ancestry. Effect modification by smoking status (current/former/never) was tested. Results were combined using fixed-effects meta-analysis. Mean serum 25(OH)D was 68 (sd 29) nmol/l for EA and 49 (sd 21) nmol/l for AA. For each 1 nmol/l higher 25(OH)D, forced expiratory volume in the 1st second (FEV1) was higher by 1·1 ml in EA (95 % CI 0·9, 1·3; P<0·0001) and 1·8 ml (95 % CI 1·1, 2·5; P<0·0001) in AA (Prace difference=0·06), and forced vital capacity (FVC) was higher by 1·3 ml in EA (95 % CI 1·0, 1·6; P<0·0001) and 1·5 ml (95 % CI 0·8, 2·3; P=0·0001) in AA (Prace difference=0·56). Among EA, the 25(OH)D–FVC association was stronger in smokers: per 1 nmol/l higher 25(OH)D, FVC was higher by 1·7 ml (95 % CI 1·1, 2·3) for current smokers and 1·7 ml (95 % CI 1·2, 2·1) for former smokers, compared with 0·8 ml (95 % CI 0·4, 1·2) for never smokers. In summary, the 25(OH)D associations with FEV1 and FVC were positive in both ancestries. In EA, a stronger association was observed for smokers compared with never smokers, which supports the importance of vitamin D in vulnerable populations.
To determine the feasibility and value of developing a regional antibiogram for community hospitals.
Multicenter retrospective analysis of antibiograms.
SETTING AND PARTICIPANTS
A total of 20 community hospitals in central and eastern North Carolina and south central Virginia participated in this study.
We combined antibiogram data from participating hospitals for 13 clinically relevant gram-negative pathogen–antibiotic combinations. From this combined antibiogram, we developed a regional antibiogram based on the mean susceptibilities of the combined data.
We combined a total of 69,778 bacterial isolates across 13 clinically relevant gram-negative pathogen–antibiotic combinations (median for each combination, 1100; range, 174–27,428). Across all pathogen–antibiotic combinations, 69% of local susceptibility rates fell within 1 SD of the regional mean susceptibility rate, and 97% of local susceptibilities fell within 2 SD of the regional mean susceptibility rate. No individual hospital had >1 pathogen–antibiotic combination with a local susceptibility rate >2 SD of the regional mean susceptibility rate. All hospitals’ local susceptibility rates were within 2 SD of the regional mean susceptibility rate for low-prevalence pathogens (<500 isolates cumulative for the region).
Small community hospitals frequently cannot develop an accurate antibiogram due to a paucity of local data. A regional antibiogram is likely to provide clinically useful information to community hospitals for low-prevalence pathogens.
Patient days and days present were compared to directly measured person time to quantify how choice of different denominator metrics may affect antimicrobial use rates. Overall, days present were approximately one-third higher than patient days. This difference varied among hospitals and units and was influenced by short length of stay.
To summarize and discuss logistic and administrative challenges we encountered during the Benefits of Enhanced Terminal Room (BETR) Disinfection Study and lessons learned that are pertinent to future utilization of ultraviolet (UV) disinfection devices in other hospitals
Multicenter cluster randomized trial
SETTING AND PARTICIPANTS
Nine hospitals in the southeastern United States
All participating hospitals developed systems to implement 4 different strategies for terminal room disinfection. We measured compliance with disinfection strategy, barriers to implementation, and perceptions from nurse managers and environmental services (EVS) supervisors throughout the 28-month trial.
Implementation of enhanced terminal disinfection with UV disinfection devices provides unique challenges, including time pressures from bed control personnel, efficient room identification, negative perceptions from nurse managers, and discharge volume. In the course of the BETR Disinfection Study, we utilized several strategies to overcome these barriers: (1) establishing safety as the priority; (2) improving communication between EVS, bed control, and hospital administration; (3) ensuring availability of necessary resources; and (4) tracking and providing feedback on compliance. Using these strategies, we deployed ultraviolet (UV) disinfection devices in 16,220 (88%) of 18,411 eligible rooms during our trial (median per hospital, 89%; IQR, 86%–92%).
Implementation of enhanced terminal room disinfection strategies using UV devices requires recognition and mitigation of 2 key barriers: (1) timely and accurate identification of rooms that would benefit from enhanced terminal disinfection and (2) overcoming time constraints to allow EVS cleaning staff sufficient time to properly employ enhanced terminal disinfection methods.
To determine whether antimicrobial-impregnated textiles decrease the acquisition of pathogens by healthcare provider (HCP) clothing.
We completed a 3-arm randomized controlled trial to test the efficacy of 2 types of antimicrobial-impregnated clothing compared to standard HCP clothing. Cultures were obtained from each nurse participant, the healthcare environment, and patients during each shift. The primary outcome was the change in total contamination on nurse scrubs, measured as the sum of colony-forming units (CFU) of bacteria.
PARTICIPANTS AND SETTING
Nurses working in medical and surgical ICUs in a 936-bed tertiary-care hospital.
Nurse subjects wore standard cotton-polyester surgical scrubs (control), scrubs that contained a complex element compound with a silver-alloy embedded in its fibers (Scrub 1), or scrubs impregnated with an organosilane-based quaternary ammonium and a hydrophobic fluoroacrylate copolymer emulsion (Scrub 2). Nurse participants were blinded to scrub type and randomly participated in all 3 arms during 3 consecutive 12-hour shifts in the intensive care unit.
In total, 40 nurses were enrolled and completed 3 shifts. Analyses of 2,919 cultures from the environment and 2,185 from HCP clothing showed that scrub type was not associated with a change in HCP clothing contamination (P=.70). Mean difference estimates were 0.118 for the Scrub 1 arm (95% confidence interval [CI], −0.206 to 0.441; P=.48) and 0.009 for the Scrub 2 rm (95% CI, −0.323 to 0.342; P=.96) compared to the control. HCP became newly contaminated with important pathogens during 19 of the 120 shifts (16%).
Antimicrobial-impregnated scrubs were not effective at reducing HCP contamination. However, the environment is an important source of HCP clothing contamination.
We performed a spatial-temporal analysis to assess household risk factors for Ebola virus disease (Ebola) in a remote, severely-affected village. We defined a household as a family's shared living space and a case-household as a household with at least one resident who became a suspect, probable, or confirmed Ebola case from 1 August 2014 to 10 October 2014. We used Geographic Information System (GIS) software to calculate inter-household distances, performed space-time cluster analyses, and developed Generalized Estimating Equations (GEE). Village X consisted of 64 households; 42% of households became case-households over the observation period. Two significant space-time clusters occurred among households in the village; temporal effects outweighed spatial effects. GEE demonstrated that the odds of becoming a case-household increased by 4·0% for each additional person per household (P < 0·02) and 2·6% per day (P < 0·07). An increasing number of persons per household, and to a lesser extent, the passage of time after onset of the outbreak were risk factors for household Ebola acquisition, emphasizing the importance of prompt public health interventions that prioritize the most populated households. Using GIS with GEE can reveal complex spatial-temporal risk factors, which can inform prioritization of response activities in future outbreaks.
Cognitive deficits in schizophrenia have major functional impacts. Modafinil is a cognitive enhancer whose effect in healthy volunteers is well-described, but whose effects on the cognitive deficits of schizophrenia appear to be inconsistent. Two possible reasons for this are that cognitive test batteries vary in their sensitivity, or that the phase of illness may be important, with patients early in their illness responding better.
A double-blind, randomised, placebo-controlled single-dose crossover study of modafinil 200 mg examined this with two cognitive batteries [MATRICS Consensus Cognitive Battery (MCCB) and Cambridge Neuropsychological Test Automated Battery (CANTAB)] in 46 participants with under 3 years’ duration of DSM-IV schizophrenia, on stable antipsychotic medication. In parallel, the same design was used in 28 age-, sex-, and education-matched healthy volunteers. Uncorrected p values were calculated using mixed effects models.
In patients, modafinil significantly improved CANTAB Paired Associate Learning, non-significantly improved efficiency and significantly slowed performance of the CANTAB Stockings of Cambridge spatial planning task. There was no significant effect on any MCCB domain. In healthy volunteers, modafinil significantly increased CANTAB Rapid Visual Processing, Intra-Extra Dimensional Set Shifting and verbal recall accuracy, and MCCB social cognition performance. The only significant differences between groups were in MCCB visual learning.
As in earlier chronic schizophrenia studies, modafinil failed to produce changes in cognition in early psychosis as measured by MCCB. CANTAB proved more sensitive to the effects of modafinil in participants with early schizophrenia and in healthy volunteers. This confirms the importance of selecting the appropriate test battery in treatment studies of cognition in schizophrenia.
To evaluate the impact of multidrug-resistant gram-negative rod (MDR-GNR) infections on mortality and healthcare resource utilization in community hospitals.
Two matched case-control analyses.
Six community hospitals participating in the Duke Infection Control Outreach Network from January 1, 2010, through December 31, 2012.
Adult patients admitted to study hospitals during the study period.
Patients with MDR-GNR bloodstream and urinary tract infections were compared with 2 groups: (1) patients with infections due to nonMDR-GNR and (2) control patients representative of the nonpsychiatric, non-obstetric hospitalized population. Four outcomes were assessed: mortality, direct cost of hospitalization, length of stay, and 30-day readmission rates. Multivariable regression models were created to estimate the effect of MDR status on each outcome measure.
No mortality difference was seen in either analysis. Patients with MDR-GNR infections had 2.03 higher odds of 30-day readmission compared with patients with nonMDR-GNR infections (95% CI, 1.04–3.97, P=.04). There was no difference in hospital direct costs between patients with MDR-GNR infections and patients with nonMDR-GNR infections. Hospitalizations for patients with MDR-GNR infections cost $5,320.03 more (95% CI, $2,366.02–$8,274.05, P<.001) and resulted in 3.40 extra hospital days (95% CI, 1.41–5.40, P<.001) than hospitalizations for control patients.
Our study provides novel data regarding the clinical and financial impact of MDR gram-negative bacterial infections in community hospitals. There was no difference in mortality between patients with MDR-GNR infections and patients with nonMDR-GNR infections or control patients.
To describe the epidemiology of complex surgical site infection (SSI) following commonly performed surgical procedures in community hospitals and to characterize trends of SSI prevalence rates over time for MRSA and other common pathogens
We prospectively collected SSI data at 29 community hospitals in the southeastern United States from 2008 through 2012. We determined the overall prevalence rates of SSI for commonly performed procedures during this 5-year study period. For each year of the study, we then calculated prevalence rates of SSI stratified by causative organism. We created log-binomial regression models to analyze trends of SSI prevalence over time for all pathogens combined and specifically for MRSA.
A total of 3,988 complex SSIs occurred following 532,694 procedures (prevalence rate, 0.7 infections per 100 procedures). SSIs occurred most frequently after small bowel surgery, peripheral vascular bypass surgery, and colon surgery. Staphylococcus aureus was the most common pathogen. The prevalence rate of SSI decreased from 0.76 infections per 100 procedures in 2008 to 0.69 infections per 100 procedures in 2012 (prevalence rate ratio [PRR], 0.90; 95% confidence interval [CI], 0.82–1.00). A more substantial decrease in MRSA SSI (PRR, 0.69; 95% CI, 0.54–0.89) was largely responsible for this overall trend.
The prevalence of MRSA SSI decreased from 2008 to 2012 in our network of community hospitals. This decrease in MRSA SSI prevalence led to an overall decrease in SSI prevalence over the study period.
To determine whether daily chlorhexidine gluconate (CHG) bathing of intensive care unit (ICU) patients leads to a decrease in hospital-acquired infections (HAIs), particularly infections caused by methicillin-resistant Staphylococcus aureus (MRSA) and vancomycin-resistant Enterococcus (VRE).
Interrupted time series analysis.
The study included 33 community hospitals participating in the Duke Infection Control Outreach Network from January 2008 through December 2013.
All ICU patients at study hospitals during the study period.
Of the 33 hospitals, 17 hospitals implemented CHG bathing during the study period, and 16 hospitals that did not perform CHG bathing served as controls. Primary pre-specified outcomes included ICU central-line–associated bloodstream infections (CLABSIs), primary bloodstream infections (BSI), ventilator-associated pneumonia (VAP), and catheter-associated urinary tract infections (CAUTIs). MRSA and VRE HAIs were also evaluated.
Chlorhexidine gluconate (CHG) bathing was associated with a significant downward trend in incidence rates of ICU CLABSI (incidence rate ratio [IRR], 0.96; 95% confidence interval [CI], 0.93–0.99), ICU primary BSI (IRR, 0.96; 95% CI, 0.94–0.99), VRE CLABSIs (IRR, 0.97; 95% CI, 0.97–0.98), and all combined VRE infections (IRR, 0.96; 95% CI, 0.93–1.00). No significant trend in MRSA infection incidence rates was identified prior to or following the implementation of CHG bathing.
In this multicenter, real-world analysis of the impact of CHG bathing, hospitals that implemented CHG bathing attained a decrease in ICU CLABSIs, ICU primary BSIs, and VRE CLABSIs. CHG bathing did not affect rates of specific or overall infections due to MRSA. Our findings support daily CHG bathing of ICU patients.
Major depressive disorder (MDD) is a common and disabling condition with well-established heritability and environmental risk factors. Gene–environment interaction studies in MDD have typically investigated candidate genes, though the disorder is known to be highly polygenic. This study aims to test for interaction between polygenic risk and stressful life events (SLEs) or childhood trauma (CT) in the aetiology of MDD.
The RADIANT UK sample consists of 1605 MDD cases and 1064 controls with SLE data, and a subset of 240 cases and 272 controls with CT data. Polygenic risk scores (PRS) were constructed using results from a mega-analysis on MDD by the Psychiatric Genomics Consortium. PRS and environmental factors were tested for association with case/control status and for interaction between them.
PRS significantly predicted depression, explaining 1.1% of variance in phenotype (p = 1.9 × 10−6). SLEs and CT were also associated with MDD status (p = 2.19 × 10−4 and p = 5.12 × 10−20, respectively). No interactions were found between PRS and SLEs. Significant PRSxCT interactions were found (p = 0.002), but showed an inverse association with MDD status, as cases who experienced more severe CT tended to have a lower PRS than other cases or controls. This relationship between PRS and CT was not observed in independent replication samples.
CT is a strong risk factor for MDD but may have greater effect in individuals with lower genetic liability for the disorder. Including environmental risk along with genetics is important in studying the aetiology of MDD and PRS provide a useful approach to investigating gene–environment interactions in complex traits.
To determine the association (1) between shorter operative duration and surgical site infection (SSI) and (2) between surgeon median operative duration and SSI risk among first-time hip and knee arthroplasties.
Retrospective cohort study
A total of 43 community hospitals located in the southeastern United States.
Adults who developed SSIs according to National Healthcare Safety Network criteria within 365 days of first-time knee or hip arthroplasties performed between January 1, 2008 and December 31, 2012.
Log-binomial regression models estimated the association (1) between operative duration and SSI outcome and (2) between surgeon median operative duration and SSI outcome. Hip and knee arthroplasties were evaluated in separate models. Each model was adjusted for American Society of Anesthesiology score and patient age.
A total of 25,531 hip arthroplasties and 42,187 knee arthroplasties were included in the study. The risk of SSI in knee arthroplasties with an operative duration shorter than the 25th percentile was 0.40 times the risk of SSI in knee arthroplasties with an operative duration between the 25th and 75th percentile (risk ratio [RR], 0.40; 95% confidence interval [CI], 0.38–0.56; P<.01). Short operative duration did not demonstrate significant association with SSI for hip arthroplasties (RR, 1.04; 95% CI, 0.79–1.37; P=.36). Knee arthroplasty surgeons with shorter median operative durations had a lower risk of SSI than surgeons with typical median operative durations (RR, 0.52; 95% CI, 0.43–0.64; P<.01).
Short operative durations were not associated with a higher SSI risk for knee or hip arthroplasty procedures in our analysis.
Infect. Control Hosp. Epidemiol. 2015;36(12):1431–1436
Paranoia is one of the commonest symptoms of psychosis but has rarely been studied in a population at risk of developing psychosis. Based on existing theoretical models, including the proposed distinction between ‘poor me’ and ‘bad me’ paranoia, we aimed to test specific predictions about associations between negative cognition, metacognitive beliefs and negative emotions and paranoid ideation and the belief that persecution is deserved (deservedness).
We used data from 117 participants from the Early Detection and Intervention Evaluation for people at risk of psychosis (EDIE-2) trial of cognitive–behaviour therapy, comparing them with samples of psychiatric in-patients and healthy students from a previous study. Multi-level modelling was utilized to examine predictors of both paranoia and deservedness, with post-hoc planned comparisons conducted to test whether person-level predictor variables were associated differentially with paranoia or with deservedness.
Our sample of at-risk mental state participants was not as paranoid, but reported higher levels of ‘bad-me’ deservedness, compared with psychiatric in-patients. We found several predictors of paranoia and deservedness. Negative beliefs about self were related to deservedness but not paranoia, whereas negative beliefs about others were positively related to paranoia but negatively with deservedness. Both depression and negative metacognitive beliefs about paranoid thinking were specifically related to paranoia but not deservedness.
This study provides evidence for the role of negative cognition, metacognition and negative affect in the development of paranoid beliefs, which has implications for psychological interventions and our understanding of psychosis.