To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Introduction: Overcrowding in the Emergency Department (ED) results in delays in care, and increased patient morbidity and mortality. Innovative departmental approaches have the potential to make patient flow through the ED more efficient and reduce overcrowding by improving patient throughput. The Calgary zone ED recently piloted a new physician role, the Emergency Physician Lead (EPL), a senior physician working closely with the charge nurse and consulting services to provide physician leadership, and to troubleshoot flow issues and safety breeches such as EMS offload delays and long emergency inpatient (EIP) stays. The objective of this study was to evaluate the efficacy of the EPL by determining its effect on key metrics of patient flow, and by identifying which specific EPL interventions were most effective at improving patient throughput. Methods: A retrospective cohort design was used to compare Foothills Medical Centre (FMC) ED patients seen by the EPL from March-June 2019 (n = 1343 patients) with a control group from the same period in 2018 (n = 5530). An EMR search was used to collect patient data and generate descriptive statistics, which were compared between groups by Mann-Whitney U-test. Patient handover notes left by the EPL were also collected and analyzed by two independent assessors to develop a list of actions taken by the EPL. Each patient was then coded based on the actions in the handover note, and means for each coded group were compared to control to find correlations between action and changes in key flow metrics. Results: Patients whose care involved the EPL had a 40% shorter average ED length of stay (ELOS) compared to control (515 vs 865 min, p < 0.001). The EPL was especially effective for patients with ELOS above the 90th percentile, with a 58% relative reduction. EPL patients also had lower average times from first contact with the department to first order being placed (79 vs 143 min, p < 0.001), and spent less time as EIPs after being admitted (390 vs 515 mins, p < 0.001). EPL actions aimed at early ordering of investigations or early management showed the largest relative reductions in ELOS, followed by actions related to resolving issues with consulting services (56% and 48% respectively, p < 0.001). Conclusion: The EPL role appears to be associated with improvements in several key metrics of patient flow. Specific EPL actions were correlated with marked decreases in length of stay. The EPL may be an effective strategy to improve patient throughput and combat ED overcrowding.
Clostridioides difficile infection (CDI) can be prevented through infection prevention practices and antibiotic stewardship. Diagnostic stewardship (ie, strategies to improve use of microbiological testing) can also improve antibiotic use. However, little is known about the use of such practices in US hospitals, especially after multidisciplinary stewardship programs became a requirement for US hospital accreditation in 2017. Thus, we surveyed US hospitals to assess antibiotic stewardship program composition, practices related to CDI, and diagnostic stewardship.
Surveys were mailed to infection preventionists at 900 randomly sampled US hospitals between May and October 2017. Hospitals were surveyed on antibiotic stewardship programs; CDI prevention, treatment, and testing practices; and diagnostic stewardship strategies. Responses were compared by hospital bed size using weighted logistic regression.
Overall, 528 surveys were completed (59% response rate). Almost all (95%) responding hospitals had an antibiotic stewardship program. Smaller hospitals were less likely to have stewardship team members with infectious diseases (ID) training, and only 41% of hospitals met The Joint Commission accreditation standards for multidisciplinary teams. Guideline-recommended CDI prevention practices were common. Smaller hospitals were less likely to use high-tech disinfection devices, fecal microbiota transplantation, or diagnostic stewardship strategies.
Following changes in accreditation standards, nearly all US hospitals now have an antibiotic stewardship program. However, many hospitals, especially smaller hospitals, appear to struggle with access to ID expertise and with deploying diagnostic stewardship strategies. CDI prevention could be enhanced through diagnostic stewardship and by emphasizing the role of non–ID-trained pharmacists and clinicians in antibiotic stewardship.
There are no estimates of the heritability of phenotypic udder traits in suckler sheep, which produce meat lambs, and whether these are associated with resilience to mastitis. Mastitis is a common disease which damages the mammary gland and reduces productivity. The aims of this study were to investigate the feasibility of collecting udder phenotypes, their heritability and their association with mastitis in suckler ewes. Udder and teat conformation, teat lesions, intramammary masses (IMM) and litter size were recorded from 10 Texel flocks in Great Britain between 2012 and 2014; 968 records were collected. Pedigree data were obtained from an online pedigree recording system. Univariate quantitative genetic parameters were estimated using animal and sire models. Linear mixed models were used to analyse continuous traits and generalised linear mixed models were used to analyse binary traits. Continuous traits had higher heritabilities than binary with teat placement and teat length heritability (h2) highest at 0.35 (SD 0.04) and 0.42 (SD 0.04), respectively. Udder width, drop and separation heritabilities were lower and varied with udder volume. The heritabilities of IMM and teat lesions (sire model) were 0.18 (SD 0.12) and 0.17 (SD 0.11), respectively. All heritabilities were sufficiently high to be in a selection programme to increase resilience to mastitis in the population of Texel sheep. Further studies are required to investigate genetic relationships between traits and to determine whether udder traits predict IMM, and the potential benefits from including traits in a selection programme to increase resilience to chronic mastitis.
Legionnaires’ disease (LD) incidence in the USA has quadrupled since 2000. Health departments must detect LD outbreaks quickly to identify and remediate sources. We tested the performance of a system to prospectively detect simulated LD outbreaks in Allegheny County, Pennsylvania, USA. We generated three simulated LD outbreaks based on published outbreaks. After verifying no significant clusters existed in surveillance data during 2014–2016, we embedded simulated outbreak-associated cases into 2016, assigning simulated residences and report dates. We mimicked daily analyses in 2016 using the prospective space-time permutation scan statistic to detect clusters of ⩽30 and ⩽180 days using 365-day and 730-day baseline periods, respectively. We used recurrence interval (RI) thresholds of ⩾20, ⩾100 and ⩾365 days to define significant signals. We calculated sensitivity, specificity and positive and negative predictive values for daily analyses, separately for each embedded outbreak. Two large, simulated cooling tower-associated outbreaks were detected. As the RI threshold was increased, sensitivity and negative predictive value decreased, while positive predictive value and specificity increased. A small, simulated potable water-associated outbreak was not detected. Use of a RI threshold of ⩾100 days minimised time-to-detection while maximizing positive predictive value. Health departments should consider using this system to detect community-acquired LD outbreaks.
A cluster of Salmonella Paratyphi B variant L(+) tartrate(+) infections with indistinguishable pulsed-field gel electrophoresis patterns was detected in October 2015. Interviews initially identified nut butters, kale, kombucha, chia seeds and nutrition bars as common exposures. Epidemiologic, environmental and traceback investigations were conducted. Thirteen ill people infected with the outbreak strain were identified in 10 states with illness onset during 18 July–22 November 2015. Eight of 10 (80%) ill people reported eating Brand A raw sprouted nut butters. Brand A conducted a voluntary recall. Raw sprouted nut butters are a novel outbreak vehicle, though contaminated raw nuts, nut butters and sprouted seeds have all caused outbreaks previously. Firms producing raw sprouted products, including nut butters, should consider a kill step to reduce the risk of contamination. People at greater risk for foodborne illness may wish to consider avoiding raw products containing raw sprouted ingredients.
To integrate electronic clinical decision support tools into clinical practice and to evaluate the impact on indwelling urinary catheter (IUC) use and catheter-associated urinary tract infections (CAUTIs).
Design, Setting, and Participants
This 4-phase observational study included all inpatients at a multicampus, academic medical center between 2011 and 2015.
Phase 1 comprised best practices training and standardization of electronic documentation. Phase 2 comprised real-time electronic tracking of IUC duration. In phase 3, a triggered alert reminded clinicians of IUC duration. In phase 4, a new IUC order (1) introduced automated order expiration and (2) required consideration of alternatives and selection of an appropriate indication.
Overall, 2,121 CAUTIs, 179,070 new catheters, 643,055 catheter days, and 2,186 reinsertions occurred in 3·85 million hospitalized patient days during the study period. The CAUTI rate per 10,000 patient days decreased incrementally in each phase from 9·06 in phase 1 to 1·65 in phase 4 (relative risk [RR], 0·182; 95% confidence interval [CI], 0·153–0·216; P<·001). New catheters per 1,000 patient days declined from 53·4 in phase 1 to 39·5 in phase 4 (RR, 0·740; 95% CI, 0·730; P<·001), and catheter days per 1,000 patient days decreased from 194·5 in phase 1 to 140·7 in phase 4 (RR, 0·723; 95% CI, 0·719–0·728; P<·001). The reinsertion rate declined from 3·66% in phase 1 to 3·25% in phase 4 (RR, 0·894; 95% CI, 0·834–0·959; P=·0017).
The phased introduction of decision support tools was associated with progressive declines in new catheters, total catheter days, and CAUTIs. Clinical decision support tools offer a viable and scalable intervention to target hospital-wide IUC use and hold promise for other quality improvement initiatives.
Collaborative programs have helped reduce catheter-associated urinary tract infection (CAUTI) rates in community-based nursing homes. We assessed whether collaborative participation produced similar benefits among Veterans Health Administration (VHA) nursing homes, which are part of an integrated system.
This study included 63 VHA nursing homes enrolled in the “AHRQ Safety Program for Long-Term Care,” which focused on practices to reduce CAUTI.
Changes in CAUTI rates, catheter utilization, and urine culture orders were assessed from June 2015 through May 2016. Multilevel mixed-effects negative binomial regression was used to derive incidence rate ratios (IRRs) representing changes over the 12-month program period.
There was no significant change in CAUTI among VHA sites, with a CAUTI rate of 2.26 per 1,000 catheter days at month 1 and a rate of 3.19 at month 12 (incidence rate ratio [IRR], 0.99; 95% confidence interval [CI], 0.67–1.44). Results were similar for catheter utilization rates, which were 11.02% at month 1 and 11.30% at month 12 (IRR, 1.02; 95% CI, 0.95–1.09). The numbers of urine cultures per 1,000 residents were 5.27 in month 1 and 5.31 in month 12 (IRR, 0.93; 95% CI, 0.82–1.05).
No changes in CAUTI rates, catheter use, or urine culture orders were found during the program period. One potential reason was the relatively low baseline CAUTI rate, as compared with a cohort of community-based nursing homes. This low baseline rate is likely related to the VHA’s prior CAUTI prevention efforts. While broad-scale collaborative approaches may be effective in some settings, targeting higher-prevalence safety issues may be warranted at sites already engaged in extensive infection prevention efforts.
The discovery of the first electromagnetic counterpart to a gravitational wave signal has generated follow-up observations by over 50 facilities world-wide, ushering in the new era of multi-messenger astronomy. In this paper, we present follow-up observations of the gravitational wave event GW170817 and its electromagnetic counterpart SSS17a/DLT17ck (IAU label AT2017gfo) by 14 Australian telescopes and partner observatories as part of Australian-based and Australian-led research programs. We report early- to late-time multi-wavelength observations, including optical imaging and spectroscopy, mid-infrared imaging, radio imaging, and searches for fast radio bursts. Our optical spectra reveal that the transient source emission cooled from approximately 6 400 K to 2 100 K over a 7-d period and produced no significant optical emission lines. The spectral profiles, cooling rate, and photometric light curves are consistent with the expected outburst and subsequent processes of a binary neutron star merger. Star formation in the host galaxy probably ceased at least a Gyr ago, although there is evidence for a galaxy merger. Binary pulsars with short (100 Myr) decay times are therefore unlikely progenitors, but pulsars like PSR B1534+12 with its 2.7 Gyr coalescence time could produce such a merger. The displacement (~2.2 kpc) of the binary star system from the centre of the main galaxy is not unusual for stars in the host galaxy or stars originating in the merging galaxy, and therefore any constraints on the kick velocity imparted to the progenitor are poor.
The increased use of the MATRICS Consensus Cognitive Battery (MCCB) to investigate cognitive dysfunctions in schizophrenia fostered interest in its sensitivity in the context of family studies. As various measures of the same cognitive domains may have different power to distinguish between unaffected relatives of patients and controls, the relative sensitivity of MCCB tests for relative–control differences has to be established. We compared MCCB scores of 852 outpatients with schizophrenia (SCZ) with those of 342 unaffected relatives (REL) and a normative Italian sample of 774 healthy subjects (HCS). We examined familial aggregation of cognitive impairment by investigating within-family prediction of MCCB scores based on probands’ scores.
Multivariate analysis of variance was used to analyze group differences in adjusted MCCB scores. Weighted least-squares analysis was used to investigate whether probands’ MCCB scores predicted REL neurocognitive performance.
SCZ were significantly impaired on all MCCB domains. REL had intermediate scores between SCZ and HCS, showing a similar pattern of impairment, except for social cognition. Proband's scores significantly predicted REL MCCB scores on all domains except for visual learning.
In a large sample of stable patients with schizophrenia, living in the community, and in their unaffected relatives, MCCB demonstrated sensitivity to cognitive deficits in both groups. Our findings of significant within-family prediction of MCCB scores might reflect disease-related genetic or environmental factors.
Introduction: Many barriers exist to integrating smoking cessation into delivery of lung cancer screening including limited provider time and patient misconceptions.
Aims: To demonstrate that proactive outreach from a telephone counsellor outside of the patient's usual care team is feasible and acceptable to patients.
Methods: Smokers undergoing lung cancer screening were approached for a telephone counselling study. Patients agreeing to participate in the intervention (n = 27) received two telephone counselling sessions. A 30-day follow-up evaluation was conducted, which also included screening participants receiving usual care (n = 56).
Results/Findings: Most (89%) intervention participants reported being satisfied with the proactive calls, and 81% reported the sessions were helpful. Use of behavioural cessation support programs in the intervention group was four times higher (44%) compared to the usual care group (11%); Relative Risk (RR) = 4.1; 95% CI: 1.7 to 9.9), and seven-day abstinence in the intervention group was double (19%) compared to the usual care group (7%); RR = 2.6; 95% CI: 0.8 to 8.9).
Conclusions: This practical telephone-based approach, which included risk messages clarifying continued risks of smoking in the context of screening results, suggests such messaging can boost utilisation of evidence-based tobacco treatment, self-efficacy, and potentially increase the likelihood of successful quitting.
Mycobacterium marinum, a bacterium found in freshwater and saltwater, can infect persons with direct exposure to fish or aquariums. During December 2013, the New York City Department of Health and Mental Hygiene learned of four suspected or confirmed M. marinum skin or soft tissue infections (SSTIs) among persons who purchased whole fish from Chinese markets. Ninety-eight case-patients with non-tuberculous mycobacteria (NTM) SSTIs were identified with onset June 2013–March 2014. Of these, 77 (79%) were female. The median age was 62 years (range 30–91). Whole genome sequencing of clinical isolates revealed two main clusters and marked genetic diversity. Environmental samples from distributors yielded NTM though not M. marinum. We compared 56 case-patients with 185 control subjects who shopped in Chinese markets, frequency-matched by age group and sex. Risk factors for infection included skin injury to the finger or hand (odds ratio [OR]: 15·5; 95% confidence interval [CI]: 6·9–37·3), hand injury while preparing fish or seafood (OR 8·3; 95% CI 3·8–19·1), and purchasing tilapia (OR 3·6; 95% CI 1·1–13·9) or whiting (OR 2·7; 95% CI 1·1–6·6). A definitive environmental outbreak source was not identified.
The impact of healthcare system integration on infection prevention programs is unknown. Using catheter-associated urinary tract infection (CAUTI) prevention as an example, we hypothesize that US Department of Veterans Affairs (VA) nursing homes have a more robust infection prevention infrastructure due to integration and centralization compared with non–VA nursing homes.
VA and non-VA nursing homes participating in the AHRQ Safety Program for Long-Term Care collaborative.
Nursing homes provided baseline information about their infection prevention programs to assess strengths and gaps related to CAUTI prevention via a needs assessment questionnaire.
A total of 353 of 494 nursing homes from 41 states (71%; 47 VA and 306 non-VA facilities) responded. VA nursing homes reported more hours per week devoted to infection prevention-related activities (31 vs 12 hours; P<.001) and were more likely to have committees that reviewed healthcare-associated infections. Compared with non-VA facilities, a higher percentage of VA nursing homes reported tracking CAUTI rates (94% vs 66%; P<.001), sharing CAUTI data with leadership (94% vs 70%; P=.014) and with nursing personnel (85% vs 56%, P=.003). However, fewer VA nursing homes reported having policies for appropriate catheter use (64% vs 81%; P=.004) and catheter insertion (83% vs 94%; P=.004).
Among nursing homes participating in an AHRQ-funded collaborative, VA and non-VA nursing homes differed in their approach to CAUTI prevention. Best practices from both settings should be applied universally to create an optimal infection prevention program within emerging integrated healthcare systems.
Toxigenic strains of Vibrio cholerae serogroups O1 and O139 have caused cholera epidemics, but other serogroups – such as O75 or O141 – can also produce cholera toxin and cause severe watery diarrhoea similar to cholera. We describe 31 years of surveillance for toxigenic non-O1, non-O139 infections in the United States and map these infections to the state where the exposure probably originated. While serogroups O75 and O141 are closely related pathogens, they differ in how and where they infect people. Oysters were the main vehicle for O75 infection. The vehicles for O141 infection include oysters, clams, and freshwater in lakes and rivers. The patients infected with serogroup O75 who had food traceback information available ate raw oysters from Florida. Patients infected with O141 ate oysters from Florida and clams from New Jersey, and those who only reported being exposed to freshwater were exposed in Arizona, Michigan, Missouri, and Texas. Improving the safety of oysters, specifically, should help prevent future illnesses from these toxigenic strains and similar pathogenic Vibrio species. Post-harvest processing of raw oysters, such as individual quick freezing, heat-cool pasteurization, and high hydrostatic pressurization, should be considered.
Although mental disorders are significant predictors of educational attainment throughout the entire educational career, most research on mental disorders among students has focused on the primary and secondary school years.
The World Health Organization World Mental Health Surveys were used to examine the associations of mental disorders with college entry and attrition by comparing college students (n = 1572) and non-students in the same age range (18–22 years; n = 4178), including non-students who recently left college without graduating (n = 702) based on surveys in 21 countries (four low/lower-middle income, five upper-middle-income, one lower-middle or upper-middle at the times of two different surveys, and 11 high income). Lifetime and 12-month prevalence and age-of-onset of DSM-IV anxiety, mood, behavioral and substance disorders were assessed with the Composite International Diagnostic Interview (CIDI).
One-fifth (20.3%) of college students had 12-month DSM-IV/CIDI disorders; 83.1% of these cases had pre-matriculation onsets. Disorders with pre-matriculation onsets were more important than those with post-matriculation onsets in predicting subsequent college attrition, with substance disorders and, among women, major depression the most important such disorders. Only 16.4% of students with 12-month disorders received any 12-month healthcare treatment for their mental disorders.
Mental disorders are common among college students, have onsets that mostly occur prior to college entry, in the case of pre-matriculation disorders are associated with college attrition, and are typically untreated. Detection and effective treatment of these disorders early in the college career might reduce attrition and improve educational and psychosocial functioning.
Post-traumatic stress disorder (PTSD) is associated with elevated risk for metabolic syndrome (MetS). However, the direction of this association is not yet established, as most prior studies employed cross-sectional designs. The primary goal of this study was to evaluate bidirectional associations between PTSD and MetS using a longitudinal design.
A total of 1355 male and female veterans of the conflicts in Iraq and Afghanistan underwent PTSD diagnostic assessments and their biometric profiles pertaining to MetS were extracted from the electronic medical record at two time points (spanning ~2.5 years, n = 971 at time 2).
The prevalence of MetS among veterans with PTSD was just under 40% at both time points and was significantly greater than that for veterans without PTSD; the prevalence of MetS among those with PTSD was also elevated relative to age-matched population estimates. Cross-lagged panel models revealed that PTSD severity predicted subsequent increases in MetS severity (β = 0.08, p = 0.002), after controlling for initial MetS severity, but MetS did not predict later PTSD symptoms. Logistic regression results suggested that for every 10 PTSD symptoms endorsed at time 1, the odds of a subsequent MetS diagnosis increased by 56%.
Results highlight the substantial cardiometabolic concerns of young veterans with PTSD and raise the possibility that PTSD may predispose individuals to accelerated aging, in part, manifested clinically as MetS. This demonstrates the need to identify those with PTSD at greatest risk for MetS and to develop interventions that improve both conditions.
To investigate the effectiveness of an online, interactive intervention, referred to as the Green Eating (GE) Project, to motivate university students to adopt GE behaviours.
The study was quasi-experimental and integrated into courses for credit/extra credit. Courses were randomly stratified into experimental or non-treatment control. The 5-week intervention consisted of four modules based on different GE topics. Participants completed the GE survey at baseline (experimental, n 241; control, n 367) and post (experimental, n 187; control, n 304). The GE survey has been previously validated and consists of Transtheoretical Model constructs including stage of change (SOC), decisional balance (DB: Pros and Cons) and self-efficacy (SE: School and Home) as well as behaviours for GE. Modules contained basic information regarding each topic and knowledge items to assess content learning.
The GE Project took place at a public university in the north-eastern USA.
Participants were full-time students between the ages of 18 and 24 years.
The GE Project was effective in significantly increasing GE behaviours, DB Pros, SE School and knowledge in experimental compared with control, but did not reduce DB Cons or increase SE Home. Experimental participants were also more likely to be in later SOC for GE at post testing.
The GE Project was effective in increasing GE behaviours in university students. Motivating consumers towards adopting GE could assist in potentially mitigating negative consequences of the food system on the environment. Future research could tailor the intervention to participant SOC to further increase the effects or design the modules for other participants.