To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Clostridioides difficile infection (CDI) can be prevented through infection prevention practices and antibiotic stewardship. Diagnostic stewardship (ie, strategies to improve use of microbiological testing) can also improve antibiotic use. However, little is known about the use of such practices in US hospitals, especially after multidisciplinary stewardship programs became a requirement for US hospital accreditation in 2017. Thus, we surveyed US hospitals to assess antibiotic stewardship program composition, practices related to CDI, and diagnostic stewardship.
Surveys were mailed to infection preventionists at 900 randomly sampled US hospitals between May and October 2017. Hospitals were surveyed on antibiotic stewardship programs; CDI prevention, treatment, and testing practices; and diagnostic stewardship strategies. Responses were compared by hospital bed size using weighted logistic regression.
Overall, 528 surveys were completed (59% response rate). Almost all (95%) responding hospitals had an antibiotic stewardship program. Smaller hospitals were less likely to have stewardship team members with infectious diseases (ID) training, and only 41% of hospitals met The Joint Commission accreditation standards for multidisciplinary teams. Guideline-recommended CDI prevention practices were common. Smaller hospitals were less likely to use high-tech disinfection devices, fecal microbiota transplantation, or diagnostic stewardship strategies.
Following changes in accreditation standards, nearly all US hospitals now have an antibiotic stewardship program. However, many hospitals, especially smaller hospitals, appear to struggle with access to ID expertise and with deploying diagnostic stewardship strategies. CDI prevention could be enhanced through diagnostic stewardship and by emphasizing the role of non–ID-trained pharmacists and clinicians in antibiotic stewardship.
There is increasing evidence to support integration of simulation into medical training; however, no national emergency medicine (EM) simulation curriculum exists. Using Delphi methodology, we aimed to identify and establish content validity for adult EM curricular content best suited for simulation-based training, to inform national postgraduate EM training.
A national panel of experts in EM simulation iteratively rated potential curricular topics, on a 4-point scale, to determine those best suited for simulation-based training. After each round, responses were analyzed. Topics scoring <2/4 were removed and remaining topics were resent to the panel for further ratings until consensus was achieved, defined as Cronbach α ≥ 0.95. At conclusion of the Delphi process, topics rated ≥ 3.5/4 were considered “core” curricular topics, while those rated 3.0-3.5 were considered “extended” curricular topics.
Forty-five experts from 13 Canadian centres participated. Two hundred eighty potential curricular topics, in 29 domains, were generated from a systematic literature review, relevant educational documents and Delphi panellists. Three rounds of surveys were completed before consensus was achieved, with response rates ranging from 93-100%. Twenty-eight topics, in eight domains, reached consensus as “core” curricular topics. Thirty-five additional topics, in 14 domains, reached consensus as “extended” curricular topics.
Delphi methodology allowed for achievement of expert consensus and content validation of EM curricular content best suited for simulation-based training. These results provide a foundation for improved integration of simulation into postgraduate EM training and can be used to inform a national simulation curriculum to supplement clinical training and optimize learning.
We sought to address the prior limitations of symptom checker accuracy by analysing the diagnostic and triage feasibility of online symptom checkers using a consecutive series of real-life emergency department (ED) patient encounters, and addressing a complex patient population – those with hepatitis C or HIV. We aimed to study the diagnostic and triage accuracy of these symptom checkers in relation to an emergency room physician-determined diagnosis. An ED retrospective analysis was performed on 8363 consecutive adult patients. Eligible patients included: 90 HIV, 67 hepatitis C, 11 both HIV and hepatitis C. Five online symptom checkers were utilised for diagnosis (Mayo Clinic, WebMD, Symptomate, Symcat, Isabel), three with triage capabilities. Symptom checker output was compared with ED physician-determined diagnosis data in regards to diagnostic accuracy and differential diagnosis listing, along with triage advice. All symptom checkers, whether for combined HIV and hepatitis C, HIV alone or hepatitis C alone had poor diagnostic accuracy in regards to Top1 (<20%), Top3 (<35%), Top10 (<40%), Listed at All (<45%). Significant variations existed for each individual symptom checker, as some appeared more accurate for listing the diagnosis in the top of the differential, vs. others more apt to list the diagnosis at all. In regards to ED triage data, a significantly higher percentage of hepatitis C patients (59.7%; 40/67) were found to have an initial diagnosis with emergent criteria than HIV patients (35.6%; 32/90). Symptom checker diagnostic capabilities are quite inferior to physician diagnostic capabilities. Complex patients such as those with HIV or hepatitis C may carry a more specific differential diagnosis, warranting symptom checkers to have diagnostic algorithms accounting for such complexity. Symptom checkers carry the potential for real-time epidemiologic monitoring of patient symptoms, as symptom entries and subsequent symptom checker diagnosis could allow health officials a means to track illnesses in specific patient populations and geographic regions. In order to do this, accurate and reliable symptom checkers are warranted.
A proper subgraph of a connected linear graph is said to disconnect the graph if removing it leaves a disconnected graph. In this paper we characterize, in the following sense, the disconnecting subgraphs of a fixed connected graph. We define two distinct types of disconnecting subgraphs (isthmuses and articulators) which are minimal in the sense that no proper subgraph of either type can disconnect the graph. We then show that any disconnecting subgraph must contain either an isthmus or an articulator. We also define a set of subgraphs (called dense) which form a lattice. We show that the union of the minimal dense subgraphs contains all isthmuses and articulators. In terms of these subgraphs we investigate some of the consequences of assuming that a disconnecting subgraph must contain at least m points.
Legionnaires’ disease (LD) incidence in the USA has quadrupled since 2000. Health departments must detect LD outbreaks quickly to identify and remediate sources. We tested the performance of a system to prospectively detect simulated LD outbreaks in Allegheny County, Pennsylvania, USA. We generated three simulated LD outbreaks based on published outbreaks. After verifying no significant clusters existed in surveillance data during 2014–2016, we embedded simulated outbreak-associated cases into 2016, assigning simulated residences and report dates. We mimicked daily analyses in 2016 using the prospective space-time permutation scan statistic to detect clusters of ⩽30 and ⩽180 days using 365-day and 730-day baseline periods, respectively. We used recurrence interval (RI) thresholds of ⩾20, ⩾100 and ⩾365 days to define significant signals. We calculated sensitivity, specificity and positive and negative predictive values for daily analyses, separately for each embedded outbreak. Two large, simulated cooling tower-associated outbreaks were detected. As the RI threshold was increased, sensitivity and negative predictive value decreased, while positive predictive value and specificity increased. A small, simulated potable water-associated outbreak was not detected. Use of a RI threshold of ⩾100 days minimised time-to-detection while maximizing positive predictive value. Health departments should consider using this system to detect community-acquired LD outbreaks.
A cluster of Salmonella Paratyphi B variant L(+) tartrate(+) infections with indistinguishable pulsed-field gel electrophoresis patterns was detected in October 2015. Interviews initially identified nut butters, kale, kombucha, chia seeds and nutrition bars as common exposures. Epidemiologic, environmental and traceback investigations were conducted. Thirteen ill people infected with the outbreak strain were identified in 10 states with illness onset during 18 July–22 November 2015. Eight of 10 (80%) ill people reported eating Brand A raw sprouted nut butters. Brand A conducted a voluntary recall. Raw sprouted nut butters are a novel outbreak vehicle, though contaminated raw nuts, nut butters and sprouted seeds have all caused outbreaks previously. Firms producing raw sprouted products, including nut butters, should consider a kill step to reduce the risk of contamination. People at greater risk for foodborne illness may wish to consider avoiding raw products containing raw sprouted ingredients.
During the 2009 influenza pandemic, a rapid assessment of disease severity was a challenge as a significant proportion of cases did not seek medical care; care-seeking behaviour changed and the proportion asymptomatic was unknown. A random-digit-dialling telephone survey was undertaken during the 2011/12 winter season in England and Wales to address the feasibility of answering these questions. A proportional quota sampling strategy was employed based on gender, age group, geographical location, employment status and level of education. Households were recruited pre-season and re-contacted immediately following peak seasonal influenza activity. The pre-peak survey was undertaken in October 2011 with 1061 individuals recruited and the post-peak telephone survey in March 2012. Eight hundred and thirty-four of the 1061 (78.6%) participants were successfully re-contacted. Their demographic characteristics compared well to national census data. In total, 8.4% of participants self-reported an influenza-like illness (ILI) in the previous 2 weeks, with 3.2% conforming to the World Health Organization (WHO) ILI case definition. In total, 29.6% of the cases reported consulting their general practitioner. 54.1% of the 1061 participants agreed to be re-contacted about providing biological samples. A population-based cohort was successfully recruited and followed up. Longitudinal survey methodology provides a practical tool to assess disease severity during future pandemics.
Significant increases in excess all-cause mortality, particularly in the elderly, were observed during the winter of 2014/15 in England. With influenza A(H3N2) the dominant circulating influenza A subtype, this paper determines the contribution of influenza to this excess controlling for weather. A standardised multivariable Poisson regression model was employed with weekly all-cause deaths the dependent variable for the period 2008–2015. Adjusting for extreme temperature, a total of 26 542 (95% CI 25 301–27 804) deaths in 65+ and 1942 (95% CI 1834–2052) in 15–64-year-olds were associated with influenza from week 40, 2014 to week 20, 2015. This is compatible with the circulation of influenza A(H3N2). It is the largest estimated number of influenza-related deaths in England since prior to 2008/09. The findings highlight the potential health impact of influenza and the important role of the annual influenza vaccination programme that is required to protect the population including the elderly, who are vulnerable to a severe outcome.
Introduction: Our emergency department (ED) sees a low volume of high acuity pediatric cases. A needs assessment revealed that 68% of our Emergency Physicians (EP) manage pediatric patients in less than 25% of their shifts. The same percentage of EPs as well as ED nurses indicated they were uncomfortable managing a critically unwell neonate. Thus, an interprofessional curriculum focused on pediatric emergencies for ED staff was developed. In-situ simulation education was chosen as the most appropriate method to consolidate each didactic block of curriculum, and uncover important system gaps. Methods: Needs assessment conducted, and emerging themes informed IPE curriculum objectives. A committee of experts in simulation, pediatric emergencies and nursing education designed a full-day, RCPSC accredited, interprofessional in-situ simulation program. Results: Progressive segmental strategy maximized learning outcomes. The initial phase (2 hrs) comprised an” early recognition of sepsis” seminar and 4 rotating skills stations (equipment familiarity, sedating the child, IV starts, and mixing IV medication). This deliberate, adaptive, customized practice was enhanced by expert facilitation at each station, directly engaging participants and providing real-time feedback. The second phase allowed interprofessional teams of MDs, RNs and Physician Assistants to apply knowledge gained from the didactic and skills stations to in-situ simulated emergencies. Each group participated in two pediatric emergency scenarios. Scenarios ran 20 minutes, followed by a 40 minute debrief. Each scenario had a trained debriefer and content expert. The day concluded with a final debrief, attended by all participants. Formalized checklists assessed participants knowledge translation during simulation exercises. Participants assessed facilitators and evaluated the simulation day and curriculum via anonymous feedback forms. Debriefing sessions were scribed and knowledge gaps and system errors were recorded. Results were distributed to ED leaders and responsibilities assigned to key stakeholders to ensure accountability and improvement in system errors. Results All participants reported the experience to be relevant and helpful in their learning. All participants requested more frequent simulation days. System gaps identified included: use of metric vs imperial measurements, non-compatible laryngoscope equipment, inadequate identification of team personnel. As a result, the above-mentioned equipment has been replaced, and we are developing resuscitation room ID stickers for all team roles. Conclusion: Simulation as a culmination to a didactic curriculum provides a safe environment to translate acquired knowledge, increasing ED staff comfort and familiarity with rare pediatric cases. Additionally, is an excellent tool to reveal system gaps and allow us to fill these gaps to improve departmental functioning and safety.
The discovery of the first electromagnetic counterpart to a gravitational wave signal has generated follow-up observations by over 50 facilities world-wide, ushering in the new era of multi-messenger astronomy. In this paper, we present follow-up observations of the gravitational wave event GW170817 and its electromagnetic counterpart SSS17a/DLT17ck (IAU label AT2017gfo) by 14 Australian telescopes and partner observatories as part of Australian-based and Australian-led research programs. We report early- to late-time multi-wavelength observations, including optical imaging and spectroscopy, mid-infrared imaging, radio imaging, and searches for fast radio bursts. Our optical spectra reveal that the transient source emission cooled from approximately 6 400 K to 2 100 K over a 7-d period and produced no significant optical emission lines. The spectral profiles, cooling rate, and photometric light curves are consistent with the expected outburst and subsequent processes of a binary neutron star merger. Star formation in the host galaxy probably ceased at least a Gyr ago, although there is evidence for a galaxy merger. Binary pulsars with short (100 Myr) decay times are therefore unlikely progenitors, but pulsars like PSR B1534+12 with its 2.7 Gyr coalescence time could produce such a merger. The displacement (~2.2 kpc) of the binary star system from the centre of the main galaxy is not unusual for stars in the host galaxy or stars originating in the merging galaxy, and therefore any constraints on the kick velocity imparted to the progenitor are poor.
The increased use of the MATRICS Consensus Cognitive Battery (MCCB) to investigate cognitive dysfunctions in schizophrenia fostered interest in its sensitivity in the context of family studies. As various measures of the same cognitive domains may have different power to distinguish between unaffected relatives of patients and controls, the relative sensitivity of MCCB tests for relative–control differences has to be established. We compared MCCB scores of 852 outpatients with schizophrenia (SCZ) with those of 342 unaffected relatives (REL) and a normative Italian sample of 774 healthy subjects (HCS). We examined familial aggregation of cognitive impairment by investigating within-family prediction of MCCB scores based on probands’ scores.
Multivariate analysis of variance was used to analyze group differences in adjusted MCCB scores. Weighted least-squares analysis was used to investigate whether probands’ MCCB scores predicted REL neurocognitive performance.
SCZ were significantly impaired on all MCCB domains. REL had intermediate scores between SCZ and HCS, showing a similar pattern of impairment, except for social cognition. Proband's scores significantly predicted REL MCCB scores on all domains except for visual learning.
In a large sample of stable patients with schizophrenia, living in the community, and in their unaffected relatives, MCCB demonstrated sensitivity to cognitive deficits in both groups. Our findings of significant within-family prediction of MCCB scores might reflect disease-related genetic or environmental factors.
In our attempt to investigate the basic active galactic nucleus (AGN) paradigm requiring a centrally located supermassive black hole (SMBH), a close to Keplerian accretion disk and a jet perpendicular to its plane, we have searched for radio continuum in galaxies with H2O megamasers in their disks. We observed 18 such galaxies with the Very Large Baseline Array in C band (5 GHz, ~2 mas resolution) and we detected 5 galaxies at 8 σ or higher levels. For those sources for which the maser data is available, the positions of masers and those of the 5 GHz radio continuum sources coincide within the uncertainties, and the radio continuum is perpendicular to the maser disk’s orientation within the position angle uncertainties.
We present the results from the Australian Long Baseline Array (LBA) observations of the ground- and excited-state OH masers at high resolutions towards the massive star-forming region G351.417+0.645 in 2012. We obtain the most accurate spatial gradient of magnetic fields at ground state transitions and verify the reliability of magnetic field strengths measured from previous lower resolution observations. In comparison with previous LBA observations in 2001 at 6.0 GHz, we identified several matched Zeeman pairs. We found that the OH maser features have no significant change of magnetic field strengths and directions with small internal proper motions, implying quite stable physical conditions. Additionally, we found that 1665- and 6035-MHz OH maser features reveal the same trend of reversal of magnetic fields. Moreover, we also analyzed the physical conditions at different locations from the coincidence of different OH maser transitions based on current OH maser models.
Adolescence and young adulthood carry risk for suicidal thoughts and behaviours (STB). An increasing subpopulation of young people consists of college students. STB prevalence estimates among college students vary widely, precluding a validated point of reference. In addition, little is known on predictors for between-study heterogeneity in STB prevalence.
A systematic literature search identified 36 college student samples that were assessed for STB outcomes, representing a total of 634 662 students [median sample size = 2082 (IQR 353–5200); median response rate = 74% (IQR 37–89%)]. We used random-effects meta-analyses to obtain pooled STB prevalence estimates, and multivariate meta-regression models to identify predictors of between-study heterogeneity.
Pooled prevalence estimates of lifetime suicidal ideation, plans, and attempts were 22.3% [95% confidence interval (CI) 19.5–25.3%], 6.1% (95% CI 4.8–7.7%), and 3.2% (95% CI 2.2–4.5%), respectively. For 12-month prevalence, this was 10.6% (95% CI 9.1–12.3%), 3.0% (95% CI 2.1–4.0%), and 1.2% (95% CI 0.8–1.6%), respectively. Measures of heterogeneity were high for all outcomes (I2 = 93.2–99.9%), indicating substantial between-study heterogeneity not due to sampling error. Pooled estimates were generally higher for females, as compared with males (risk ratios in the range 1.12–1.67). Higher STB estimates were also found in samples with lower response rates, when using broad definitions of suicidality, and in samples from Asia.
Based on the currently available evidence, STB seem to be common among college students. Future studies should: (1) incorporate refusal conversion strategies to obtain adequate response rates, and (2) use more fine-grained measures to assess suicidal ideation.
The Infrared Astronomical Satellite (IRAS) has detected many galaxies in the infrared (IR), most of which have fairly steep 25μ to 60μ spectra. Many quasars and active galaxies exhibit a significantly flatter spectrum in the infrared. Several studies, for example, DeGrijp et al. (1985) used this characteristic to select a subsample of “warm” objects from the IRAS PSC (1985).
Six radio telescopes were operated as the first southern hemisphere VLBI array in April and May 1982. Observations were made at 2.3 and 8.4 Ghz. This array produced VLBI images of 28 southern hemisphere radio sources, high accuracy VLBI geodesy between southern hemisphere sites, and subarcsecond radio astrometry of celestial sources south of declination −45 degrees. This paper discusses only the astrophysical aspects of the experiment.
Harvest weed seed control is an alternative non-chemical approach to weed management that targets escaped weed seeds at the time of crop harvest. Relatively little is known on how these methods will work on species in the US. Two of the most prominent weeds in soybean production in the midsouthern US are Palmer amaranth and barnyardgrass. Typically, when crop harvesting occurs the weed seed has already either shattered or is taken into the combine and may be redistributed in the soil seedbank. This causes further weed seed spread and may contribute to the addition of resistant seeds in the seedbank. There is little research on how much seed is retained on different weed species at or beyond harvest time. Thus, the objective of this study was to determine the percentage of total Palmer amaranth and barnyardgrass seed production that was retained on the plant during delayed soybean harvest. Retained seed over time was similar between 2015 and 2016, but was significantly different between years for only Palmer amaranth. Seed retention did not differ between years for either weed species. Palmer amaranth and barnyardgrass retained 98 and 41% of their seed at soybean maturity and 95 and 32% of their seed one month after soybean maturity, respectively. Thus, this research indicates that if there are escaped Palmer amaranth plants and soybean is harvested in a timely manner, most seed will enter the combine and offer potential for capture or destruction of these seeds using harvest weed seed control tactics. While there would be some benefit to using HWSC for barnyardgrass, the utility of this practice on mitigating herbicide resistance would be less pronounced than that of Palmer amaranth because of the reduced seed retention or early seed shatter.
Measures of social cognition are increasingly being applied to psychopathology, including studies of schizophrenia and other psychotic disorders. Tests of social cognition present unique challenges for international adaptations. The Mayer–Salovey–Caruso Emotional Intelligence Test, Managing Emotions Branch (MSCEIT-ME) is a commonly-used social cognition test that involves the evaluation of social scenarios presented in vignettes.
This paper presents evaluations of translations of this test in six different languages based on representative samples from the relevant countries. The goal was to identify items from the MSCEIT-ME that show different response patterns across countries using indices of discrepancy and content validity criteria. An international version of the MSCEIT-ME scoring was developed that excludes items that showed undesirable properties across countries.
We then confirmed that this new version had better performance (i.e. less discrepancy across regions) in international samples than the version based on the original norms. Additionally, it provides scores that are comparable to ratings based on local norms.
This paper shows that it is possible to adapt complex social cognitive tasks so they can provide valid data across different cultural contexts.
Placental transport of vitamin D and other nutrients (e.g. amino acids, fats and glucose) to the fetus is sensitive to maternal and fetal nutritional cues. We studied the effect of maternal calorific restriction on fetal vitamin D status and the placental expression of genes for nutrient transport [aromatic T-type amino acid transporter-1 (TAT-1); triglyceride hydrolase/lipoprotein uptake facilitator lipoprotein lipase (LPL)] and vitamin D homeostasis [CYP27B1; vitamin D receptor (VDR)], and their association with markers of fetal cardiovascular function and skeletal muscle growth. Pregnant sheep received 100% total metabolizable energy (ME) requirements (control), 40% total ME requirements peri-implantation [PI40, 1–31 days of gestation (dGA)] or 50% total ME requirements in late gestation (L, 104–127 dGA). Fetal, but not maternal, plasma 25-hydroxy-vitamin D (25OHD) concentration was lower in PI40 and L maternal undernutrition groups (P<0.01) compared with the control group at 0.86 gestation. PI40 group placental CYP27B1 messenger RNA (mRNA) levels were increased (P<0.05) compared with the control group. Across all groups, higher fetal plasma 25OHD concentration was associated with higher skeletal muscle myofibre and capillary density (P<0.05). In the placenta, higher VDR mRNA levels were associated with higher TAT-1 (P<0.05) and LPL (P<0.01) mRNA levels. In the PI40 maternal undernutrition group only, reduced fetal plasma 25OHD concentration may be mediated in part by altered placental CYP27B1. The association between placental mRNA levels of VDR and nutrient transport genes suggests a way in which the placenta may integrate nutritional cues in the face of maternal dietary challenges and alter fetal physiology.