To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This study compared the level of education and tests from multiple cognitive domains as proxies for cognitive reserve.
The participants were educationally, ethnically, and cognitively diverse older adults enrolled in a longitudinal aging study. We examined independent and interactive effects of education, baseline cognitive scores, and MRI measures of cortical gray matter change on longitudinal cognitive change.
Baseline episodic memory was related to cognitive decline independent of brain and demographic variables and moderated (weakened) the impact of gray matter change. Education moderated (strengthened) the gray matter change effect. Non-memory cognitive measures did not incrementally explain cognitive decline or moderate gray matter change effects.
Episodic memory showed strong construct validity as a measure of cognitive reserve. Education effects on cognitive decline were dependent upon the rate of atrophy, indicating education effectively measures cognitive reserve only when atrophy rate is low. Results indicate that episodic memory has clinical utility as a predictor of future cognitive decline and better represents the neural basis of cognitive reserve than other cognitive abilities or static proxies like education.
Understanding risk factors for death from Covid-19 is key to providing good quality clinical care. We assessed the presenting characteristics of the ‘first wave’ of patients with Covid-19 at Royal Oldham Hospital, UK and undertook logistic regression modelling to investigate factors associated with death. Of 470 patients admitted, 169 (36%) died. The median age was 71 years (interquartile range 57–82), and 255 (54.3%) were men. The most common comorbidities were hypertension (n = 218, 46.4%), diabetes (n = 143, 30.4%) and chronic neurological disease (n = 123, 26.1%). The most frequent complications were acute kidney injury (AKI) (n = 157, 33.4%) and myocardial injury (n = 21, 4.5%). Forty-three (9.1%) patients required intubation and ventilation, and 39 (8.3%) received non-invasive ventilation. Independent risk factors for death were increasing age (odds ratio (OR) per 10 year increase above 40 years 1.87, 95% confidence interval (CI) 1.57–2.27), hypertension (OR 1.72, 95% CI 1.10–2.70), cancer (OR 2.20, 95% CI 1.27–3.81), platelets <150 × 103/μl (OR 1.93, 95% CI 1.13–3.30), C-reactive protein ≥100 μg/ml (OR 1.68, 95% CI 1.05–2.68), >50% chest radiograph infiltrates (OR 2.09, 95% CI 1.16–3.77) and AKI (OR 2.60, 95% CI 1.64–4.13). There was no independent association between death and gender, ethnicity, deprivation level, fever, SpO2/FiO2, lymphopoenia or other comorbidities. These findings will inform clinical and shared decision making, including use of respiratory support and therapeutic agents.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
Background: Healthcare services are increasingly shifting from inpatient to outpatient settings. Outpatient settings such as emergency departments (EDs), oncology clinics, dialysis clinics, and day surgery often involve invasive procedures with the risk of acquiring healthcare-associated infections (HAIs). As a leading cause of HAI, Clostridioides difficile infection (CDI) in outpatient settings has not been sufficiently described in Canada. The Canadian Nosocomial Infection Surveillance Program (CNISP) aims to describe the epidemiology, molecular characterization, and antimicrobial susceptibility of outpatient CDI across Canada. Methods: Epidemiologic data were collected from patients diagnosed with CDI from a network of 47 adult and pediatric CNISP hospitals. Patients presenting to an outpatient setting such as the ED or outpatient clinics were considered as outpatient CDI. Cases were considered HAIs if the patient had had a healthcare intervention within the previous 4 weeks, and they were considered community-associated if there was no history of hospitalization within the previous 12 weeks. Clostridioides difficile isolates were submitted to the National Microbiology Laboratory for testing during an annual 2-month targeted surveillance period. National and regional rates of CDI were stratified by outpatient location. Results: Between January 1, 2015, and June 30, 2019, 2,691 cases of outpatient-CDI were reported, and 348 isolates were available for testing. Most cases (1,475 of 2,691, 54.8%) were identified in outpatient clinics, and 72.8% (1,960 of 2,691) were classified as community associated. CDI cases per 100,000 ED visits were highest in 2015, at 10.3, and decreased to 8.1 in 2018. Rates from outpatient clinics decreased from 3.5 in 2016 to 2.7 in 2018 (Fig. 1). Regionally, CDI rates in the ED declined in Central Canada and increased in the West after 2016. Rates in outpatient clinics were >2 times higher in the West compared to other regions. RT027 associated with NAP1 was most common among ED patients (26 of 195, 13.3%), whereas RT106 associated with NAP11 was predominant in outpatient clinics (22 of 189, 11.6%). Overall, 10.4% of isolates were resistant to moxifloxacin, 0.5% were resistant to rifampin, and 24.2% were resistant to clindamycin. No resistance was observed for metronidazole, vancomycin, or tigecycline. Compared to CNISP inpatient CDI data, outpatients with CDI were younger (51.8 ± 23.3 vs 64.2 ± 21.6; P < .001), included more females (56.4% vs 50.9%; P < .001), and were more often treated with metronidazole (63.0% vs 56.1%; P < .001). Conclusions: For the first time, CDI cases identified in outpatient settings were characterized in a Canadian context. Outpatient CDI rates are decreasing overall, but they vary by region. Predominant ribotypes vary based on outpatient location. Outpatients with CDI are younger and are more likely female than inpatients with CDI.
Disclosures: Susy Hota reports contract research for Finch Therapeutics.
A survey of Veterans’ Affairs Medical Centers on control of carbapenem-resistant Enterobacteriaceae (CRE) and carbapenem-producing CRE (CP-CRE) demonstrated that most facilities use VA guidelines but few screen for CRE/CP-CRE colonization regularly or regularly communicate CRE/CP-CRE status at patient transfer. Most respondents were knowledgeable about CRE guidelines but cited lack of adequate resources.
Invasive species are widely recognized as a major threat to global diversity and an important factor associated with global change. Species distribution models (SDMs) have been widely applied to determine the range that invasive species could potentially occupy, but most examples focus on predictive variables at a single spatial scale. In this study, we simultaneously considered a broad range of variables related to climate, topography, land cover, land use, and propagule pressure to predict what areas in the southeastern United States are more susceptible to invasion by 45 invasive terrestrial plant species. Using expert-verified occurrence points from EDDMapS, we modeled invasion susceptibility at 30-m resolution for each species using a maximum entropy (MaxEnt) modeling approach. We then analyzed how environmental predictors affected susceptibility to invasion at different spatial scales. Climatic and land-use variables, especially minimum temperature of coldest month and distance to developed areas, were good predictors of landscape susceptibility to invasion. For most of the species tested, human-disturbed systems such as developed areas and barren lands were more prone to be invaded than areas that experienced minimal human interference. As expected, we found that landscape heterogeneity and the presence of corridors for propagule dispersal significantly increased landscape susceptibility to invasion for most species. However, we also found a number of species for which the susceptibility to invasion increased in landscapes with large core areas and/or less-aggregated patches. These exceptions suggest that even though we found the expected general patterns for susceptibility to invasion among most species, the influence of landscape composition and configuration on invasion risk is species specific.
In the absence of pyuria, positive urine cultures are unlikely to represent infection. Conditional urine reflex culture policies have the potential to limit unnecessary urine culturing. We evaluated the impact of this diagnostic stewardship intervention.
We conducted a retrospective, quasi-experimental (nonrandomized) study, with interrupted time series, from August 2013 to January 2018 to examine rates of urine cultures before versus after the policy intervention. We compared 3 intervention sites to 3 control sites in an aggregated series using segmented negative binomial regression.
The study included 6 acute-care hospitals within the Veterans’ Health Administration across the United States.
Adult patients with at least 1 urinalysis ordered during acute-care admission, excluding pregnant patients or those undergoing urological procedures, were included.
At the intervention sites, urine cultures were performed if a preceding urinalysis met prespecified criteria. No such restrictions occurred at the control sites. The primary outcome was the rate of urine cultures performed per 1,000 patient days. The safety outcome was the rate of gram-negative bloodstream infection per 1,000 patient days.
The study included 224,573 urine cultures from 50,901 admissions in 24,759 unique patients. Among the intervention sites, the overall average number of urine cultures performed did not significantly decrease relative to the preintervention period (5.9% decrease; P = 0.8) but did decrease by 21% relative to control sites (P < .01). We detected no significant difference in the rates of gram-negative bloodstream infection among intervention or control sites (P = .49).
Conditional urine reflex culture policies were associated with a decrease in urine culturing without a change in the incidence of gram-negative bloodstream infection.
Stigma against mental illness and the mentally ill is well known. However, stigma against psychiatrists and mental health professionals is known but not discussed widely. Public attitudes and also those of other professionals affect recruitment into psychiatry and mental health services. The reasons for this discriminatory attitude are many and often not dissimilar to those held against mentally ill individuals. In this Guidance paper we present some of the factors affecting the image of psychiatry and psychiatrists which is perceived by the public at large. We look at the portrayal of psychiatry, psychiatrists in the media and literature which may affect attitudes. We also explore potential causes and explanations and propose some strategies in dealing with negative attitudes. Reduction in negative attitudes will improve recruitment and retention in psychiatry. We recommend that national psychiatric societies and other stakeholders, including patients, their families and carers, have a major and significant role to play in dealing with stigma, discrimination and prejudice against psychiatry and psychiatrists.
Stigma and social exclusion related to mental health are of substantial public health importance for Europe. As part of ROAMER (ROAdmap for MEntal health Research in Europe), we used systematic mapping techniques to describe the current state of research on stigma and social exclusion across Europe. Findings demonstrate growing interest in this field between 2007 and 2012. Most studies were descriptive (60%), focused on adults of working age (60%) and were performed in Northwest Europe—primarily in the UK (32%), Finland (8%), Sweden (8%) and Germany (7%). In terms of mental health characteristics, the largest proportion of studies investigated general mental health (20%), common mental disorders (16%), schizophrenia (16%) or depression (14%). There is a paucity of research looking at mechanisms to reduce stigma and promote social inclusion, or at factors that might promote resilience or protect against stigma/social exclusion across the life course. Evidence is also limited in relation to evaluations of interventions. Increasing incentives for cross-country research collaborations, especially with new EU Member States and collaboration across European professional organizations and disciplines, could improve understanding of the range of underpinning social and cultural factors which promote inclusion or contribute toward lower levels of stigma, especially during times of hardship.
We examined Clostridioides difficile infection (CDI) prevention practices and their relationship with hospital-onset healthcare facility-associated CDI rates (CDI rates) in Veterans Affairs (VA) acute-care facilities.
From January 2017 to February 2017, we conducted an electronic survey of CDI prevention practices and hospital characteristics in the VA. We linked survey data with CDI rate data for the period January 2015 to December 2016. We stratified facilities according to whether their overall CDI rate per 10,000 bed days of care was above or below the national VA mean CDI rate. We examined whether specific CDI prevention practices were associated with an increased risk of a CDI rate above the national VA mean CDI rate.
All 126 facilities responded (100% response rate). Since implementing CDI prevention practices in July 2012, 60 of 123 facilities (49%) reported a decrease in CDI rates; 22 of 123 facilities (18%) reported an increase, and 41 of 123 (33%) reported no change. Facilities reporting an increase in the CDI rate (vs those reporting a decrease) after implementing prevention practices were 2.54 times more likely to have CDI rates that were above the national mean CDI rate. Whether a facility’s CDI rates were above or below the national mean CDI rate was not associated with self-reported cleaning practices, duration of contact precautions, availability of private rooms, or certification of infection preventionists in infection prevention.
We found considerable variation in CDI rates. We were unable to identify which particular CDI prevention practices (i.e., bundle components) were associated with lower CDI rates.
The rocky shores of the north-east Atlantic have been long studied. Our focus is from Gibraltar to Norway plus the Azores and Iceland. Phylogeographic processes shape biogeographic patterns of biodiversity. Long-term and broadscale studies have shown the responses of biota to past climate fluctuations and more recent anthropogenic climate change. Inter- and intra-specific species interactions along sharp local environmental gradients shape distributions and community structure and hence ecosystem functioning. Shifts in domination by fucoids in shelter to barnacles/mussels in exposure are mediated by grazing by patellid limpets. Further south fucoids become increasingly rare, with species disappearing or restricted to estuarine refuges, caused by greater desiccation and grazing pressure. Mesoscale processes influence bottom-up nutrient forcing and larval supply, hence affecting species abundance and distribution, and can be proximate factors setting range edges (e.g., the English Channel, the Iberian Peninsula). Impacts of invasive non-native species are reviewed. Knowledge gaps such as the work on rockpools and host–parasite dynamics are also outlined.
The last two years have seen widespread acceptance of the idea that the Milky Way halo was largely created in an early (8-10 Gyr ago) and massive (>1010Mȯ) merger. The roots of this idea pre-date the Gaia mission, but the exquisite proper motions available from Gaia have made the hypothesis irresistible. We trace the history of this idea, reviewing the series of papers that led to our current understanding.
We present new results on the Galactic bar/bulge transverse velocity structure using Gaia and the VISTA Variables in Via Lactea (VVV) survey. Gaia is complemented in high extinction regions by the multi-epoch infrared VVV observations for which derived relative proper motions can be tied to Gaia’s absolute frame. We extract kinematic maps (both 2D and 3D) of the Galactic bar/bulge, from which we measure the pattern speed of the bar using a novel technique. We focus on the evidence of an X-shaped bulge from the kinematic maps.
Introduction: Trauma is a common cause of mortality across all age groups and is projected to become the third greatest contributor to global disease burden. Recent studies have demonstrated that survival from traumatic cardiac arrest (TCA) is more favourable than once believed and further research on this population is being encouraged. Currently, it is unclear whether existing databases, such as the National Ambulatory Care Reporting system (NACRS), which includes all emergency department visits, could be used to identify TCAs for population-based studies. We aimed to determine the accuracy of NACRS administrative codes in identifying TCA patients. Methods: This retrospective validation study used data acquired from NACRS and our institutional Patient Care System. We identified a number of International Classification of Diseases, tenth revision (ICD-10) diagnostic, procedural and cause of injury codes that we hypothesized would be consistent with TCA. NACRS was subsequently searched for patients meeting the diagnostic code criteria (January 1 - December 31, 2015). The following inclusion criteria were: an eligible ICD-10 diagnostic code or a qualifying Canadian Classification of Health Interventions (CCI) procedure code and an eligible ICD-10 external cause of injury code. Electronic medical records for these patients were then reviewed to determine whether true TCAs had occurred. Results: Eighty-five patients met the inclusion criteria and one was excluded from analysis due to inaccessible health records, leaving 84 patients eligible for chart review. Overall, 55% (n = 46) of patients were found to have true TCA, 35% (n = 29) sustained a cardiac arrest of non-traumatic etiology and 11% (n = 9) were considered “unclear” (i.e. could not determine whether it was a true TCA based on the medical records). We found that true TCA patients were most accurately identified using a combination of ICD-10 CA cardiac arrest and external cause of injury codes (Positive predictive value: 70.6%, 95% CI 46.9-86.7). Conclusion: TCA patients were identified with moderate accuracy using the NACRS database. Further efforts to integrate specific data fields for TCA cases within existing population databases and trauma registries is necessary to facilitate future studies focused on this patient population.
In the National Institutes of Health (NIH) Clinical Center, patients colonized or infected with vancomycin-resistant Enterococcus (VRE) are placed in contact isolation until they are deemed “decolonized,” defined as having 3 consecutive perirectal swabs negative for VRE. Some decolonized patients later develop recurrent growth of VRE from surveillance or clinical cultures (ie, “recolonized”), although that finding may represent recrudescence or new acquisition of VRE. We describe the dynamics of VRE colonization and infection and their relationship to receipt of antibiotics.
In this retrospective cohort study of patients at the National Institutes of Health Clinical Center, baseline characteristics were collected via chart review. Antibiotic exposure and hospital days were calculated as proportions of VRE decolonized days. Using survival analysis, we assessed the relationship between antibiotic exposure and time to VRE recolonization in a subcohort analysis of 72 decolonized patients.
In total, 350 patients were either colonized or infected with VRE. Among polymerase chain reaction (PCR)-positive, culture (Cx)-negative (PCR+/Cx−) patients, PCR had a 39% positive predictive value for colonization. Colonization with VRE was significantly associated with VRE infection. Among 72 patients who met decolonization criteria, 21 (29%) subsequently became recolonized. VRE recolonization was 4.3 (P = .001) and 2.0 (P = .22) times higher in patients with proportions of antibiotic days and antianaerobic antibiotic days above the median, respectively.
Colonization is associated with clinical VRE infection and increased mortality. Despite negative perirectal cultures, re-exposure to antibiotics increases the risk of VRE recolonization.
Metal–insulator–metal (MIM) resonant absorbers comprise a conducting ground plane, a dielectric of thickness t, and thin separated metal top-surface structures of dimension l. The fundamental resonance wavelength is predicted by an analytic standing-wave model based on t, l, and the dielectric refractive index spectrum. For the dielectrics SiO2, AlN, and TiO2, values for l of a few microns give fundamental resonances in the 8-12 μm long-wave infrared (LWIR) wavelength region. Agreement with theory is better for t/l exceeding 0.1. Harmonics at shorter wavelengths were already known, but we show that there are additional resonances in the far-infrared 20 - 50 μm wavelength range in MIM structures designed to have LWIR fundamental resonances. These new resonances are consistent with the model if far-IR dispersion features in the index spectrum are considered. LWIR fundamental absorptions are experimentally shown to be optimized for a ratio t/l of 0.1 to 0.3 for SiO2- and AlN-based MIM absorbers, respectively, with TiO2-based MIM optimized at an intermediate ratio.
Although most hospitals report very high levels of hand hygiene compliance (HHC), the accuracy of these overtly observed rates is questionable due to the Hawthorne effect and other sources of bias. In the study, we aimed (1) to compare HHC rates estimated using the standard audit method of overt observation by a known observer and a new audit method that employed a rapid (<15 minutes) “secret shopper” method and (2) to pilot test a novel feedback tool.
Quality improvement project using a quasi-experimental stepped-wedge design.
This study was conducted in 5 acute-care hospitals (17 wards, 5 intensive care units) in the Midwestern United States.
Sites recruited a hand hygiene observer from outside the acute-care units to rapidly and covertly observe entry and exit HHC during the study period, October 2016–September 2017. After 3 months of observations, sites received a monthly feedback tool that communicated HHC information from the new audit method.
The absolute difference in HHC estimates between the standard and new audit methods was ~30%. No significant differences in HHC were detected between the baseline and feedback phases (OR, 0.92; 95% CI, 0.84–1.01), but the standard audit method had significantly higher estimates than the new audit method (OR, 9.83; 95% CI, 8.82–10.95).
HHC estimates obtained using the new audit method were substantially lower than estimates obtained using the standard audit method, suggesting that the rapid, secret-shopper method is less subject to bias. Providing feedback using HHC from the new audit method did not seem to impact HHC behaviors.