We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The COVID-19 pandemic and mitigation measures are likely to have a marked effect on mental health. It is important to use longitudinal data to improve inferences.
Aims
To quantify the prevalence of depression, anxiety and mental well-being before and during the COVID-19 pandemic. Also, to identify groups at risk of depression and/or anxiety during the pandemic.
Method
Data were from the Avon Longitudinal Study of Parents and Children (ALSPAC) index generation (n = 2850, mean age 28 years) and parent generation (n = 3720, mean age 59 years), and Generation Scotland (n = 4233, mean age 59 years). Depression was measured with the Short Mood and Feelings Questionnaire in ALSPAC and the Patient Health Questionnaire-9 in Generation Scotland. Anxiety and mental well-being were measured with the Generalised Anxiety Disorder Assessment-7 and the Short Warwick Edinburgh Mental Wellbeing Scale.
Results
Depression during the pandemic was similar to pre-pandemic levels in the ALSPAC index generation, but those experiencing anxiety had almost doubled, at 24% (95% CI 23–26%) compared with a pre-pandemic level of 13% (95% CI 12–14%). In both studies, anxiety and depression during the pandemic was greater in younger members, women, those with pre-existing mental/physical health conditions and individuals in socioeconomic adversity, even when controlling for pre-pandemic anxiety and depression.
Conclusions
These results provide evidence for increased anxiety in young people that is coincident with the pandemic. Specific groups are at elevated risk of depression and anxiety during the COVID-19 pandemic. This is important for planning current mental health provisions and for long-term impact beyond this pandemic.
A survey of acute-care hospitals found that rapid molecular diagnostic tests (RMDTs) have been widely adopted. Although many hospitals use their antimicrobial stewardship team and/or guidelines to help clinicians interpret results and optimize treatment, opportunities to more fully achieve the potential benefits of RMDTs remain.
An observational study was conducted to characterize high-touch surfaces in emergency departments and hemodialysis facilities. Certain surfaces were touched with much greater frequency than others. A small number of surfaces accounted for the majority of touch episodes. Prioritizing disinfection of these surfaces may reduce pathogen transmission within healthcare environments.
Background:Candida dubliniensis is a worldwide fungal opportunistic pathogen, closely related to C. albicans. Originally identified in patients infected with HIV in Dublin, Ireland, C. dubliniensis has emerged as a pathogen in other immunocompromised individuals, including patients receiving chemotherapy and transplant recipients. Pediatric epidemiological data for this organism are limited. Methods: We report a descriptive review of C. dubliniensis isolates recovered between January 2018 and June 2019 at a large tertiary-care pediatric institution in Columbus, Ohio. Results:C. dubliniensis was identified in 48 patients in the 18-month review period. In total, 67 positive cultures were collected in these patients with the following distribution of sources: 44 sputum (66%), 11 bronchoalveloar lavage fluid (16%), 4 blood (6%), 3 wounds (4%), 2 esophageal (3%), 2 peritoneal fluid (3%), and 1 vaginal (1%). Of the 48 patients in whom C. dubliniensis was identified, 35 (73%) were patients with cystic fibrosis. Also, 8 patients (17%) were considered to have clinical infections and received antifungal therapy: 3 patients with pneumonia, 2 patients with esophagitis, 1 patient with peritonitis, 1 patient with catheter-related bloodstream infection, and 1 patient with disseminated candidiasis. The remaining 40 patients (83%) were considered colonized. Conclusions: We report a descriptive series over 18 months of clinical isolates with C. dubliniensis recovery at a pediatric institution. Most isolates were identified as colonizing strains in patients with cystic fibrosis. C. dubliniensis was a rare cause of invasive disease in our institution, with only 8 cases identified.
Background: In recent years, several rapid molecular diagnostic tests (RMDTs) for infectious diseases diagnostics, such as bloodstream infections (BSIs), have become available for clinical use. The extent to which RMDTs have been adopted and how the results of these tests have been incorporated into clinical care are currently unknown. Methods: We surveyed members of the Society for Healthcare Epidemiology of America Research Network to characterize utilization of RMDT in hospitals and antimicrobial stewardship program (ASP) involvement in result communication and interpretation. The survey was administered using Qualtrics software, and data were analyzed using Stata and Excel software. Results: Overall, 57 responses were received (response rate, 59%), and 72% were from academic hospitals; 50 hospitals (88%) used at least 1 RMDT for BSI (Fig. 1). The factors most commonly reported to have been important in the decision to adopt RMDT were improvements in antimicrobial usage (82%), clinical outcomes (74%), and laboratory efficiency (52%). Among 7 hospitals that did not use RMDT for BSI, the most common reason was cost of new technology. In 50 hospitals with RMDT for BSI, 54% provided written guidelines for optimization or de-escalation of antimicrobials based upon RMDT results. In 40 hospitals (80%), microbiology laboratories directly notified a healthcare worker of the RMDT results: 70% provided results to a physician, nurse practitioner, or physician assistant; 48% to the ASP team; and 33% to a nurse. Furthermore, 11 hospitals (22%) had neither guidelines nor ASP intervention. In addition, 24 hospitals (48%) reported performing postimplementation evaluation of RMDT impact. Reported findings included reduction in time to antibiotic de-escalation (75%), reduction in length of stay (25%), improved laboratory efficiency (20%), and reduction in mortality and overall costs (12%). Among the 47 hospitals with both RMDT and ASP, 79% reported that the ASP team routinely reviewed blood culture RMDT results, and 53.2% used clinical decision support software to do so. Finally, 53 hospitals (93%) used 1 or more RMDT for non–bloodstream infections (Fig. 1). Fewer than half of hospitals provided written guidelines to assist clinicians in interpreting these RMDT results. Conclusions: RMDTs have been widely adopted by participating hospitals and are associated with positive self-reported clinical, logistic, and financial outcomes. However, nearly 1 in 4 hospitals did not have guidelines or ASP interventions to assist clinicians with optimization of antimicrobial prescribing based on RMDT results for BSI. Also, most hospitals did not have guidelines for RMDT results for non-BSI. These findings suggest that opportunities exist to further enhance the potential benefits of RMDT.
Background: The healthcare environment can serve as a reservoir for many microorganisms and, in the absence of appropriate cleaning and disinfection, can contribute to pathogen transmission. Identification of high-touch surfaces (HTS) in hospital patient rooms has allowed the recognition of surfaces that represent the greatest transmission risk and prioritization of cleaning and disinfection resources for infection prevention. HTS in other healthcare settings, including high-volume and high-risk settings such as emergency departments (EDs) and hemodialysis facilities (HDFs), have not been well studied or defined. Methods: Observations were conducted in 2 EDs and 3 HDFs using structured observation tools. All touch episodes, defined as hand-to-surface contact regardless of hand hygiene and/or glove use, were recorded. Touches by healthcare personnel, patients, and visitors were included. Surfaces were classified as being allocated to individual patients or shared among multiple patients. The number of touch episodes per hour was calculated for each surface to rank surfaces by frequency of touch. Results: In total, 28 hours of observation (14 hours each in EDs and HDFs) were conducted. 1,976 touch episodes were observed among 62 surfaces. On average, more touch episodes were observed per hour in HDFs than in EDs (89 vs 52, respectively). The most frequently touched surfaces in EDs included stretcher rails, privacy curtains, visitor chair arm rests and seats, and patient bedside tables, which together accounted for 68.8% of all touch episodes in EDs (Fig. 1). Frequently touched surfaces in HDFs included both shared and single-patient surfaces: 27.8% and 72.2% of HDF touch episodes, respectively. The most frequently touched surfaces in HDFs were supply cart drawers, dialysis machine control panels and keyboards, handwashing faucet handles, bedside work tables, and bed rail or dialysis chair armrests, which accounted for 68.4% of all touch-episodes recorded. Conclusions: To our knowledge, this is the first quantitative study to identify HTSs in EDs and HDFs. Our observations reveal that certain surfaces within these environments are subject to a substantially greater frequency of hand contact than others and that a relatively small number of surfaces account for most touch episodes. Notably, whereas HTSs in EDs were primarily single-patient surfaces, HTSs in HDFs included surfaces shared in the care of multiple patients, which may represent an even greater risk of patient-to-patient pathogen transmission than single-patient surfaces. The identification of HTSs in EDs and HDFs contributes to a better understanding of the risk of environment-related pathogen transmission in these settings and may allow prioritization and optimization of cleaning and disinfection resources within facilities.
Background:Pseudomonas aeruginosa is an important nosocomial pathogen associated with intrinsic and acquired resistance mechanisms to major classes of antibiotics. To better understand clinical risk factors for drug-resistant P. aeruginosa infection, decision-tree models for the prediction of fluoroquinolone and carbapenem-resistant P. aeruginosa were constructed and compared to multivariable logistic regression models using performance characteristics. Methods: In total, 5,636 patients admitted to 4 hospitals within a New York City healthcare system from 2010 to 2016 with blood, respiratory, wound, or urine cultures growing PA were included in the analysis. Presence or absence of drug-resistance was defined using the first culture of any source positive for P. aeruginosa during each hospitalization. To train and validate the prediction models, cases were randomly split (60 of 40) into training and validation datasets. Clinical decision-tree models for both fluoroquinolone and carbapenem resistance were built from the training dataset using 21 clinical variables of interest, and multivariable logistic regression models were built using the 16 clinical variables associated with resistance in bivariate analyses. Decision-tree models were optimized using K-fold cross validation, and performance characteristics between the 4 models were compared. Results: From 2010 through 2016, prevalence of fluoroquinolone and carbapenem resistance was 32% and 18%, respectively. For fluoroquinolone resistance, the logistic regression algorithm attained a positive predictive value (PPV) of 0.57 and a negative predictive value (NPV) of 0.73 (sensitivity, 0.27; specificity, 0.90) and the decision-tree algorithm attained a PPV of 0.65 and an NPV of 0.72 (sensitivity 0.21, specificity 0.95). For carbapenem resistance, the logistic regression algorithm attained a PPV of 0.53 and a NPV of 0.85 (sensitivity 0.20, specificity 0.96) and the decision-tree algorithm attained a PPV of 0.59 and an NPV of 0.84 (sensitivity 0.22, specificity 0.96). The decision-tree partitioning algorithm identified prior fluoroquinolone resistance, SNF stay, sex, and length-of-stay as variables of greatest importance for fluoroquinolone resistance compared to prior carbapenem resistance, age, and length-of-stay for carbapenem resistance. The highest-performing decision tree for fluoroquinolone resistance is illustrated in Fig. 1. Conclusions: Supervised machine-learning techniques may facilitate prediction of P. aeruginosa resistance and risk factors driving resistance patterns in hospitalized patients. Such techniques may be applied to readily available clinical information from hospital electronic health records to aid with clinical decision making.
Antarctica's ice shelves modulate the grounded ice flow, and weakening of ice shelves due to climate forcing will decrease their ‘buttressing’ effect, causing a response in the grounded ice. While the processes governing ice-shelf weakening are complex, uncertainties in the response of the grounded ice sheet are also difficult to assess. The Antarctic BUttressing Model Intercomparison Project (ABUMIP) compares ice-sheet model responses to decrease in buttressing by investigating the ‘end-member’ scenario of total and sustained loss of ice shelves. Although unrealistic, this scenario enables gauging the sensitivity of an ensemble of 15 ice-sheet models to a total loss of buttressing, hence exhibiting the full potential of marine ice-sheet instability. All models predict that this scenario leads to multi-metre (1–12 m) sea-level rise over 500 years from present day. West Antarctic ice sheet collapse alone leads to a 1.91–5.08 m sea-level rise due to the marine ice-sheet instability. Mass loss rates are a strong function of the sliding/friction law, with plastic laws cause a further destabilization of the Aurora and Wilkes Subglacial Basins, East Antarctica. Improvements to marine ice-sheet models have greatly reduced variability between modelled ice-sheet responses to extreme ice-shelf loss, e.g. compared to the SeaRISE assessments.
Mindfulness meditation has become a common method for reducing stress, stress-related psychopathology and some physical symptoms. As mindfulness programs become ubiquitous, concerns have been raised about their unknown potential for harm. We estimate multiple indices of harm following Mindfulness-Based Stress Reduction (MBSR) on two primary outcomes: global psychological and physical symptoms. In secondary analyses, we estimate multiple indices of harm on anxiety and depressive symptoms, discomfort in interpersonal relations, paranoid ideation and psychoticism.
Methods
Intent-to-treat analyses with multiple imputations for missing data were used on pre- and post-test data from a large, observational dataset (n = 2155) of community health clinic MBSR classes and from MBSR (n = 156) and waitlist control (n = 118) participants from three randomized controlled trials conducted contemporaneous to community classes in the same city by the same health clinic MBSR teachers. We estimate the change in symptoms, proportion of participants with increased symptoms, proportion of participants reporting greater than a 35% increase in symptoms, and for global psychological symptoms, clinically significant harm.
Results
We find no evidence that MBSR leads to higher rates of harm relative to waitlist control on any primary or secondary outcome. On many indices of harm across multiple outcomes, community MBSR was significantly preventative of harm.
Conclusions
Engagement in MBSR is not predictive of increased rates of harm relative to no treatment. Rather, MBSR may be protective against multiple indices of harm. Research characterizing the relatively small proportion of MBSR participants that experience harm remains important.
Previous genetic association studies have failed to identify loci robustly associated with sepsis, and there have been no published genetic association studies or polygenic risk score analyses of patients with septic shock, despite evidence suggesting genetic factors may be involved. We systematically collected genotype and clinical outcome data in the context of a randomized controlled trial from patients with septic shock to enrich the presence of disease-associated genetic variants. We performed genomewide association studies of susceptibility and mortality in septic shock using 493 patients with septic shock and 2442 population controls, and polygenic risk score analysis to assess genetic overlap between septic shock risk/mortality with clinically relevant traits. One variant, rs9489328, located in AL589740.1 noncoding RNA, was significantly associated with septic shock (p = 1.05 × 10–10); however, it is likely a false-positive. We were unable to replicate variants previously reported to be associated (p < 1.00 × 10–6 in previous scans) with susceptibility to and mortality from sepsis. Polygenic risk scores for hematocrit and granulocyte count were negatively associated with 28-day mortality (p = 3.04 × 10–3; p = 2.29 × 10–3), and scores for C-reactive protein levels were positively associated with susceptibility to septic shock (p = 1.44 × 10–3). Results suggest that common variants of large effect do not influence septic shock susceptibility, mortality and resolution; however, genetic predispositions to clinically relevant traits are significantly associated with increased susceptibility and mortality in septic individuals.
We describe an ultra-wide-bandwidth, low-frequency receiver recently installed on the Parkes radio telescope. The receiver system provides continuous frequency coverage from 704 to 4032 MHz. For much of the band (
${\sim}60\%$
), the system temperature is approximately 22 K and the receiver system remains in a linear regime even in the presence of strong mobile phone transmissions. We discuss the scientific and technical aspects of the new receiver, including its astronomical objectives, as well as the feed, receiver, digitiser, and signal processor design. We describe the pipeline routines that form the archive-ready data products and how those data files can be accessed from the archives. The system performance is quantified, including the system noise and linearity, beam shape, antenna efficiency, polarisation calibration, and timing stability.
n-6 Fatty acids have been shown to exert pro-adipogenic effects, whereas n-3 fatty acids work in opposition. Increasing intakes of linoleic acid (LA; n-6) v. α-linolenic acid (ALA; n-3) in Western diets has led to the hypothesis that consumption of this diet during pregnancy may be contributing to adverse offspring health. This study investigated the effects of feeding a maternal dietary LA:ALA ratio similar to that of the Western diet (9:1) compared with a proposed ‘ideal’ ratio (about 1:1·5), at two total fat levels (18 v. 36 % fat, w/w), on growth and lipogenic gene expression in the offspring. Female Wistar rats were assigned to one of the four experimental groups throughout gestation and lactation. Offspring were culled at 1 and 2 weeks of age for sample collection. Offspring of dams consuming a 36 % fat diet were approximately 20 % lighter than those exposed to an 18 % fat diet (P < 0·001). Male, but not female, liver weight at 1 week was approximately 13 % heavier and had increased glycogen (P < 0·05), in offspring exposed to high LA (P < 0·01). Hepatic expression of lipogenic genes suggested an increase in lipogenesis in male offspring exposed to a 36 % fat maternal diet and in female offspring exposed to a low-LA diet, via increases in the expression of fatty acid synthase and sterol regulatory element-binding protein. Sexually dimorphic responses to altered maternal diet appeared to persist until 2 weeks of age. In conclusion, whilst maternal total fat content predominantly affected offspring growth, fatty acid ratio and total fat content had sexually dimorphic effects on offspring liver weight and composition.
Evidence suggests that sub-optimal maternal nutrition has implications for the developing offspring. We have previously shown that exposure to a low-protein diet during gestation was associated with upregulation of genes associated with cholesterol transport and packaging within the placenta. This study aimed to elucidate the effect of altering maternal dietary linoleic acid (LA; omega-6) to alpha-linolenic acid (ALA; omega-6) ratios as well as total fat content on placental expression of genes associated with cholesterol transport. The potential for maternal body mass index (BMI) to be associated with expression of these genes in human placental samples was also evaluated. Placentas were collected from 24 Wistar rats at 20-day gestation (term = 21–22-day gestation) that had been fed one of four diets containing varying fatty acid compositions during pregnancy, and from 62 women at the time of delivery. Expression of 14 placental genes associated with cholesterol packaging and transfer was assessed in rodent and human samples by quantitative real time polymerase chain reaction. In rats, placental mRNA expression of ApoA2, ApoC2, Cubn, Fgg, Mttp and Ttr was significantly elevated (3–30 fold) in animals fed a high LA (36% fat) diet, suggesting increased cholesterol transport across the placenta in this group. In women, maternal BMI was associated with fewer inconsistent alterations in gene expression. In summary, sub-optimal maternal nutrition is associated with alterations in the expression of genes associated with cholesterol transport in a rat model. This may contribute to altered fetal development and potentially programme disease risk in later life. Further investigation of human placenta in response to specific dietary interventions is required.
Sleep disturbances are prevalent in cancer patients, especially those with advanced disease. There are few published intervention studies that address sleep issues in advanced cancer patients during the course of treatment. This study assesses the impact of a multidisciplinary quality of life (QOL) intervention on subjective sleep difficulties in patients with advanced cancer.
Method
This randomized trial investigated the comparative effects of a multidisciplinary QOL intervention (n = 54) vs. standard care (n = 63) on sleep quality in patients with advanced cancer receiving radiation therapy as a secondary endpoint. The intervention group attended six intervention sessions, while the standard care group received informational material only. Sleep quality was assessed using the Pittsburgh Sleep Quality Index (PSQI) and Epworth Sleepiness Scale (ESS), administered at baseline and weeks 4 (post-intervention), 27, and 52.
Results
The intervention group had a statistically significant improvement in the PSQI total score and two components of sleep quality and daytime dysfunction than the control group at week 4. At week 27, although both groups showed improvements in sleep measures from baseline, there were no statistically significant differences between groups in any of the PSQI total and component scores, or ESS. At week 52, the intervention group used less sleep medication than control patients compared to baseline (p = 0.04) and had a lower ESS score (7.6 vs. 9.3, p = 0.03).
Significance of results
A multidisciplinary intervention to improve QOL can also improve sleep quality of advanced cancer patients undergoing radiation therapy. Those patients who completed the intervention also reported the use of less sleep medication.
To sustainably improve cleaning of high-touch surfaces (HTSs) in acute-care hospitals using a multimodal approach to education, reduction of barriers to cleaning, and culture change for environmental services workers.
The study was conducted in 2 academic acute-care hospitals, 2 community hospitals, and an academic pediatric and women’s hospital.
Participants:
Frontline environmental services workers.
Intervention:
A 5-module educational program, using principles of adult learning theory, was developed and presented to environmental services workers. Audience response system (ARS), videos, demonstrations, role playing, and graphics were used to illustrate concepts of and the rationale for infection prevention strategies. Topics included hand hygiene, isolation precautions, personal protective equipment (PPE), cleaning protocols, and strategies to overcome barriers. Program evaluation included ARS questions, written evaluations, and objective assessments of occupied patient room cleaning. Changes in hospital-onset C. difficile infection (CDI) and methicillin-resistant S. aureus (MRSA) bacteremia were evaluated.
Results:
On average, 357 environmental service workers participated in each module. Most (93%) rated the presentations as ‘excellent’ or ‘very good’ and agreed that they were useful (95%), reported that they were more comfortable donning/doffing PPE (91%) and performing hand hygiene (96%) and better understood the importance of disinfecting HTSs (96%) after the program. The frequency of cleaning individual HTSs in occupied rooms increased from 26% to 62% (P < .001) following the intervention. Improvement was sustained 1-year post intervention (P < .001). A significant decrease in CDI was associated with the program.
Conclusion:
A novel program that addressed environmental services workers’ knowledge gaps, challenges, and barriers was well received and appeared to result in learning, behavior change, and sustained improvements in cleaning.
Many foreign aid donors brand development interventions. How do citizens in the donor country react to seeing this branding in action? We test the proposition that citizens will express higher levels of support for foreign aid when they see a branded foreign aid project relative to seeing the same project without branding. We present results from a survey-based laboratory experiment conducted in the United Kingdom where subjects learned about a typical foreign aid project and received a randomized UK branding treatment. Our results suggest that the branding treatments increase the likelihood that donor country respondents believe that aid recipients can identify the source of the foreign aid. Only among conservative respondents, however, does the evidence imply that branding increases support for foreign aid. “UK aid” branding increases conservative opinion that aid dollars are well spent and increases support among this group for the expansion of foreign aid.
Stone was a critical resource for prehistoric hunter-gatherers. Archaeologists, therefore, have long argued that these groups would actively have sought out stone of ‘high quality’. Although the defining of quality can be a complicated endeavour, researchers in recent years have suggested that stone with fewer impurities would be preferred for tool production, as it can be worked and used in a more controllable way. The present study shows that prehistoric hunter-gatherers at the Holocene site of Welling, in Ohio, USA, continuously selected the ‘purest’ stone for over 9000 years.
Every winter, snowy landscapes are smoothed by snow deposition in calm conditions (no wind). In this study, we investigated how vertically falling snow attenuates topographic relief at horizontal scales less than or approximately equal to snow depth (e.g., 0.1–10 m). In a set of three experiments under natural snowfall, we observed the particle-scale mechanisms by which smoothing is achieved, and we examined the cumulative effect at the snowpack scale. The experiments consisted of (a) a strobe-light box for tracking the trajectories of snowflakes at deposition, (b) allowing snow to fall through a narrow gap (40 mm) and examining snow accumulation above and below the gap, and (c) allowing snow to accumulate over a set of artificial surfaces. At the particle scale, we observed mechanisms enhancing (bouncing, rolling, ejection, breakage, creep, metamorphism) and retarding (interlocking, cohesion, adhesion, sintering) the rate of smoothing. The cumulative effect of these mechanisms is found to be driven by snowpack surface curvature, introducing a directional bias in the lateral transport of snow particles. Our findings suggest that better quantification of the mechanisms behind smoothing by snow could provide insights into the evolution of snow depth variability, and snow-vegetation interactions.