We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Bloodstream infections (BSIs) are a frequent cause of morbidity in patients with acute myeloid leukemia (AML), due in part to the presence of central venous access devices (CVADs) required to deliver therapy.
Objective:
To determine the differential risk of bacterial BSI during neutropenia by CVAD type in pediatric patients with AML.
Methods:
We performed a secondary analysis in a cohort of 560 pediatric patients (1,828 chemotherapy courses) receiving frontline AML chemotherapy at 17 US centers. The exposure was CVAD type at course start: tunneled externalized catheter (TEC), peripherally inserted central catheter (PICC), or totally implanted catheter (TIC). The primary outcome was course-specific incident bacterial BSI; secondary outcomes included mucosal barrier injury (MBI)-BSI and non-MBI BSI. Poisson regression was used to compute adjusted rate ratios comparing BSI occurrence during neutropenia by line type, controlling for demographic, clinical, and hospital-level characteristics.
Results:
The rate of BSI did not differ by CVAD type: 11 BSIs per 1,000 neutropenic days for TECs, 13.7 for PICCs, and 10.7 for TICs. After adjustment, there was no statistically significant association between CVAD type and BSI: PICC incident rate ratio [IRR] = 1.00 (95% confidence interval [CI], 0.75–1.32) and TIC IRR = 0.83 (95% CI, 0.49–1.41) compared to TEC. When MBI and non-MBI were examined separately, results were similar.
Conclusions:
In this large, multicenter cohort of pediatric AML patients, we found no difference in the rate of BSI during neutropenia by CVAD type. This may be due to a risk-profile for BSI that is unique to AML patients.
In the early modern period, extractive industries were the vanguard of European colonization in America. Whether involving the removal of minerals, flora, fauna, or other organic or inorganic materials, these ventures attracted enterprising Europeans hoping to profit from bringing natural resources out of newly accessible lands and into the expanding currents of international trade. Establishing a viable extractive industry – such as mining, logging, fishing, hunting animals, or collecting plants – proved a critical preliminary component of many settlement schemes by helping to generate the capital needed to underwrite their initial development and, ideally, by contributing to their ongoing productivity. Although the results were uneven, European powers, especially Spain, England, France, and the Netherlands, sought to parlay their subjects’ involvement in various American extractive industries to produce national wealth, claim sovereignty over new lands, justify the exploitation of subaltern populations, and lay the groundwork for more extensive imperial expansion. Whether as the foundations of new commodity frontiers or as the precursors to other forms of colonial development, extractive industries reshaped many regions in the Americas, often with dire outcomes for their Indigenous inhabitants and natural environments.
To assess the relationship between food insecurity, sleep quality, and days with mental and physical health issues among college students.
Design:
An online survey was administered. Food insecurity was assessed using the ten-item Adult Food Security Survey Module. Sleep was measured using the nineteen-item Pittsburgh Sleep Quality Index (PSQI). Mental health and physical health were measured using three items from the Healthy Days Core Module. Multivariate logistic regression was conducted to assess the relationship between food insecurity, sleep quality, and days with poor mental and physical health.
Setting:
Twenty-two higher education institutions.
Participants:
College students (n 17 686) enrolled at one of twenty-two participating universities.
Results:
Compared with food-secure students, those classified as food insecure (43·4 %) had higher PSQI scores indicating poorer sleep quality (P < 0·0001) and reported more days with poor mental (P < 0·0001) and physical (P < 0·0001) health as well as days when mental and physical health prevented them from completing daily activities (P < 0·0001). Food-insecure students had higher adjusted odds of having poor sleep quality (adjusted OR (AOR): 1·13; 95 % CI 1·12, 1·14), days with poor physical health (AOR: 1·01; 95 % CI 1·01, 1·02), days with poor mental health (AOR: 1·03; 95 % CI 1·02, 1·03) and days when poor mental or physical health prevented them from completing daily activities (AOR: 1·03; 95 % CI 1·02, 1·04).
Conclusions:
College students report high food insecurity which is associated with poor mental and physical health, and sleep quality. Multi-level policy changes and campus wellness programmes are needed to prevent food insecurity and improve student health-related outcomes.
ABSTRACT IMPACT: Partnering with academic offices to promote peer-mentoring in a virtual format is feasible, novel, and well-received across a major academic campus. Particularly during a pandemic, the success of this programmatic effort highlights the continued need for peer-to-peer support. OBJECTIVES/GOALS: To identify feasibility and key lessons learned from the planning and implementation of a virtual, interdisciplinary group peer-mentoring series, implemented broadly across an academic medical center in New York City. METHODS/STUDY POPULATION: ASPIRE! (Accountability & Safe-Space to Promote, Inspire, Recharge, & Empower one another!) is a group of seven interdisciplinary mid-career academic women faculty, who began collaborations as CTSA KL2 scholars. Our mission is to support interdisciplinary peer coaching for advancement of gender and racial equity among academic faculty and leaders. We designed and implemented a series of virtual symposia focused on essential struggles for clinicians and investigators at during the COVID-19 pandemic. Partnering with Columbia’s CTSA, Office for Women and Diverse Faculty, and Office for Research, we invited leaders in psychiatry/psychology, early childhood education, organization/team management, and academic advancement as keynote speakers and facilitated peer-mentoring virtual breakouts. RESULTS/ANTICIPATED RESULTS: These efforts resulted in the completion of four separate 1.5-hour symposia, each with keynote speakers, discussions with academic leaders, and 30-minute breakout peer-mentoring sessions. Session topics included Calibrating Expectations, Helping Families Thrive, Managing Remote Teams, and Faces and Phases of Stress. Enrollment ranged from 30 to 70 participants per session. Participants reported: (1) Keynotes focused on actionable solutions stimulated the most productive conversations; (2) Peers from different disciplines and career stages provided a range of actionable recommendations tested within local contexts; (3) The greatest learning came from the peer-to-peer breakout group sessions. DISCUSSION/SIGNIFICANCE OF FINDINGS: Partnering with academic offices to promote interdisciplinary, peer-mentoring in a virtual format is feasible, novel, and can be well-received across a major academic campus during the COVID-19 pandemic. The success of this programmatic effort highlights the continued need for expanded peer-to-peer support in academia.
To evaluate whether vanA rectal screening for vancomycin-resistant Enterococcus (VRE) predicts vancomycin resistance for patients with enterococcal bloodstream infection (BSI).
Design:
A retrospective cohort study.
Setting:
Large academic medical center.
Methods:
The predictive performance of a vanA rectal swab was evaluated in 161 critically ill adults with an enterococcal BSI from January 1, 2007, to September 1, 2014, and who had a vanA rectal swab screening obtained within 14 days prior to blood culture.
Results:
Of the patients meeting inclusion criteria, 83 (51.6%) were vanA swab positive. Rectal-swab–positive patients were more likely to be younger, to be immunocompromised, to have an indwelling central vascular catheter, and to have a history of MDR bacteria. The vanA rectal swab had sensitivity and negative predictive values of 83.6% and 85.9%, respectively, and specificity and positive predictive values of 71.3% and 67.5%, respectively, for predicting a vancomycin-resistant enterococcal BSI in critically ill adults.
Conclusions:
VanA rectal swabs may be useful for antimicrobial stewardship at institutions with VRE screening already in place for infection control purposes. A higher PPV would be warranted to implement a universal vanA screen on all ICU patients.
Background: Bloodstream infections (BSIs) due to methicillin-resistant Staphylococcus aureus (MRSA) are important causes of morbidity and mortality in hospitalized patients. Long-term national MRSA BSI surveillance establishes rates for internal and external comparison and provide insight into epidemiologic, molecular, and resistance trends. Here, we present and discuss National MRSA BSI incidence rates and trends over time in Canadian acute-care hospitals from 2008 to 2018. Methods: The Canadian Nosocomial Infection Surveillance Programme (CNISP) is a collaborative effort of the Association of Medical Microbiology and Infectious Disease Canada and the Public Health Agency of Canada. Since 1995, the CNISP has conducted hospital-based sentinel surveillance of MRSA BSIs. Data were collected using standardized definitions and forms from hospitals that participate in the CNISP (48 hospitals in 2008 to 62 hospitals in 2018). For each MRSA BSI identified, the medical record was reviewed for clinical and demographic information and when possible, 1 blood-culture isolate per patient was submitted to a central laboratory for further molecular characterization and susceptibility testing. Results: From 2008 to 2013, MRSA BSI rates per 10,000 patient days were relatively stable (0.60–0.56). Since 2014, MRSA BSI rates have gradually increased from 0.66 to 1.05 in 2018. Although healthcare-associated (HA) MRSA BSI has shown a minimal increase (0.40 in 2014 to 0.51 in 2018), community-acquired (CA) MRSA BSI has increased by 150%, from 0.20 in 2014 to 0.50 in 2018 (Fig. 1). Laboratory characterization revealed that the proportion of isolates identified as CMRSA 2 (USA 100) decreased each year, from 39% in 2015 to 28% in 2018, while CMRSA 10 (USA 300) has increased from 41% to 47%. Susceptibility testing shows a decrease in clindamycin resistance from 82% in 2013 to 41% in 2018. Conclusions: Over the last decade, ongoing prospective MRSA BSI surveillance has shown relatively stable HA-MRSA rates, while CA-MRSA BSI rates have risen substantially. The proportion of isolates most commonly associated with HA-MRSA BSI (CMRSA2/USA 100) are decreasing and, given that resistance trends are tied to the prevalence of specific epidemic types, a large decrease in clindamycin resistance has been observed. MRSA BSI surveillance has shown a changing pattern in the epidemiology and laboratory characterization of MRSA BSI. The addition of hospitals in later years that may have had higher rates of CA-MRSA BSI could be a confounding factor. Continued comprehensive national surveillance will provide valuable information to address the challenges of infection prevention and control of MRSA BSI in hospitals.
Cognitive deficits affect a significant proportion of patients with bipolar disorder (BD). Problems with sustained attention have been found independent of mood state and the causes are unclear. We aimed to investigate whether physical parameters such as activity levels, sleep, and body mass index (BMI) may be contributing factors.
Methods
Forty-six patients with BD and 42 controls completed a battery of neuropsychological tests and wore a triaxial accelerometer for 21 days which collected information on physical activity, sleep, and circadian rhythm. Ex-Gaussian analyses were used to characterise reaction time distributions. We used hierarchical regression analyses to examine whether physical activity, BMI, circadian rhythm, and sleep predicted variance in the performance of cognitive tasks.
Results
Neither physical activity, BMI, nor circadian rhythm predicted significant variance on any of the cognitive tasks. However, the presence of a sleep abnormality significantly predicted a higher intra-individual variability of the reaction time distributions on the Attention Network Task.
Conclusions
This study suggests that there is an association between sleep abnormalities and cognition in BD, with little or no relationship with physical activity, BMI, and circadian rhythm.
To ascertain opinions regarding etiology and preventability of hospital-onset bacteremia and fungemia (HOB) and perspectives on HOB as a potential outcome measure reflecting quality of infection prevention and hospital care.
Design:
Cross-sectional survey.
Participants:
Hospital epidemiologists and infection preventionist members of the Society for Healthcare Epidemiology of America (SHEA) Research Network.
Methods:
A web-based, multiple-choice survey was administered via the SHEA Research Network to 133 hospitals.
Results:
A total of 89 surveys were completed (67% response rate). Overall, 60% of respondents defined HOB as a positive blood culture on or after hospital day 3. Central line-associated bloodstream infections and intra-abdominal infections were perceived as the most frequent etiologies. Moreover, 61% thought that most HOB events are preventable, and 54% viewed HOB as a measure reflecting a hospital’s quality of care. Also, 29% of respondents’ hospitals already collect HOB data for internal purposes. Given a choice to publicly report central-line–associated bloodstream infections (CLABSIs) and/or HOB, 57% favored reporting either HOB alone (22%) or in addition to CLABSI (35%) and 34% favored CLABSI alone.
Conclusions:
Among the majority of SHEA Research Network respondents, HOB is perceived as preventable, reflective of quality of care, and potentially acceptable as a publicly reported quality metric. Further studies on HOB are needed, including validation as a quality measure, assessment of risk adjustment, and formation of evidence-based bundles and toolkits to facilitate measurement and improvement of HOB rates.
The gas evolution of a typical Dwarf Spheroidal Galaxy is investigated by means of 3D hydrodynamic simulations, taking into account the feedback of type II and Ia supernovae, the outflow of an Intermediate Massive Black Hole (IMBH) and a static cored dark matter potential. When the IMBH’s outflow is simulated in an homogeneous medium a jet structure is created and a small fraction of the gas is pushed away from the galaxy. No jet structure can be seen, however, when the medium is disturbed by supernovae, but gas is still pushed away. In this case, the main driver of the gas removal are the supernovae. The interplay between the stellar feedback and the IMBH’s outflow should be taken into account.
As referrals to specialist palliative care (PC) grow in volume and diversity, an evidence-based triage method is needed to enable services to manage waiting lists in a transparent, efficient, and equitable manner. Discrete choice experiments (DCEs) have not to date been used among PC clinicians, but may serve as a rigorous and efficient method to explore and inform the complex decision-making involved in PC triage. This article presents the protocol for a novel application of an international DCE as part of a mixed-method research program, ultimately aiming to develop a clinical decision-making tool for PC triage.
Method
Five stages of protocol development were undertaken: (1) identification of attributes of interest; (2) creation and (3) execution of a pilot DCE; and (4) refinement and (5) planned execution of the final DCE.
Result
Six attributes of interest to PC triage were identified and included in a DCE that was piloted with 10 palliative care practitioners. The pilot was found to be feasible, with an acceptable cognitive burden, but refinements were made, including the creation of an additional attribute to allow independent analysis of concepts involved. Strategies for recruitment, data collection, analysis, and modeling were confirmed for the final planned DCE.
Significance of results
This DCE protocol serves as an example of how the sophisticated DCE methodology can be applied to health services research in PC. Discussion of key elements that improved the utility, integrity, and feasibility of the DCE provide valuable insights.
The hot ISM in early-type galaxies (ETGs) plays a crucial role in understanding their formation and evolution. The structural features of the hot gas identified by Chandra observations point to key evolutionary mechanisms, (e.g., kim12). In our Chandra Galaxy Atlas (CGA) project, taking full advantage of the Chandra capabilities, we systematically analyzed the archival Chandra data of 72 ETGs and produced uniform data products of the hot gas properties. The main data products include spatially resolved 2D spectral maps of the hot gas from individual galaxies. We emphasize that new features can be identified in the spectral maps which are not easily visible in the surface brightness maps. The high-level images can be viewed at the dedicated CGA website, and the CGA data products can be downloaded to compare with other wavelength data and to perform user-specific analyses. Utilizing our data products, we will further address focused science topics.
The genetic and environmental contributions of negative valence systems (NVS) to internalizing pathways study (also referred to as the Adolescent and Young Adult Twin Study) was designed to examine varying constructs of the NVS as they relate to the development of internalizing disorders from a genetically informed perspective. The goal of this study was to evaluate genetic and environmental contributions to potential psychiatric endophenotypes that contribute to internalizing psychopathology by studying adolescent and young adult twins longitudinally over a 2-year period. This report details the sample characteristics, study design, and methodology of this study. The first wave of data collection (i.e., time 1) is complete; the 2-year follow-up (i.e., time 2) is currently underway. A total of 430 twin pairs (N = 860 individual twins; 166 monozygotic pairs; 57.2% female) and 422 parents or legal guardians participated at time 1. Twin participants completed self-report surveys and participated in experimental paradigms to assess processes within the NVS. Additionally, parents completed surveys to report on themselves and their twin children. Findings from this study will help clarify the genetic and environmental influences of the NVS and their association with internalizing risk. The goal of this line of research is to develop methods for early internalizing disorder risk detection.
OBJECTIVES/SPECIFIC AIMS: Existing GCP training is geared primarily towards researchers conducting drug, device, or biologic clinical trials, and largely ignores the unique needs of researchers conducting social and behavioral clinical trials. The purpose of this project was to develop a comprehensive, relevant, interactive, and easy to administer GCP eLearning course for social and behavioral researchers. METHODS/STUDY POPULATION: As part of the ECRPTQ project funded by the National Center for Advancing Translational Sciences (NCATS), a Social and Behavioral Work Group of ~30 experienced social and behavioral investigators and study coordinators was formed to develop GCP training for social and behavioral researchers. Existing GCP training programs were reviewed to identify relevant content that should be included as well as gaps specific to social and behavioral clinical trials where new content would need to be developed. In total, 9 specific modules—Introduction, Research Protocol, Roles and Responsibilities, Informed Consent Communication, Confidentiality/Privacy, Recruitment/Retention, Participant Safety/Adverse Event Reporting, Quality Control/Assurance, and Research Misconduct—were identified by the work group and the content was mapped to competency domains defined by the ECRPTQ project, as well as International Council for Harmonisation (ICH) GCP principles. Several investigators and study coordinators were identified as content experts for each module topic. Working with an instructional designer, these experts defined learning objectives and outlined content relevant for both study coordinators and investigators for inclusion in the modules. The curriculum was developed using Articulate Storyline that is SCORM 1.2 compliant making the course usable to the widest audience. The course was designed to be administered on laptop or desktop computers and is accessible for individuals with hearing or viewing impairments. To maximize learning, instructional designers used creative treatments including: narration to guide learners or offer tips; short video scenarios to introduce topics; interactive activities, such as drag and drop games and “click to learn more information”; knowledge checks with feedback; resources, including downloadable job aids; end of module quizzes, and documentation of course completion. The full curriculum takes 2–4 hours to complete, with individual modules taking 30 minutes to complete. RESULTS/ANTICIPATED RESULTS: Pilot testing to evaluate the effectiveness of the eLearning course is underway at 5 sites: University of Michigan, Boston University, University of Rochester, University of Florida, and SUNY Buffalo. DISCUSSION/SIGNIFICANCE OF IMPACT: This eLearning course provides relevant, comprehensive GCP training specifically for social and behavioral researchers. Unlike existing GCP training that is geared towards drug and device researchers, this course includes scenarios and examples that are relevant to social and behavioral researchers. The engaging, interactive nature of this course is designed to improve learning and retention, resulting in improved job performance. In addition, the modules are designed for both investigators and clinical research coordinators, thus eliminating the need for different training modules for different study team members.
Ketamine has recently become an agent of interest as an acute treatment for severe depression and as the anaesthetic for electroconvulsive therapy (ECT). Subanaesthetic doses result in an acute reduction in depression severity while evidence is equivocal for this antidepressant effect with anaesthetic or adjuvant doses. Recent systematic reviews call for high-quality evidence from further randomised controlled trials (RCTs).
Aims
To establish if ketamine as the anaesthetic for ECT results in fewer ECT treatments, improvements in depression severity ratings and less memory impairment than the standard anaesthetic.
Method
Double-blind, parallel-design, RCT of intravenous ketamine (up to 2 mg/kg) with an active comparator, intravenous propofol (up to 2.5 mg/kg), as the anaesthetic for ECT in patients receiving ECT for major depression on an informal basis. (Trial registration: European Clinical Trials Database (EudraCT): 2011-000396-14 and clinicalTrials.gov: NCT01306760.)
Results
No significant differences were found on any outcome measure during, at the end of or 1 month following the ECT course.
Conclusions
Ketamine as an anaesthetic does not enhance the efficacy of ECT.
Individual placement and support (IPS) has been repeatedly demonstrated
to be the most effective form of mental health vocational rehabilitation.
Its no-discharge policy plus fixed caseloads, however, makes it expensive
to provide.
Aims
To test whether introducing a time limit for IPS would significantly
alter its clinical effectiveness and consequently its potential
cost-effectiveness.
Method
Referrals to an IPS service were randomly allocated to either standard
IPS or to time-limited IPS (IPS-LITE). IPS-LITE participants were
referred back to their mental health teams if still unemployed at 9
months or after 4 months employment support. The primary outcome at 18
months was working for 1 day. Secondary outcomes comprised other
vocational measures plus clinical and social functioning. The
differential rates of discharge were used to calculate a notional
increased capacity and to model potential rates and costs of
employment.
Results
A total of 123 patients were randomised and data were collected on 120
patients at 18 months. The two groups (IPS-LITE = 62 and IPS = 61) were
well matched at baseline. Rates of employment were equal at 18 months
(IPS-LITE = 24 (41%) and IPS = 27 (46%)) at which time 57 (97%) had been
discharged from the IPS-LITE service and 16 (28%) from IPS. Only 11
patients (4 IPS-LITE and 7 IPS) obtained their first employment after 9
months. There were no significant differences in any other outcomes.
IPS-LITE discharges generated a potential capacity increase of 46.5%
compared to 12.7% in IPS which would translate into 35.8 returns to work
in IPS-LITE compared to 30.6 in IPS over an 18-month period if the rates
remained constant.
Conclusions
IPS-LITE is equally effective to IPS and only minimal extra employment is
gained by persisting beyond 9 months. If released capacity is utilised
with similar outcomes, IPS-LITE results in an increase by 17% in numbers
gaining employment within 18 months compared to IPS and will increase
with prolonged follow-up. IPS-LITE may be more cost-effective and should
be actively considered as an alternative within public services.
Silver Lake is the modern terminal playa of the Mojave River in southern California (USA). As a result, it is well located to record both influences from the winter precipitation dominated San Bernardino Mountains – the source of the Mojave River – and from the late summer to early fall North American monsoon at Silver Lake. Here, we present various physical, chemical and biological data from a new radiocarbon-dated, 8.2 m sediment core taken from Silver Lake that spans modern through 14.8 cal ka BP. Texturally, the core varies between sandy clay, clayey sand, and sand-silt-clay, often with abrupt sedimentological transitions. These grain-size changes are used to divide the core into six lake status intervals over the past 14.8 cal ka BP. Notable intervals include a dry Younger Dryas chronozone, a wet early Holocene terminating 7.8 – 7.4 cal ka BP, a distinct mid-Holocene arid interval, and a late Holocene return to ephemeral lake conditions. A comparison to potential climatic forcings implicates a combination of changing summer – winter insolation and tropical and N Pacific sea-surface temperature dynamics as the primary drivers of Holocene climate in the central Mojave Desert.
Hospital Ebola preparation is underway in the United States and other countries; however, the best approach and resources involved are unknown.
OBJECTIVE
To examine costs and challenges associated with hospital Ebola preparation by means of a survey of Society for Healthcare Epidemiology of America (SHEA) members.
DESIGN
Electronic survey of infection prevention experts.
RESULTS
A total of 257 members completed the survey (221 US, 36 international) representing institutions in 41 US states, the District of Columbia, and 18 countries. The 221 US respondents represented 158 (43.1%) of 367 major medical centers that have SHEA members and included 21 (60%) of 35 institutions recently defined by the US Centers for Disease Control and Prevention as Ebola virus disease treatment centers. From October 13 through October 19, 2014, Ebola consumed 80% of hospital epidemiology time and only 30% of routine infection prevention activities were completed. Routine care was delayed in 27% of hospitals evaluating patients for Ebola.
LIMITATIONS
Convenience sample of SHEA members with a moderate response rate.
CONCLUSIONS
Hospital Ebola preparations required extraordinary resources, which were diverted from routine infection prevention activities. Patients being evaluated for Ebola faced delays and potential limitations in management of other diseases that are more common in travelers returning from West Africa.