To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The current study was conducted to examine the types of adjustments used to support students with special educational needs in mainstream classrooms and how schools monitored the effectiveness of the adjustments they use. A range of stakeholders were interviewed in 22 mainstream schools across New South Wales, Australia, and the interviews were analysed for key themes. Some schools had a narrow focus on a few key areas, with teaching assistants being the most commonly reported adjustment. Few schools used formal formative monitoring to evaluate the effectiveness of adjustments. Options for improvement schools could consider include examining the breadth of adjustments, establishing clear measurable goals, considering alternative strategies for use of teaching assistants, and ensuring adjustments are monitored.
The thickness of a supraglacial layer is critical to the magnitude and time frame of glacier melt. Field-based, short pulse, ground-penetrating radar (GPR) has successfully measured debris thickness during a glacier's melt season, when there is a strong return from the ice–debris interface, but profiling with GPR in the absence of a highly reflective ice interface has not been explored. We investigated the performance of 960 MHz signals over 2 km of transects on Changri Nup Glacier, Nepal, during the post-monsoon. We also performed laboratory experiments to interpret the field data and investigate electromagnetic wave propagation into dry rocky debris. Laboratory tests confirmed wave penetration into the glacier ice and suggest that the ice–debris interface return was missing in field data because of a weak dielectric contrast between solid ice and porous dry debris. We developed a new method to estimate debris thicknesses by applying a statistical approach to volumetric backscatter, and our backscatter-based calculated thickness retrievals gave reasonable agreement with debris depths measured manually in the field (10–40 cm). We conclude that, when melt season profiling is not an option, a remote system near 1 GHz could allow dry debris thickness to be estimated based on volumetric backscatter.
Microstructures, including crystallographic fabric, within the margin of streaming ice can exert strong control on flow dynamics. To characterize a natural setting, we retrieved three cores, two of which reached bed, from the flank of Jarvis Glacier, eastern Alaska Range, Alaska. The core sites lie ~1 km downstream of the source, with abundant water present in the extracted cores and at the base of the glacier. All cores exhibit dipping layers, a combination of debris bands and bubble-free domains. Grain sizes coarsen on average approaching the lateral margin. Crystallographic orientations are more clustered and with c-axes closer to horizontal nearer the lateral margin. The measured fabric is sufficiently weak to induce little mechanical anisotropy, but the data suggest that despite the challenging conditions of warm ice, abundant water and a short flow distance, many aspects of the microstructure, including measurable crystallographic fabric, evolved in systematic ways.
This study aimed to investigate general factors associated with prognosis regardless of the type of treatment received, for adults with depression in primary care.
We searched Medline, Embase, PsycINFO and Cochrane Central (inception to 12/01/2020) for RCTs that included the most commonly used comprehensive measure of depressive and anxiety disorder symptoms and diagnoses, in primary care depression RCTs (the Revised Clinical Interview Schedule: CIS-R). Two-stage random-effects meta-analyses were conducted.
Twelve (n = 6024) of thirteen eligible studies (n = 6175) provided individual patient data. There was a 31% (95%CI: 25 to 37) difference in depressive symptoms at 3–4 months per standard deviation increase in baseline depressive symptoms. Four additional factors: the duration of anxiety; duration of depression; comorbid panic disorder; and a history of antidepressant treatment were also independently associated with poorer prognosis. There was evidence that the difference in prognosis when these factors were combined could be of clinical importance. Adding these variables improved the amount of variance explained in 3–4 month depressive symptoms from 16% using depressive symptom severity alone to 27%. Risk of bias (assessed with QUIPS) was low in all studies and quality (assessed with GRADE) was high. Sensitivity analyses did not alter our conclusions.
When adults seek treatment for depression clinicians should routinely assess for the duration of anxiety, duration of depression, comorbid panic disorder, and a history of antidepressant treatment alongside depressive symptom severity. This could provide clinicians and patients with useful and desired information to elucidate prognosis and aid the clinical management of depression.
Subthreshold post-traumatic stress disorder (PTSD) is more prevalent than PTSD, yet its role as a potential risk factor for PTSD is unknown. To address this gap, we analysed data from a 7-year, prospective national cohort of USA veterans. Of veterans with subthreshold PTSD at wave 1, 34.3% developed PTSD compared with 7.6% of trauma-exposed veterans without subthreshold PTSD (relative risk ratio 6.4). Among veterans with subthreshold PTSD, specific PTSD symptoms, greater age, cognitive difficulties, lower dispositional optimism and new-onset traumas predicted incident PTSD. Results suggest that preventive interventions targeting subthreshold PTSD and associated factors may help mitigate risk for PTSD in USA veterans.
The National Neuropsychology Network (NNN) is a multicenter clinical research initiative funded by the National Institute of Mental Health (NIMH; R01 MH118514) to facilitate neuropsychology’s transition to contemporary psychometric assessment methods with resultant improvement in test validation and assessment efficiency.
The NNN includes four clinical research sites (Emory University; Medical College of Wisconsin; University of California, Los Angeles (UCLA); University of Florida) and Pearson Clinical Assessment. Pearson Q-interactive (Q-i) is used for data capture for Pearson published tests; web-based data capture tools programmed by UCLA, which serves as the Coordinating Center, are employed for remaining measures.
NNN is acquiring item-level data from 500–10,000 patients across 47 widely used Neuropsychology (NP) tests and sharing these data via the NIMH Data Archive. Modern psychometric methods (e.g., item response theory) will specify the constructs measured by different tests and determine their positive/negative predictive power regarding diagnostic outcomes and relationships to other clinical, historical, and demographic factors. The Structured History Protocol for NP (SHiP-NP) helps standardize acquisition of relevant history and self-report data.
NNN is a proof-of-principle collaboration: by addressing logistical challenges, NNN aims to engage other clinics to create a national and ultimately an international network. The mature NNN will provide mechanisms for data aggregation enabling shared analysis and collaborative research. NNN promises ultimately to enable robust diagnostic inferences about neuropsychological test patterns and to promote the validation of novel adaptive assessment strategies that will be more efficient, more precise, and more sensitive to clinical contexts and individual/cultural differences.
Mass asymptomatic SARS-CoV-2 nucleic acid amplified testing of healthcare personnel (HCP) was performed at a large tertiary health system. A low period-prevalence of positive HCP was observed. Of those who tested positive, half had mild symptoms in retrospect. HCP with even mild symptoms should be isolated and tested.
Prenatal choline is a key nutrient, like folic acid and vitamin D, for fetal brain development and subsequent mental function. We sought to determine whether effects of higher maternal plasma choline concentrations on childhood attention and social problems, found in an initial clinical trial of choline supplementation, are observed in a second cohort.
Of 183 mothers enrolled from an urban safety net hospital clinic, 162 complied with gestational assessments and brought their newborns for study at 1 month of age; 83 continued assessments through 4 years of age. Effects of maternal 16 weeks of gestation plasma choline concentrations ⩾7.07 μM, 1 s.d. below the mean level obtained with supplementation in the previous trial, were compared to lower levels. The Attention Problems and Withdrawn Syndrome scales on Child Behavior Checklist 1½–5 were the principal outcomes.
Higher maternal plasma choline was associated with lower mean Attention Problems percentiles in children, and for male children, with lower Withdrawn percentiles. Higher plasma choline concentrations also reduced Attention Problems percentiles for children of mothers who used cannabis during gestation as well as children of mothers who had gestational infection.
Prenatal choline's positive associations with early childhood behaviors are found in a second, more diverse cohort. Increases in attention problems and social withdrawal in early childhood are associated with later mental illnesses including attention deficit disorder and schizophrenia. Choline concentrations in the pregnant women in this study replicate other research findings suggesting that most pregnant women do not have adequate choline in their diets.
Demand for building competencies in implementation research (IR) outstrips supply of training programs, calling for a paradigm shift. We used a bootstrap approach to leverage external resources and create IR capacity through a novel 2-day training for faculty scientists across the four Texas Clinical & Translational Science Awards (CTSAs). The Workshop combined internal and external expertise, targeted nationally established IR competencies, incorporated new National Institutes of Health/National Cancer Institute OpenAccess online resources, employed well-known adult education principles, and measured impact. CTSA leader buy-in was reflected in financial support. Evaluation showed increased self-reported IR competency; statewide initiatives expanded. The project demonstrated that, even with limited onsite expertise, it was possible to bootstrap resources and build IR capacity de novo in the CTSA community.
We summarize some of the past year's most important findings within climate change-related research. New research has improved our understanding of Earth's sensitivity to carbon dioxide, finds that permafrost thaw could release more carbon emissions than expected and that the uptake of carbon in tropical ecosystems is weakening. Adverse impacts on human society include increasing water shortages and impacts on mental health. Options for solutions emerge from rethinking economic models, rights-based litigation, strengthened governance systems and a new social contract. The disruption caused by COVID-19 could be seized as an opportunity for positive change, directing economic stimulus towards sustainable investments.
A synthesis is made of ten fields within climate science where there have been significant advances since mid-2019, through an expert elicitation process with broad disciplinary scope. Findings include: (1) a better understanding of equilibrium climate sensitivity; (2) abrupt thaw as an accelerator of carbon release from permafrost; (3) changes to global and regional land carbon sinks; (4) impacts of climate change on water crises, including equity perspectives; (5) adverse effects on mental health from climate change; (6) immediate effects on climate of the COVID-19 pandemic and requirements for recovery packages to deliver on the Paris Agreement; (7) suggested long-term changes to governance and a social contract to address climate change, learning from the current pandemic, (8) updated positive cost–benefit ratio and new perspectives on the potential for green growth in the short- and long-term perspective; (9) urban electrification as a strategy to move towards low-carbon energy systems and (10) rights-based litigation as an increasingly important method to address climate change, with recent clarifications on the legal standing and representation of future generations.
Social media summary
Stronger permafrost thaw, COVID-19 effects and growing mental health impacts among highlights of latest climate science.
This is the first report on the association between trauma exposure and depression from the Advancing Understanding of RecOvery afteR traumA(AURORA) multisite longitudinal study of adverse post-traumatic neuropsychiatric sequelae (APNS) among participants seeking emergency department (ED) treatment in the aftermath of a traumatic life experience.
We focus on participants presenting at EDs after a motor vehicle collision (MVC), which characterizes most AURORA participants, and examine associations of participant socio-demographics and MVC characteristics with 8-week depression as mediated through peritraumatic symptoms and 2-week depression.
Eight-week depression prevalence was relatively high (27.8%) and associated with several MVC characteristics (being passenger v. driver; injuries to other people). Peritraumatic distress was associated with 2-week but not 8-week depression. Most of these associations held when controlling for peritraumatic symptoms and, to a lesser degree, depressive symptoms at 2-weeks post-trauma.
These observations, coupled with substantial variation in the relative strength of the mediating pathways across predictors, raises the possibility of diverse and potentially complex underlying biological and psychological processes that remain to be elucidated in more in-depth analyses of the rich and evolving AURORA database to find new targets for intervention and new tools for risk-based stratification following trauma exposure.
Frailty is a common clinical syndrome in older adults that carries an increased risk for poor health outcomes. Early identification of frailty may help optimizing quality of care. Fried's frailty criteria are often used as the gold standard of frailty. However, it takes too much time and the availability of a hand grip strength meter to measure these criteria in daily practice. Screening instruments for frailty such as the Groningen Frailty Indicator (GFI) and the Tilburg Frailty Indicator (TFI), are available. However, it is not yet certain whether the usual cut-off values are applicable to older psychiatric patients.
To determine internal consistency, sensitivity, specificity and area under the curve (AUC) of the receiver operating characteristic-curve (ROC- curve) of the GFI and TFI using validated cut-off values, and to determine the optimal cut-off value in older psychiatric patients.
Baseline data of an ongoing prospective cohort study were used. In this study GFI, TFI and Fried-criteria were determined in hospitalized and non-hospitalized psychiatric patients over 65 years old.
A total of 145 participants were enrolled, 90 of which were hospitalized and 55 were non-hospitalized. Median age of participants was 75.2 (SD =7) years, 108 were female. Prevalence of frailty according to Fried-criteria was 29.7%. Internal consistency (Cronbach's alpha) of the GFI was 0.76 and TFI = 0.75. Using the validated cut-off value and the Fried- criteria as reference, sensitivity of the GFI (≥4) was 0.95 (95% CI 0.83 - 0.99) and specificity 0.27 (95%CI 0.19 - 0.37). Sensitivity of the TFI (≥5) was 0.98 (95% CI 0.86 - 1.00) and specificity 0.31 (95% CI 0.23 - 0.41). The optimum cut-off value for both the GFI and TFI was ≥8. The AUC of the ROC-curve of GFI and TFI were 0.82 (95% CI 0.75 - 0.90) and 0.79 (95% CI 0.72 - 0.87), respectively.
We found an acceptable internal consistency and AUC of both the GFI and the TFI in older psychiatric patients. Increasing the cut-off values of both GFI and TFI seems necessary to lower the amount of false positives in this population.
Background: Shared Healthcare Intervention to Eliminate Life-threatening Dissemination of MDROs in Orange County, California (SHIELD OC) was a CDC-funded regional decolonization intervention from April 2017 through July 2019 involving 38 hospitals, nursing homes (NHs), and long-term acute-care hospitals (LTACHs) to reduce MDROs. Decolonization in NH and LTACHs consisted of universal antiseptic bathing with chlorhexidine (CHG) for routine bathing and showering plus nasal iodophor decolonization (Monday through Friday, twice daily every other week). Hospitals used universal CHG in ICUs and provided daily CHG and nasal iodophor to patients in contact precautions. We sought to evaluate whether decolonization reduced hospitalization and associated healthcare costs due to infections among residents of NHs participating in SHIELD compared to nonparticipating NHs. Methods: Medicaid insurer data covering NH residents in Orange County were used to calculate hospitalization rates due to a primary diagnosis of infection (counts per member quarter), hospital bed days/member-quarter, and expenditures/member quarter from the fourth quarter of 2015 to the second quarter of 2019. We used a time-series design and a segmented regression analysis to evaluate changes attributable to the SHIELD OC intervention among participating and nonparticipating NHs. Results: Across the SHIELD OC intervention period, intervention NHs experienced a 44% decrease in hospitalization rates, a 43% decrease in hospital bed days, and a 53% decrease in Medicaid expenditures when comparing the last quarter of the intervention to the baseline period (Fig. 1). These data translated to a significant downward slope, with a reduction of 4% per quarter in hospital admissions due to infection (P < .001), a reduction of 7% per quarter in hospitalization days due to infection (P < .001), and a reduction of 9% per quarter in Medicaid expenditures (P = .019) per NH resident. Conclusions: The universal CHG bathing and nasal decolonization intervention adopted by NHs in the SHIELD OC collaborative resulted in large, meaningful reductions in hospitalization events, hospitalization days, and healthcare expenditures among Medicaid-insured NH residents. The findings led CalOptima, the Medicaid provider in Orange County, California, to launch an NH incentive program that provides dedicated training and covers the cost of CHG and nasal iodophor for OC NHs that enroll.
Disclosures: Gabrielle M. Gussin, University of California, Irvine, Stryker (Sage Products): Conducting studies in which contributed antiseptic product is provided to participating hospitals and nursing homes. Clorox: Conducting studies in which contributed antiseptic product is provided to participating hospitals and nursing homes. Medline: Conducting studies in which contributed antiseptic product is provided to participating hospitals and nursing homes. Xttrium: Conducting studies in which contributed antiseptic product is provided to participating hospitals and nursing homes.
Background: The Hospital-Acquired Condition Reduction Program (HACRP) is a pay-for-performance Medicare program that promotes reducing patient harm, particularly healthcare-associated infections (HAIs). We examined the association between infection-control–related activities and the number of penalties a hospital received between fiscal years 2015 and 2018. Methods: We used logistic regression with ordered categories to assess infection control resource use and the number of penalties, an ordered categorical dependent variable with 5 categories ranging from 0 to 4, as of 2018. Data sources included National Healthcare Safety Network, American Hospital Association Annual Survey, Medicare Impact and Cost Report files, and Data.Medicare.gov. We excluded hospitals lacking data to calculate any HACRP score or component score for HAI and hospitals missing observations for model variables (301 hospitals). We assessed the following model variables: teaching hospital status, infection preventionists (IP) per 1,000 beds, surveillance hours per week per bed, other infection control activities per week per bed, nurse-to-bed ratio, housekeeping expenditure per 10,000 beds, nursing position vacancies per bed, bed size, electronic health record (EHR) implementation, number of skilled nursing beds, rural or urban location, and Medicare patient case-mix (cmi_quartiles). Results: In our model, negative logit model point estimates indicated that increased values of the variable are associated with a lower odds of having a higher number of penalties. The final data set consisted of 3,004 US hospitals. Lower penalties were significantly associated with higher IP-to-bed ratio. Although the point estimates were <1, an association between lower penalties and higher nurse-to-bed ratios or electronic health records was not demonstrated (Table 1). Conclusions: Our results suggest that after controlling for selected hospital structural factors, incremental resources related to infection control have a protective association with HCARP penalties.
On Hawai‘i Island, an increase in human neuroangiostrongyliasis cases has been primarily associated with the accidental ingestion of Angiostrongylus cantonensis L3 in snails or slugs, or potentially, from larvae left behind in the slug's slime or feces. We evaluated more than 40 different treatments in vitro for their ability to kill A. cantonensis larvae with the goal of identifying a safe and effective fruit and vegetable wash in order to reduce the risk of exposure. Our evaluation of treatment lethality was carried out in two phases; initially using motility as an indicator of larval survival after treatment, followed by the development and application of a propidium iodide staining assay to document larval mortality. Treatments tested included common household products, consumer vegetable washes and agricultural crop washes. We found minimal larvicidal efficacy among consumer-grade fruit and vegetable washes, nor among botanical extracts such as those from ginger or garlic, nor acid solutions such as vinegar. Alkaline solutions, on the other hand, as well as oxidizers such as bleach and chlorine dioxide, did show larvicidal potential. Surfactants, a frequent ingredient in detergents that lowers surface tension, had variable results, but dodecylbenzene sulfonic acid as a 70% w/w solution in 2-propanol was very effective, both in terms of the speed and the thoroughness with which it killed A. cantonensis L3 nematodes. Thus, our results suggest promising directions for future investigation.
The emphasis on team science in clinical and translational research increases the importance of collaborative biostatisticians (CBs) in healthcare. Adequate training and development of CBs ensure appropriate conduct of robust and meaningful research and, therefore, should be considered as a high-priority focus for biostatistics groups. Comprehensive training enhances clinical and translational research by facilitating more productive and efficient collaborations. While many graduate programs in Biostatistics and Epidemiology include training in research collaboration, it is often limited in scope and duration. Therefore, additional training is often required once a CB is hired into a full-time position. This article presents a comprehensive CB training strategy that can be adapted to any collaborative biostatistics group. This strategy follows a roadmap of the biostatistics collaboration process, which is also presented. A TIE approach (Teach the necessary skills, monitor the Implementation of these skills, and Evaluate the proficiency of these skills) was developed to support the adoption of key principles. The training strategy also incorporates a “train the trainer” approach to enable CBs who have successfully completed training to train new staff or faculty.