To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This review systematically explores the current available evidence on the effectiveness of interventions provided to first responders to prevent and/or treat the mental health effects of responding to a disaster.
A systematic review of Medline, Scopus, PsycINFO, and gray literature was conducted. Studies describing the effectiveness of interventions provided to first responders to prevent and/or treat the mental health effects of responding to a disaster were included. Quality was assessed using the Grading of Recommendations Assessment, Development and Evaluation (GRADE) criteria, and the Critical Appraisal Skills Programme (CASP) checklist.
Manuscripts totaling 3869 met the initial search criteria; 25 studies met the criteria for in-depth analysis, including 22 quantitative and 3 qualitative studies; 6 were performed in low- and middle-income countries (LMICs); 18 studies evaluated a psychological intervention; of these, 13 found positive impact, 4 found no impact, and 1 demonstrated worsened symptoms after the intervention. Pre-event trainings decreased psychiatric symptoms in each of the 3 studies evaluating its effectiveness.
This review demonstrates that there are likely effective interventions to both prevent and treat psychiatric symptoms in first responders in high-, medium-, and low-income countries.
Inattentive respondents introduce noise into data sets, weakening correlations between items and increasing the likelihood of null findings. “Screeners” have been proposed as a way to identify inattentive respondents, but questions remain regarding their implementation. First, what is the optimal number of Screeners for identifying inattentive respondents? Second, what types of Screener questions best capture inattention? In this paper, we address both of these questions. Using item-response theory to aggregate individual Screeners we find that four Screeners are sufficient to identify inattentive respondents. Moreover, two grid and two multiple choice questions work well. Our findings have relevance for applied survey research in political science and other disciplines. Most importantly, our recommendations enable the standardization of Screeners on future surveys.
Major depressive disorder and neuroticism (Neu) share a large genetic basis. We sought to determine whether this shared basis could be decomposed to identify genetic factors that are specific to depression.
We analysed summary statistics from genome-wide association studies (GWAS) of depression (from the Psychiatric Genomics Consortium, 23andMe and UK Biobank) and compared them with GWAS of Neu (from UK Biobank). First, we used a pairwise GWAS analysis to classify variants as associated with only depression, with only Neu or with both. Second, we estimated partial genetic correlations to test whether the depression's genetic link with other phenotypes was explained by shared overlap with Neu.
We found evidence that most genomic regions (25/37) associated with depression are likely to be shared with Neu. The overlapping common genetic variance of depression and Neu was genetically correlated primarily with psychiatric disorders. We found that the genetic contributions to depression, that were not shared with Neu, were positively correlated with metabolic phenotypes and cardiovascular disease, and negatively correlated with the personality trait conscientiousness. After removing shared genetic overlap with Neu, depression still had a specific association with schizophrenia, bipolar disorder, coronary artery disease and age of first birth. Independent of depression, Neu had specific genetic correlates in ulcerative colitis, pubertal growth, anorexia and education.
Our findings demonstrate that, while genetic risk factors for depression are largely shared with Neu, there are also non-Neu-related features of depression that may be useful for further patient or phenotypic stratification.
The rocky shores of the north-east Atlantic have been long studied. Our focus is from Gibraltar to Norway plus the Azores and Iceland. Phylogeographic processes shape biogeographic patterns of biodiversity. Long-term and broadscale studies have shown the responses of biota to past climate fluctuations and more recent anthropogenic climate change. Inter- and intra-specific species interactions along sharp local environmental gradients shape distributions and community structure and hence ecosystem functioning. Shifts in domination by fucoids in shelter to barnacles/mussels in exposure are mediated by grazing by patellid limpets. Further south fucoids become increasingly rare, with species disappearing or restricted to estuarine refuges, caused by greater desiccation and grazing pressure. Mesoscale processes influence bottom-up nutrient forcing and larval supply, hence affecting species abundance and distribution, and can be proximate factors setting range edges (e.g., the English Channel, the Iberian Peninsula). Impacts of invasive non-native species are reviewed. Knowledge gaps such as the work on rockpools and host–parasite dynamics are also outlined.
To prevent environmental transmission of pathogens, hospital rooms housing patients on transmission-based precautions are cleaned extensively and disinfected with ultraviolet (UV) light. To do so consistently requires time and coordination, and these procedures must avoid patient flow delays and associated safety risks. We sought to improve room turnover efficiency to allow for UV disinfection.
A 60-day quality improvement and implementation project.
A quaternary academic pediatric referral facility.
A multidisciplinary healthcare team participated in a 60-day before-and-after trial that followed the Toyota Production System Lean methodology. We used value-stream mapping and manual time studies to identify areas for improvement. Areas addressed included room breakdown, room cleaning, and wait time between cleaning and disinfection. Room turnover was measured as the time in minutes from a discharged patient exiting an isolation room to UV disinfection completion. Impact was measured using postintervention manual time studies.
Median room turnover decreased from 130 minutes (range, 93–294 minutes) to 65 minutes (range, 48–95 minutes; P < .0001). Other outcomes included decreased median time between room breakdown to cleaning start time (from 10 to 3 minutes; P = .004), room cleaning complete to UV disinfection start (from 36 to 8 minutes; P < .0001), and the duration of room cleaning and curtain changing (from 57 to 37 minutes; P < .0001).
We decreased room turnover time by half in 60 days by decreasing times between and during routine tasks. Utilizing Lean methodology and manual time study can help teams understand and improve hospital processes and systems.
Item 9 of the Patient Health Questionnaire-9 (PHQ-9) queries about thoughts of death and self-harm, but not suicidality. Although it is sometimes used to assess suicide risk, most positive responses are not associated with suicidality. The PHQ-8, which omits Item 9, is thus increasingly used in research. We assessed equivalency of total score correlations and the diagnostic accuracy to detect major depression of the PHQ-8 and PHQ-9.
We conducted an individual patient data meta-analysis. We fit bivariate random-effects models to assess diagnostic accuracy.
16 742 participants (2097 major depression cases) from 54 studies were included. The correlation between PHQ-8 and PHQ-9 scores was 0.996 (95% confidence interval 0.996 to 0.996). The standard cutoff score of 10 for the PHQ-9 maximized sensitivity + specificity for the PHQ-8 among studies that used a semi-structured diagnostic interview reference standard (N = 27). At cutoff 10, the PHQ-8 was less sensitive by 0.02 (−0.06 to 0.00) and more specific by 0.01 (0.00 to 0.01) among those studies (N = 27), with similar results for studies that used other types of interviews (N = 27). For all 54 primary studies combined, across all cutoffs, the PHQ-8 was less sensitive than the PHQ-9 by 0.00 to 0.05 (0.03 at cutoff 10), and specificity was within 0.01 for all cutoffs (0.00 to 0.01).
PHQ-8 and PHQ-9 total scores were similar. Sensitivity may be minimally reduced with the PHQ-8, but specificity is similar.
The initial classic Fontan utilising a direct right atrial appendage to pulmonary artery anastomosis led to numerous complications. Adults with such complications may benefit from conversion to a total cavo-pulmonary connection, the current standard palliation for children with univentricular hearts.
A single institution, retrospective chart review was conducted for all Fontan conversion procedures performed from July, 1999 through January, 2017. Variables analysed included age, sex, reason for Fontan conversion, age at Fontan conversion, and early mortality or heart transplant within 1 year after Fontan conversion.
A total of 41 Fontan conversion patients were identified. Average age at Fontan conversion was 24.5 ± 9.2 years. Dominant left ventricular physiology was present in 37/41 (90.2%) patients. Right-sided heart failure occurred in 39/41 (95.1%) patients and right atrial dilation was present in 33/41 (80.5%) patients. The most common causes for Fontan conversion included atrial arrhythmia in 37/41 (90.2%), NYHA class II HF or greater in 31/41 (75.6%), ventricular dysfunction in 23/41 (56.1%), and cirrhosis or fibrosis in 7/41 (17.1%) patients. Median post-surgical follow-up was 6.2 ± 4.9 years. Survival rates at 30 days, 1 year, and greater than 1-year post-Fontan conversion were 95.1, 92.7, and 87.8%, respectively. Two patients underwent heart transplant: the first within 1 year of Fontan conversion for heart failure and the second at 5.3 years for liver failure.
Fontan conversion should be considered early when atrial arrhythmias become common rather than waiting for severe heart failure to ensue, and Fontan conversion can be accomplished with an acceptable risk profile.
Background: Hereditary transthyretin-mediated (hATTR) amyloidosis a hereditary, multi-systemic and life-threatening disease resulting in neuropathy and cardiomyopathy. In the APOLLO study, patisiran, an investigational RNAi therapeutic targeting hepatic TTR production resulted in significant improvement in neuropathy and QoL compared to placebo and was generally well tolerated. Methods: APOLLO, a Phase 3 study of patisiran vs. placebo (NCT01960348) prespecified a cardiac subpopulation (n=126 of 225 total) that included patients with baseline left ventricular (LV) wall thickness ≥ 13mm and no medical history of aortic valve disease or hypertension. Cardiac measures included structure and function by electrocardiography, changes in NT-proBNP and 10-MWT gait speed. Results: At 18 months, patisiran treatment resulted in a mean reduction in LV wall thickness of 1 mm (p=0.017) compared to baseline, which was associated with significant improvements relative to placebo in LV end diastolic volume (+8.31 mL, p=0.036), global longitudinal strain (-1.37%, p=0.015) and NT-proBNP (55% reduction, p=7.7 x 10-8) (Figure 1). Gait speed was also improved relative to placebo (+0.35 m/sec, p=7.4 x 10-9). Rate of death or hospitalization was lower with patisiran. mNIS+7 results in the cardiac subpopulation will also be presented. Conclusions: These data suggest patisiran has the potential to halt or reverse cardiac manifestations of hATTR amyloidosis.
Fast ice flow is associated with the deformation of subglacial sediment. Seismic shear velocities, Vs, increase with the rigidity of material and hence can be used to distinguish soft sediment from hard bedrock substrates. Depth profiles of Vs can be obtained from inversions of Rayleigh wave dispersion curves, from passive or active-sources, but these can be highly ambiguous and lack depth sensitivity. Our novel Bayesian transdimensional algorithm, MuLTI, circumvents these issues by adding independent depth constraints to the inversion, also allowing comprehensive uncertainty analysis. We apply MuLTI to the inversion of a Rayleigh wave dataset, acquired using active-source (Multichannel Analysis of Surface Waves) techniques, to characterise sediment distribution beneath the frontal margin of Midtdalsbreen, an outlet of Norway's Hardangerjøkulen ice cap. Ice thickness (0–20 m) is constrained using co-located GPR data. Outputs from MuLTI suggest that partly-frozen sediment (Vs 500–1000 m s−1), overlying bedrock (Vs 2000–2500 m s−1), is present in patches with a thickness of ~4 m, although this approaches the resolvable limit of our Rayleigh wave frequencies (14–100 Hz). Uncertainties immediately beneath the glacier bed are <280 m s−1, implying that MuLTI cannot only distinguish bedrock and sediment substrates but does so with an accuracy sufficient for resolving variations in sediment properties.
The USA is currently enduring an opioid crisis. Identifying cost-effective, easy-to-implement behavioral measures that predict treatment outcomes in opioid misusers is a crucial scientific, therapeutic, and epidemiological goal.
The current study used a mixed cross-sectional and longitudinal design to test whether a behavioral choice task, previously validated in stimulant users, was associated with increased opioid misuse severity at baseline, and whether it predicted change in opioid misuse severity at follow-up. At baseline, data from 100 prescription opioid-treated chronic pain patients were analyzed; at follow-up, data were analyzed in 34 of these participants who were non-misusers at baseline. During the choice task, participants chose under probabilistic contingencies whether to view opioid-related images in comparison with affectively pleasant, unpleasant, and neutral images. Following previous procedures, we also assessed insight into choice behavior, operationalized as whether (yes/no) participants correctly self-reported the image category they chose most often.
At baseline, the higher choice for viewing opioid images in direct comparison with pleasant images was associated with opioid misuse and impaired insight into choice behavior; the combination of these produced especially elevated opioid-related choice behavior. In longitudinal analyses of individuals who were initially non-misusers, higher baseline opioid v. pleasant choice behavior predicted more opioid misuse behaviors at follow-up.
These results indicate that greater relative allocation of behavior toward opioid stimuli and away from stimuli depicting natural reinforcement is associated with concurrent opioid misuse and portends vulnerability toward future misuse. The choice task may provide important medical information to guide opioid-prescribing practices.
Chemical weed control remains a widely used component of integrated weed management strategies because of its cost-effectiveness and rapid removal of crop pests. Additionally, dicamba-plus-glyphosate mixtures are a commonly recommended herbicide combination to combat herbicide resistance, specifically in recently commercially released dicamba-tolerant soybean and cotton. However, increased spray drift concerns and antagonistic interactions require that the application process be optimized to maximize biological efficacy while minimizing environmental contamination potential. Field research was conducted in 2016, 2017, and 2018 across three locations (Mississippi, Nebraska, and North Dakota) for a total of six site-years. The objectives were to characterize the efficacy of a range of droplet sizes [150 µm (Fine) to 900 µm (Ultra Coarse)] using a dicamba-plus-glyphosate mixture and to create novel weed management recommendations utilizing pulse-width modulation (PWM) sprayer technology. Results across pooled site-years indicated that a droplet size of 395 µm (Coarse) maximized weed mortality from a dicamba-plus-glyphosate mixture at 94 L ha–1. However, droplet size could be increased to 620 µm (Extremely Coarse) to maintain 90% of the maximum weed mortality while further mitigating particle drift potential. Although generalized droplet size recommendations could be created across site-years, optimum droplet sizes within each site-year varied considerably and may be dependent on weed species, geographic location, weather conditions, and herbicide resistance(s) present in the field. The precise, site-specific application of a dicamba-plus-glyphosate mixture using the results of this research will allow applicators to more effectively utilize PWM sprayers, reduce particle drift potential, maintain biological efficacy, and reduce the selection pressure for the evolution of herbicide-resistant weeds.
Music festivals are globally attended events that bring together performers and fans for a defined period of time. These festivals often have on-site medical care to help reduce the impact on local health care systems. Historically, the literature suggests that patient transfers off-site are frequently related to complications of substance use. However, there is a gap in understanding why patients are transferred to hospital when an on-site medical team, capable of providing first aid services blended with a higher level of care (HLC) team, is present.
The purpose of this study is to better understand patterns of injuries and illnesses that necessitate transfer when physician-led HLC teams are accessible on-site.
This is a prospective, descriptive case series analyzing patient encounter documentation from four large-scale, North American, multi-day music festivals.
On-site medical teams that included HLC team members were present for the duration of each festival, so every team was able to “treat and release” when clinically appropriate. Over the course of the combined 34 event days, there were 10,406 patient encounters resulting in 156 individuals being transferred off-site for assessment, diagnostic testing, and/or treatment. A minority of patients seen were transferred off-site (1.5%). The patient presentation rate (PPR) was 16.5/1,000. The ambulance transfer rate (ATR) was 0.12/1,000 attendees, whereas the total transfer-to-hospital rate (TTHR), when factoring in non-ambulance transport, was 0.25/1,000. In contrast to existing literature on transfers from music festivals, the most common reason for transfer off-site was for musculo-skeletal (MSK) injuries (53.8%) that required imaging.
The presence of on-site HLC teams impacted the case mix of patients transferred to hospital, and may reduce the number of transfers for intoxication. Confounding preconceptions, patients in the present study were transferred largely for injuries that required specialized imaging and testing that could not be performed in an out-of-hospital setting. These results suggest that a better understanding of the specific effects on-site HLC teams have on avoiding off-site transfers will aid in improving planning for music festivals. The findings also identify areas for further improvement in on-site care, such as integrated on-site radiology, which could potentially further reduce the impact of music festivals on local health services. The role of non-emergency transport vehicles (NETVs) deserves further attention.
TurrisSA, CallaghanCW, RabbH, MunnMB, LundA. On the Way Out: An Analysis of Patient Transfers from Four Large-Scale North American Music Festivals Over Two YearsPrehosp Disaster Med. 2019;34(1):72–81.
To assess trends of mortality attributable to child and maternal undernutrition (CMU), overweight/obesity and dietary risks of non-communicable diseases (NCD) in sub-Saharan Africa (SSA) using data from the Global Burden of Disease (GBD) Study 2015.
For each risk factor, a systematic review of data was used to compute the exposure level and the effect size. A Bayesian hierarchical meta-regression analysis was used to estimate the exposure level of the risk factors by age, sex, geography and year. The burden of all-cause mortality attributable to CMU, fourteen dietary risk factors (eight diets, five nutrients and fibre intake) and overweight/obesity was estimated.
All age groups and both sexes.
In 2015, CMU, overweight/obesity and dietary risks of NCD accounted for 826204 (95 % uncertainty interval (UI) 737346, 923789), 266768 (95 % UI 189051, 353096) and 558578 (95 % UI 453433, 680197) deaths, respectively, representing 10·3 % (95 % UI 9·1, 11·6 %), 3·3 % (95 % UI 2·4, 4·4 %) and 7·0 % (95 % UI 5·8, 8·3 %) of all-cause mortality. While the age-standardized proportion of all-cause mortality accounted for by CMU decreased by 55·2 % between 1990 and 2015 in SSA, it increased by 63·3 and 17·2 % for overweight/obesity and dietary risks of NCD, respectively.
The increasing burden of diet- and obesity-related diseases and the reduction of mortality attributable to CMU indicate that SSA is undergoing a rapid nutritional transition. To tackle the impact in SSA, interventions and international development agendas should also target dietary risks associated with NCD and overweight/obesity.
The South Caucasus occupies the divide between ancient Mesopotamia and prehistoric Europe, and was thus crucial in the development of Old World societies. Chronologies for the region, however, have lacked the definition achieved in surrounding areas. Concentrating on the Tsaghkahovit Plain of north-western Armenia, Project ArAGATS's multi-site radiocarbon dataset has now produced Bayesian modelling, which provides tight chronometric support for tracing the transmission of technology, population movement and social developments that shaped the Eurasian Bronze and Iron Ages.
A transannular patch is often used in the contemporary surgical repair of tetralogy of Fallot. This can lead to significant pulmonary insufficiency and increased right ventricular volumes and ultimately pulmonary valve replacement. Cardiopulmonary exercise testing is used to assess exercise capacity in tetralogy of Fallot patients before pulmonary valve replacement. There is only few published literatures on how lung function affects functional capacity in tetralogy of Fallot patients repaired with a transannular patch.
A retrospective chart review was done from 2015 to 2017 on patients with tetralogy of Fallot who underwent maximal effort cardiopulmonary exercise testing with cycle ergometry and with concurrent pulmonary function testing. Tetralogy of Fallot patients repaired with a transannular patch without pulmonary valve replacement were compared with age, gender, and size-matched normal controls.
In the tetralogy of Fallot group, 24 out of 57 patients underwent primary repair with a transannular patch. When compared to the normal controls, they demonstrated abnormal predicted forced expiratory volume in one second (79 ± 23.1% versus 90.7 ± 14.1%, p<0.05), predicted maximal voluntary ventilation (74 ± 18% versus 90.5 ± 16.2%, p<0.05) while having low-normal predicted forced vital capacity (80.5 ± 17.2% versus 90.2 ± 12.4%, p<0.05) and normal breathing reserve percentage (50.3 ± 11.3% versus 47.5 ± 17.3%, p = 0.52). Cardiopulmonary exercise testing abnormalities included significantly lower percent predicted oxygen consumption (63.2 ± 12.2% versus 87 ± 12.1%, p<0.05), maximal heart rate (171.8 ± 18.9 versus 184.6 ± 13.6, p<0.05), and percent predicted maximum workload (61.7 ± 15.9% versus 88.3 ± 21.5%, p<0.05).
Tetralogy of Fallot patients repaired with a transannular patch can have abnormal pulmonary function testing with poor exercise capacity in addition to chronotropic incompetence and impaired muscular power.