To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In recent years, a variety of efforts have been made in political science to enable, encourage, or require scholars to be more open and explicit about the bases of their empirical claims and, in turn, make those claims more readily evaluable by others. While qualitative scholars have long taken an interest in making their research open, reflexive, and systematic, the recent push for overarching transparency norms and requirements has provoked serious concern within qualitative research communities and raised fundamental questions about the meaning, value, costs, and intellectual relevance of transparency for qualitative inquiry. In this Perspectives Reflection, we crystallize the central findings of a three-year deliberative process—the Qualitative Transparency Deliberations (QTD)—involving hundreds of political scientists in a broad discussion of these issues. Following an overview of the process and the key insights that emerged, we present summaries of the QTD Working Groups’ final reports. Drawing on a series of public, online conversations that unfolded at www.qualtd.net, the reports unpack transparency’s promise, practicalities, risks, and limitations in relation to different qualitative methodologies, forms of evidence, and research contexts. Taken as a whole, these reports—the full versions of which can be found in the Supplementary Materials—offer practical guidance to scholars designing and implementing qualitative research, and to editors, reviewers, and funders seeking to develop criteria of evaluation that are appropriate—as understood by relevant research communities—to the forms of inquiry being assessed. We dedicate this Reflection to the memory of our coauthor and QTD working group leader Kendra Koivu.1
Brain imaging studies have shown altered amygdala activity during emotion processing in children and adolescents with oppositional defiant disorder (ODD) and conduct disorder (CD) compared to typically developing children and adolescents (TD). Here we aimed to assess whether aggression-related subtypes (reactive and proactive aggression) and callous-unemotional (CU) traits predicted variation in amygdala activity and skin conductance (SC) response during emotion processing.
We included 177 participants (n = 108 cases with disruptive behaviour and/or ODD/CD and n = 69 TD), aged 8–18 years, across nine sites in Europe, as part of the EU Aggressotype and MATRICS projects. All participants performed an emotional face-matching functional magnetic resonance imaging task.
Differences between cases and TD in affective processing, as well as specificity of activation patterns for aggression subtypes and CU traits, were assessed. Simultaneous SC recordings were acquired in a subsample (n = 63). Cases compared to TDs showed higher amygdala activity in response to negative faces (fearful and angry) v. shapes. Subtyping cases according to aggression-related subtypes did not significantly influence on amygdala activity; while stratification based on CU traits was more sensitive and revealed decreased amygdala activity in the high CU group. SC responses were significantly lower in cases and negatively correlated with CU traits, reactive and proactive aggression.
Our results showed differences in amygdala activity and SC responses to emotional faces between cases with ODD/CD and TD, while CU traits moderate both central (amygdala) and peripheral (SC) responses. Our insights regarding subtypes and trait-specific aggression could be used for improved diagnostics and personalized treatment.
Opioid antagonists may mitigate medication-associated weight gain and/or metabolic dysregulation. ENLIGHTEN-2 evaluated a combination of olanzapine and the opioid antagonist samidorphan (OLZ/SAM) vs olanzapine for effects on weight gain and metabolic parameters over 24 weeks in adults with stable schizophrenia.
This phase 3, double-blind study (ClinicalTrials.gov: NCT02694328) enrolled adults 18–55 yo with stable schizophrenia, randomized 1:1 to once-daily OLZ/SAM or olanzapine. Co-primary endpoints were percent change from baseline in body weight and proportion of patients with ≥10% weight gain at week 24. Waist circumference and fasting metabolic parameters were also measured. Completers could enter a 52-week open-label safety extension.
561 patients were randomized: 550 were dosed, 538 had ≥1 post-baseline weight assessment, and 352 (64%) completed; 10.9% discontinued due to AEs. At week 24, least squares mean (SE) percent weight change from baseline was 4.21 (0.68)% with OLZ/SAM and 6.59 (0.67)% with olanzapine (difference, −2.38 [0.76]%; P=0.003). Fewer patients treated with OLZ/SAM (17.8%) had ≥10% weight gain vs olanzapine (29.8%; odds ratio=0.50; P=0.003). The change from baseline in waist circumference was significantly smaller with OLZ/SAM (P<0.001). Common AEs (≥10%) with OLZ/SAM and olanzapine were weight increased (24.8%, 36.2%), somnolence (21.2%, 18.1%), dry mouth (12.8%, 8.0%), and increased appetite (10.9%, 12.3%), respectively. Metabolic parameter changes were generally small and remained stable with long-term OLZ/SAM treatment.
OLZ/SAM treatment limited weight gain associated with olanzapine. Metabolic parameter changes were generally small, similar between groups over 24 weeks, and remained stable over an additional 52 weeks of open-label OLZ/SAM treatment.
In the 2015 review paper ‘Petawatt Class Lasers Worldwide’ a comprehensive overview of the current status of high-power facilities of
was presented. This was largely based on facility specifications, with some description of their uses, for instance in fundamental ultra-high-intensity interactions, secondary source generation, and inertial confinement fusion (ICF). With the 2018 Nobel Prize in Physics being awarded to Professors Donna Strickland and Gerard Mourou for the development of the technique of chirped pulse amplification (CPA), which made these lasers possible, we celebrate by providing a comprehensive update of the current status of ultra-high-power lasers and demonstrate how the technology has developed. We are now in the era of multi-petawatt facilities coming online, with 100 PW lasers being proposed and even under construction. In addition to this there is a pull towards development of industrial and multi-disciplinary applications, which demands much higher repetition rates, delivering high-average powers with higher efficiencies and the use of alternative wavelengths: mid-IR facilities. So apart from a comprehensive update of the current global status, we want to look at what technologies are to be deployed to get to these new regimes, and some of the critical issues facing their development.
The “Stop the Bleed” campaign advocates for non-medical personnel to be trained in basic hemorrhage control. However, it is not clear what type of education or the duration of instruction needed to meet that requirement. The objective of this study was to determine the impact of a brief hemorrhage control educational curriculum on the willingness of laypersons to respond during a traumatic emergency.
This “Stop the Bleed” education initiative was conducted by the University of Texas Health San Antonio Office of the Medical Director (San Antonio, Texas USA) between September 2016 and March 2017. Individuals with formal medical certification were excluded from this analysis. Trainers used a pre-event questionnaire to assess participants knowledge and attitudes about tourniquets and responding to traumatic emergencies. Each training course included an individual evaluation of tourniquet placement, 20 minutes of didactic instruction on hemorrhage control techniques, and hands-on instruction with tourniquet application on both adult and child mannequins. The primary outcome in this study was the willingness to use a tourniquet in response to a traumatic medical emergency.
Of 236 participants, 218 met the eligibility criteria. When initially asked if they would use a tourniquet in real life, 64.2% (140/218) responded “Yes.” Following training, 95.6% (194/203) of participants responded that they would use a tourniquet in real life. When participants were asked about their comfort level with using a tourniquet in real life, there was a statistically significant improvement between their initial response and their response post training (2.5 versus 4.0, based on 5-point Likert scale; P<.001).
In this hemorrhage control education study, it was found that a short educational intervention can improve laypersons’ self-efficacy and reported willingness to use a tourniquet in an emergency. Identified barriers to act should be addressed when designing future hemorrhage control public health education campaigns. Community education should continue to be a priority of the “Stop the Bleed” campaign.
RossEM, RedmanTT, MappJG, BrownDJ, TanakaK, CooleyCW, KharodCU, WamplerDA. Stop the Bleed: The Effect of Hemorrhage Control Education on Laypersons’ Willingness to Respond During a Traumatic Medical Emergency. Prehosp Disaster Med. 2018;33(2):127–132.
To evaluate the effectiveness of a computerized clinical decision support intervention aimed at reducing inappropriate Clostridium difficile testing
Retrospective cohort study
University of Pennsylvania Health System, comprised of 3 large tertiary-care hospitals
All adult patients admitted over a 2-year period
Providers were required to use an order set integrated into a commercial electronic health record to order C. difficile toxin testing. The order set identified patients who had received laxatives within the previous 36 hours and displayed a message asking providers to consider stopping laxatives and reassessing in 24 hours prior to ordering C. difficile testing. Providers had the option to continue or discontinue laxatives and to proceed with or forgo testing. The primary endpoint was the change in inappropriate C. difficile testing, as measured by the number of patients who had C. difficile testing ordered while receiving laxatives.
Compared to the 1-year baseline period, the intervention resulted in a decrease in the proportion of inappropriate C. difficile testing (29.6% vs 27.3%; P=.02). The intervention was associated with an increase in the number of patients who had laxatives discontinued and did not undergo C. difficile testing (5.8% vs 46.4%; P<.01) and who had their laxatives discontinued and underwent testing (5.4% vs 35.2%; P<.01). We observed a nonsignificant increase in the proportion of patients with C. difficile related complications (5.0% vs 8.9%; P=.11).
A C. difficile order set was successful in decreasing inappropriate C. difficile testing and improving the timely discontinuation of laxatives.
Psychological models of conversion disorder (CD) traditionally assume that psychosocial stressors are identifiable around symptom onset. In the face of limited supportive evidence such models are being challenged.
Forty-three motor CD patients, 28 depression patients and 28 healthy controls were assessed using the Life Events and Difficulties Schedule in the year before symptom onset. A novel ‘escape’ rating for events was developed to test the Freudian theory that physical symptoms of CD could provide escape from stressors, a form of ‘secondary gain’.
CD patients had significantly more severe life events and ‘escape’ events than controls. In the month before symptom onset at least one severe event was identified in 56% of CD patients – significantly more than 21% of depression patients [odds ratio (OR) 4.63, 95% confidence interval (CI) 1.56–13.70] and healthy controls (OR 5.81, 95% CI 1.86–18.2). In the same time period 53% of CD patients had at least one ‘high escape’ event – again significantly higher than 14% in depression patients (OR 6.90, 95% CI 2.05–23.6) and 0% in healthy controls. Previous sexual abuse was more commonly reported in CD than controls, and in one third of female patients was contextually relevant to life events at symptom onset. The majority (88%) of life events of potential aetiological relevance were not identified by routine clinical assessments. Nine per cent of CD patients had no identifiable severe life events.
Evidence was found supporting the psychological model of CD, the Freudian notion of escape and the potential aetiological relevance of childhood traumas in some patients. Uncovering stressors of potential aetiological relevance requires thorough psychosocial evaluation.
We analyzed birth order differences in means and variances of height and body mass index (BMI) in monozygotic (MZ) and dizygotic (DZ) twins from infancy to old age. The data were derived from the international CODATwins database. The total number of height and BMI measures from 0.5 to 79.5 years of age was 397,466. As expected, first-born twins had greater birth weight than second-born twins. With respect to height, first-born twins were slightly taller than second-born twins in childhood. After adjusting the results for birth weight, the birth order differences decreased and were no longer statistically significant. First-born twins had greater BMI than the second-born twins over childhood and adolescence. After adjusting the results for birth weight, birth order was still associated with BMI until 12 years of age. No interaction effect between birth order and zygosity was found. Only limited evidence was found that birth order influenced variances of height or BMI. The results were similar among boys and girls and also in MZ and DZ twins. Overall, the differences in height and BMI between first- and second-born twins were modest even in early childhood, while adjustment for birth weight reduced the birth order differences but did not remove them for BMI.
A trend toward greater body size in dizygotic (DZ) than in monozygotic (MZ) twins has been suggested by some but not all studies, and this difference may also vary by age. We analyzed zygosity differences in mean values and variances of height and body mass index (BMI) among male and female twins from infancy to old age. Data were derived from an international database of 54 twin cohorts participating in the COllaborative project of Development of Anthropometrical measures in Twins (CODATwins), and included 842,951 height and BMI measurements from twins aged 1 to 102 years. The results showed that DZ twins were consistently taller than MZ twins, with differences of up to 2.0 cm in childhood and adolescence and up to 0.9 cm in adulthood. Similarly, a greater mean BMI of up to 0.3 kg/m2 in childhood and adolescence and up to 0.2 kg/m2 in adulthood was observed in DZ twins, although the pattern was less consistent. DZ twins presented up to 1.7% greater height and 1.9% greater BMI than MZ twins; these percentage differences were largest in middle and late childhood and decreased with age in both sexes. The variance of height was similar in MZ and DZ twins at most ages. In contrast, the variance of BMI was significantly higher in DZ than in MZ twins, particularly in childhood. In conclusion, DZ twins were generally taller and had greater BMI than MZ twins, but the differences decreased with age in both sexes.
For over 100 years, the genetics of human anthropometric traits has attracted scientific interest. In particular, height and body mass index (BMI, calculated as kg/m2) have been under intensive genetic research. However, it is still largely unknown whether and how heritability estimates vary between human populations. Opportunities to address this question have increased recently because of the establishment of many new twin cohorts and the increasing accumulation of data in established twin cohorts. We started a new research project to analyze systematically (1) the variation of heritability estimates of height, BMI and their trajectories over the life course between birth cohorts, ethnicities and countries, and (2) to study the effects of birth-related factors, education and smoking on these anthropometric traits and whether these effects vary between twin cohorts. We identified 67 twin projects, including both monozygotic (MZ) and dizygotic (DZ) twins, using various sources. We asked for individual level data on height and weight including repeated measurements, birth related traits, background variables, education and smoking. By the end of 2014, 48 projects participated. Together, we have 893,458 height and weight measures (52% females) from 434,723 twin individuals, including 201,192 complete twin pairs (40% monozygotic, 40% same-sex dizygotic and 20% opposite-sex dizygotic) representing 22 countries. This project demonstrates that large-scale international twin studies are feasible and can promote the use of existing data for novel research purposes.
The ApRES (autonomous phase-sensitive radio-echo sounder) instrument is a robust, lightweight and relatively inexpensive radar that has been designed to allow long-term, unattended monitoring of ice-shelf and ice-sheet thinning. We describe the instrument and demonstrate its capabilities and limitations by presenting results from three trial campaigns conducted in different Antarctic settings. Two campaigns were ice sheet-based – Pine Island Glacier and Dome C – and one was conducted on the Ross Ice Shelf. The ice-shelf site demonstrates the ability of the instrument to collect a time series of basal melt rates; the two grounded ice applications show the potential to recover profiles of vertical strain rate and also demonstrate some of the limitations of the present system.
Cognitive therapy (CT) has considerable utility for psychosomatic medicine (PM) in acute medical settings but, to date, no such cohesive adaptation has been developed. Part I delineated a CT model for acute medical settings focusing on assessment and formulation. In Part II, we review how CT can be applied to common PM clinical challenges. A pragmatic approach is helpful because this review targets PM trainees and educators.
Narrative review is used to discuss the application of CT strategies to common challenges in acute medical settings. Treatment complexities and limitations associated with the PM setting are detailed. Exemplary dialogues are used to model techniques.
We present CT approaches to eight common scenarios: (1) distressed or hopeless patients; (2) patients expressing pivotal distorted cognitions/images; (3) patients who catastrophize; (4) patients who benefit from distraction and activation strategies; (5) panic and anxiety; (6) suicidal patients; (7) patients who are stuck and helpless; (8) inhibited patients. Limitations are discussed.
Significance of results:
A CT informed PM assessment, formulation and early intervention with specific techniques offers a novel integrative framework for psychotherapy with the acutely medically ill. Future efforts should focus on dissemination, education of fellows and building research efficacy data.