We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Social anxiety disorder (SoAD) in youth is often treated with a generic form of cognitive behavioural therapy (CBT). Some studies have suggested that primary SoAD is associated with lower recovery rates following generic CBT compared with other anxiety disorders.
Aims:
This systematic review and meta-analysis investigated recovery rates following generic CBT for youth with primary SoAD versus other primary anxiety disorders.
Method:
Five databases (PsycINFO, Web of Science, PubMed, Embase, Medline) were searched for randomised controlled trials of generic CBT for child and/or adolescent anxiety.
Results:
Ten trials met criteria for inclusion in the systematic review, six of which presented sufficient data for inclusion in the meta-analysis. Sixty-seven did not report data on recovery rates relative to primary diagnosis. While most individual studies included in the systematic review were not sufficiently powered to detect a difference in recovery rates between diagnoses, there was a pattern of lower recovery rates for youth with primary SoAD. Across the trials included in the meta-analysis, the post-CBT recovery rate from primary SoAD (35%) was significantly lower than the recovery rate from other primary anxiety disorders (54%).
Conclusions:
Recovery from primary SoAD is significantly less likely than recovery from any other primary anxiety disorder following generic CBT in youth. This suggests a need for research to enhance the efficacy of CBT for youth SoAD.
Recent studies have used Mendelian randomization (MR) to investigate the observational association between low birth weight (BW) and increased risk of cardiometabolic outcomes, specifically cardiovascular disease, glycemic traits, and type 2 diabetes (T2D), and inform on the validity of the Barker hypothesis. We used simulations to assess the validity of these previous MR studies, and to determine whether a better formulated model can be used in this context. Genetic and phenotypic data were simulated under a model of no direct causal effect of offspring BW on cardiometabolic outcomes and no effect of maternal genotype on offspring cardiometabolic risk through intrauterine mechanisms; where the observational relationship between BW and cardiometabolic risk was driven entirely by horizontal genetic pleiotropy in the offspring (i.e. offspring genetic variants affecting both BW and cardiometabolic disease simultaneously rather than a mechanism consistent with the Barker hypothesis). We investigated the performance of four commonly used MR analysis methods (weighted allele score MR (WAS-MR), inverse variance weighted MR (IVW-MR), weighted median MR (WM-MR), and MR-Egger) and a new approach, which tests the association between maternal genotypes related to offspring BW and offspring cardiometabolic risk after conditioning on offspring genotype at the same loci. We caution against using traditional MR analyses, which do not take into account the relationship between maternal and offspring genotypes, to assess the validity of the Barker hypothesis, as results are biased in favor of a causal relationship. In contrast, we recommend the aforementioned conditional analysis framework utilizing maternal and offspring genotypes as a valid test of not only the Barker hypothesis, but also to investigate hypotheses relating to the Developmental Origins of Health and Disease more broadly.
Despite a growing understanding of disorders of consciousness following severe brain injury, the association between long-term impairment of consciousness, spontaneous brain oscillations, and underlying subcortical damage, and the ability of such information to aid patient diagnosis, remains incomplete.
Methods
Cross-sectional observational sample of 116 patients with a disorder of consciousness secondary to brain injury, collected prospectively at a tertiary center between 2011 and 2013. Multimodal analyses relating clinical measures of impairment, electroencephalographic measures of spontaneous brain activity, and magnetic resonance imaging data of subcortical atrophy were conducted in 2018.
Results
In the final analyzed sample of 61 patients, systematic associations were found between electroencephalographic power spectra and subcortical damage. Specifically, the ratio of beta-to-delta relative power was negatively associated with greater atrophy in regions of the bilateral thalamus and globus pallidus (both left > right) previously shown to be preferentially atrophied in chronic disorders of consciousness. Power spectrum total density was also negatively associated with widespread atrophy in regions of the left globus pallidus, right caudate, and in the brainstem. Furthermore, we showed that the combination of demographics, encephalographic, and imaging data in an analytic framework can be employed to aid behavioral diagnosis.
Conclusions
These results ground, for the first time, electroencephalographic presentation detected with routine clinical techniques in the underlying brain pathology of disorders of consciousness and demonstrate how multimodal combination of clinical, electroencephalographic, and imaging data can be employed in potentially mitigating the high rates of misdiagnosis typical of this patient cohort.
Previous genetic association studies have failed to identify loci robustly associated with sepsis, and there have been no published genetic association studies or polygenic risk score analyses of patients with septic shock, despite evidence suggesting genetic factors may be involved. We systematically collected genotype and clinical outcome data in the context of a randomized controlled trial from patients with septic shock to enrich the presence of disease-associated genetic variants. We performed genomewide association studies of susceptibility and mortality in septic shock using 493 patients with septic shock and 2442 population controls, and polygenic risk score analysis to assess genetic overlap between septic shock risk/mortality with clinically relevant traits. One variant, rs9489328, located in AL589740.1 noncoding RNA, was significantly associated with septic shock (p = 1.05 × 10–10); however, it is likely a false-positive. We were unable to replicate variants previously reported to be associated (p < 1.00 × 10–6 in previous scans) with susceptibility to and mortality from sepsis. Polygenic risk scores for hematocrit and granulocyte count were negatively associated with 28-day mortality (p = 3.04 × 10–3; p = 2.29 × 10–3), and scores for C-reactive protein levels were positively associated with susceptibility to septic shock (p = 1.44 × 10–3). Results suggest that common variants of large effect do not influence septic shock susceptibility, mortality and resolution; however, genetic predispositions to clinically relevant traits are significantly associated with increased susceptibility and mortality in septic individuals.
The scarcity of Romano-British human remains from north-west England has hindered understanding of burial practice in this region. Here, we report on the excavation of human and non-human animal remains1 and material culture from Dog Hole Cave, Haverbrack. Foetal and neonatal infants had been interred alongside a horse burial and puppies, lambs, calves and piglets in the very latest Iron Age to early Romano-British period, while the mid- to late Roman period is characterised by burials of older individuals with copper-alloy jewellery and beads. This material culture is more characteristic of urban sites, while isotope analysis indicates that the later individuals were largely from the local area. We discuss these results in terms of burial ritual in Cumbria and rural acculturation. Supplementary material is available online (https://doi.org/10.1017/S0068113X20000136), and contains further information about the site and excavations, small finds, zooarchaeology, human osteology, site taphonomy, the palaeoenvironment, isotope methods and analysis, and finds listed in Benson and Bland 1963.
A range of decision-makers, including policy-makers, NGOs and local communities, have a stake in developing conservation interventions that are to be implemented on the ground. In order to ensure that decision-making is evidence-informed, the science community needs to engage these communities of policy and practice effectively. This chapter brings together work which explores how scientists can work effectively with decision-makers, using global case studies from South America, Australia, New Zealand and elsewhere to identify what works. It identifies 10 key tips for successful engagement : (1) know who you need to talk to, (2) engage early, (3) make it easy to engage, (4) include multiple knowledges, perspectives and worldviews, (5) think hard about power, (6) build trust, (7) good facilitation is key, (8) learn new engagement skills, (9) make use of existing spaces of collaboration, and (10) don't give up. While executing these tips will not guarantee successful engagement in every case, it will improve the chances for mutually beneficial relationships and hence better conservation outcomes.
Dr Nick Martin has made enormous contributions to the field of behavior genetics over the past 50 years. Of his many seminal papers that have had a profound impact, we focus on his early work on the power of twin studies. He was among the first to recognize the importance of sample size calculation before conducting a study to ensure sufficient power to detect the effects of interest. The elegant approach he developed, based on the noncentral chi-squared distribution, has been adopted by subsequent researchers for other genetic study designs, and today remains a standard tool for power calculations in structural equation modeling and other areas of statistical analysis. The present brief article discusses the main aspects of his seminal paper, and how it led to subsequent developments, by him and others, as the field of behavior genetics evolved into the present era.
Blood cell concentrations for most cell types are highly heritable. Data from Nick Martin’s twin registry provided much of the data for the early heritability and linkage studies of blood cell related traits and have contributed significantly to more recent genomewide association studies that have successfully identified individual genetic loci.
Nick Martin is a pioneer in recognizing the need for large sample size to study the complex, heterogeneous and polygenic disorders of common mental disorders. In the predigital era, questionnaires were mailed to thousands of twin pairs around Australia. Always quick to adopt new technology, Nick’s studies progressed to phone interviews and then online. Moreover, Nick was early to recognize the value of collecting DNA samples. As genotyping technologies improved over the years, these twin and family cohorts were used for linkage, candidate gene and genome-wide association studies. These cohorts have underpinned many analyses to disentangle the complex web of genetic and lifestyle factors associated with mental health. With characteristic foresight, Nick is chief investigator of our Australian Genetics of Depression Study, which has recruited 16,000 people with self-reported depression (plus DNA samples) over a time frame of a few months — analyses are currently ongoing. The mantra of sample size, sample size, sample size has guided Nick’s research over the last 30 years and continues to do so.
The classical twin design relies on a number of strong number of assumptions in order to yield unbiased estimates of heritability. This includes the equal environments assumption — that monozygotic and dizygotic twins experience similar degrees of environmental similarity — an assumption that is likely to be violated in practice for many traits of interest. An alternative method of estimating heritability that does not suffer from many of these limitations is to model trait similarity between sibling pairs as a function of their empirical genome-wide identity by descent sharing, estimated from genetic markers. In this review, I recount the story behind Nick Martin’s and my development of this method, our first attempts at applying it in a human population and more recent studies using the original and related methods to estimate trait heritability.
Despite decades of suicide research, our ability to predict suicide has not changed. Why is this the case? We outline the unique challenges facing suicide research. Borrowing successful strategies from other medical fields, we propose specific research directions that aim to translate scientific findings into meaningful clinical impact.
We first published on the subject of pregnancy management via fetal reduction (FR) 30 years ago [1]. Dramatic changes have occurred in medical technology, outcomes, and patient choices – large demographic and cultural shifts that have driven the pace and direction of progress and research [2, 3].
The cognitive process of worry, which keeps negative thoughts in mind and elaborates the content, contributes to the occurrence of many mental health disorders. Our principal aim was to develop a straightforward measure of general problematic worry suitable for research and clinical treatment. Our secondary aim was to develop a measure of problematic worry specifically concerning paranoid fears.
Methods
An item pool concerning worry in the past month was evaluated in 250 non-clinical individuals and 50 patients with psychosis in a worry treatment trial. Exploratory factor analysis and item response theory (IRT) informed the selection of scale items. IRT analyses were repeated with the scales administered to 273 non-clinical individuals, 79 patients with psychosis and 93 patients with social anxiety disorder. Other clinical measures were administered to assess concurrent validity. Test-retest reliability was assessed with 75 participants. Sensitivity to change was assessed with 43 patients with psychosis.
Results
A 10-item general worry scale (Dunn Worry Questionnaire; DWQ) and a five-item paranoia worry scale (Paranoia Worries Questionnaire; PWQ) were developed. All items were highly discriminative (DWQ a = 1.98–5.03; PWQ a = 4.10–10.7), indicating small increases in latent worry lead to a high probability of item endorsement. The DWQ was highly informative across a wide range of the worry distribution, whilst the PWQ had greatest precision at clinical levels of paranoia worry. The scales demonstrated excellent internal reliability, test-retest reliability, concurrent validity and sensitivity to change.
Conclusions
The new measures of general problematic worry and worry about paranoid fears have excellent psychometric properties.
Drawing on a landscape analysis of existing data-sharing initiatives, in-depth interviews with expert stakeholders, and public deliberations with community advisory panels across the U.S., we describe features of the evolving medical information commons (MIC). We identify participant-centricity and trustworthiness as the most important features of an MIC and discuss the implications for those seeking to create a sustainable, useful, and widely available collection of linked resources for research and other purposes.
In the National Institutes of Health (NIH) Clinical Center, patients colonized or infected with vancomycin-resistant Enterococcus (VRE) are placed in contact isolation until they are deemed “decolonized,” defined as having 3 consecutive perirectal swabs negative for VRE. Some decolonized patients later develop recurrent growth of VRE from surveillance or clinical cultures (ie, “recolonized”), although that finding may represent recrudescence or new acquisition of VRE. We describe the dynamics of VRE colonization and infection and their relationship to receipt of antibiotics.
Methods:
In this retrospective cohort study of patients at the National Institutes of Health Clinical Center, baseline characteristics were collected via chart review. Antibiotic exposure and hospital days were calculated as proportions of VRE decolonized days. Using survival analysis, we assessed the relationship between antibiotic exposure and time to VRE recolonization in a subcohort analysis of 72 decolonized patients.
Results:
In total, 350 patients were either colonized or infected with VRE. Among polymerase chain reaction (PCR)-positive, culture (Cx)-negative (PCR+/Cx−) patients, PCR had a 39% positive predictive value for colonization. Colonization with VRE was significantly associated with VRE infection. Among 72 patients who met decolonization criteria, 21 (29%) subsequently became recolonized. VRE recolonization was 4.3 (P = .001) and 2.0 (P = .22) times higher in patients with proportions of antibiotic days and antianaerobic antibiotic days above the median, respectively.
Conclusion:
Colonization is associated with clinical VRE infection and increased mortality. Despite negative perirectal cultures, re-exposure to antibiotics increases the risk of VRE recolonization.
Important Bird and Biodiversity Areas (IBAs) are sites identified as being globally important for the conservation of bird populations on the basis of an internationally agreed set of criteria. We present the first review of the development and spread of the IBA concept since it was launched by BirdLife International (then ICBP) in 1979 and examine some of the characteristics of the resulting inventory. Over 13,000 global and regional IBAs have so far been identified and documented in terrestrial, freshwater and marine ecosystems in almost all of the world’s countries and territories, making this the largest global network of sites of significance for biodiversity. IBAs have been identified using standardised, data-driven criteria that have been developed and applied at global and regional levels. These criteria capture multiple dimensions of a site’s significance for avian biodiversity and relate to populations of globally threatened species (68.6% of the 10,746 IBAs that meet global criteria), restricted-range species (25.4%), biome-restricted species (27.5%) and congregatory species (50.3%); many global IBAs (52.7%) trigger two or more of these criteria. IBAs range in size from < 1 km2 to over 300,000 km2 and have an approximately log-normal size distribution (median = 125.0 km2, mean = 1,202.6 km2). They cover approximately 6.7% of the terrestrial, 1.6% of the marine and 3.1% of the total surface area of the Earth. The launch in 2016 of the KBA Global Standard, which aims to identify, document and conserve sites that contribute to the global persistence of wider biodiversity, and whose criteria for site identification build on those developed for IBAs, is a logical evolution of the IBA concept. The role of IBAs in conservation planning, policy and practice is reviewed elsewhere. Future technical priorities for the IBA initiative include completion of the global inventory, particularly in the marine environment, keeping the dataset up to date, and improving the systematic monitoring of these sites.
Compositional zoning is observed rarely in chrome-spinel grains from slowly-cooled layered intrusions because diffusion of cations continues within the spinel to low temperatures. However, in certain circumstances, such gradational zoning of both divalent and trivalent cations is observed and may be useful in deciphering the thermal history of the host intrusions. The accessory chrome-spinels of the Kabanga mafic-ultramafic chonolith intrusions of the Kibaran igneous event in north western Tanzania are notable because they have preserved gradational compositional zoning. This zoning is demonstrated to predate and be independent of later hydrous alteration of the silicate assemblage. At Kabanga, most chrome-spinel grains within olivine-rich cumulate rocks are gradationally and cryptically zoned from Fe2+-Cr3+ rich cores to more Mg2+-Al3+ rich rims (normal zoning). A few grains are zoned from Mg2+-Al3+ rich cores to more Fe2+-Cr3+ rich rims (reverse zoned). The zoning of divalent cations is proportional to that of trivalent cations with Mg2+ following Al3+ and Fe2+ following Cr3+ from core to rim. The zoning of trivalent and tetravalent cations is interpreted to be caused by either new growth from an evolving melt or peritectic reactions between evolved or contaminated melt and adjacent Al-Cr-bearing ferromagnesian minerals, which is preserved by relatively rapid initial cooling in the small chonolith intrusions. Divalent cation zoning is controlled by sub-solidus exchange of Fe2+ and Mg with adjacent ferromagnesian minerals and continues to lower temperatures, indicated to be 580 to 630°C by the spinel-olivine geothermometer. Preservation of such zoning is more likely in the smaller chonolith intrusions that typically host magmatic nickel-copper sulfide deposits and can be used as an exploration indicator when interpreting chromite compositions in regional heavy indicator mineral surveys.
Edited by
J. M. Larrazabal, University of the Basque Country, San Sebastian,D. Lascar, Université de Paris VII (Denis Diderot),G. Mints, Stanford University, California
This study sought to identify trajectories of DSM-IV based internalizing (INT) and externalizing (EXT) problem scores across childhood and adolescence and to provide insight into the comorbidity by modeling the co-occurrence of INT and EXT trajectories. INT and EXT were measured repeatedly between age 7 and age 15 years in over 7,000 children and analyzed using growth mixture models. Five trajectories were identified for both INT and EXT, including very low, low, decreasing, and increasing trajectories. In addition, an adolescent onset trajectory was identified for INT and a stable high trajectory was identified for EXT. Multinomial regression showed that similar EXT and INT trajectories were associated. However, the adolescent onset INT trajectory was independent of high EXT trajectories, and persisting EXT was mainly associated with decreasing INT. Sex and early life environmental risk factors predicted EXT and, to a lesser extent, INT trajectories. The association between trajectories indicates the need to consider comorbidity when a child presents with INT or EXT disorders, particularly when symptoms start early. This is less necessary when INT symptoms start at adolescence. Future studies should investigate the etiology of co-occurring INT and EXT and the specific treatment needs of these severely affected children.