We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
There is a growing body of evidence highlighting the presence of a single general dimension of psychopathology that can account for multiple associations across mental and substance use disorders. However, relatively little evidence has emerged regarding the validity of this model with respect to a range of factors that have been previously implicated across multiple disorders. The current study utilized a cross-sectional population survey of adolescents (n = 2,003) to examine the extent to which broad psychopathology factors account for specific associations between psychopathology and key validators: poor sleep, self-harm, suicidality, risky sexual behavior, and low self-esteem. Confirmatory factor models, latent class models, and factor mixture models were estimated to identify the best structure of psychopathology. Structural equation models were then estimated to examine the broad and specific associations between each psychopathology indicator and the validators. A confirmatory factor model with three lower-order factors, representing internalizing, externalizing, and psychotic-like experiences, and a single higher-order factor evidenced the best fit. The associations between manifest indicators of psychopathology and validators were largely nonspecific. However, significant and large direct effects were found between several pairwise associations. These findings have implications for the identification of potential targets for intervention and/or tailoring of prevention programs.
Despite advances in the treatment of pulmonary hypertension and improvements in obstetric care, pulmonary hypertension (PH) remains a leading cause of cardiac maternal death in the developed world. The last three decades have seen the development of effective therapies for specific forms of PH, improving patients’ symptoms and more than doubling survival in some forms of PH. Consequently there are an increasing number of women of childbearing potential with PH. Women may present for the first time, with PH in pregnancy, in the early post-partum period or patients with PH may consider pregnancy despite counselling regarding the high risks.
The net benefit from investing in any technology is a function of the cost of implementation and the expected return in revenue. The objective of the present study was to quantify, using deterministic equations, the net monetary benefit from investing in genotyping of commercial females. Three case studies were presented reflecting dairy cows, beef cows and ewes based on Irish population parameters; sensitivity analyses were also performed. Parameters considered in the sensitivity analyses included the accuracy of genomic evaluations, replacement rate, proportion of female selection candidates retained as replacements, the cost of genotyping, the sire parentage error rate and the age of the female when it first gave birth. Results were presented as an annualised monetary net benefit over the lifetime of an individual, after discounting for the timing of expressions. In the base scenarios, the net benefit was greatest for dairy, followed by beef and then sheep. The net benefit improved as the reliability of the genomic evaluations improved and, in fact, a negative net benefit of genotyping was less frequent when the reliability of the genomic evaluations was high. The impact of a 10% point increase in genomic reliability was, however, greatest in sheep, followed by beef and then dairy. The net benefit of genotyping female selection candidates reduced as replacement rate increased. As genotyping costs increased, the net benefit reduced irrespective of the percentage of selection candidates kept, the replacement rate or even the population considered. Nonetheless, the association between the genotyping cost and the net benefit of genotyping differed by the percentage of selection candidates kept. Across all replacement rates evaluated, retaining 25% of the selection candidates resulted in the greatest net benefit when genotyping cost was low but the lowest net benefit when genotyping cost was high. Genotyping breakeven cost was non-linearly associated with the percentage of selection candidates retained, reaching a maximum when 50% of selection candidates were retained, irrespective of replacement rate, genomic reliability or the population. The genotyping breakeven cost was also non-linearly associated with replacement rate. The approaches outlined within provide the back-end framework for a decision support tool to quantify the net benefit of genotyping, once parameterised by the relevant population metrics.
Little data exists about the methodology of contextualizing version two of the Mental Health Gap Action Programme Intervention Guide (mhGAP-IG) in resource-poor settings. This paper describes the contextualisation and pilot testing of the guide in Kilifi, Kenya.
Methods
Contextualisation was conducted as a collaboration between the KEMRI-Wellcome Trust Research Programme (KWTRP) and Kilifi County Government's Department of Health (KCGH) between 2016 and 2018. It adapted a mixed-method design and involved a situational analysis, stakeholder engagement, local adaptation and pilot testing of the adapted guide. Qualitative data were analysed using content analysis to identify key facilitators and barriers to the implementation process. Pre- and post-training scores of the adapted guide were compared using the Wilcoxon signed-rank test.
Results
Human resource for mental health in Kilifi is strained with limited infrastructure and outdated legislation. Barriers to implementation included few specialists for referral, unreliable drug supply, difficulty in translating the guide to Kiswahili language, lack of clarity of the roles of KWTRP and KCGH in the implementation process and the unwillingness of the biomedical practitioners to collaborate with traditional health practitioners to enhance referrals to hospital. In the adaptation process, stakeholders recommended the exclusion of child and adolescent mental and behavioural problems, as well as dementia modules from the final version of the guide. Pilot testing of the adapted guide showed a significant improvement in the post-training scores: 66.3% (95% CI 62.4–70.8) v. 76.6% (95% CI 71.6–79.2) (p < 0.001).
Conclusion
The adapted mhGAP-IG version two can be used across coastal Kenya to train primary healthcare providers. However, successful implementation in Kilifi will require a review of new evidence on the burden of disease, improvements in the mental health system and sustained dialogue among stakeholders.
Accurately dating when people first colonized new areas is vital for understanding the pace of past cultural and environmental changes, including questions of mobility, human impacts and human responses to climate change. Establishing effective chronologies of these events requires the synthesis of multiple radiocarbon (14C) dates. Various “chronometric hygiene” protocols have been used to refine 14C dating of island colonization, but they can discard up to 95% of available 14C dates leaving very small datasets for further analysis. Despite their foundation in sound theory, without independent tests we cannot know if these protocols are apt, too strict or too lax. In Iceland, an ice core-dated tephrochronology of the archaeology of first settlement enables us to evaluate the accuracy of 14C chronologies. This approach demonstrated that the inclusion of a wider range of 14C samples in Bayesian models improves the precision, but does not affect the model outcome. Therefore, based on our assessments, we advocate a new protocol that works with a much wider range of samples and where outlying 14C dates are systematically disqualified using Bayesian Outlier Models. We show that this approach can produce robust termini ante quos for colonization events and may be usefully applied elsewhere.
Non-communicable diseases are projected to become the most common causes of death in Africa by 2030. The impact on health of epidemiological and nutritional transitions in sub-Saharan Africa remains unclear. To assess the trends of dietary fatty acids over time in Uganda, we examined fatty acids in serum collected from individuals in rural south-west Uganda, at three time points over two decades. Independent cross-sectional samples of 915 adults and children were selected from the general population cohort in 1990 (n 281), 2000 (n 283) and 2008 (n 351). Serum phospholipid fatty acids were measured by GC. Multivariate regression analyses were performed to compare the geometric means of fatty acids by time period. Serum fatty acid profiling showed high proportions of SFA, cis-MUFA and industrial trans-fatty acids (iTFA), likely to be biomarkers of high consumption of palm oil and hydrogenated fats. In contrast, proportions of n-6 and n-3 PUFA from vegetable oils and fish were low. From 1990 to 2008, serum phospholipids showed increases in absolute amounts of SFA (17·3 % increase in adults and 26·4 % in children), MUFA (16·7 % increase in adults and 16·8 % in children) and n-6:n-3 PUFA (40·1 % increase in adults and 39·8 % in children). The amount of elaidic acid, iTFA from hydrogenated fats, increased in children (60·1 % increase). In this rural Ugandan population, we show evidence of unfavourable trends over time of dietary fatty acids.
Natural disasters are increasing in frequency and severity. They cause widespread hardship and are associated with detrimental effects on mental health.
Aims
Our aim is to provide the best estimate of the effects of natural disasters on mental health through a systematic review and meta-analysis of the rates of psychological distress and psychiatric disorder after natural disasters.
Method
This systematic review and meta-analysis is limited to studies that met predetermined quality criteria. We required included studies to make comparisons with pre-disaster or non-disaster exposed controls, and sample representative populations. Key studies were identified through a comprehensive search of PubMed, EMBASE and PsycINFO from 1980 to 3 March 2017. Random effects meta-analyses were performed for studies that reported key outcomes with appropriate statistics.
Results
Forty-one studies were identified by the literature search, of which 27 contributed to the meta-analyses. Continuous measures of psychological distress were increased after natural disasters (combined standardised mean difference 0.63, 95% CI 0.27–0.98, P = 0.005). Psychiatric disorders were also increased (combined odds ratio 1.84, 95% CI 1.43–2.38, P < 0.001). Rates of post-traumatic stress disorder and depression were significantly increased after disasters. Findings for anxiety and alcohol misuse/dependence were not significant. High rates of heterogeneity suggest that disaster-specific factors and, to a lesser degree, methodological factors contribute to the variance between studies.
Conclusions
Increased rates of psychological distress and psychiatric disorders follow natural disasters. High levels of heterogeneity between studies suggest that disaster variables and post-disaster response have the potential to mitigate adverse effects.
We report daptomycin minimum inhibitory concentrations (MICs) for vancomycin-resistant Enterococcus faecium isolated from bloodstream infections over a 4-year period. The daptomycin MIC increased over time hospital-wide for initial isolates and increased over time within patients, culminating in 40% of patients having daptomycin-nonsusceptible isolates in the final year of the study.
The aim of this study was to describe patient level costing methods and develop a database of healthcare resource use and cost in patients with AHF receiving ventricular assist device (VAD) therapy.
Methods:
Patient level micro-costing was used to identify documented activity in the years preceding and following VAD implantation, and preceding heart transplant for a cohort of seventy-seven consecutive patients listed for heart transplantation (2009–12). Clinician interviews verified activity, established time resource required for each activity, and added additional undocumented activities. Costs were sourced from the general ledger, salary, stock price, pharmacy formulary data, and from national medical benefits and prostheses lists. Linked administrative data analyses of activity external to the implanting institution, used National Weighted Activity Units (NWAU), 2014 efficient price, and admission complexity cost weights and were compared with micro-costed data for the implanting admission.
Results:
The database produced includes patient level activity and costs associated with the seventy-seven patients across thirteen resource areas including hospital activity external to the implanting center. The median cost of the implanting admission using linked administrative data was $246,839 (interquartile range [IQR] $246,839–$271,743), versus $270,716 (IQR $211,740–$378,482) for the institutional micro-costing (p = .08).
Conclusions:
Linked administrative data provides a useful alternative for imputing costs external to the implanting center, and combined with institutional data can illuminate both the pathways to transplant referral and the hospital activity generated by patients experiencing the terminal phases of heart failure in the year before transplant, cf-VAD implant, or death.
The goal of this study was to examine the mental health needs of children and youth who present to the emergency department (ED) for mental health care and to describe the type of, and satisfaction with, follow-up mental health services accessed.
Methods
A 6-month to 1.5-year prospective cohort study was conducted in three Canadian pediatric EDs and one general ED, with a 1-month follow-up post-ED discharge. Measures included 1) clinician rating of mental health needs, 2) patient and caregiver self-reports of follow-up services, and 3) interviews regarding follow-up satisfaction. Data analysis included descriptive statistics and the Fisher’s exact test to compare sites.
Results
The cohort consisted of 373 children and youth (61.1% female; mean age 15.1 years, 1.5 standard deviation). The main reason for ED presentations was a mental health crisis. The three most frequent areas of need requiring action were mood (43.8%), suicide risk (37.4%), and parent-child relational problems (34.6%). During the ED visit, 21.6% of patients received medical clearance, 40.9% received a psychiatric consult, and 19.4% were admitted to inpatient psychiatric care. At the 1-month post-ED visit, 84.3% of patients/caregivers received mental health follow-up. Ratings of service recommendations were generally positive, as 60.9% of patients obtained the recommended follow-up care and 13.9% were wait-listed.
Conclusions
Children and youth and their families presenting to the ED with mental health needs had substantial clinical morbidity, were connected with services, were satisfied with their ED visit, and accessed follow-up care within 1-month with some variability.
Background: Paediatric specialist dental practitioners are often faced with the challenge of disruptive behaviour or refusal to comply with treatment. Behaviour management skills are an essential component of their role. However, little is known of the confidence or competence of practitioners in these approaches. Aim: To identify paediatric dentists’ knowledge of behavioural management principles as applied to paediatric dentistry. Method: Postal questionnaire survey of all specialists in Paediatric Dentistry on the General Dental Council UK register (n = 234), using the Knowledge of Behavioural Principles as Applied to Children Questionnaire (KBPACQ; O'Dell, 1979) adapted for the dental setting. Information was also gathered on experience in using behavioural management techniques and demographics. Results: Responses were received from 105 practitioners (45%). Participants gave the correct answer, on average, to 38% of the items (range 0 to 75%). Conclusion: Knowledge of behavioural principles amongst paediatric dentists in the United Kingdom is poor, despite their widespread reported use of such techniques.
No existing models of alcohol prevention concurrently adopt universal and selective approaches. This study aims to evaluate the first combined universal and selective approach to alcohol prevention.
Method
A total of 26 Australian schools with 2190 students (mean age: 13.3 years) were randomized to receive: universal prevention (Climate Schools); selective prevention (Preventure); combined prevention (Climate Schools and Preventure; CAP); or health education as usual (control). Primary outcomes were alcohol use, binge drinking and alcohol-related harms at 6, 12 and 24 months.
Results
Climate, Preventure and CAP students demonstrated significantly lower growth in their likelihood to drink and binge drink, relative to controls over 24 months. Preventure students displayed significantly lower growth in their likelihood to experience alcohol harms, relative to controls. While adolescents in both the CAP and Climate groups demonstrated slower growth in drinking compared with adolescents in the control group over the 2-year study period, CAP adolescents demonstrated faster growth in drinking compared with Climate adolescents.
Conclusions
Findings support universal, selective and combined approaches to alcohol prevention. Particularly novel are the findings of no advantage of the combined approach over universal or selective prevention alone.
Globally, the Series 2 – Series 3 boundary of the Cambrian System coincides with a major carbon isotope excursion, sea-level changes and trilobite extinctions. Here we examine the sedimentology, sequence stratigraphy and carbon isotope record of this interval in the Cambrian strata (Durness Group) of NW Scotland. Carbonate carbon isotope data from the lower part of the Durness Group (Ghrudaidh Formation) show that the shallow-marine, Laurentian margin carbonates record two linked sea-level and carbon isotopic events. Whilst the carbon isotope excursions are not as pronounced as those expressed elsewhere, correlation with global records (Sauk I – Sauk II boundary and Olenellus biostratigraphic constraint) identifies them as representing the local expression of the ROECE and DICE. The upper part of the ROECE is recorded in the basal Ghrudaidh Formation whilst the DICE is seen around 30m above the base of this unit. Both carbon isotope excursions co-occur with surfaces interpreted to record regressive–transgressive events that produced amalgamated sequence boundaries and ravinement/flooding surfaces overlain by conglomerates of reworked intraclasts. The ROECE has been linked with redlichiid and olenellid trilobite extinctions, but in NW Scotland, Olenellus is found after the negative peak of the carbon isotope excursion but before sequence boundary formation.
Toxigenic strains of Vibrio cholerae serogroups O1 and O139 have caused cholera epidemics, but other serogroups – such as O75 or O141 – can also produce cholera toxin and cause severe watery diarrhoea similar to cholera. We describe 31 years of surveillance for toxigenic non-O1, non-O139 infections in the United States and map these infections to the state where the exposure probably originated. While serogroups O75 and O141 are closely related pathogens, they differ in how and where they infect people. Oysters were the main vehicle for O75 infection. The vehicles for O141 infection include oysters, clams, and freshwater in lakes and rivers. The patients infected with serogroup O75 who had food traceback information available ate raw oysters from Florida. Patients infected with O141 ate oysters from Florida and clams from New Jersey, and those who only reported being exposed to freshwater were exposed in Arizona, Michigan, Missouri, and Texas. Improving the safety of oysters, specifically, should help prevent future illnesses from these toxigenic strains and similar pathogenic Vibrio species. Post-harvest processing of raw oysters, such as individual quick freezing, heat-cool pasteurization, and high hydrostatic pressurization, should be considered.
The yields of spring barley during a medium-term (7 years) compost and slurry addition experiment and the soil carbon (C) and nitrogen (N) contents, bacterial community structure, soil microbial biomass and soil respiration rates have been determined to assess the effects of repeated, and in some cases very large, organic amendments on soil and crop parameters. For compost, total additions were equivalent to up to 119 t C/ha and 1·7 t N/ha and for slurry they were 25 t C/ha and 0·35 t N/ha over 7 years, which represented very large additions compared to control soil C and N contents (69 t C/ha and 0·3 t N/ha in the 0–30 cm soil depth). There was an initial positive response to compost and slurry addition on barley yield, but over the experiment the yield differential between the amounts of compost addition declined, indicating that repeated addition of compost at a lower rate over several years had the same cumulative effect as a large single compost application. By the end of the experiment it was clear that the addition of compost and slurry increased soil C and N contents, especially towards the top of the soil profile, as well as soil respiration rates. However, the increases in soil C and N contents were not proportional to the amount of C and N added, suggesting either that: (i) a portion of the added C and N was more vulnerable to loss; (ii) that its addition rendered another C or N pool in the soil more susceptible to loss; or (iii) that the C inputs from additional crop productivity did not increase in line with the organic amendments. Soil microbial biomass was depressed at the highest rate of organic amendment, and whilst this may have been due to genuine toxic or inhibitory effects of large amounts of compost, it could also be due to the inaccuracy of the substrate-induced respiration approach used for determining soil biomass when there is a large supply of organic matter. At the highest compost addition, the bacterial community structure was significantly altered, suggesting that the amendments significantly altered soil community dynamics.
In November 1975 workers at the Herzberg Institute for Astrophysics in Canada (Avery et al. 1976) discovered cyanodiacetylene in Sgr B2. This molecule is the heaviest yet detected in interstellar space, having a molecular weight of 75 amu, and is the longest linear molecule known.
In western Canada, more money is spent on wild oat herbicides than on any
other weed species, and wild oat resistance to herbicides is the most
widespread resistance issue. A direct-seeded field experiment was conducted
from 2010 to 2014 at eight Canadian sites to determine crop life cycle, crop
species, crop seeding rate, crop usage, and herbicide rate combination
effects on wild oat management and canola yield. Combining 2× seeding rates
of early-cut barley silage with 2× seeding rates of winter cereals and
excluding wild oat herbicides for 3 of 5 yr (2011 to 2013) often led to
similar wild oat density, aboveground wild oat biomass, wild oat seed
density in the soil, and canola yield as a repeated canola–wheat rotation
under a full wild oat herbicide rate regime. Wild oat was similarly well
managed after 3 yr of perennial alfalfa without wild oat herbicides.
Forgoing wild oat herbicides in only 2 of 5 yr from exclusively summer
annual crop rotations resulted in higher wild oat density, biomass, and seed
banks. Management systems that effectively combine diverse and optimal
cultural practices against weeds, and limit herbicide use, reduce selection
pressure for weed resistance to herbicides and prolong the utility of
threatened herbicide tools.
Few studies have explored therapists’ views on computerized cognitive behavioural therapy (cCBT) and this study aimed to provide an in-depth understanding of accredited therapists’ views on cCBT's role in treating depression. Twelve therapists constituted this self-selected sample (eight female, four male). Mean age was 52 years (range 46–61). The data obtained from a semi-structured questionnaire were analysed using thematic analysis. Three themes were identified and discussed: (1) the standardized nature of cCBT for depression, (2) the importance of the therapeutic relationship in cCBT, and (3) the pros and cons with cCBT as an alternative to CBT. The therapists in this study emphasized that innovations in CBT delivery formats (e.g. internet-based, computerized) show promise. However, participants expressed some views that clash with the evidence-based viewpoint. More work is needed to improve the implementation of evidence-based practice and policy.