To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
With one in ten young people being affected by ill mental health and stigma regularly cited as a factor affecting access to early intervention services, focussing resources on school based stigma reduction strategies seems prudent. ‘Headucate’, a student society, designed a 50 minute workshop which aims to increase mental health literacy and decrease stigma.
Repeated, cross sectional surveys were carried out at three time points; 1) immediately before (n=77), 2) Immediately after (n=81) and 3) three months post workshop (n=73). The surveys were paper based versions of the Reported Intended Behaviours Score (RIBS) and Mental Health Knowledge Scale (MAKS) utilising a social distance scale.
Four year 10 classed (pupils aged 14-15) were recruited. Post hoc t-tests were carried out when one-way ANOVAS were significant.
Disorder knowledge (from MAKS) and intended contact (from RIBS) significantly increased between time points one and two (p<0.01 and <0.004 respectively) but then decreased.
Analysis of the question pertaining to knowing where to access help showed a statistically significant increase (p<0.001) between time points one and two and then a decrease at time three, albeit to a higher value than at time point one (3.45 compared to 3.13, P=0.088).
Headucate workshops offer a low resource option which is well accepted by students. Like other school based stigma reduction strategies, a dramatic increase was seen between immediately before and after indicating that the workshop resonates with the pupils, but there was little sustained change in attitudes.
Negative interactions between people and large carnivores are common and will probably increase as the human population and livestock production continue to expand. Livestock predation by wild carnivores can significantly affect the livelihoods of farmers, resulting in retaliatory killings and subsequent conflicts between local communities and conservationists. A better understanding of livestock predation patterns could help guide measures to improve both human relationships and coexistence with carnivores. Environmental variables can influence the intensity of livestock predation, are relatively easy to monitor, and could potentially provide a useful predictive framework for targeting mitigation. We chose lion predation of livestock as a model to test whether variations in environmental conditions trigger changes in predation. Analysing 6 years of incident reports for Pandamatenga village in Botswana, an area of high human–lion conflict, we used generalized linear models to show that significantly more attacks coincided with lower moonlight levels and temperatures, and attack severity increased significantly with extreme minimum temperatures. Furthermore, we found a delayed effect of rainfall: lower rainfall was followed by a significantly increased severity of attacks in the following month. Our results suggest that preventative measures, such as introducing deterrents or changing livestock management, could be implemented adaptively based on environmental conditions. This could be a starting point for investigating similar effects in other large carnivores, to reduce livestock attacks and work towards wider human–wildlife coexistence.
Introduction: The Institute of Medicine (IOM) has recommended that high-quality, evidence-based guidelines be developed for emergency medical services (EMS). The National Association of EMS Physicians (NAEMSP) has outlined a strategy that will see this task fulfilled, consisting of multiple working groups focused on all aspects of guideline development and implementation. A first step, and our objective, was a cataloguing and appraisal of the current guidelines targeting EMS providers. Methods: A systematic search of the literature was conducted in MEDLINE (1175), EMBASE (519), PubMed (14), Trip (416), and guidelines.gov (64) through May 1, 2016. Two independent reviewers screened titles for relevance to prehospital care, and then abstracts for essential guideline features, including a systematic review, a grading system, and an association between level of evidence and strength of recommendation. All disagreements were moderated by a third party. Citations meeting inclusion criteria were appraised with the AGREE II tool, which looks at six different domains of guideline quality, containing a total of 23 items rated from 1 to 7. Each guideline was appraised by three separate reviewers, and composite scores were calculated by averaging the scaled domain totals. Results: After primary (kappa 97%) and secondary (kappa 93%) screening, 49 guidelines were retained for full review. Only three guidelines obtained a score of >90%, the topics of which included aeromedical transport, analgesia in trauma, and resuscitation of avalanche victims. Only two guidelines scored between 80% and 90%, the topics of which included stroke and pediatric seizure management. One guideline, splinting in an austere environment, scored between 70% and 80%. Nine guidelines scored between 60% and 70%, the topics of which included ischemic stroke, cardiovascular life support, hemorrhage control, intubation, triage, hypothermia, and fibrinolytic use. Of the remaining guidelines, 14 scored between 50% and 60%, and 20 obtained a score of <50%. Conclusion: There are few high-quality, evidence-based guidelines in EMS. Of those that are published, the majority fail to meet established quality measures. Although a lack of randomized controlled trials (RCTs) conducted in the prehospital field continues to limit guideline development, suboptimal methodology is also commonplace within the existing literature.
Perennial grain crops are expected to sequester soil carbon (C) and improve soil health due to their large and extensive root systems. To examine the rate of initial soil C accumulation in a perennial grain crop, we compared soil under perennial intermediate wheatgrass (IWG) with that under annual winter wheat 4 years after the crops were first planted. In addition, we tested the effect of three nitrogen (N) sources on C pools: Low available N (Low N (Organic N); 90 kg N ha−1 poultry litter), moderately available N (Mid N; 90 kg N ha−1 urea) and high available N (High N; 135 kg N ha−1 urea). We measured aboveground C (grain + straw), and coarse and fine root C to a depth of 1 m. Particulate organic matter (POM-C), fractionated by size, was used to indicate labile and more stabilized soil C pools. At harvest, IWG had 1.9 times more straw C and up to 15 times more root C compared with wheat. There were no differences in the size of the large (6 mm–250 µm) or medium (250–53 µm) POM-C fractions between wheat and IWG (P > 0.05) in surface horizons (0–10 cm). Large POM-C under IWG ranged from 3.6 ± 0.3 to 4.0 ± 0.7 g C kg soil−1 across the three N rates, similar to wheat under which large POM-C ranged from 3.6 ± 1.4 to 4.7 ± 0.7 g C kg soil−1. Averaged across N level, medium POM-C was 11.1 ± 0.8 and 11.3 ± 0.7 g C kg soil−1 for IWG and wheat, respectively. Despite IWG's greater above and belowground biomass (to 70 cm), POM-C fractions in IWG and wheat were similar. Post-hoc power analysis revealed that in order to detect differences in the labile C pool at 0–10 cm with an acceptable power (~80%) a 15% difference would be required between wheat and IWG. This demonstrates that on sandy soils with low cation exchange capacity, perennial IWG will need to be in place for longer than 4 years in order to detect an accumulated soil C difference > 15%.
Universal screening for postpartum depression is recommended in many countries. Knowledge of whether the disclosure of depressive symptoms in the postpartum period differs across cultures could improve detection and provide new insights into the pathogenesis. Moreover, it is a necessary step to evaluate the universal use of screening instruments in research and clinical practice. In the current study we sought to assess whether the Edinburgh Postnatal Depression Scale (EPDS), the most widely used screening tool for postpartum depression, measures the same underlying construct across cultural groups in a large international dataset.
Ordinal regression and measurement invariance were used to explore the association between culture, operationalized as education, ethnicity/race and continent, and endorsement of depressive symptoms using the EPDS on 8209 new mothers from Europe and the USA.
Education, but not ethnicity/race, influenced the reporting of postpartum depression [difference between robust comparative fit indexes (∆*CFI) < 0.01]. The structure of EPDS responses significantly differed between Europe and the USA (∆*CFI > 0.01), but not between European countries (∆*CFI < 0.01).
Investigators and clinicians should be aware of the potential differences in expression of phenotype of postpartum depression that women of different educational backgrounds may manifest. The increasing cultural heterogeneity of societies together with the tendency towards globalization requires a culturally sensitive approach to patients, research and policies, that takes into account, beyond rhetoric, the context of a person's experiences and the context in which the research is conducted.
Epidemiology formed the basis of ‘the Barker hypothesis’, the concept of ‘developmental programming’ and today’s discipline of the Developmental Origins of Health and Disease (DOHaD). Animal experimentation provided proof of the underlying concepts, and continues to generate knowledge of underlying mechanisms. Interventions in humans, based on DOHaD principles, will be informed by experiments in animals. As knowledge in this discipline has accumulated, from studies of humans and other animals, the complexity of interactions between genome, environment and epigenetics, has been revealed. The vast nature of programming stimuli and breadth of effects is becoming known. As a result of our accumulating knowledge we now appreciate the impact of many variables that contribute to programmed outcomes. To guide further animal research in this field, the Australia and New Zealand DOHaD society (ANZ DOHaD) Animals Models of DOHaD Research Working Group convened at the 2nd Annual ANZ DOHaD Congress in Melbourne, Australia in April 2015. This review summarizes the contributions of animal research to the understanding of DOHaD, and makes recommendations for the design and conduct of animal experiments to maximize relevance, reproducibility and translation of knowledge into improving health and well-being.
Historically, alloy development with better radiation performance has been focused on traditional alloys with one or two principal element(s) and minor alloying elements, where enhanced radiation resistance depends on microstructural or nanoscale features to mitigate displacement damage. In sharp contrast to traditional alloys, recent advances of single-phase concentrated solid solution alloys (SP-CSAs) have opened up new frontiers in materials research. In these alloys, a random arrangement of multiple elemental species on a crystalline lattice results in disordered local chemical environments and unique site-to-site lattice distortions. Based on closely integrated computational and experimental studies using a novel set of SP-CSAs in a face-centered cubic structure, we have explicitly demonstrated that increasing chemical disorder can lead to a substantial reduction in electron mean free paths, as well as electrical and thermal conductivity, which results in slower heat dissipation in SP-CSAs. The chemical disorder also has a significant impact on defect evolution under ion irradiation. Considerable improvement in radiation resistance is observed with increasing chemical disorder at electronic and atomic levels. The insights into defect dynamics may provide a basis for understanding elemental effects on evolution of radiation damage in irradiated materials and may inspire new design principles of radiation-tolerant structural alloys for advanced energy systems.
Many adults with autism spectrum disorder (ASD) remain undiagnosed. Specialist assessment clinics enable the detection of these cases, but such services are often overstretched. It has been proposed that unnecessary referrals to these services could be reduced by prioritizing individuals who score highly on the Autism-Spectrum Quotient (AQ), a self-report questionnaire measure of autistic traits. However, the ability of the AQ to predict who will go on to receive a diagnosis of ASD in adults is unclear.
We studied 476 adults, seen consecutively at a national ASD diagnostic referral service for suspected ASD. We tested AQ scores as predictors of ASD diagnosis made by expert clinicians according to International Classification of Diseases (ICD)-10 criteria, informed by the Autism Diagnostic Observation Schedule-Generic (ADOS-G) and Autism Diagnostic Interview-Revised (ADI-R) assessments.
Of the participants, 73% received a clinical diagnosis of ASD. Self-report AQ scores did not significantly predict receipt of a diagnosis. While AQ scores provided high sensitivity of 0.77 [95% confidence interval (CI) 0.72–0.82] and positive predictive value of 0.76 (95% CI 0.70–0.80), the specificity of 0.29 (95% CI 0.20–0.38) and negative predictive value of 0.36 (95% CI 0.22–0.40) were low. Thus, 64% of those who scored below the AQ cut-off were ‘false negatives’ who did in fact have ASD. Co-morbidity data revealed that generalized anxiety disorder may ‘mimic’ ASD and inflate AQ scores, leading to false positives.
The AQ's utility for screening referrals was limited in this sample. Recommendations supporting the AQ's role in the assessment of adult ASD, e.g. UK NICE guidelines, may need to be reconsidered.
Phase coherent interferometers with intercontinental baselines became possible because of the development of stable frequency standards. With sufficiently stable frequency standards, no connection is necessary between the two ends of an interferometer. The first VLBI experiments were conducted by a group at the University of Florida who used an intensity interferometer with independent tape recorders for observations of Jupiter. Later, they changed to a coherent system using crystal-controlled oscillators. Since then, several interferometer systems have been developed. A Canadian group developed a system using video tape recorders at each end of the interferometer. They recorded the data in analogue form and managed to bring the two tapes together and to synchronize them to an accuracy of better than a microsecond. After synchronization, the outputs were combined and fringes extracted. Their system has a bandwidth of about 4 MHz. No-one else has attempted a wide-band analogue system.
This study aimed to monitor the microbiological effect of cleaning near-patient sites over a 48-hour period with a novel disinfectant, electrolyzed water.
One ward dedicated to acute care of the elderly population in a district general hospital in Scotland.
Lockers, left and right cotsides, and overbed tables in 30 bed spaces were screened for aerobic colony count (ACC), methicillin-susceptible Staphylococcus aureus (MSSA), and methicillin-resistant S. aureus (MRSA) before cleaning with electrolyzed water. Sites were rescreened at varying intervals from 1 to 48 hours after cleaning. Microbial growth was quantified as colony-forming units (CFUs) per square centimeter and presence or absence of MSSA and MRSA at each site. The study was repeated 3 times at monthly intervals.
There was an early and significant reduction in average ACC (360 sampled sites) from a before-cleaning level of 4.3 to 1.65 CFU/cm2 at 1 hour after disinfectant cleaning (P <.0001). Average counts then increased to 3.53 CFU/cm2 at 24 hours and 3.68 CFU/cm2 at 48 hours. Total MSSA/MRSA (34 isolates) decreased by 71% at 4 hours after cleaning but then increased to 155% (53 isolates) of precleaning levels at 24 hours.
Cleaning with electrolyzed water reduced ACC and staphylococci on surfaces beside patients. ACC remained below precleaning levels at 48 hours, but MSSA/MRSA counts exceeded original levels at 24 hours after cleaning. Although disinfectant cleaning quickly reduces bioburden, additional investigation is required to clarify the reasons for rebound contamination of pathogens at near-patient sites.
Infect Control Hosp Epidemiol 2014;35(12):1505–1510
This paper presents an integrated design and costing method for large stiffened panels for the purpose of investigating the influence and interaction of lay-up technology and production rate on manufacturing cost. A series of wing cover panels (≈586kg, 19·9m2) have been sized with realistic requirements considering manual and automated lay-up routes. The integrated method has enabled the quantification of component unit cost sensitivity to changes in annual production rate and employed equipment maximum deposition rate. Moreover the results demonstrate the interconnected relationship between lay-up process and panel design, and unit cost. The optimum unit cost solution when using automated lay-up is a combination of the minimum deposition rate and minimum number of lay-up machines to meet the required production rate. However, the location of the optimum unit cost, at the boundaries between the number of lay-up machines required, can make unit cost very sensitive to small changes in component design, production rate, and equipment maximum deposition rate.
Previous evidence has suggested an association between cryptosporidiosis and consumption of unfiltered drinking water from Loch Katrine in Scotland. Before September 2007, the water was only micro-strained and chlorinated; however, since that time, coagulation and rapid gravity filtration have been installed. In order to determine risk factors associated with cryptosporidiosis, including drinking water, we analysed data on microbiologically confirmed cases of cryptosporidiosis from 2004 to 2010. We identified an association between the incidence of cryptosporidiosis and unfiltered Loch Katrine drinking water supplied to the home (odds ratio 1·86, 95% confidence interval 1·11–3·11, P = 0·019). However, while filtration appears to be associated with initially reduced rates of cryptosporidiosis, evidence suggests it may paradoxically make those consumers more susceptible to other transmission routes in the long-term. These findings support implementation of similar treatment for other unfiltered drinking-water supplies, as a means of reducing cryptosporidiosis associated with drinking water.
An analysis was undertaken to measure age-specific vaccine effectiveness (VE) of 2010/11 trivalent seasonal influenza vaccine (TIV) and monovalent 2009 pandemic influenza vaccine (PIV) administered in 2009/2010. The test-negative case-control study design was employed based on patients consulting primary care. Overall TIV effectiveness, adjusted for age and month, against confirmed influenza A(H1N1)pdm 2009 infection was 56% (95% CI 42–66); age-specific adjusted VE was 87% (95% CI 45–97) in <5-year-olds and 84% (95% CI 27–97) in 5- to 14-year-olds. Adjusted VE for PIV was only 28% (95% CI −6 to 51) overall and 72% (95% CI 15–91) in <5-year-olds. For confirmed influenza B infection, TIV effectiveness was 57% (95% CI 42–68) and in 5- to 14-year-olds 75% (95% CI 32–91). TIV provided moderate protection against the main circulating strains in 2010/2011, with higher protection in children. PIV administered during the previous season provided residual protection after 1 year, particularly in the <5 years age group.
Fifty years after the hyporheic zone was first defined (Orghidan, 1959), there are still gaps in the knowledge regarding the role of biodiversity in hyporheic processes. First, some methodological questions remained unanswered regarding the interactions between biodiversity and physical processes, both for the study of habitat characteristics and interactions at different scales. Furthermore, many questions remain to be addressed to help inform our understanding of invertebrate community dynamics, especially regarding the trophic niches of organisms, the functional groups present within sediment, and their temporal changes. Understanding microbial community dynamics would require investigations about their relationship with the physical characteristics of the sediment, their diversity, their relationship with metabolic pathways, their interactions with invertebrates, and their response to environmental stress. Another fundamental research question is that of the importance of the hyporheic zone in the global metabolism of the river, which must be explored in relation to organic matter recycling, the effects of disturbances, and the degradation of contaminants. Finally, the application of this knowledge requires the development of methods for the estimation of hydrological exchanges, especially for the management of sediment clogging, the optimization of self-purification, and the integration of climate change in environmental policies. The development of descriptors of hyporheic zone health and of new metrology is also crucial to include specific targets in water policies for the long-term management of the system and a clear evaluation of restoration strategies.