To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The aim of this study is to enrich public health emergency management (PHEM) curricula and increase the workforce readiness of graduates through the implementation of an innovative curriculum structure centered around simulation and the creation of authentic learning experiences into a mastery-based Disaster Preparedness graduate certificate program launched in 2016 at the Colorado School of Public Health. Learners progress through a sequence of increasingly complex discussion and operations-based exercises designed to align with training methodologies used by future employers in the disaster response field, covering PHEM fundamentals and domestic and international disaster preparedness and response. Preliminary feedback is overwhelmingly positive, equating the experience to securing an internship. Embedding simulation-based exercises and authentic learning environments into graduate curricula exposes learners to diverse disaster scenarios, provides occasion for practicing critical thinking and dynamic problem solving, increases familiarity with anticipated emergency situations, and builds the confidence necessary for exercising judgment in a real-world situation. This novel curriculum should serve as a model for graduate programs wishing to enrich traditional training tactics using a typical school of public health support and alignment with community resources. (Disaster Med Public Health Preparedness. 2019;13:777–781)
Measles is a target for elimination in all six WHO regions by 2020, and over the last decade, there has been considerable progress towards this goal. Surveillance is recognised as a cornerstone of elimination programmes, allowing early identification of outbreaks, thus enabling control and preventing re-emergence. Fever–rash surveillance is increasingly available across WHO regions, and this symptom-based reporting is broadly used for measles surveillance. However, as measles control increases, symptom-based cases are increasingly likely to reflect infection with other diseases with similar symptoms such as rubella, which affects the same populations, and can have a similar seasonality. The WHO recommends that cases from suspected measles outbreaks be laboratory-confirmed, to identify ‘true’ cases, corresponding to measles IgM titres exceeding a threshold indicative of infection. Although serological testing for IgM has been integrated into the fever–rash surveillance systems in many countries, the logistics of sending in every suspected case are often beyond the health system's capacity. We show how age data from serologically confirmed cases can be leveraged to infer the status of non-tested samples, thus strengthening the information we can extract from symptom-based surveillance. Applying an age-specific confirmation model to data from three countries with divergent epidemiology across Africa, we identify the proportion of cases that need to be serologically tested to achieve target levels of accuracy in estimated infected numbers and discuss how this varies depending on the epidemiological context. Our analysis provides an approach to refining estimates of incidence leveraging all available data, which has the potential to improve allocation of resources, and thus contribute to rapid and efficient control of outbreaks.
Background: Continuous video-EEG (cvEEG) monitoring is the standard of care for diagnosis and management of neonatal seizures. However, it is labour-intensive. We aimed to establish consistency in monitoring of newborns utilising NICU nurses. Methods: Neonatal nurses were trained to apply scalp electrodes, troubleshoot technical issues. Guidelines, checklists and visual training modules were developed. A central network system allowed remote access to the cvEEGs by the epileptologist for timely interpretation and feedback. We compared 100 infants with moderate to severe HIE before and after the training program. Results: 192 cvEEGs were performed. Of the 100 infants compared; time to initiate brain monitoring decreased by average of 31.5 hours, in electrographic seizure detection increased(20% compared to 34% a), seizure clinical misdiagnosis decreased (65% compared to 36% ), and Anti-Seizure burden decreased. Conclusions: Training experienced NICU nurses to set-up, start and monitor cvEEG can decrease the time to initiate cvEEG which may lead to better seizure diagnosis and management.
Background: Despite advances in neonatal care, neonates with moderate to severe HIE are at high risk of mortality and morbidity. we report the impact of a dedicated NNCC team on short term mortality and morbidities. Methods: A retrospective cohort study on neonates with moderate to serve HIE between July 1st 2008 and December 31st 2017. primary outcome : a composite of death and/or brain injury on MRI. Secondary outcomes: rate of cooling, length of hospital stay, anti-seizure medication burden, and use of inotropes. A regression analysis was done adjusting for gestational age, birth weight, gender, out-born status, Apgar score at 10 minutes, cord blood pH, and HIE clinical staging Results: 216 neonates were included, 109 before NNCC implementation, and 107 thereafter. NNCC program resulted in reduction in the primary outcome (AOR: 0.28, CI: 0.14-0.54, p<0.001) and brain injury (AOR: 0.28, CI: 0.14-0.55, p<0.001). It decreased average length of stay/infants by 5 days (p=0.03), improved cooling rate (73% compared to 93% , p <0.001), reduced: seizure misdiagnosis (71% compared to 23%, P <0.001), anti-seizure medication burden (P = 0.001), and inotrope use (34% compared to 53%, p=0.004) Conclusions: NNCC program decreased mortality and brain injury , shortened the length of hospital stay and improved care of neonates with significant HIE.
Rubella virus infection typically presents as a mild illness in children; however, infection during pregnancy may cause the birth of an infant with congenital rubella syndrome (CRS). As of February 2017, India began introducing rubella-containing vaccine (RCV) into the public-sector childhood vaccination programme. Low-level RCV coverage among children over several years can result in an increase in CRS incidence by increasing the average age of infection without sufficiently reducing rubella incidence. We evaluated the impact of RCV introduction on CRS incidence across India's heterogeneous demographic and epidemiological contexts. We used a deterministic age-structured model that reflects Indian states’ rural and urban area-specific demography and vaccination coverage levels to simulate rubella dynamics and estimate CRS incidence with and without RCV introduction to the public sector. Our analysis suggests that current low-level private-sector vaccination has already slightly increased the burden of CRS in India. We additionally found that the effect of public-sector RCV introduction depends on the basic reproductive number, R0, of rubella. If R0 is five, a value empirically estimated from an array of settings, CRS incidence post-RCV introduction will likely decrease. However, if R0 is seven or nine, some states may experience short-term or annual increases in CRS, even if a long-term total reduction in cases (30 years) is expected. Investment in population-based serological surveys and India's fever/rash surveillance system will be key to monitoring the success of the vaccination programme.
To determine the patterns and predictors of treatment response trajectories for veterans with post-traumatic stress disorder (PTSD).
Conditional latent growth mixture modelling was used to identify classes and predictors of class membership. In total, 2686 veterans treated for PTSD between 2002 and 2015 across 14 hospitals in Australia completed the PTSD Checklist at intake, discharge, and 3 and 9 months follow-up. Predictor variables included co-morbid mental health problems, relationship functioning, employment and compensation status.
Five distinct classes were found: those with the most severe PTSD at intake separated into a relatively large class (32.5%) with small change, and a small class (3%) with a large change. Those with slightly less severe PTSD separated into one class comprising 49.9% of the total sample with large change effects, and a second class comprising 7.9% with extremely large treatment effects. The final class (6.7%) with least severe PTSD at intake also showed a large treatment effect. Of the multiple predictor variables, depression and guilt were the only two found to predict differences in response trajectories.
These findings highlight the importance of assessing guilt and depression prior to treatment for PTSD, and for severe cases with co-morbid guilt and depression, considering an approach to trauma-focused therapy that specifically targets guilt and depression-related cognitions.
The Middle East respiratory syndrome coronavirus (MERS-CoV) is caused by a novel coronavirus discovered in 2012. Since then, 1806 cases, including 564 deaths, have been reported by the Kingdom of Saudi Arabia (KSA) and affected countries as of 1 June 2016. Previous literature attributed increases in MERS-CoV transmission to camel breeding season as camels are likely the reservoir for the virus. However, this literature review and subsequent analysis indicate a lack of seasonality. A retrospective, epidemiological cluster analysis was conducted to investigate increases in MERS-CoV transmission and reports of household and nosocomial clusters. Cases were verified and associations between cases were substantiated through an extensive literature review and the Armed Forces Health Surveillance Branch's Tiered Source Classification System. A total of 51 clusters were identified, primarily nosocomial (80·4%) and most occurred in KSA (45·1%). Clusters corresponded temporally with the majority of periods of greatest incidence, suggesting a strong correlation between nosocomial transmission and notable increases in cases.
Job loss, debt and financial difficulties are associated with increased risk of mental illness and suicide in the general population. Interventions targeting people in debt or unemployed might help reduce these effects.
We searched MEDLINE, Embase, The Cochrane Library, Web of Science, and PsycINFO (January 2016) for randomized controlled trials (RCTs) of interventions to reduce the effects of unemployment and debt on mental health in general population samples. We assessed papers for inclusion, extracted data and assessed risk of bias.
Eleven RCTs (n = 5303 participants) met the inclusion criteria. All recruited participants were unemployed. Five RCTs assessed ‘job-club’ interventions, two cognitive behaviour therapy (CBT) and a single RCT assessed each of emotional competency training, expressive writing, guided imagery and debt advice. All studies were at high risk of bias. ‘Job club’ interventions led to improvements in levels of depression up to 2 years post-intervention; effects were strongest among those at increased risk of depression (improvements of up to 0.2–0.3 s.d. in depression scores). There was mixed evidence for effectiveness of group CBT on symptoms of depression. An RCT of debt advice found no effect but had poor uptake. Single trials of three other interventions showed no evidence of benefit.
‘Job-club’ interventions may be effective in reducing depressive symptoms in unemployed people, particularly those at high risk of depression. Evidence for CBT-type interventions is mixed; further trials are needed. However the studies are old and at high risk of bias. Future intervention studies should follow CONSORT guidelines and address issues of poor uptake.
North American studies show bipolar disorder is associated with elevated
rates of problem gambling; however, little is known about rates in the
different presentations of bipolar illness.
To determine the prevalence and distribution of problem gambling in
people with bipolar disorder in the UK.
The Problem Gambling Severity Index was used to measure gambling problems
in 635 participants with bipolar disorder.
Moderate to severe gambling problems were four times higher in people
with bipolar disorder than in the general population, and were associated
with type 2 disorder (OR = 1.74, P = 0.036), history of
suicidal ideation or attempt (OR = 3.44, P = 0.02) and
rapid cycling (OR = 2.63, P = 0.008).
Approximately 1 in 10 patients with bipolar disorder may be at moderate
to severe risk of problem gambling, possibly associated with suicidal
behaviour and a rapid cycling course. Elevated rates of gambling problems
in type 2 disorder highlight the probable significance of modest but
unstable mood disturbance in the development and maintenance of such
A randomised controlled trial (RCT) of high-dose v. low-dose fish oil in recent-onset rheumatoid arthritis (RA) demonstrated that the group allocated to high-dose fish oil had increased remission and decreased failure of disease-modifying anti-rheumatic drug (DMARD) therapy. This study examines the relationships between plasma phospholipid levels of the n-3 fatty acids in fish oil, EPA and DHA, and remission and DMARD use in recent-onset RA. EPA and DHA were measured in blood samples from both groups of the RCT. The data were analysed as a single cohort, and Cox proportional hazards models were used to examine relationships between plasma phospholipid (PL) EPA and DHA and various outcome measures. When analysed as a single cohort, plasma PL EPA was related to time to remission, with a one unit increase in EPA (1 % total fatty acids) associated with a 12 % increase in the probability of remission at any time during the study period (hazard ratio (HR)=1·12; 95 % CI 1·02, 1·23; P=0·02). Adjustment for smoking, anti-cyclic citrullinated peptide antibodies and ‘shared epitope’ HLA-DR allele status did not change the HR. Plasma PL EPA, adjusted for the same variables, was negatively related to time to DMARD failure (HR=0·85; 95 % CI 0·72, 0·99; P=0·047). The HR for DHA and time to remission or DMARD failure were similar in magnitude to those for EPA, but not statistically significant. Biomarkers of n-3 status, such as plasma PL EPA, have the potential to predict clinical outcomes relevant to standard drug treatment of RA patients.
Methylation of the fragile X mental retardation 1 (FMR1) exon 1/intron 1 boundary positioned fragile X related epigenetic element 2 (FREE2), reveals skewed X-chromosome inactivation (XCI) in fragile X syndrome full mutation (FM: CGG > 200) females. XCI skewing has been also linked to abnormal X-linked gene expression with the broader clinical impact for sex chromosome aneuploidies (SCAs). In this study, 10 FREE2 CpG sites were targeted using methylation specific quantitative melt analysis (MS-QMA), including 3 sites that could not be analysed with previously used EpiTYPER system. The method was applied for detection of skewed XCI in FM females and in different types of SCA. We tested venous blood and saliva DNA collected from 107 controls (CGG < 40), and 148 FM and 90 SCA individuals. MS-QMA identified: (i) most SCAs if combined with a Y chromosome test; (ii) locus-specific XCI skewing towards the hypomethylated state in FM females; and (iii) skewed XCI towards the hypermethylated state in SCA with 3 or more X chromosomes, and in 5% of the 47,XXY individuals. MS-QMA output also showed significant correlation with the EpiTYPER reference method in FM males and females (P < 0.0001) and SCAs (P < 0.05). In conclusion, we demonstrate use of MS-QMA to quantify skewed XCI in two applications with diagnostic utility.
Solar irradiance and precipitation are the most likely drivers of the seasonal variation of net primary productivity (NPP) in tropical forests. Since their roles remain poorly understood, we use litter traps, dendrometer bands and census data collected from one hectare permanent plots to quantify the seasonality of above-ground NPP components and weather parameters in 13 sites distributed along a 2800-m altitudinal gradient ranging from lowland Amazonia to the high Andes. We combine canopy leaf area index and litterfall data to describe the seasonality of canopy production. We hypothesize that solar irradiance is the primary driver of canopy phenology in wetter sites, whereas precipitation drives phenology in drier systems. The seasonal rhythm of canopy NPP components is in synchrony with solar irradiance at all altitudes. Leaf litterfall peaks in the late dry season, both in lowland (averaging 0.54 ± 0.08 Mg C ha y−1, n = 5) and montane forests (averaging 0.29 ± 0.04 Mg C ha y−1, n = 8). Peaks in above-ground coarse woody NPP appears to be triggered by the onset of rainfall in seasonal lowland rain forests (averaging 0.26 ± 0.04 Mg C ha y−1, n = 5, in November), but not in montane cloud forests.
A calcium phosphate ceramic waste-form has been developed at AWE for the immobilisation of chloride containing wastes arising from the pyrochemical reprocessing of plutonium. In order to determine the long term durability of the waste-form, aging trials have been carried out at PNNL. Ceramics were prepared using Pu-239 and -238, these were characterised by PXRD at regular intervals and Single Pass Flow Through (SPFT) tests after approximately 5 yrs.
While XRD indicated some loss of crystallinity in the Pu-238 samples after exposure to 2.8 x 1018 α decays, SPFT tests indicated that accelerated aging had not had a detrimental effect on the durability of Pu-238 samples compared to Pu-239 waste-forms.
The Nuclear Decommissioning Authority (NDA) is developing a safety case for the long-term management of higher activity wastes. This includes safety assessments of transport to and operations at the repository. One of the main faults and hazards to be considered is waste package response to impact accidents.
The criteria of impact performance for waste packages are based upon activity release of particulates generated from the break up of the waste form during impact. The NDA approach to impact performance is based upon waste package response from finite element modelling in combination with break-up tests.
Previous break up research commissioned by the NDA has concentrated on commercial graphite and glass samples. These extended studies, undertaken by the National Nuclear Laboratory in collaboration with the Department of Aerosol Technology of the Fraunhofer Institute of Toxicology and Experimental Medicine, provide break-up data specific to nuclear facilities and waste materials. These include archived unirrradiated graphite used to construct Magnox reactor cores and reflectors, simulant high level waste glass, selected grout formulations and selected metal-in-grout formulations.
Programmes for the geological disposal of radioactive wastes are by nature extremely complex. A structured approach for making and documenting varied kinds of decisions is required to support programme design and implementation. At each programme stage, the decision-making process must be able to identify and justify key priorities for work, to reduce uncertainties.
To support structured decision-making evidence support logic (ESL) has been developed and applied to varied complex projects, nationally and internationally, in several industries. Evidence support logic involves breaking down a hypothesis that informs a decision into a hierarchical 'decision tree'. Examples of hypotheses are 'the geology associated with site x will provide sufficient disposal capacity', 'container x will contain waste form y for z years' and 'the engineered barrier system will provide the required safety functions'. Independent evaluations of confidence 'for' and 'against' bottom-level hypotheses allow the level of remaining uncertainty (or conflict) to be recognized explicitly, and the overall confidence (and uncertainty) relevant to the overall decision, and key sensitivities, to be represented clearly and succinctly.
Thus ESL can help (1) break down decisions into a manageable and logical structure, assisting clear presentation; (2) identify key uncertainties and sensitivities to inform prioritization; and (3) test whether the outcomes of specific studies have improved confidence.