To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Measles is a target for elimination in all six WHO regions by 2020, and over the last decade, there has been considerable progress towards this goal. Surveillance is recognised as a cornerstone of elimination programmes, allowing early identification of outbreaks, thus enabling control and preventing re-emergence. Fever–rash surveillance is increasingly available across WHO regions, and this symptom-based reporting is broadly used for measles surveillance. However, as measles control increases, symptom-based cases are increasingly likely to reflect infection with other diseases with similar symptoms such as rubella, which affects the same populations, and can have a similar seasonality. The WHO recommends that cases from suspected measles outbreaks be laboratory-confirmed, to identify ‘true’ cases, corresponding to measles IgM titres exceeding a threshold indicative of infection. Although serological testing for IgM has been integrated into the fever–rash surveillance systems in many countries, the logistics of sending in every suspected case are often beyond the health system's capacity. We show how age data from serologically confirmed cases can be leveraged to infer the status of non-tested samples, thus strengthening the information we can extract from symptom-based surveillance. Applying an age-specific confirmation model to data from three countries with divergent epidemiology across Africa, we identify the proportion of cases that need to be serologically tested to achieve target levels of accuracy in estimated infected numbers and discuss how this varies depending on the epidemiological context. Our analysis provides an approach to refining estimates of incidence leveraging all available data, which has the potential to improve allocation of resources, and thus contribute to rapid and efficient control of outbreaks.
Although measles incidence has reached historic lows in many parts of the world, the disease still causes substantial morbidity globally. Even where control programs have succeeded in driving measles locally extinct, unless vaccination coverage is maintained at extremely high levels, susceptible numbers may increase sufficiently to spark large outbreaks. Human mobility will drive potentially infectious contacts and interact with the landscape of susceptibility to determine the pattern of measles outbreaks. These interactions have proved difficult to characterise empirically. We explore the degree to which new sources of data combined with existing public health data can be used to evaluate the landscape of immunity and the role of spatial movement for measles introductions by retrospectively evaluating our ability to predict measles outbreaks in vaccinated populations. Using inferred spatial patterns of accumulation of susceptible individuals and travel data, we predicted the timing of epidemics in each district of Pakistan during a large measles outbreak in 2012–2013 with over 30 000 reported cases. We combined these data with mobility data extracted from over 40 million mobile phone subscribers during the same time frame in the country to quantify the role of connectivity in the spread of measles. We investigate how different approaches could contribute to targeting vaccination efforts to reach districts before outbreaks started. While some prediction was possible, accuracy was low and we discuss key uncertainties linked to existing data streams that impede such inference and detail what data might be necessary to robustly infer timing of epidemics.
Background: Despite advances in neonatal care, neonates with moderate to severe HIE are at high risk of mortality and morbidity. we report the impact of a dedicated NNCC team on short term mortality and morbidities. Methods: A retrospective cohort study on neonates with moderate to serve HIE between July 1st 2008 and December 31st 2017. primary outcome : a composite of death and/or brain injury on MRI. Secondary outcomes: rate of cooling, length of hospital stay, anti-seizure medication burden, and use of inotropes. A regression analysis was done adjusting for gestational age, birth weight, gender, out-born status, Apgar score at 10 minutes, cord blood pH, and HIE clinical staging Results: 216 neonates were included, 109 before NNCC implementation, and 107 thereafter. NNCC program resulted in reduction in the primary outcome (AOR: 0.28, CI: 0.14-0.54, p<0.001) and brain injury (AOR: 0.28, CI: 0.14-0.55, p<0.001). It decreased average length of stay/infants by 5 days (p=0.03), improved cooling rate (73% compared to 93% , p <0.001), reduced: seizure misdiagnosis (71% compared to 23%, P <0.001), anti-seizure medication burden (P = 0.001), and inotrope use (34% compared to 53%, p=0.004) Conclusions: NNCC program decreased mortality and brain injury , shortened the length of hospital stay and improved care of neonates with significant HIE.
Background: Continuous video-EEG (cvEEG) monitoring is the standard of care for diagnosis and management of neonatal seizures. However, it is labour-intensive. We aimed to establish consistency in monitoring of newborns utilising NICU nurses. Methods: Neonatal nurses were trained to apply scalp electrodes, troubleshoot technical issues. Guidelines, checklists and visual training modules were developed. A central network system allowed remote access to the cvEEGs by the epileptologist for timely interpretation and feedback. We compared 100 infants with moderate to severe HIE before and after the training program. Results: 192 cvEEGs were performed. Of the 100 infants compared; time to initiate brain monitoring decreased by average of 31.5 hours, in electrographic seizure detection increased(20% compared to 34% a), seizure clinical misdiagnosis decreased (65% compared to 36% ), and Anti-Seizure burden decreased. Conclusions: Training experienced NICU nurses to set-up, start and monitor cvEEG can decrease the time to initiate cvEEG which may lead to better seizure diagnosis and management.
Rubella virus infection typically presents as a mild illness in children; however, infection during pregnancy may cause the birth of an infant with congenital rubella syndrome (CRS). As of February 2017, India began introducing rubella-containing vaccine (RCV) into the public-sector childhood vaccination programme. Low-level RCV coverage among children over several years can result in an increase in CRS incidence by increasing the average age of infection without sufficiently reducing rubella incidence. We evaluated the impact of RCV introduction on CRS incidence across India's heterogeneous demographic and epidemiological contexts. We used a deterministic age-structured model that reflects Indian states’ rural and urban area-specific demography and vaccination coverage levels to simulate rubella dynamics and estimate CRS incidence with and without RCV introduction to the public sector. Our analysis suggests that current low-level private-sector vaccination has already slightly increased the burden of CRS in India. We additionally found that the effect of public-sector RCV introduction depends on the basic reproductive number, R0, of rubella. If R0 is five, a value empirically estimated from an array of settings, CRS incidence post-RCV introduction will likely decrease. However, if R0 is seven or nine, some states may experience short-term or annual increases in CRS, even if a long-term total reduction in cases (30 years) is expected. Investment in population-based serological surveys and India's fever/rash surveillance system will be key to monitoring the success of the vaccination programme.
Horn growth rate does not appear to be related to the amino acid profile of the major protein source for dairy cattle (Offer, Logue & Roberts, 1997), but it is possible that sulphur amino acids are limiting in early lactation, when the homeorhetic drive to milk production is most extreme. Supplementation of a high production ration for dairy cows with protected methionine should increase milk production, and allow any sulphur amino acid limitation on horn growth to be alleviated (Mengal, Galbraith, Souri & Scaife, 1997).
A total of 60 in-calf Holstein heifers were divided into two groups in a randomised block design. The study commenced approximately three weeks pre-calving until 26 weeks post calving, and animals were housed in one of three systems for the duration. Two diets were formulated based on a grass silage:maize silage mixture (50:50 DM basis), with rolled wheat, soya bean meal, sugar beet feed and rapeseed meal and were offered from approximately five days post calving for the remainder of the 26 week period.
Measles is a major cause of childhood morbidity and mortality in many parts of the world. Estimates of the case-fatality rate (CFR) of measles have varied widely from place to place, as well as in the same location over time. Amongst populations that have experienced famine or armed conflict, measles CFR can be especially high, although past work has mostly focused on refugee populations. Here, we estimate measles CFR between 1970 and 1991 in a rural region of Bangladesh, which experienced civil war and famine in the 1970s. We use historical measles mortality data and a mechanistic model of measles transmission to estimate the CFR of measles. We first demonstrate the ability of this model to recover the CFR in the absence of incidence data, using simulated mortality data. Our method produces CFR estimates that correspond closely to independent estimates from surveillance data and we can capture both the magnitude and the change in CFR suggested by these previous estimates. We use this method to quantify the sharp increase in CFR that resulted in a large number of deaths during a measles outbreak in the region in 1976. Most of the children who died during this outbreak were born during a famine in 1974, or in the 2 years preceding the famine. Our results suggest that the period of turmoil during and after the 1971 war and the sustained effects of the famine, is likely to have contributed to the high fatality burden of the 1976 measles outbreak in Matlab.
This study examines the effect of subglacial abrasion on the basal sliding term of the gravitational energy balance of the dynamic, temperate Nisqually Glacier on Mount Rainier, Washington, U.S.A. Subglacial water flux is estimated as 3 × 107 m3 a–1 and suspended sediment flux as 3 × 107 kg a–1. Suspended-sediment flux is assumed to represent, within an order of magnitude, the annual mass eroded by subglacial abrasion.
Subglacial abrasion involves both brittle fracture and plastic deformation. Field observations of bas-relief and grooved depression striations appear to have exact counterparts in rock mechanics experiments approximating subglacial velocities and normal stresses. Boulton's ([Cl974]) abrasion model and a new attritivity model proposed herein are shown to predict subglacial abrasion-rates within the limits of natural variability and the error range of measurements. The first crude gravitational energy balance for lower Nisqually Glacier (1.96 km2) is attempted and probably has only order-of-magnitude accuracy. The importance of subglacial abrasion in dissipating basal sliding energy at Nisqually Glacier is confirmed.
As part of a program to study surge-type glaciers, a radar-depth survey, using a frequency of 620 MHz, has been made of Trapridge Glacier, Yukon Territory. Soundings were taken at 26 locations on the glacier surface and a maximum ice thickness of 143 m was measured. A rapid change in surface slope in the lower ablation region marks the boundary between active and stagnant ice and is suggestive of an “ice dam” or the water “collection zone” postulated by Robin and Weertman for surging glaciers.
Laboratory pH analyses of glacial melt waters are unrepresentative of in situ values, due primarily to CO2 gas exchange between the sample and the atmosphere, and solute enrichment from chemical reaction with sediment and colloidal particles. A method is presented which enables field pH measurements that are reproducible within ±0.04 pH units to be made in glacial melt waters, using commonly available digital pH meters with combination electrodes.
During initial spring snow melt in May 1981 at Gornergletscher, Switzerland, melt waters in the proglacial stream leaving the glacier terminus were oversaturated with respect to atmospheric p(CO2), and rapidly increased pH during CO2 outgassing at in situ temperature and pressure. Summer ice melt from glaciers which are temperate in the ablation zone are usually undersaturated by about ten times with respect to atmospheric p(CO2), and rapidly lower their pH values to achieve equilibrium upon encountering the atmosphere, as observed at Gornergletscher during July and August 1981. Gornergletscher summer proglacial stream waters, sometimes show pH increases from rock weathering, with the rate limited by the transfer rate of CO2 across the air-water interface to drive the weathering reactions. Throughout the year, any water parcel at equilibrium with atmospheric CO2 is generally at an equilibrium pH value, if filtration prohibits solute enrichment. For these reasons, laboratory pH measurements are unacceptable for quantitative studies of melt-water chemistry and should be discontinued.
Job loss, debt and financial difficulties are associated with increased risk of mental illness and suicide in the general population. Interventions targeting people in debt or unemployed might help reduce these effects.
We searched MEDLINE, Embase, The Cochrane Library, Web of Science, and PsycINFO (January 2016) for randomized controlled trials (RCTs) of interventions to reduce the effects of unemployment and debt on mental health in general population samples. We assessed papers for inclusion, extracted data and assessed risk of bias.
Eleven RCTs (n = 5303 participants) met the inclusion criteria. All recruited participants were unemployed. Five RCTs assessed ‘job-club’ interventions, two cognitive behaviour therapy (CBT) and a single RCT assessed each of emotional competency training, expressive writing, guided imagery and debt advice. All studies were at high risk of bias. ‘Job club’ interventions led to improvements in levels of depression up to 2 years post-intervention; effects were strongest among those at increased risk of depression (improvements of up to 0.2–0.3 s.d. in depression scores). There was mixed evidence for effectiveness of group CBT on symptoms of depression. An RCT of debt advice found no effect but had poor uptake. Single trials of three other interventions showed no evidence of benefit.
‘Job-club’ interventions may be effective in reducing depressive symptoms in unemployed people, particularly those at high risk of depression. Evidence for CBT-type interventions is mixed; further trials are needed. However the studies are old and at high risk of bias. Future intervention studies should follow CONSORT guidelines and address issues of poor uptake.
The seasonality and periodicity of infections, and the mechanisms underlying observed dynamics, can have implications for control efforts. This is particularly true for acute childhood infections. Among these, the dynamics of measles is the best understood and has been extensively studied, most notably in the UK prior to the start of vaccination. Less is known about the dynamics of other childhood diseases, particularly outside Europe and the United States. In this paper, we leverage a unique dataset to examine the epidemiology of six childhood infections – measles, mumps, rubella, varicella, scarlet fever and pertussis – across 32 states in Mexico from 1985 to 2007. This dataset provides us with a spatio-temporal probe into the dynamics of six common childhood infections, and allows us to compare them in the same setting over the same time period. We examine three key epidemiological characteristics of these infections – the age profile of infections, spatio-temporal dynamics, and seasonality in transmission – and compare them with predictions from existing theory and past findings. Our analysis reveals interesting epidemiological differences between the six pathogens, and variations across space. We find signatures of term-time forcing (reduced transmission during the summer) for measles, mumps, rubella, varicella, and scarlet fever; for pertussis, a lack of term-time forcing could not be rejected.
We are searching for the coolest white dwarf stars in the galactic disk and halo. The Sloan survey, in due course, will identify an enormous number of new white dwarf stars which will better define the white dwarf luminosity function—an important tool for understanding the age and history of the stellar population of the galaxy. The broadband filter data obtained in the digital photometry phase of the survey will not permit identification of the most interesting of these, the coolest white dwarf stars. This is because the cool main sequence and subdwarf stars become indistinguishable from the white dwarfs in the various colorcolor diagrams. We have interference filters designed to separate out these classes of objects. We have obtained photometry of test fields to complement the Sloan data and identify the population of cool white dwarf stars. These data will ultimately resolve the controversies, based for the most part on small-number statistics, of the location of the turndown in the white dwarf luminosity function for the disk. If the halo is significantly older than the disk, we will find a second peak in the white dwarf luminosity function, at lower luminosities than the disk turndown. Our data will provide the first meaningful constraints on the location of the turndown in the halo white dwarf luminosity function.
Few studies have investigated the impact of parental suicide attempt (SA) on offspring outcomes other than mental health. We investigated the association of parental SA with offspring educational attainment in the Avon Longitudinal Study of Parents and Children (ALSPAC).
Parental SA was prospectively recorded from pregnancy until the study children were 11 years old. National school test results (ages 11–16 years) were obtained by record linkage. Multilevel regression models quantified the association between parental SA and offspring outcomes.
Data were available for 6667 mother–child and 3054 father–child pairs. Adolescents whose mothers had attempted suicide were less likely than their peers to achieve the expected educational level by age 14 years [adjusted odds ratio (aOR) 0.63, 95% confidence interval (CI) 0.41–0.95] in models controlling for relevant confounders, including parental education and depression. At age 16 years, adolescents whose mothers had attempted suicide were less likely to obtain the expected educational level (five or more qualifications at grade A*–C) (aOR 0.66, 95% CI 0.43–1.00) in models controlling for relevant confounders and parental education; however, after additionally controlling for maternal depression the results were consistent with chance (aOR 0.74, 95% CI 0.48–1.13). Findings in relation to paternal SA were consistent with those of maternal SA but power was limited due to lower response rate amongst fathers.
Maternal SA was associated with diminished educational performance at age 14 years. Educational attainment during adolescence can have substantial effect on future opportunities and well-being and these offspring may benefit from interventions.
Litigation in surgery is increasing and liabilities are becoming unsustainable. This study aimed to analyse trends in claims, and identify areas for potential risk reduction, improved patient safety and a reduction in the number, and cost, of future claims.
Ten years of retrospective data on claims in otorhinolaryngology (2003–2013) were obtained from the National Health Service Litigation Authority via a Freedom of Information request. Data were re-entered into a spreadsheet and coded for analysis.
A total of 1031 claims were identified; of these, 604 were successful and 427 were unsuccessful. Successful claims cost a total of £41 000 000 (mean, £68 000). The most common areas for successful claims were: failure or delay in diagnosis (137 cases), intra-operative problems (116 cases), failure or delay in treatment (66 cases), failure to warn – informed consent issue (54 cases), and inappropriate treatment (47 cases).
Over half of the claims in ENT relate to the five most common areas of liability. Recent policy changes by the National Health Service Litigation Authority, over the level of information divulged, limits our learning from claims.