To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
England has recently started a new paediatric influenza vaccine programme using a live-attenuated influenza vaccine (LAIV). There is uncertainty over how well the vaccine protects against more severe end-points. A test-negative case–control study was used to estimate vaccine effectiveness (VE) in vaccine-eligible children aged 2–16 years of age in preventing laboratory-confirmed influenza hospitalisation in England in the 2015–2016 season using a national sentinel laboratory surveillance system. Logistic regression was used to estimate the VE with adjustment for sex, risk-group, age group, region, ethnicity, deprivation and month of sample collection. A total of 977 individuals were included in the study (348 cases and 629 controls). The overall adjusted VE for all study ages and vaccine types was 33.4% (95% confidence interval (CI) 2.3–54.6) after adjusting for age group, sex, index of multiple deprivation, ethnicity, region, sample month and risk group. Risk group was shown to be an important confounder. The adjusted VE for all influenza types for the live-attenuated vaccine was 41.9% (95% CI 7.3–63.6) and 28.8% (95% CI −31.1 to 61.3) for the inactivated vaccine. The study provides evidence of the effectiveness of influenza vaccination in preventing hospitalisation due to laboratory-confirmed influenza in children in 2015–2016 and continues to support the rollout of the LAIV childhood programme.
The biogeographic histories of parasites and pathogens are infrequently compared with those of free-living species, including their hosts. Documenting the frequency with which parasites and pathogens disperse across geographic regions contributes to understanding not only their evolution, but also the likelihood that they may become emerging infectious diseases. Haemosporidian parasites of birds (parasite genera Plasmodium, Haemoproteus and Leucocytozoon) are globally distributed, dipteran-vectored parasites. To date, over 2000 avian haemosporidian lineages have been designated by molecular barcoding methods. To achieve their current distributions, some lineages must have dispersed long distances, often over water. Here we quantify such events using the global avian haemosporidian database MalAvi and additional records primarily from the Americas. We scored lineages as belonging to one or more global biogeographic regions based on infection records. Most lineages were restricted to a single region but some were globally distributed. We also used part of the cytochrome b gene to create genus-level parasite phylogenies and scored well-supported nodes as having descendant lineages in regional sympatry or allopatry. Descendant sister lineages of Plasmodium, Haemoproteus and Leucocytozoon were distributed in allopatry in 11, 16 and 15% of investigated nodes, respectively. Although a small but significant fraction of the molecular variance in cytochrome b of all three genera could be explained by biogeographic region, global parasite dispersal likely contributed to the majority of the unexplained variance. Our results suggest that avian haemosporidian parasites have faced few geographic barriers to dispersal over their evolutionary history.
Significant increases in excess all-cause mortality, particularly in the elderly, were observed during the winter of 2014/15 in England. With influenza A(H3N2) the dominant circulating influenza A subtype, this paper determines the contribution of influenza to this excess controlling for weather. A standardised multivariable Poisson regression model was employed with weekly all-cause deaths the dependent variable for the period 2008–2015. Adjusting for extreme temperature, a total of 26 542 (95% CI 25 301–27 804) deaths in 65+ and 1942 (95% CI 1834–2052) in 15–64-year-olds were associated with influenza from week 40, 2014 to week 20, 2015. This is compatible with the circulation of influenza A(H3N2). It is the largest estimated number of influenza-related deaths in England since prior to 2008/09. The findings highlight the potential health impact of influenza and the important role of the annual influenza vaccination programme that is required to protect the population including the elderly, who are vulnerable to a severe outcome.
Post-harvest drying prolongs seed survival in air-dry storage; previous research has shown a benefit of drying moist rice seeds at temperatures greater than recommended for genebanks (5–20°C). The aim of this study was to determine whether there is a temperature limit for safely drying rice seeds, and to explore whether the benefit to longevity is caused by high-temperature stress or continued seed development. Seeds of two rice varieties were harvested at different stages of development and dried initially either over silica gel, or intermittently (8 h day–1) or continuously (24 h day–1) over MgCl2 at temperatures between 15 and 60°C for up to 3 days. Seeds dried more rapidly the warmer the temperature. Subsequent seed longevity in hermetic storage (45°C and 10.9% moisture content) was substantially improved by increase in drying temperature up to 45°C in both cultivars, and also with further increase from 45 to 60°C in cv. ‘Macassane’. The benefit of high-temperature drying to subsequent longevity tended to diminish the later the stage of development at seed harvest. Intermittent or continuous drying at high temperatures provided broadly similar improvements to longevity, but with the greatest improvements detected in a few treatment combinations with continuous drying. Heated-air drying of rice seeds harvested before maturity improved their subsequent storage longevity by more than that which occurred during subsequent development in planta, which may have resulted from the triggering of protection mechanisms in response to high-temperature stress.
Human parainfluenza virus (HPIV) infections are one of the commonest causes of upper and lower respiratory tract infections. In order to determine if there have been any recent changes in HPIV epidemiology in England and Wales, laboratory surveillance data between 1998 and 2013 were analysed. The UK national laboratory surveillance database, LabBase, and the newly established laboratory-based virological surveillance system, the Respiratory DataMart System (RDMS), were used. Descriptive analysis was performed to examine the distribution of cases by year, age, sex and serotype, and to examine the overall temporal trend using the χ2 test. A random-effects model was also employed to model the number of cases. Sixty-eight per cent of all HPIV detections were due to HPIV type 3 (HPIV-3). HPIV-3 infections were detected all year round but peaked annually between March and June. HPIV-1 and HPIV-2 circulated at lower levels accounting for 20% and 8%, respectively, peaking during the last quarter of the year with a biennial cycle. HPIV-4 was detected in smaller numbers, accounting for only 4% and also mainly observed in the last quarter of the year. However, in recent years, HPIV-4 detection has been reported much more commonly with an increase from 0% in 1998 to 3·7% in 2013. Although an overall higher proportion of HPIV infection was reported in infants (43·0%), a long-term decreasing trend in proportion in infants was observed. An increase was also observed in older age groups. Continuous surveillance will be important in tracking any future changes.
The effects of simulated additional rain (ear wetting, 25 mm) or of rain shelter imposed at different periods after anthesis on grain quality at maturity and the dynamics of grain filling and desiccation were investigated in UK field-grown crops of wheat (Triticum aestivum L., cvar Tybalt) in 2011 and in 2012 when June–August rainfall was 255·0 and 214·6 mm, respectively, and above the decadal mean (157·4 mm). Grain filling and desiccation were quantified well by broken-stick regressions and Gompertz curves, respectively. Rain shelter for 56 (2011) or 70 days (2012) after anthesis, and to a lesser extent during late maturation only, resulted in more rapid desiccation and hence progress to harvest maturity whereas ear wetting had negligible effects, even when applied four times. Grain-filling duration was also affected as above in 2011, but with no significant effect in 2012. In both years, there were strong positive associations between final grain dry weight and duration of filling. The treatments affected all grain quality traits in 2011: nitrogen (N) and sulphur (S) concentrations, N : S ratio, sodium dodecyl sulphate (SDS) sedimentation volume, Hagberg Falling Number (HFN), and the incidence of blackpoint. Only N concentration and blackpoint were affected significantly by treatments in 2012. Rain shelter throughout grain filling reduced N concentration, whereas rain shelter reduced the incidence of blackpoint and ear wetting increased it. In 2011, rain shelter throughout reduced S concentration, increased N : S ratio and reduced SDS. Treatment effects on HFN were not consistent within or between years. Nevertheless, a comparison between the extreme treatment means in 2012 indicated damage from late rain combined with ear wetting resulted in a reduction of c. 0·7 s in HFN/mm August rainfall, while that between samples taken immediately after ear wetting at harvest maturity or 7 days later suggested recovery from damage to HFN upon re-drying in planta. Hence, the incidence of blackpoint was the only grain quality trait affected consistently by the diverse treatments. The remaining aspects of grain quality were comparatively resilient to rain incident upon developing and maturing ears of cvar Tybalt. No consistent temporal patterns of sensitivity to shelter or ear wetting were detected for any aspect of grain quality.
The positive effects of dietary fibre on health are now widely recognised; however, our understanding of the mechanisms involved in producing such benefits remains unclear. There are even uncertainties about how dietary fibre in plant foods should be defined and analysed. This review attempts to clarify the confusion regarding the mechanisms of action of dietary fibre and deals with current knowledge on the wide variety of dietary fibre materials, comprising mainly of NSP that are not digested by enzymes of the gastrointestinal (GI) tract. These non-digestible materials range from intact cell walls of plant tissues to individual polysaccharide solutions often used in mechanistic studies. We discuss how the structure and properties of fibre are affected during food processing and how this can impact on nutrient digestibility. Dietary fibre can have multiple effects on GI function, including GI transit time and increased digesta viscosity, thereby affecting flow and mixing behaviour. Moreover, cell wall encapsulation influences macronutrient digestibility through limited access to digestive enzymes and/or substrate and product release. Moreover, encapsulation of starch can limit the extent of gelatinisation during hydrothermal processing of plant foods. Emphasis is placed on the effects of diverse forms of fibre on rates and extents of starch and lipid digestion, and how it is important that a better understanding of such interactions with respect to the physiology and biochemistry of digestion is needed. In conclusion, we point to areas of further investigation that are expected to contribute to realisation of the full potential of dietary fibre on health and well-being of humans.
The epidemiology of laboratory-confirmed respiratory syncytial virus (RSV) infections in young children has not recently been described in England, and is an essential step in identifying optimal target groups for future licensed RSV vaccines. We used two laboratory surveillance systems to examine the total number and number of positive RSV tests in children aged <5 years in England from 2010 to 2014. We derived odds ratios (ORs) with 95% confidence intervals (CIs) comparing children by birth month, using multivariable logistic regression models adjusted for age, season and sex. Forty-seven percent of RSV tests (29 851/63 827) and 57% (7405/13 034) of positive results in children aged <5 years were in infants aged <6 months. Moreover, 38% (4982/13 034) of positive results were in infants aged <3 months. Infants born in September, October and November had the highest odds of a positive RSV test during their first year of life compared to infants born in January (OR 2·1, 95% CI 1·7–2·7; OR 2·4, 95% CI 2·1–2·8; and OR 2·4, 95% CI 2·1–2·7, respectively). Our results highlight the importance of young age and birth month near the beginning of the RSV season to the risk of laboratory-confirmed RSV infection. Future control measures should consider protection for these groups.
Seasonal respiratory infections place an increased burden on health services annually. We used a sentinel emergency department syndromic surveillance system to understand the factors driving respiratory attendances at emergency departments (EDs) in England. Trends in different respiratory indicators were observed to peak at different points during winter, with further variation observed in the distribution of attendances by age. Multiple linear regression analysis revealed acute respiratory infection and bronchitis/bronchiolitis ED attendances in patients aged 1–4 years were particularly sensitive indicators for increasing respiratory syncytial virus activity. Using near real-time surveillance of respiratory ED attendances may provide early warning of increased winter pressures in EDs, particularly driven by seasonal pathogens. This surveillance may provide additional intelligence about different categories of attendance, highlighting pressures in particular age groups, thereby aiding planning and preparation to respond to acute changes in EDs, and thus the health service in general.
Several private boarding schools in England have established universal influenza vaccination programmes for their pupils. We evaluated the impact of these programmes on the burden of respiratory illnesses in boarders. Between November 2013 and May 2014, age-specific respiratory disease incidence rates in boarders were compared between schools offering and not offering influenza vaccine to healthy boarders. We adjusted for age, sex, school size and week using negative binomial regression. Forty-three schools comprising 14 776 boarders participated. Almost all boarders (99%) were aged 11–17 years. Nineteen (44%) schools vaccinated healthy boarders against influenza, with a mean uptake of 48·5% (range 14·2–88·5%). Over the study period, 1468 respiratory illnesses were reported in boarders (5·66/1000 boarder-weeks); of these, 33 were influenza-like illnesses (ILIs, 0·26/1000 boarder-weeks) in vaccinating schools and 95 were ILIs (0·74/1000 boarder-weeks) in non-vaccinating schools. The impact of vaccinating healthy boarders was a 54% reduction in ILI in all boarders [rate ratio (RR) 0·46, 95% confidence interval (CI) 0·28–0·76]. Disease rates were also reduced for upper respiratory tract infections (RR 0·72, 95% CI 0·61–0·85) and chest infections (RR 0·18, 95% CI 0·09–0·36). These findings demonstrate a significant impact of influenza vaccination on ILI and other clinical endpoints in secondary-school boarders. Additional research is needed to investigate the impact of influenza vaccination in non-boarding secondary-school settings.
The relationship between risk of death following influenza A(H1N1)pdm09 infection and ethnicity and deprivation during the 2009/2010 pandemic period and the first post-pandemic season of 2010/2011 in England was examined. Poisson regression models were used to estimate the mortality risk, adjusted for age, gender, and place of residence. Those of non-White ethnicity experienced an increased mortality risk compared to White populations during the 2009/2010 pandemic [10·5/1000 vs. 6·0/1000 general population; adjusted risk ratio (RR) 1·84, 95% confidence interval (CI) 1·39-2·54] with the highest risk in those of Pakistani ethnicity. However, no significant difference between ethnicities was observed during the following 2010/2011 season. Persons living in areas with the highest level of deprivation had a significantly higher risk of death (RR 2·08, 95% CI 1·49-2·91) compared to the lowest level for both periods. These results highlight the importance of rapid identification of groups at higher risk of severe disease in the early stages of future pandemics to enable the implementation of optimal prevention and control measures for vulnerable populations.
Personalised nutrition (PN) has the potential to reduce disease risk and optimise health and performance. Although previous research has shown good acceptance of the concept of PN in the UK, preferences regarding the delivery of a PN service (e.g. online v. face-to-face) are not fully understood. It is anticipated that the presence of a free at point of delivery healthcare system, the National Health Service (NHS), in the UK may have an impact on end-user preferences for deliverances. To determine this, supplementary analysis of qualitative data obtained from focus group discussions on PN service delivery, collected as part of the Food4Me project in the UK and Ireland, was undertaken. Irish data provided comparative analysis of a healthcare system that is not provided free of charge at the point of delivery to the entire population. Analyses were conducted using the ‘framework approach’ described by Rabiee (Focus-group interview and data analysis. Proc Nutr Soc 63, 655-660). There was a preference for services to be led by the government and delivered face-to-face, which was perceived to increase trust and transparency, and add value. Both countries associated paying for nutritional advice with increased commitment and motivation to follow guidelines. Contrary to Ireland, however, and despite the perceived benefit of paying, UK discussants still expected PN services to be delivered free of charge by the NHS. Consideration of this unique challenge of free healthcare that is embedded in the NHS culture will be crucial when introducing PN to the UK.
General Practitioner consultation rates for influenza-like illness (ILI) are monitored through several geographically distinct schemes in the UK, providing early warning to government and health services of community circulation and intensity of activity each winter. Following on from the 2009 pandemic, there has been a harmonization initiative to allow comparison across the distinct existing surveillance schemes each season. The moving epidemic method (MEM), proposed by the European Centre for Disease Prevention and Control for standardizing reporting of ILI rates, was piloted in 2011/12 and 2012/13 along with the previously proposed UK method of empirical percentiles. The MEM resulted in thresholds that were lower than traditional thresholds but more appropriate as indicators of the start of influenza virus circulation. The intensity of the influenza season assessed with the MEM was similar to that reported through the percentile approach. The MEM pre-epidemic threshold has now been adopted for reporting by each country of the UK. Further work will continue to assess intensity of activity and apply standardized methods to other influenza-related data sources.
An analysis was undertaken to measure age-specific vaccine effectiveness (VE) of 2010/11 trivalent seasonal influenza vaccine (TIV) and monovalent 2009 pandemic influenza vaccine (PIV) administered in 2009/2010. The test-negative case-control study design was employed based on patients consulting primary care. Overall TIV effectiveness, adjusted for age and month, against confirmed influenza A(H1N1)pdm 2009 infection was 56% (95% CI 42–66); age-specific adjusted VE was 87% (95% CI 45–97) in <5-year-olds and 84% (95% CI 27–97) in 5- to 14-year-olds. Adjusted VE for PIV was only 28% (95% CI −6 to 51) overall and 72% (95% CI 15–91) in <5-year-olds. For confirmed influenza B infection, TIV effectiveness was 57% (95% CI 42–68) and in 5- to 14-year-olds 75% (95% CI 32–91). TIV provided moderate protection against the main circulating strains in 2010/2011, with higher protection in children. PIV administered during the previous season provided residual protection after 1 year, particularly in the <5 years age group.
In approaching the issue of how the universe started, it is common cause that we have to face up to the unsolved problem of quantum gravity: the domain where Einstein's theory of gravity is expected to break down because quantum effects become so dominant that they affect the very nature of space and time. Comparing the gravitational constants of nature with those from quantum theory leads to the Planck length ℓP ≈ 10-33cm, which is taken to be the characteristic scale at which quantum gravity dominates. By contrast, most (but not all) variant classical gravitational theories modify GR at low energies (see Chapter 14).
Quantum gravity processes are presumed to have dominated the very earliest times, preceding inflation: the geometry and quantum state that provide the initial data for any inflationary epoch themselves are usually assumed to come from the as yet unknown quantum gravity theory. There are many theories of the quantum origin of the universe, but none has attained dominance. The problem is that we do not have a good theory of quantum gravity (Rovelli, 2004, Weltmann, Murugan and Ellis, 2010), so all these attempts are essentially different proposals for extrapolating known physics into the unknown. A key issue is whether quantum effects can remove the initial singularity and make possible universes without a beginning.
In addition, the weakness of the gravitational force implies that it will be very difficult, though perhaps not impossible, to observationally test theories of quantum gravity.