To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
General Practitioner consultation rates for influenza-like illness (ILI) are monitored through several geographically distinct schemes in the UK, providing early warning to government and health services of community circulation and intensity of activity each winter. Following on from the 2009 pandemic, there has been a harmonization initiative to allow comparison across the distinct existing surveillance schemes each season. The moving epidemic method (MEM), proposed by the European Centre for Disease Prevention and Control for standardizing reporting of ILI rates, was piloted in 2011/12 and 2012/13 along with the previously proposed UK method of empirical percentiles. The MEM resulted in thresholds that were lower than traditional thresholds but more appropriate as indicators of the start of influenza virus circulation. The intensity of the influenza season assessed with the MEM was similar to that reported through the percentile approach. The MEM pre-epidemic threshold has now been adopted for reporting by each country of the UK. Further work will continue to assess intensity of activity and apply standardized methods to other influenza-related data sources.
Cystic fibrosis (CF) is an inherited childhood-onset life-shortening disease. It is characterized by increased respiratory production, leading to airway obstruction, chronic lung infection and inflammatory reactions. The most common bacteria causing persisting infections in people with CF is Pseudomonas aeruginosa. Superparamagnetic Fe3O4 iron oxide nanoparticles (NPs) conjugated to the antibiotic (tobramycin), guided by a gradient of the magnetic field or subjected to an oscillating magnetic field, show promise in improving the drug delivery across the mucus and P. aeruginosa biofilm to the bacteria. The question remains whether tobramycin needs to be released from the NPs after the penetration of the mucus barrier in order to act upon the pathogenic bacteria. We used a zero-length 1-ethyl-3-[3-dimethylaminopropyl] carbodiimide hydrochloride (EDC) crosslinking agent to couple tobramycin, via its amine groups, to the carboxyl groups on Fe3O4 NPs capped with citric acid. The therapeutic efficiency of Fe3O4 NPs attached to the drug versus that of the free drug was investigated in P. aeruginosa culture.
This article reports the results of a study that uses social network analysis to compare the persuasiveness of legal precedents in the diffusion of the strict liability rule for manufacturing defects. This new study tests which legal precedents were most influential and also whether certain state judicial variables influenced the diffusion process. The results are striking. The federal circuit regions appear to define an important reference group in the diffusion process, and social network effects dominate economic and political variables. In addition, the de facto separation of powers in the enactment of new state legislation appears to influence courts' propensities to adopt the strict liability rule. When the executive and legislative branches were controlled by the same political party, regardless of whether it was Republican or Democratic, state courts were more inclined to adopt the strict liability rule.
An analysis was undertaken to measure age-specific vaccine effectiveness (VE) of 2010/11 trivalent seasonal influenza vaccine (TIV) and monovalent 2009 pandemic influenza vaccine (PIV) administered in 2009/2010. The test-negative case-control study design was employed based on patients consulting primary care. Overall TIV effectiveness, adjusted for age and month, against confirmed influenza A(H1N1)pdm 2009 infection was 56% (95% CI 42–66); age-specific adjusted VE was 87% (95% CI 45–97) in <5-year-olds and 84% (95% CI 27–97) in 5- to 14-year-olds. Adjusted VE for PIV was only 28% (95% CI −6 to 51) overall and 72% (95% CI 15–91) in <5-year-olds. For confirmed influenza B infection, TIV effectiveness was 57% (95% CI 42–68) and in 5- to 14-year-olds 75% (95% CI 32–91). TIV provided moderate protection against the main circulating strains in 2010/2011, with higher protection in children. PIV administered during the previous season provided residual protection after 1 year, particularly in the <5 years age group.
To report a large outbreak of Clostridium difficile infection (CDI; ribotype 027) between June 2007 and August 2008, describe infection control measures, and evaluate the impact of restricting the use of fluoroquinolones in controlling the outbreak.
Outbreak investigation in 3 acute care hospitals of the Northern Health and Social Care Trust in Northern Ireland.
Implementation of a series of CDI control measures that targeted high-risk antibiotic agents (ie, restriction of fluoroquinolones), infection control practices, and environmental hygiene.
A total of 318 cases of CDI were identified during the outbreak, which was the result of the interaction between C. difficile ribotype 027 being introduced into the affected hospitals for the first time and other predisposing risk factors (ranging from host factors to suboptimal compliance with antibiotic guidelines and infection control policies). The 30-day all-cause mortality rate was 24.5%; however, CDI was the attributable cause of death for only 2.5% of the infected patients. Time series analysis showed that restricting the use of fluoroquinolones was associated with a significant reduction in the incidence of CDI (coefficient, —0.054; lag time, 4 months; P = .003).
These findings provide additional evidence to support the value of antimicrobial stewardship as an essential element of multifaceted interventions to control CDI outbreaks. The present CDI outbreak was ended following the implementation of an action plan improving communication, antibiotic stewardship, infection control practices, environmental hygiene, and surveillance.
The UK was one of few European countries to document a substantial wave of pandemic (H1N1) 2009 influenza in summer 2009. The First Few Hundred (FF100) project ran from April–June 2009 gathering information on early laboratory-confirmed cases across the UK. In total, 392 confirmed cases were followed up. Children were predominantly affected (median age 15 years, IQR 10–27). Symptoms were mild and similar to seasonal influenza, with the exception of diarrhoea, which was reported by 27%. Eleven per cent of all cases had an underlying medical condition, similar to the general population. The majority (92%) were treated with antiviral drugs with 12% reporting adverse effects, mainly nausea and other gastrointestinal complaints. Duration of illness was significantly shorter when antivirals were given within 48 h of onset (median 5 vs. 9 days, P=0·01). No patients died, although 14 were hospitalized, of whom three required mechanical ventilation. The FF100 identified key clinical and epidemiological characteristics of infection with this novel virus in near real-time.
Patients whose symptoms are ‘unexplained by disease’ often have a poor symptomatic outcome after specialist consultation, but we know little about which patient factors predict this. We therefore aimed to determine predictors of poor subjective outcome for new neurology out-patients with symptoms unexplained by disease 1 year after the initial consultation.
The Scottish Neurological Symptom Study was a 1-year prospective cohort study of patients referred to secondary care National Health Service neurology clinics in Scotland (UK). Patients were included if the neurologist rated their symptoms as ‘not at all’ or only ‘somewhat explained’ by organic disease. Patient-rated change in health was rated on a five-point Clinical Global Improvement (CGI) scale (‘much better’ to ‘much worse’) 1 year later.
The 12-month outcome data were available on 716 of 1144 patients (63%). Poor outcome on the CGI (‘unchanged’, ‘worse’ or ‘much worse’) was reported by 482 (67%) out of 716 patients. The only strong independent baseline predictors were patients' beliefs [expectation of non-recovery (odds ratio [OR] 2.04, 95% confidence interval [CI] 1.40–2.96), non-attribution of symptoms to psychological factors (OR 2.22, 95% CI 1.51–3.26)] and the receipt of illness-related financial benefits (OR 2.30, 95% CI 1.37–3.86). Together, these factors predicted 13% of the variance in outcome.
Of the patients, two-thirds had a poor outcome at 1 year. Illness beliefs and financial benefits are more useful in predicting poor outcome than the number of symptoms, disability and distress.
In March 1988, there was an outbreak of infection by a strain of Salmonella saint-paul with a distinctive antigenic marker. A total of 143 reports were received between 1 March and 7 June. Preliminary investigations suggested that raw beansprouts were a possible source of infection and a case-control study confirmed the association. S. saint-paul of the epidemic type was isolated from samples of beansprouts on retail sale in different cities in the United Kingdom and from mung bean seeds on the premises of the producer who was most strongly associated with cases. In addition, Salmonella virchow PT34 was isolated from samples of raw beansprouts and was subsequently associated with seven cases of infection. Four other serotypes of salmonella were also isolated from beansprouts. On 8 April the public were advised to boil beansprouts for 15 seconds before consumption, and the premises of the one producer associated with many cases were closed. As a result of these actions there was a significant decrease in the number of infections with S. saint-paul.
Evidence of past zoonotic infection was investigated serologically in randomly selected Northern Ireland farmers. The percentage of farmers with antibody was: Brucella abortus (0·7), Leptospira interrogans serovars (8·1), Borrelia burgdorferi (14·3), Toxoplasma gondii (73·5), Coxiella burnetii (28·0), Chlamydia psittaci (11·1) and Hantavirus (1·2).
The results show that Northern Ireland farmers have been exposed in the past to zoonotic infections. It is not known if these infections contributed to ill health in farmers but it is now time for the health of farm workers and their medical services to be reassessed.
An outbreak of Salmonella typhimurium DT 124 infection which affected 101 people in England in December 1987 and January 1988 was detected through surveillance of laboratory reports from medical microbiology laboratories of the NHS and PHLS. Within 1 week of noting the increase in reports, epidemiological and microbiological investigations identified a small German salami stick as the vehicle of infection and the product was withdrawn from sale. The epidemiological investigation highlighted the occurrence of a long incubation period, bloody diarrhoea. Prompt recognition and investigation of the outbreak prevented further cases of severe infection.
The study was designed to provide quantifiable information on both within- and between-herd variation in pig growth rate from birth to slaughter and to examine how this was influenced by moving pigs at a common age to a common environment. Five litters were selected from each of eight pig herds in Northern Ireland with varying growth performance. All eight herds were offered the same nutritional regime. Five pigs (three boars and two gilts) were selected from each litter. In each herd, 22 pigs (12 boars and 10 gilts) were weighed individually, every 4 weeks, from 4 to 20 weeks of age. At 4 weeks of age (weaning) three non-sibling boars were taken from each herd and brought to a common environment where they received medication, were housed individually from 6 weeks of age and offered the same dietary regime. They were weighed and feed intakes were recorded twice weekly. A growth rate difference of 61 g/day (P < 0.001), 112 g/day (P < 0.01) and 170 g/day (P < 0.001) was observed on farm, between the top and bottom quartile of herds during 4 to 8, 8 to 12 and 12 to 20 weeks of age, respectively. This difference in growth rate equated to an average difference in cost of production of ¢13/kg carcass on a birth to bacon unit. When pigs from the different herds were housed in the common environment, large variation in growth performance (143 g/day (P < 0.01) and 243 g/day (P < 0.001) for 8 to 12 and 12 to 20 weeks, respectively) was also observed between the top and bottom quartile of herds. Although feed efficiency was similar, a significant feed intake difference of 329 g/day (P < 0.01) and 655 g/day (P < 0.001) between 8 to 12 and 12 to 20 weeks of age was observed. The variation in growth rate between pigs whether managed on farm or in the common environment was similar (variation in days to 100 kg on farm and in the common environment was 18 and 19 days, respectively). When housed in the common environment, although the top and bottom quartile of pigs converted feed equally efficiently, pigs in the top quartile had significantly higher feed intakes suggesting greater appetites. It is difficult to assess the extent to which these differences can be attributed to genetic effects or pre-weaning environment, and how much the effects of management, disease or genetics contributed to the variation between and within herds.
The Keck Interferometer Nuller (KIN) is one of the major scientific and technical precursors to the Terrestrial Planet Finder Interferometer (TPF-I) mission. KIN's primary objective is to measure the level of exo-zodiacal mid-infrared emission around nearby main sequence stars, which requires deep broad-band nulling of astronomical sources of a few Janskys at 10 microns. A number of new capabilitites are needed in order to reach that goal with the Keck telescopes: mid-infrared coherent recombination, interferometric operation in “split pupil” mode, N-band optical path stabilization using K-band fringe tracking and internal metrology, and eventually, active atmospheric dispersion correction. We report here on the progress made implementing these new functionalities, and discuss the initial levels of extinction achieved on the sky.
The equation of state of Fo90 hydrous ringwoodite has been measured using X-ray powder diffraction to 45 GPa at the GSECARS beam line at the Advanced Photon Source synchrotron at Argonne National Laboratory. The sample was synthesized at 1400°C and 20 GPa in the 5000 ton multi anvil press at Bayerisches Geoinstitut in Bayreuth. The sample has the formula Mg1.70Fe0.192+ Fe0.023+H0.13- Si1.00O4 as determined by electron microprobe, Fourier transform infrared and Mössbauer spectroscopies, and contains ~0.79% H2O by weight. Compression of the sample had been been measured previously to 11 GPa by single crystal X-ray diffraction. A third-order Birch-Murnaghan equation of state fit to all of the data gives V0 = 530.49±0.07 Å3, K0 = 174.6±2.7 GPa and K' = 6.2±0.6. The effect of 1% H incorporation in the structure on the bulk modulus is large and roughly equivalent to an increase in the temperature of ∼600°C at low pressure. The large value of K' indicates significant stiffening of the sample with pressure so that the effect of hydration decreases with pressure.
By-product-based diets generally contain lower levels of energy than cereal-based diets due to higher levels of fibre (Bakker et al., 1995). Supplementation with oil is a common method of improving the digestible energy content of by-product-based diets and it has been reported that this practice may also improve energy digestibility. However, the results of McCann et al., (2004) suggested that the method of oil application to finishing pig diets may affect the digestibility of dietary nutrients. The aim of this experiment was to compare apparent digestibility coefficients determined in finishing pigs offered either by-product based diets or cereal-based diets, with and without vegetable oil blend supplementation applied using two different methods (either directly incorporated into the pellet (IN) or sprayed (SP) on after pelleting).
Cereals have traditionally been used in the pig industry as the main source of energy in pig diets. However, as a result of cereal availability and price, alternative sources of energy have been considered, for example the addition of oil to cereal by-product-based diets. By-product-based diets commonly contain higher levels of fibre than cereal-based diets and several studies (e.g. Bakker et al 1995) have reported them to be less digestible in terms of dry matter (DM), energy, crude protein (CP) and oil. The lower DM digestibility of by-product-based diets may lead to a higher level of slurry output, which is an increasing environmental concern. The aim of this work was to examine the differences in digestibility between by-product-based diets supplemented with oil and cereal-based diets.