To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Mechanistic models (MMs) have served as causal pathway analysis and ‘decision-support’ tools within animal production systems for decades. Such models quantitatively define how a biological system works based on causal relationships and use that cumulative biological knowledge to generate predictions and recommendations (in practice) and generate/evaluate hypotheses (in research). Their limitations revolve around obtaining sufficiently accurate inputs, user training and accuracy/precision of predictions on-farm. The new wave in digitalization technologies may negate some of these challenges. New data-driven (DD) modelling methods such as machine learning (ML) and deep learning (DL) examine patterns in data to produce accurate predictions (forecasting, classification of animals, etc.). The deluge of sensor data and new self-learning modelling techniques may address some of the limitations of traditional MM approaches – access to input data (e.g. sensors) and on-farm calibration. However, most of these new methods lack transparency in the reasoning behind predictions, in contrast to MM that have historically been used to translate knowledge into wisdom. The objective of this paper is to propose means to hybridize these two seemingly divergent methodologies to advance the models we use in animal production systems and support movement towards truly knowledge-based precision agriculture. In order to identify potential niches for models in animal production of the future, a cross-species (dairy, swine and poultry) examination of the current state of the art in MM and new DD methodologies (ML, DL analytics) is undertaken. We hypothesize that there are several ways via which synergy may be achieved to advance both our predictive capabilities and system understanding, being: (1) building and utilizing data streams (e.g. intake, rumination behaviour, rumen sensors, activity sensors, environmental sensors, cameras and near IR) to apply MM in real-time and/or with new resolution and capabilities; (2) hybridization of MM and DD approaches where, for example, a ML framework is augmented by MM-generated parameters or predicted outcomes and (3) hybridization of the MM and DD approaches, where biological bounds are placed on parameters within a MM framework, and the DD system parameterizes the MM for individual animals, farms or other such clusters of data. As animal systems modellers, we should expand our toolbox to explore new DD approaches and big data to find opportunities to increase understanding of biological systems, find new patterns in data and move the field towards intelligent, knowledge-based precision agriculture systems.
The aim of this study was to assess the impact of a urinary tract infection (UTI) management bundle to reduce the treatment of asymptomatic bacteriuria (AB) and to improve the management of symptomatic UTIs.
Before-and-after intervention study.
Consecutive sample of inpatients with positive single or mixed-predominant urine cultures collected and reported while admitted to the hospital.
The UTI management bundle consisted of nursing and prescriber education, modification of the reporting of positive urine cultures, and pharmacists’ prospective audit and feedback. A retrospective chart review of consecutive inpatients with positive urinary cultures was performed before and after implementation of the management bundle.
Prior to the implementation of the management bundle, 276 patients were eligible criteria for chart review. Of these 276 patients, 165 (59·8%) were found to have AB; of these 165 patients with AB, 111 (67·3%) were treated with antimicrobials. Moreover, 268 patients met eligibility criteria for postintervention review. Of these 268, 133 patients (49·6%) were found to have AB; of these 133 with AB, 22 (16·5%) were treated with antimicrobials. Thus, a 75·5% reduction of AB treatment was achieved. Educational components of the bundle resulted in a substantial decrease in nonphysician-directed urine sample submission. Adherence to a UTI management algorithm improved substantially in the intervention period, with a notable decrease in fluoroquinolone prescription for empiric UTI treatment.
A UTI management bundle resulted in a dramatic improvement in the management of urinary tract infection, particularly a reduction in the treatment of AB and improved management of symptomatic UTI.
Introduction: Emergency departments (ED) across Canada acknowledge the need to transform in order to provide high quality care for the increasing proportion of older patients presenting for treatment. Older people are more complex than younger ED users. They have a disproportionately high use of EDs, increased rates of hospitalization, and are more likely to suffer adverse events. The objective of this initiative was to develop minimum standards for the care of older people in the emergency department. Methods: We created a panel of international leaders in geriatrics and emergency medicine to develop a policy framework on minimum standards for care of older people in the ED. We conducted a literature review of international guidelines, frameworks, recommendations, and best practices for the acute care of older people and developed a draft standards document. This preliminary document was circulated to interdisciplinary members of the International Federation of Emergency Medicine (IFEM) geriatric emergency medicine (GEM) group. Following review, the standards were presented to the IFEM clinical practice group. At each step, verbal, written and online feedback were gathered and integrated into the final minimum standards document. Results: Following the developmental process, a series of eight minimum standard statements were created and accepted by IFEM. These standards utilise the IFEM Framework for Quality and Safety in the ED, and are centred on the recognition that older people are a core population of emergency health service users whose care needs are different from those of children and younger adults. They cover key areas, including the overall approach to older patients, the physical environment and equipment, personnel and training, policies and protocols, and strategies for navigating the health-care continuum. Conclusion: These standards aim to improve the evaluation, management and integration of care of older people in the ED in an effort to improve outcomes. The minimum standards represent a first step on which future activities can be built, including the development of specific indicators for each of the minimum standards. The standards are designed to apply across the spectrum of EDs worldwide, and it is hoped that they will act as a catalyst to change.
In autumn 2014, enterovirus D68 (EV-D68) cases presenting with severe respiratory or neurological disease were described in countries worldwide. To describe the epidemiology and virological characteristics of EV-D68 in England, we collected clinical information on laboratory-confirmed EV-D68 cases detected in secondary care (hospitals), between September 2014 and January 2015. In primary care (general practitioners), respiratory swabs collected (September 2013–January 2015) from patients presenting with influenza-like illness were tested for EV-D68. In secondary care 55 EV-D68 cases were detected. Among those, 45 cases had clinical information available and 89% (40/45) presented with severe respiratory symptoms. Detection of EV-D68 among patients in primary care increased from 0.4% (4/1074; 95% CI 0.1–1.0) (September 2013–January 2014) to 0.8% (11/1359; 95% CI 0.4–1.5) (September 2014–January 2015). Characterization of EV-D68 strains circulating in England since 2012 and up to winter 2014/2015 indicated that those strains were genetically similar to those detected in 2014 in USA. We recommend reinforcing enterovirus surveillance through screening respiratory samples of suspected cases.
The UK has longstanding problems with psychiatry recruitment. Various initiatives aim to improve psychiatry's image among medical students, but involve research and none are student-led. Providing opportunities to take part in psychiatry research and quality improvement could increase the number of students who choose to enter the speciality.
We have developed the student psychiatry audit and research collaborative (SPARC), a student-led initiative for nationwide collaboration in high-quality research and audits.
Our model is inspired by the success of the UK Student audit and research in surgery (STARSurg). Area teams, located in medical schools, take part in multi-centre projects. The area teams consist of medical students, who have the main responsibility for collecting data; a junior doctor, to supervise the process; and a consultant, with overall responsibility for patient care. The data collected centrally and analysed by a team of medical students and doctors. Student leads from each site are named authors on resulting papers. All other students are acknowledged and are able to present the work.
We have completed our first audits in Cardiff and London; other sites will return data in 2017. Student feedback indicated a high level of satisfaction with the project and interest in psychiatry as a future career.
This initiative aims to tackle the recruitment problems in psychiatry by giving students a chance to take part in high quality research and audits.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Aberrant microbiota composition and function have been linked to several pathologies, including type 2 diabetes. In animal models, prebiotics induce favourable changes in the intestinal microbiota, intestinal permeability (IP) and endotoxaemia, which are linked to concurrent improvement in glucose tolerance. This is the first study to investigate the link between IP, glucose tolerance and intestinal bacteria in human type 2 diabetes. In all, twenty-nine men with well-controlled type 2 diabetes were randomised to a prebiotic (galacto-oligosaccharide mixture) or placebo (maltodextrin) supplement (5·5 g/d for 12 weeks). Intestinal microbial community structure, IP, endotoxaemia, inflammatory markers and glucose tolerance were assessed at baseline and post intervention. IP was estimated by the urinary recovery of oral 51Cr-EDTA and glucose tolerance by insulin-modified intravenous glucose tolerance test. Intestinal microbial community analysis was performed by high-throughput next-generation sequencing of 16S rRNA amplicons and quantitative PCR. Prebiotic fibre supplementation had no significant effects on clinical outcomes or bacterial abundances compared with placebo; however, changes in the bacterial family Veillonellaceae correlated inversely with changes in glucose response and IL-6 levels (r −0·90, P=0·042 for both) following prebiotic intake. The absence of significant changes to the microbial community structure at a prebiotic dosage/length of supplementation shown to be effective in healthy individuals is an important finding. We propose that concurrent metformin treatment and the high heterogeneity of human type 2 diabetes may have played a significant role. The current study does not provide evidence for the role of prebiotics in the treatment of type 2 diabetes.
The horse is a non-ruminant herbivore adapted to eating plant-fibre or forage-based diets. Some horses are stabled for most or the majority of the day with limited or no access to fresh pasture and are fed preserved forage typically as hay or haylage and sometimes silage. This raises questions with respect to the quality and suitability of these preserved forages (considering production, nutritional content, digestibility as well as hygiene) and required quantities. Especially for performance horses, forage is often replaced with energy dense feedstuffs which can result in a reduction in the proportion of the diet that is forage based. This may adversely affect the health, welfare, behaviour and even performance of the horse. In the past 20 years a large body of research work has contributed to a better and deeper understanding of equine forage needs and the physiological and behavioural consequences if these are not met. Recent nutrient requirement systems have incorporated some, but not all, of this new knowledge into their recommendations. This review paper amalgamates recommendations based on the latest understanding in forage feeding for horses, defining forage types and preservation methods, hygienic quality, feed intake behaviour, typical nutrient composition, digestion and digestibility as well as health and performance implications. Based on this, consensual applied recommendations for feeding preserved forages are provided.
This study uses a field experiment involving 251 adult participants to determine which messages related to climate change, extreme weather events, and decaying infrastructure are most effective in encouraging people to pay more for investments that could alleviate future water-quality risks. The experiment also assesses whether people prefer the investments to be directed toward gray or green infrastructure projects. Messages about global warming induced climate change and decaying infrastructure lead to larger contributions than messages about extreme weather events. The results suggest that people are likely to pay more for green infrastructure projects than for gray infrastructure projects.
To assess whether diet quality before or during pregnancy predicts adverse pregnancy and birth outcomes in a sample of Australian women.
The Dietary Questionnaire for Epidemiological Studies was used to calculate diet quality using the Australian Recommended Food Score (ARFS) methodology modified for pregnancy.
A population-based cohort participating in the Australian Longitudinal Study on Women’s Health (ALSWH).
A national sample of Australian women, aged 20–25 and 31–36 years, who were classified as preconception or pregnant when completing Survey 3 or Survey 5 of the ALSWH, respectively. The 1907 women with biologically plausible energy intake estimates were included in regression analyses of associations between preconception and pregnancy ARFS and subsequent pregnancy outcomes.
Preconception and pregnancy groups were combined as no significant differences were detected for total and component ARFS. Women with gestational hypertension, compared with those without, had lower scores for total ARFS, vegetable, fruit, grain and nuts/bean/soya components. Women with gestational diabetes had a higher score for the vegetable component only, and women who had a low-birth-weight infant had lower scores for total ARFS and the grain component, compared with those who did not report these outcomes. Women with the highest ARFS had the lowest odds of developing gestational hypertension (OR=0·4; 95 % CI 0·2, 0·7) or delivering a child of low birth weight (OR=0·4; 95 % CI 0·2, 0·9), which remained significant for gestational hypertension after adjustment for potential confounders.
A high-quality diet before and during pregnancy may reduce the risk of gestational hypertension for the mother.
We studied the spread of influenza in the community between 1993 and 2009 using primary-care surveillance data to investigate if the onset of influenza was age-related. Virus detections [A(H3N2), B, A(H1N1)] and clinical incidence of influenza-like illness (ILI) in 12·3 million person-years in the long-running Royal College of General Practitioners-linked clinical-virological surveillance programme in England & Wales were examined. The number of days between symptom onset and the all-age peak ILI incidence were compared by age group for each influenza type/subtype. We found that virus detection and ILI incidence increase, peak and decrease were in unison. The mean interval between symptom onset to peak ILI incidence in virus detections (all ages) was: A(H3N2) 20·5 [95% confidence interval (CI) 19·7–21·6] days; B, 18·8 (95% CI 15·8·0–21·7) days; and A(H1N1) 17·0 (95% CI 15·6–18·4) days. Differences by age group were examined using the Kruskal–Wallis test. For A(H3N2) and A(H1N1) viruses the interval was similar in each age group. For influenza B there were highly significant differences by age group (P = 0·0001). Clinical incidence rates of ILI reported in the 8 weeks preceding the period of influenza virus activity were used to estimate a baseline incidence and threshold value (upper 95% CI of estimate) which was used as a marker of epidemic progress. Differences between the age groups in the week in which the threshold was reached were small and not localized to any age group. In conclusion we found no evidence to suggest that influenza A(H3N2) and A(H1N1) occurs in the community in one age group before another. For influenza B, virus detection was earlier in children aged 5–14 years than in persons aged ⩾25 years.
We present the first experimentally determined oscillator strengths for the Pb ii transitions at 1203.6 Å and 1433.9 Å, obtained from lifetime measurements made using beam-foil techniques. We also present new detections of these lines in the interstellar medium from an analysis of archival spectra acquired by the Space Telescope Imaging Spectrograph onboard the Hubble Space Telescope. Our observations of the Pb ii λ1203 line represent the first detection of this transition in interstellar gas. Our experimental f-values for the Pb ii λ1203 and λ1433 transitions are consistent with recent theoretical results, including our own relativistic calculations, but are significantly smaller than previous values based on older calculations. Our new f-value for Pb ii λ1433 (0.321 ± 0.034) yields an increase in the interstellar abundance of Pb of 0.43 dex over estimates based on the f-value listed by Morton. With our revised f-values, and with our new detections of Pb ii λ1203 and λ1433, we find that the depletion of Pb onto interstellar grains is not nearly as severe as previously thought, and is very similar to the depletions seen for elements such as Zn and Sn, which have similar condensation temperatures.
General Practitioner consultation rates for influenza-like illness (ILI) are monitored through several geographically distinct schemes in the UK, providing early warning to government and health services of community circulation and intensity of activity each winter. Following on from the 2009 pandemic, there has been a harmonization initiative to allow comparison across the distinct existing surveillance schemes each season. The moving epidemic method (MEM), proposed by the European Centre for Disease Prevention and Control for standardizing reporting of ILI rates, was piloted in 2011/12 and 2012/13 along with the previously proposed UK method of empirical percentiles. The MEM resulted in thresholds that were lower than traditional thresholds but more appropriate as indicators of the start of influenza virus circulation. The intensity of the influenza season assessed with the MEM was similar to that reported through the percentile approach. The MEM pre-epidemic threshold has now been adopted for reporting by each country of the UK. Further work will continue to assess intensity of activity and apply standardized methods to other influenza-related data sources.
Military trainees are at high risk for skin and soft-tissue infections (SSTIs), especially those caused by methicillin-resistant Staphylococcus aureus (MRSA). A multicomponent hygiene-based SSTI prevention strategy was implemented at a military training center. After implementation, we observed 30% and 64% reductions in overall and MRSA-associated SSTI rates, respectively.
A total of 855 sera from dogs in Greece were tested for antibodies to strains belonging to the Pomona, Grippotyphosa and Australis serogroups of Leptospira to assess exposure levels to these serogroups, possible associations with clinical disease and to evaluate whether these findings support the inclusion of additional serovars in dog vaccines. Antibodies were detected in 110 (12·9%) dogs. The highest seroprevalence (4·9%) was to the proposed novel serovar Altodouro belonging to the Pomona serogroup. This serovar also showed a statistically significant association with clinical disease. Serovar Bratislava antibodies were found in 3·4% of sera. Consideration should be given to the inclusion of serovars belonging to the Pomona serogroup and serovar Bratislava in future dog vaccines for the Greek market.
An analysis was undertaken to measure age-specific vaccine effectiveness (VE) of 2010/11 trivalent seasonal influenza vaccine (TIV) and monovalent 2009 pandemic influenza vaccine (PIV) administered in 2009/2010. The test-negative case-control study design was employed based on patients consulting primary care. Overall TIV effectiveness, adjusted for age and month, against confirmed influenza A(H1N1)pdm 2009 infection was 56% (95% CI 42–66); age-specific adjusted VE was 87% (95% CI 45–97) in <5-year-olds and 84% (95% CI 27–97) in 5- to 14-year-olds. Adjusted VE for PIV was only 28% (95% CI −6 to 51) overall and 72% (95% CI 15–91) in <5-year-olds. For confirmed influenza B infection, TIV effectiveness was 57% (95% CI 42–68) and in 5- to 14-year-olds 75% (95% CI 32–91). TIV provided moderate protection against the main circulating strains in 2010/2011, with higher protection in children. PIV administered during the previous season provided residual protection after 1 year, particularly in the <5 years age group.
Dientamoeba fragilis is an intestinal protozoan in humans that is commonly associated with diarrhoea and other gastrointestinal complaints. Studies conducted to investigate the biology of this parasite are limited by methods for in vitro cultivation. The objective of this study was to improve a biphasic culture medium, based on the Loeffler's slope, by further supplementation in order to increase the yield of trophozoites in culture. The current in vitro culture of D. fragilis is a xenic culture with a mix of bacteria. Three different liquid overlays were evaluated including Earle's balanced salt solution (EBSS), PBS and Dulbecco's modified PBS (DPBS), for their ability to support the in vitro growth of D. fragilis trophozoites. Out of these 3 overlays EBSS gave the highest increase in the trophozoite numbers. The effect of supplementation was analysed by supplementing EBSS with ascorbic acid, ferric ammonium citrate, L-cysteine, cholesterol and alpha-lipoic acid and quantification of in vitro growth by cell counts. A new liquid overlay is here described based upon EBSS supplemented with cholesterol and ferric ammonium citrate that, in conjunction with the Loeffler's slope, supports the growth of D. fragilis trophozoites in vitro. This modified overlay supported a 2-fold increase in the numbers of trophozoite in culture from all 4 D. fragilis isolates tested, when compared to a PBS overlay. These advances enable the harvest of a larger number of trophozoites needed for further studies on this parasite.
Introduction: Disasters and mass-casualty scenarios may overwhelm medical resources regardless of the level of preparation. Disaster response requires medical equipment, such as ventilators, that can be operated under adverse circumstances and should be able to provide respiratory support for a variety of patient populations.
Objective: The objective of this study was to evaluate the performance of three portable ventilators designed to provide ventilatory support outside the hospital setting and in mass-casualty incidents, and their adherence to the Task Force for Mass Critical Care recommendations for mass-casualty care ventilators.
Methods: Each device was evaluated at minimum and maximum respiratory rate and tidal volume settings to determine the accuracy of set versus delivered VT at lung compliance settings of 0.02, 0.08 and 0.1 L/cm H20 with corresponding resistance settings of 10, 25, and 5 cm H2O/L/sec, to simulate patients with ARDS, severe asthma, and normal lungs. Additionally, different FIO2 settings with each device (if applicable) were evaluated to determine accuracy of FIO2 delivery and evaluate the effect on delivered VT. Ventilators also were tested for duration of battery life.
Results: VT decreased with all three devices as compliance decreased. The decrease was more pronounced when the internal compressor was activated. At the 0.65 FIO2 setting on the MCV 200, the measured FIO2 varied widely depending on the set VT. Battery life range was 311-582 minutes with the 73X having the longest battery life. Delivered VT decreased toward the end of battery life with the SAVe having the largest decrease. The respiratory rate on the SAVe also decreased approaching the end of battery life.
Conclusion: The 73X and MCV 200 were the closest to satisfying the Task Force for Mass Critical Care requirements for mass casualty ventilators, although neither had the capability to provide PEEP. The 73X provided the most consistent tidal volume delivery across all compliances, had the longest battery duration and the least decline in VT at the end of battery life.