To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We sought to address the prior limitations of symptom checker accuracy by analysing the diagnostic and triage feasibility of online symptom checkers using a consecutive series of real-life emergency department (ED) patient encounters, and addressing a complex patient population – those with hepatitis C or HIV. We aimed to study the diagnostic and triage accuracy of these symptom checkers in relation to an emergency room physician-determined diagnosis. An ED retrospective analysis was performed on 8363 consecutive adult patients. Eligible patients included: 90 HIV, 67 hepatitis C, 11 both HIV and hepatitis C. Five online symptom checkers were utilised for diagnosis (Mayo Clinic, WebMD, Symptomate, Symcat, Isabel), three with triage capabilities. Symptom checker output was compared with ED physician-determined diagnosis data in regards to diagnostic accuracy and differential diagnosis listing, along with triage advice. All symptom checkers, whether for combined HIV and hepatitis C, HIV alone or hepatitis C alone had poor diagnostic accuracy in regards to Top1 (<20%), Top3 (<35%), Top10 (<40%), Listed at All (<45%). Significant variations existed for each individual symptom checker, as some appeared more accurate for listing the diagnosis in the top of the differential, vs. others more apt to list the diagnosis at all. In regards to ED triage data, a significantly higher percentage of hepatitis C patients (59.7%; 40/67) were found to have an initial diagnosis with emergent criteria than HIV patients (35.6%; 32/90). Symptom checker diagnostic capabilities are quite inferior to physician diagnostic capabilities. Complex patients such as those with HIV or hepatitis C may carry a more specific differential diagnosis, warranting symptom checkers to have diagnostic algorithms accounting for such complexity. Symptom checkers carry the potential for real-time epidemiologic monitoring of patient symptoms, as symptom entries and subsequent symptom checker diagnosis could allow health officials a means to track illnesses in specific patient populations and geographic regions. In order to do this, accurate and reliable symptom checkers are warranted.
Hospitalized patients placed in isolation due to a carrier state or infection with resistant or highly communicable organisms report higher rates of anxiety and loneliness and have fewer physician encounters, room entries, and vital sign records. We hypothesized that isolation status might adversely impact patient experience as reported through Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) surveys, particularly regarding communication.
Retrospective analysis of HCAHPS survey results over 5 years.
A 1,165-bed, tertiary-care, academic medical center.
Patients on any type of isolation for at least 50% of their stay were the exposure group. Those never in isolation served as controls.
Multivariable logistic regression, adjusting for age, race, gender, payer, severity of illness, length of stay and clinical service were used to examine associations between isolation status and “top-box” experience scores. Dose response to increasing percentage of days in isolation was also analyzed.
Patients in isolation reported worse experience, primarily with staff responsiveness (help toileting 63% vs 51%; adjusted odds ratio [aOR], 0.77; P = .0009) and overall care (rate hospital 80% vs 73%; aOR, 0.78; P < .0001), but they reported similar experience in other domains. No dose-response effect was observed.
Isolated patients do not report adverse experience for most aspects of provider communication regarded to be among the most important elements for safety and quality of care. However, patients in isolation had worse experiences with staff responsiveness for time-sensitive needs. The absence of a dose-response effect suggests that isolation status may be a marker for other factors, such as illness severity. Regardless, hospitals should emphasize timely staff response for this population.
Although food from grazed animals is increasingly sought by consumers because of perceived animal welfare advantages, grazing systems provide the farmer and the animal with unique challenges. The system is dependent almost daily on the climate for feed supply, with the importation of large amounts of feed from off farm, and associated labour and mechanisation costs, sometimes reducing economic viability. Furthermore, the cow may have to walk long distances and be able to harvest feed efficiently in a highly competitive environment because of the need for high levels of pasture utilisation. She must, also, be: (1) highly fertile, with a requirement for pregnancy within ~80 days post-calving; (2) ‘easy care’, because of the need for the management of large herds with limited labour; (3) able to walk long distances; and (4) robust to changes in feed supply and quality, so that short-term nutritional insults do not unduly influence her production and reproduction cycles. These are very different and are in addition to demands placed on cows in housed systems offered pre-made mixed rations. Furthermore, additional demands in environmental sustainability and animal welfare, in conjunction with the need for greater system-level biological efficiency (i.e. ‘sustainable intensification’), will add to the ‘robustness’ requirements of cows in the future. Increasingly, there is evidence that certain genotypes of cows perform better or worse in grazing systems, indicating a genotype×environment interaction. This has led to the development of tailored breeding objectives within countries for important heritable traits to maximise the profitability and sustainability of their production system. To date, these breeding objectives have focussed on the more easily measured traits and those of highest relative economic importance. In the future, there will be greater emphasis on more difficult to measure traits that are important to the quality of life of the animal in each production system and to reduce the system’s environmental footprint.
Body condition score (BCS) is a subjective assessment of the proportion of body fat an animal possesses and is independent of frame size. There is a growing awareness of the importance of mature animal live-weight given its contribution to the overall costs of production of a sector. Because of the known relationship between BCS and live-weight, strategies to reduce live-weight could contribute to the favouring of animals with lesser body condition. The objective of the present study was to estimate the average difference in live-weight per incremental change in BCS, measured subjectively on a scale of 1 to 5. The data used consisted of 19 033 BCS and live-weight observations recorded on the same day from 7556 ewes on commercial and research flocks; the breeds represented included purebred Belclare (540 ewes), Charollais (1484 ewes), Suffolk (885 ewes), Texel (1695 ewes), Vendeen (140 ewes), as well as, crossbreds (2812 ewes). All associations were quantified using linear mixed models with the dependent variable of live-weight; ewe parity was included as a random effect. The independent variables were BCS, breed (n=6), stage of the inter-lambing interval (n=6; pregnancy, lambing, pre-weaning, at weaning, post-weaning and mating) and parity (1, 2, 3, 4 and 5+). In addition, two-way interactions were used to investigate whether the association between BCS and live-weight differed by parity, a period of the inter-lambing interval or breed. The association between BCS and live-weight differed by parity, by a period of the inter-lambing interval and by breed. Across all data, a one-unit difference in BCS was associated with 4.82 (SE=0.08) kg live-weight, but this differed by parity from 4.23 kg in parity 1 ewes to 5.82 kg in parity 5+ ewes. The correlation between BCS and live-weight across all data was 0.48 (0.47 when adjusted for nuisance factors in the statistical model), but this varied from 0.48 to 0.53 by parity, from 0.36 to 0.63 by stage of the inter-lambing interval and from 0.41 to 0.62 by breed. Results demonstrate that consideration should be taken of differences in BCS when comparing ewes on live-weight as differences in BCS contribute quite substantially to differences in live-weight; moreover, adjustments for differences in BCS should consider the population stratum, especially breed.
Understanding how critical sow live-weight and back-fat depth during gestation are in ensuring optimum sow productivity is important. The objective of this study was to quantify the association between sow parity, live-weight and back-fat depth during gestation with subsequent sow reproductive performance. Records of 1058 sows and 13 827 piglets from 10 trials on two research farms between the years 2005 and 2015 were analysed. Sows ranged from parity 1 to 6 with the number of sows per parity distributed as follows: 232, 277, 180, 131, 132 and 106, respectively. Variables that were analysed included total born (TB), born alive (BA), piglet birth weight (BtWT), pre-weaning mortality (PWM), piglet wean weight (WnWT), number of piglets weaned (Wn), wean to service interval (WSI), piglets born alive in subsequent farrowing and sow lactation feed intake. Calculated variables included the within-litter CV in birth weight (LtV), pre-weaning growth rate per litter (PWG), total litter gain (TLG), lactation efficiency and litter size reared after cross-fostering. Data were analysed using linear mixed models accounting for covariance among records. Third and fourth parity sows had more (P<0.05) TB, BA and heavier BtWT compared with gilts and parity 6 sow contemporaries. Parities 2 and 3 sows weaned more (P<0.05) piglets than older sows. These piglets had heavier (P<0.05) birth weights than those from gilt litters. LtV and PWM were greater (P<0.01) in litters born to parity 5 sows than those born to younger sows. Sow live-weight and back-fat depth at service, days 25 and 50 of gestation were not associated with TB, BA, BtWT, LtV, PWG, WnWT or lactation efficiency (P>0.05). Heavier sow live-weight throughout gestation was associated with an increase in PWM (P<0.01) and reduced Wn and lactation feed intake (P<0.05). Deeper back-fat in late gestation was associated with fewer (P<0.05) BA but heavier (P<0.05) BtWT, whereas deeper back-fat depth throughout gestation was associated with reduced (P<0.01) lactation feed intake. Sow back-fat depth was not associated with LtV, PWG, TLG, WSI or piglets born alive in subsequent farrowing (P>0.05). In conclusion, this study showed that sow parity, live-weight and back-fat depth can be used as indicators of reproductive performance. In addition, this study also provides validation for future development of a benchmarking tool to monitor and improve the productivity of modern sow herd.
Early detection of karyotype abnormalities, including aneuploidy, could aid producers in identifying animals which, for example, would not be suitable candidate parents. Genome-wide genetic marker data in the form of single nucleotide polymorphisms (SNPs) are now being routinely generated on animals. The objective of the present study was to describe the statistics that could be generated from the allele intensity values from such SNP data to diagnose karyotype abnormalities; of particular interest was whether detection of aneuploidy was possible with both commonly used genotyping platforms in agricultural species, namely the Applied BiosystemsTM AxiomTM and the Illumina platform. The hypothesis was tested using a case study of a set of dizygotic X-chromosome monosomy 53,X sheep twins. Genome-wide SNP data were available from the Illumina platform (11 082 autosomal and 191 X-chromosome SNPs) on 1848 male and 8954 female sheep and available from the AxiomTM platform (11 128 autosomal and 68 X-chromosome SNPs) on 383 female sheep. Genotype allele intensity values, either as their original raw values or transformed to logarithm intensity ratio (LRR), were used to accurately diagnose two dizygotic (i.e. fraternal) twin 53,X sheep, both of which received their single X chromosome from their sire. This is the first reported case of 53,X dizygotic twins in any species. Relative to the X-chromosome SNP genotype mean allele intensity values of normal females, the mean allele intensity value of SNP genotypes on the X chromosome of the two females monosomic for the X chromosome was 7.45 to 12.4 standard deviations less, and were easily detectable using either the AxiomTM or Illumina genotype platform; the next lowest mean allele intensity value of a female was 4.71 or 3.3 standard deviations less than the population mean depending on the platform used. Both 53,X females could also be detected based on the genotype LRR although this was more easily detectable when comparing the mean LRR of the X chromosome of each female to the mean LRR of their respective autosomes. On autopsy, the ovaries of the two sheep were small for their age and evidence of prior ovulation was not appreciated. In both sheep, the density of primordial follicles in the ovarian cortex was lower than normally found in ovine ovaries and primary follicle development was not observed. Mammary gland development was very limited. Results substantiate previous studies in other species that aneuploidy can be readily detected using SNP genotype allele intensity values generally already available, and the approach proposed in the present study was agnostic to genotype platform.
The overall objective of a series of experiments to investigate ‘metabolic stress’ was to examine the relationships between ‘metabolic load’, disease and other parameters associated with the welfare of the dairy cow. In the main, these used several well controlled herd based studies complimented with more basic and strategic investigations. In this paper we compare and contrast practical aspects of health and welfare in two high genetic merit herds managed at the extremes of inputs and outputs for dairy farming in south-west Scotland. The hypothesis was that high output herds would have more health and welfare problems than low input herds. Two herds (70 Holstein-Friesian cows each) at SAC Acrehead Dumfries of a similar genetic background (overall in the top 5% of UK cows by PIN and ITEM), were housed in identical buildings and tended by the same herdsman. Both herds had autumn- and spring-calving cattle. The ‘low input’ herd (LI) was given a minimum of concentrate (approx. 0.5 t per cow per year) and milked twice a day and had a restricted quota of 385 000 l. The ‘high output’ herd (HO) was managed for high yields (unrestricted quota) and was given concentrates (2 t per cow per year) and forage ad libitum and milked three times daily. In 1995-96 the sole source of winter forage was grass/clover silage (LI) or grass silage (HO) but in 1996-1998 ensiled cereal and fodder beet were included in both diets. ‘Metabolic load’ could only be inferred from overall inputs, milk outputs, weight loss, body condition score and behaviour. There were significant differences in 305-day lactation yields between herds, and season of calving especially in 1995-96 (LI autumn; 5952 l at 30 g/kg protein (P); LI spring; 5741 l, 32.5 g/kg P; HO autumn; 9541 l at 32.8 g/kg P; HO spring; 8402 l, 32.6 g/kg P). LI weight and body condition-score losses were greatest in this year and behavioural studies showed substantial differences in feeding time (HO < LI, P < 0.05) and total lying time (LI < HO; P < 0.05). However these differences were much less marked in subsequent years. There was a significant difference in the prevalence and incidence of clinical lameness between herds (HO > LI; P < 0.05) and season (autumn > spring P < 0.05) but not for mastitis or metabolic disease. An in-depth study of subclinical claw horn lesion development in first calving heifers showed significant differences between herds in 1996-97 (LI > HO, P < 0.05) but none in 1995-96. There was a significant difference for season in both years (autumn > spring, P < 0.05). Analysis of blood biochemistry parameters of samples taken at approximately 1 month after calving showed some significant differences between LI and HO generally indicating a greater ‘metabolic load’ for LI. Although the full effects of ‘metabolic load’ on immune function and reproduction are dealt with elsewhere our preliminary data showed no significant differences between herds for the former but some significant differences for the latter, in particular there were differences in aspects of the progesterone profiles between herds and more importantly between seasons. However these latter differences were not clearly reflected in conception rates. It was concluded that the hypothesis was not fully sustained and that both systems had pitfalls in terms of welfare. The three major areas causing difficulties for both systems were the need first to ensure adequate intake of forage; secondly to limit the environmental challenge to the feet and udder and finally to marry these systems to the factors limiting reproduction, primarily calving season and ability of reproduction management.
Accurate genomic analyses are predicated on access to a large quantity of accurately genotyped and phenotyped animals. Because the cost of genotyping is often less than the cost of phenotyping, interest is increasing in generating genotypes for phenotyped animals. In some instances this may imply the requirement to genotype older animals with greater phenotypic information content. Biological material for these older informative animals may, however, no longer exist. The objective of the present study was to quantify the ability to impute 11 129 single nucleotide polymorphism (SNP) genotypes of non-genotyped animals (in this instance sires) from the genotypes of their progeny with or without including the genotypes of the progenys’ dams (i.e. mates of the sire to be imputed). The impact on the accuracy of genotype imputation by including more progeny (and their dams’) genotypes in the imputation reference population was also quantified. When genotypes of the dams were not available, genotypes of 41 sires with at least 15 genotyped progeny were used for the imputation; when genotypes of the dams were available, genotypes of 21 sires with at least 10 genotyped progeny were used for the imputation. Imputation was undertaken exploiting family and population level information. The mean and variability in the proportion of genotypes per individual that could not be imputed reduced as the number of progeny genotypes used per individual increased. Little improvement in the proportion of genotypes that could not be imputed was achieved once genotypes of seven progeny and their dams were used or genotypes of 11 progeny without their respective dam’s genotypes were used. Mean imputation accuracy per individual (depicted by both concordance rates and correlation between true and imputed) increased with increasing progeny group size. Moreover, the range in mean imputation accuracy per individual reduced as more progeny genotypes were used in the imputation. If the genotype of the mate of the sire was also used, high accuracy of imputation (mean genotype concordance rate per individual of 0.988), with little additional benefit thereafter, was achieved with seven genotyped progeny. In the absence of genotypes on the dam, similar imputation accuracy could not be achieved even using genotypes on up to 15 progeny. Results therefore suggest, at least for the SNP density used in the present study, that it is possible to accurately impute the genotypes of a non-genotyped parent from the genotypes of its progeny and there is a benefit of also including the genotype of the sire’s mate (i.e. dam of the progeny).
Here we report on the material chemistry following crystallization in the presence of water vapor of chlorinated formamidinium lead-triiodide (NH2CH = NH2PbI3−xClx) perovskite films. We found in-situ exposure to water vapor reduces, or possibly eliminates, the retention of chlorine (Cl) inside NH2CH = NH2PbI3−xClx crystals. There is a strong tendency toward Cl volatility, which indicates the sensitivity of these materials for their integration into solar cells. The requisite for additional efforts focused on the mitigation of water vapor is reported. Based on the in situ results, hot casting (<100 °C) in dry conditions demonstrates improved film coverage and Cl retention with efficiencies reaching 12.07%.
A range of precision farming technologies are used commercially for variable rate applications of nitrogen (N) for cereals, yet these usually adjust N rates from a pre-set value, rather than predicting economically optimal N requirements on an absolute basis. This paper reports chessboard experiments set up to examine variation in N requirements, and to develop and test systems for its prediction, and to assess its predictability. Results showed very substantial variability in fertiliser N requirements within fields, typically >150 kg ha−1, and large variation in optimal yields, typically >2 t ha−1. Despite this, calculated increases in yield and gross margin with N requirements perfectly matched across fields were surprisingly modest (compared to the uniform average rate). Implications are discussed, including the causes of the large remaining variation in grain yield, after N limitations were removed.
Optimising oilseed rape canopy size through correct management is crucial for maximising yield. Plant growth regulators (PGRs) and nitrogen (N) fertiliser are generally applied at a flat rate, however variable applications may be useful for the optimisation of canopy size. The aim of this paper was to understand the potential for spectral reflectance indices to predict green area index (GAI) and crop N content in winter oilseed rape, with specific focus on the Fritzmeier Isaria Crop Sensor. Three large oilseed rape chessboard experiments were set up in 2015 and 2016 in the UK. The results show good correlations between the Isaria indices and both GAI and crop N content, suggesting that the Isaria may be a useful tool for variably applying PGRs and N fertiliser to oilseed rape.
The relationship between childhood adversity and bipolar affective
disorder remains unclear.
To understand the size and significance of this effect through a
statistical synthesis of reported research.
Search terms relating to childhood adversity and bipolar disorder were
entered into Medline, EMBASE, PsycINFO and Web of Science. Eligible
studies included a sample diagnosed with bipolar disorder, a comparison
sample and a quantitative measure of childhood adversity.
In 19 eligible studies childhood adversity was 2.63 times (95% CI
2.00–3.47) more likely to have occurred in bipolar disorder compared with
non-clinical controls. The effect of emotional abuse was particularly
robust (OR = 4.04, 95% CI 3.12–5.22), but rates of adversity were similar
to those in psychiatric controls.
Childhood adversity is associated with bipolar disorder, which has
implications for the treatment of this clinical group. Further
prospective research could clarify temporal causality and explanatory
The objective of the present study was to quantify the extent of genetic variation in three health-related traits namely dagginess, lameness and mastitis, in an Irish sheep population. Each of the health traits investigated pose substantial welfare implications as well as considerable economic costs to producers. Data were also available on four body-related traits, namely body condition score (BCS), live weight, muscle depth and fat depth. Animals were categorised as lambs (<365 days old) or ewes (⩾365 days old) and were analysed both separately and combined. After edits, 39 315 records from 264 flocks between the years 2009 and 2015 inclusive were analysed. Variance components were estimated using animal linear mixed models. Fixed effects included contemporary group, represented as a three-way interaction between flock, date of inspection and animal type (i.e. lamb, yearling ewe (i.e. females ⩾365 days but <730 days old that have not yet had a recorded lambing) or ewe), animal breed proportion, coefficients of heterosis and recombination, animal gender (lambs only), animal parity (ewes only; lambs were assigned a separate ‘parity’) and the difference in age of the animal from the median of the respective parity/age group. An additive genetic effect and residual effect were both fitted as random terms with maternal genetic and non-genetic components also considered for traits of the lambs. The direct heritability of dagginess was similar across age groups (0.14 to 0.15), whereas the direct heritability of lameness ranged from 0.06 (ewes) to 0.12 (lambs). The direct heritability of mastitis was 0.04. For dagginess, 13% of the phenotypic variation was explained by dam litter, whereas the maternal heritability of dagginess was 0.05. The genetic correlation between ewe and lamb dagginess was 0.38; the correlation between ewe and lamb lameness was close to zero but was associated with a large standard error. Direct genetic correlations were evident between dagginess and BCS in ewes and between lameness and BCS in lambs. The present study has demonstrated that ample genetic variation exists for all three health traits investigated indicating that genetic improvement is indeed possible.
Theory suggests that early experiences may calibrate the “threshold activity” of the hypothalamus–pituitary–adrenal axis in childhood. Particularly challenging or particularly supportive environments are posited to manifest in heightened physiological sensitivity to context. Using longitudinal data from the Family Life Project (N = 1,292), we tested whether links between maternal sensitivity and hypothalamus–pituitary–adrenal axis activity aligned with these predictions. Specifically, we tested whether the magnitude of the within-person relation between maternal sensitivity and children's cortisol levels, a proxy for physiological sensitivity to context, was especially pronounced for children who typically experienced particularly low or high levels of maternal sensitivity over time. Our results were consistent with these hypotheses. Between children, lower levels of mean maternal sensitivity (7–24 months) were associated with higher mean cortisol levels across this period (measured as a basal sample collected at each visit). However, the magnitude and direction of the within-person relation was contingent on children's average levels of maternal sensitivity over time. Increases in maternal sensitivity were associated with contemporaneous cortisol decreases for children with typically low-sensitive mothers, whereas sensitivity increases were associated with cortisol increases for children with typically high-sensitive mothers. No within-child effects were evident at moderate levels of maternal sensitivity.
A recent outbreak of Q fever was linked to an intensive goat and sheep dairy farm in Victoria, Australia, 2012-2014. Seventeen employees and one family member were confirmed with Q fever over a 28-month period, including two culture-positive cases. The outbreak investigation and management involved a One Health approach with representation from human, animal, environmental and public health. Seroprevalence in non-pregnant milking goats was 15% [95% confidence interval (CI) 7–27]; active infection was confirmed by positive quantitative PCR on several animal specimens. Genotyping of Coxiella burnetii DNA obtained from goat and human specimens was identical by two typing methods. A number of farming practices probably contributed to the outbreak, with similar precipitating factors to the Netherlands outbreak, 2007-2012. Compared to workers in a high-efficiency particulate arrestance (HEPA) filtered factory, administrative staff in an unfiltered adjoining office and those regularly handling goats and kids had 5·49 (95% CI 1·29–23·4) and 5·65 (95% CI 1·09–29·3) times the risk of infection, respectively; suggesting factory workers were protected from windborne spread of organisms. Reduction in the incidence of human cases was achieved through an intensive human vaccination programme plus environmental and biosecurity interventions. Subsequent non-occupational acquisition of Q fever in the spouse of an employee, indicates that infection remains endemic in the goat herd, and remains a challenge to manage without source control.
‘Photovoice’, a community-based participatory research methodology, uses images as a tool to deconstruct problems by posing meaningful questions in a community to find actionable solutions. This community-enhancing technique was used to elicit experiences of climate change among women in rural Nepal. The current analysis employs mixed methods to explore the subjective mental health experience of participating in a 4- to 5-day photovoice process focused on climate change. A secondary objective of this work was to explore whether or not photovoice training, as a one-time 4-to 5-day intensive intervention, can mobilise people to be more aware of environmental changes related to climate change and to be more resilient to these changes, while providing positive mental health outcomes.