To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Measuring diet choice in grazing animals is challenging, complicating the assessment of feed efficiency in pasture-based systems. Furthermore, animals may modify their intake of a forage species depending on its nutritive value and on their own physiological status. Various fecal markers have been used to estimate feed intake in grazing animals. However, plant-wax markers such as n-alkanes (ALK) and long-chain alcohols may provide reliable estimates of both dietary choices and intakes. Still, their use in beef cattle has been relatively limited. The present study was designed to test the reliability of the ALK technique to estimate diet choices in beef heifers. Twenty-two Angus-cross heifers were evaluated at both post-weaning and yearling age. At each age, they were offered both red clover and fescue hay as cubes. Following 3-week acclimation periods, daily intake of each forage species was assessed daily for 10 days. During the final 5 days, fecal grab samples were collected twice daily. The ALK fecal concentrations were adjusted using recovery fractions compiled from literature. Diet composition was estimated using two statistical methods. Post-weaning, dietary choices were reliably estimated, with low residual error, regardless of the statistical approach adopted. The regression of observed on estimated red clover proportion ranged from 0.85±0.08 to 1.01±0.09 for fecal samples collected in the p.m. and for daily proportions once averaged, respectively. However, at yearling age, the estimates were less reliable. There was a tendency to overestimate the red clover proportion in diets of heifers preferring fescue, and vice versa. This was due to greater variability in ALK fecal concentrations in the yearling heifers. Overall, the ALK technique provided a reliable tool for estimating diet choice in animals fed a simple forage diet. Although further refinements in the application of this methodology are needed, plant-wax markers provide opportunities for evaluating diet composition in grazing systems in cattle.
If mental discipline and capacity for sustained thought are essential attributes of a formal philosopher, then Conrad Celtis, whom D. F. Strauss christened “the German arch-humanist,” was far from being one. On the big questions Celtis' thinking was protean and inconclusive. He showed a grand indifference to the need for consistency. In a remarkable way he combined Platonic mysticism with an Aristotelian view of nature. He savagely attacked the church, its clergy, and dogma, yet performed his religious duties as though no trace of doubt marred the serenity of a simple faith. He lead an openly immoral life, if not so successfully as he boasted, and at the end turned to the comforts of a pious death. In a very real way these contradictions were not his alone, but less obviously those of his whole generation.
The role that vitamin D plays in pulmonary function remains uncertain. Epidemiological studies reported mixed findings for serum 25-hydroxyvitamin D (25(OH)D)–pulmonary function association. We conducted the largest cross-sectional meta-analysis of the 25(OH)D–pulmonary function association to date, based on nine European ancestry (EA) cohorts (n 22 838) and five African ancestry (AA) cohorts (n 4290) in the Cohorts for Heart and Aging Research in Genomic Epidemiology Consortium. Data were analysed using linear models by cohort and ancestry. Effect modification by smoking status (current/former/never) was tested. Results were combined using fixed-effects meta-analysis. Mean serum 25(OH)D was 68 (sd 29) nmol/l for EA and 49 (sd 21) nmol/l for AA. For each 1 nmol/l higher 25(OH)D, forced expiratory volume in the 1st second (FEV1) was higher by 1·1 ml in EA (95 % CI 0·9, 1·3; P<0·0001) and 1·8 ml (95 % CI 1·1, 2·5; P<0·0001) in AA (Prace difference=0·06), and forced vital capacity (FVC) was higher by 1·3 ml in EA (95 % CI 1·0, 1·6; P<0·0001) and 1·5 ml (95 % CI 0·8, 2·3; P=0·0001) in AA (Prace difference=0·56). Among EA, the 25(OH)D–FVC association was stronger in smokers: per 1 nmol/l higher 25(OH)D, FVC was higher by 1·7 ml (95 % CI 1·1, 2·3) for current smokers and 1·7 ml (95 % CI 1·2, 2·1) for former smokers, compared with 0·8 ml (95 % CI 0·4, 1·2) for never smokers. In summary, the 25(OH)D associations with FEV1 and FVC were positive in both ancestries. In EA, a stronger association was observed for smokers compared with never smokers, which supports the importance of vitamin D in vulnerable populations.
Machine learning methods are being increasingly applied to physical healthcare. In this article we describe some of the potential benefits, challenges and limitations of this approach in a mental health context. We provide a number of examples where machine learning could add value beyond conventional statistical modelling.
Out-of-area (OOA) placements occur when patients cannot be admitted to local facilities, which can be extremely stressful for patients and families. Thus, the Department of Health aims to eliminate the need for OOA admissions. Using data from a UK mental health trust we developed a ‘virtual mental health ward’ to evaluate the potential impact of referral rates and length of stay (LOS) on OOA rates. The results indicated OOA rates were equally sensitive to LOS and referral rate. This suggests that investment in community services that reduce both LOS and referral rates are required to meaningfully reduce OOA admission rates.
Declaration of interest
P.A.T. holds an honorary consultant contract with the Tees, Esk and Wear Valleys NHS Foundation Trust.
To determine the feasibility and value of developing a regional antibiogram for community hospitals.
Multicenter retrospective analysis of antibiograms.
SETTING AND PARTICIPANTS
A total of 20 community hospitals in central and eastern North Carolina and south central Virginia participated in this study.
We combined antibiogram data from participating hospitals for 13 clinically relevant gram-negative pathogen–antibiotic combinations. From this combined antibiogram, we developed a regional antibiogram based on the mean susceptibilities of the combined data.
We combined a total of 69,778 bacterial isolates across 13 clinically relevant gram-negative pathogen–antibiotic combinations (median for each combination, 1100; range, 174–27,428). Across all pathogen–antibiotic combinations, 69% of local susceptibility rates fell within 1 SD of the regional mean susceptibility rate, and 97% of local susceptibilities fell within 2 SD of the regional mean susceptibility rate. No individual hospital had >1 pathogen–antibiotic combination with a local susceptibility rate >2 SD of the regional mean susceptibility rate. All hospitals’ local susceptibility rates were within 2 SD of the regional mean susceptibility rate for low-prevalence pathogens (<500 isolates cumulative for the region).
Small community hospitals frequently cannot develop an accurate antibiogram due to a paucity of local data. A regional antibiogram is likely to provide clinically useful information to community hospitals for low-prevalence pathogens.
Patient days and days present were compared to directly measured person time to quantify how choice of different denominator metrics may affect antimicrobial use rates. Overall, days present were approximately one-third higher than patient days. This difference varied among hospitals and units and was influenced by short length of stay.
To summarize and discuss logistic and administrative challenges we encountered during the Benefits of Enhanced Terminal Room (BETR) Disinfection Study and lessons learned that are pertinent to future utilization of ultraviolet (UV) disinfection devices in other hospitals
Multicenter cluster randomized trial
SETTING AND PARTICIPANTS
Nine hospitals in the southeastern United States
All participating hospitals developed systems to implement 4 different strategies for terminal room disinfection. We measured compliance with disinfection strategy, barriers to implementation, and perceptions from nurse managers and environmental services (EVS) supervisors throughout the 28-month trial.
Implementation of enhanced terminal disinfection with UV disinfection devices provides unique challenges, including time pressures from bed control personnel, efficient room identification, negative perceptions from nurse managers, and discharge volume. In the course of the BETR Disinfection Study, we utilized several strategies to overcome these barriers: (1) establishing safety as the priority; (2) improving communication between EVS, bed control, and hospital administration; (3) ensuring availability of necessary resources; and (4) tracking and providing feedback on compliance. Using these strategies, we deployed ultraviolet (UV) disinfection devices in 16,220 (88%) of 18,411 eligible rooms during our trial (median per hospital, 89%; IQR, 86%–92%).
Implementation of enhanced terminal room disinfection strategies using UV devices requires recognition and mitigation of 2 key barriers: (1) timely and accurate identification of rooms that would benefit from enhanced terminal disinfection and (2) overcoming time constraints to allow EVS cleaning staff sufficient time to properly employ enhanced terminal disinfection methods.
Falconer and MacKay (1996) note that the measurement of a trait in two different environments may be considered as two traits rather than one. In this way it is possible, through the calculation of genetic correlations, to estimate to what extent the two measurements under different conditions are in fact the same characteristic and are determined by the same genes. The widespread use of AI in pig production has faltered due to problems with dilution and cryopreservation of semen and yet an industry split, where breeders and nucleus herds use AI extensively but multipliers and commercial producers do not, is becoming apparent. Reproductive traits are increasingly seen as an important component of overall pig production and while the genetic correlation between reproductive and production traits has been explored, little work has focused on the genotype by environment interaction of such fertility traits. The present study reports the genetic relationship of number born alive (NBA) in litters conceived naturally and by AI, and in rate of weaning to first service (WTFS-1).
Currently fewer than 50% of UK lambs produce carcasses of acceptable quality for the domestic and export markets, which compromises the competitiveness of sheep farming. Carcass quality can be changed by selection, and this is now being taken advantage of in terminal sire breeds and, to a lesser extent, in hill breeds. However, little attention has yet been focused on the crossing breeds, which have relatively poor carcass quality, in spite of the large impact such breeds have on the slaughter generation. Recently, a long-term project began to develop breeding programmes relevant to crossing sire (‘longwool’) breeds. Its objective is to produce a selection index to improve carcass quality without compromising the reproductive performance or maternal ability of these breeds. The Bluefaced Leicester is the most prevalent crossing sire breed with its crossbred (‘Mule’) daughters out of draft hill ewes accounting for 89% of crossbred (longwool x hill) ewes in the UK (Pollot, 1998).
With increasing emphasis in the meat sector on better and more consistent quality, carcass leanness and conformation is now an important issue for sheep breeders. In 1999, only 47% of all carcasses in the UK met the target specifications for weight, fat and conformation (MLC, 2000), highlighting the potential for improvement. In the current stratified crossbreeding system, crossbred wether lambs are a by-product of the production of dam line ewes for the lowland sector. If their carcass quality is sufficient, they can give a valuable boost to the economics of the breeding programme. Genetic improvement of carcass quality in crossing sire breeds would benefit the crossbred wethers, as well as filter through to the terminal sire cross lambs produced by the crossbred ewes. This work aims to assess the influence of selection index and live conformation score of crossing sires (in this case Bluefaced Leicesters) on growth and carcass quality traits of their crossbred progeny, as a first step towards designing a genetic improvement programme for crossing sire sheep.