To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We studied the association between chlorhexidine gluconate (CHG) concentration on skin and resistant bacterial bioburden. CHG was almost always detected on the skin, and detection of methicillin-resistant Staphylococcus aureus, carbapenem-resistant Enterobacteriaceae, and vancomycin-resistant Enterococcus on skin sites was infrequent. However, we found no correlation between CHG concentration and bacterial bioburden.
Mental disorders cause high burden in adolescents, but adolescents often underutilise potentially beneficial treatments. Perceived need for and barriers to care may influence whether adolescents utilise services and which treatments they receive. Adolescents and parents are stakeholders in adolescent mental health care, but their perceptions regarding need for and barriers to care might differ. Understanding patterns of adolescent-parent agreement might help identify gaps in adolescent mental health care.
A nationally representative sample of Australian adolescents aged 13–17 and their parents (N = 2310), recruited between 2013–2014, were asked about perceived need for four types of adolescent mental health care (counselling, medication, information and skill training) and barriers to care. Perceived need was categorised as fully met, partially met, unmet, or no need. Cohen's kappa was used to assess adolescent-parent agreement. Multinomial logistic regressions were used to model variables associated with patterns of agreement.
Almost half (46.5% (s.e. = 1.21)) of either adolescents or parents reported a perceived need for any type of care. For both groups, perceived need was greatest for counselling and lowest for medication. Identified needs were fully met for a third of adolescents. Adolescent-parent agreement on perceived need was fair (kappa = 0.25 (s.e. = 0.01)), but poor regarding the extent to which needs were met (kappa = −0.10 (s.e. = 0.02)). The lack of parental knowledge about adolescents' feelings was positively associated with adolescent-parent agreement that needs were partially met or unmet and disagreement about perceived need, compared to agreement that needs were fully met (relative risk ratio (RRR) = 1.91 (95% CI = 1.19–3.04) to RRR = 4.69 (95% CI = 2.38–9.28)). Having a probable disorder was positively associated with adolescent-parent agreement that needs were partially met or unmet (RRR = 2.86 (95% CI = 1.46–5.61)), and negatively with adolescent-parent disagreement on perceived need (RRR = 0.50 (95% CI = 0.30–0.82)). Adolescents reported most frequently attitudinal barriers to care (e.g. self-reliance: 55.1% (s.e. = 2.39)); parents most frequently reported that their child refused help (38.7% (s.e. = 2.69)). Adolescent-parent agreement was poor for attitudinal (kappa = −0.03 (s.e. = 0.06)) and slight for structural barriers (kappa = 0.02 (s.e. = 0.09)).
There are gaps in the extent to which adolescent mental health care is meeting the needs of adolescents and their parents. It seems important to align adolescents' and parents' needs at the beginning and throughout treatment and to improve communication between adolescents and their parents. Both might provide opportunities to increase the likelihood that needs will be fully met. Campaigns directed towards adolescents and parents need to address different barriers to care. For adolescents, attitudinal barriers such as stigma and mental health literacy require attention.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
In cluster-randomized trials (CRT), groups rather than individuals are randomized to interventions. The aim of this study was to present critical design, implementation, and analysis issues to consider when planning a CRT in the healthcare setting and to synthesize characteristics of published CRT in the field of healthcare epidemiology.
A systematic review was conducted to identify CRT with infection control outcomes.
We identified the following 7 epidemiological principles: (1) identify design type and justify the use of CRT; (2) account for clustering when estimating sample size and report intraclass correlation coefficient (ICC)/coefficient of variation (CV); (3) obtain consent; (4) define level of inference; (5) consider matching and/or stratification; (6) minimize bias and/or contamination; and (7) account for clustering in the analysis. Among 44 included studies, the most common design was CRT with crossover (n = 15, 34%), followed by parallel CRT (n = 11, 25%) and stratified CRT (n = 7, 16%). Moreover, 22 studies (50%) offered justification for their use of CRT, and 20 studies (45%) demonstrated that they accounted for clustering at the design phase. Only 15 studies (34%) reported the ICC, CV, or design effect. Also, 15 studies (34%) obtained waivers of consent, and 7 (16%) sought consent at the cluster level. Only 17 studies (39%) matched or stratified at randomization, and 10 studies (23%) did not report efforts to mitigate bias and/or contamination. Finally, 29 studies (88%) accounted for clustering in their analyses.
We must continue to improve the design and reporting of CRT to better evaluate the effectiveness of infection control interventions in the healthcare setting.
To enhance enrollment into randomized clinical trials (RCTs), we proposed electronic health record-based clinical decision support for patient–clinician shared decision-making about care and RCT enrollment, based on “mathematical equipoise.”
As an example, we created the Knee Osteoarthritis Mathematical Equipoise Tool (KOMET) to determine the presence of patient-specific equipoise between treatments for the choice between total knee replacement (TKR) and nonsurgical treatment of advanced knee osteoarthritis.
With input from patients and clinicians about important pain and physical function treatment outcomes, we created a database from non-RCT sources of knee osteoarthritis outcomes. We then developed multivariable linear regression models that predict 1-year individual-patient knee pain and physical function outcomes for TKR and for nonsurgical treatment. These predictions allowed detecting mathematical equipoise between these two options for patients eligible for TKR. Decision support software was developed to graphically illustrate, for a given patient, the degree of overlap of pain and functional outcomes between the treatments and was pilot tested for usability, responsiveness, and as support for shared decision-making.
The KOMET predictive regression model for knee pain had four patient-specific variables, and an r2 value of 0.32, and the model for physical functioning included six patient-specific variables, and an r2 of 0.34. These models were incorporated into prototype KOMET decision support software and pilot tested in clinics, and were generally well received.
Use of predictive models and mathematical equipoise may help discern patient-specific equipoise to support shared decision-making for selecting between alternative treatments and considering enrollment into an RCT.
To determine which healthcare worker (HCW) roles and patient care activities are associated with acquisition of vancomycin-resistant Enterococcus (VRE) on HCW gloves or gowns after patient care, as a surrogate for transmission to other patients.
Prospective cohort study.
Medical and surgical intensive care units at a tertiary-care academic institution.
VRE-colonized patients on Contact Precautions and their HCWs.
Overall, 94 VRE-colonized patients and 469 HCW–patient interactions were observed. Research staff recorded patient care activities and cultured HCW gloves and gowns for VRE before doffing and exiting patient room.
VRE were isolated from 71 of 469 HCWs’ gloves or gowns (15%) following patient care. Occupational/physical therapists, patient care technicians, nurses, and physicians were more likely than environmental services workers and other HCWs to have contaminated gloves or gowns. Compared to touching the environment alone, the odds ratio (OR) for VRE contamination associated with touching both the patient (or objects in the immediate vicinity of the patient) and environment was 2.78 (95% confidence interval [CI], 0.99–0.77) and the OR associated with touching only the patient (or objects in the immediate vicinity) was 3.65 (95% CI, 1.17–11.41). Independent risk factors for transmission of VRE to HCWs were touching the patient’s skin (OR, 2.18; 95% CI, 1.15–4.13) and transferring the patient into or out of bed (OR, 2.66; 95% CI, 1.15–6.43).
Patient contact is a major risk factor for HCW contamination and subsequent transmission. Interventions should prioritize contact precautions and hand hygiene for HCWs whose activities involve touching the patient.
Oxidative stress occurs when antioxidant defence mechanisms are overwhelmed by free radicals and may lead to damage to DNA, which has been implicated in processes such as ageing and cancer. The Comet assay allows detection of oxidative DNA damage in individual cells. As horses with recurrent airway obstruction (RAO) have been shown to demonstrate low antioxidant status and oxidative stress, we hypothesised that peripheral blood mononuclear cells (PBMC) of horses with RAO would demonstrate increases in DNA damage following natural allergen challenge.
Six horses (mean age 15 years, range 8-23 years) diagnosed with RAO (in remission) and 6 healthy breed matched controls (mean age 9 years, range 5-15 years) were studied. Blood samples were collected 7 days prior to challenge and immediately and 3 days after stabling on mouldy hay and straw for 24h. All animals were kept at grass prior to and after the challenge period. Bronchoalveolar lavage (BAL) was performed and neutrophil counts determined.
Crib–biting is a stereotypic behaviour performed by approximately 5% of captive domestic horses. Dietary factors have been strongly associated with the development of oral stereotypies and risk factors for crib–biting, identified in recent epidemiological studies, include feeding high concentrate and/or low forage diets (Waters et al., 2002). Experimental work has shown that such diets are likely to result in increased gastric acidity (Murray and Eichorn, 1996; Nadeau et al., 2000). We therefore propose that young horses initiate crib–biting in an attempt to produce alkaline saliva to buffer their stomachs when alternative opportunities for mastication are limited. The aim of this study was to determine whether there was an association between crib–biting behaviour and stomach condition in foals.
Foals that had recently started to perform crib–biting were recruited into the study and compared with non–stereotypic foals. The stomachs of 15 crib-biting foals and 9 normal foals were examined using a video endoscope.
The Middle East respiratory syndrome coronavirus (MERS-CoV) is caused by a novel coronavirus discovered in 2012. Since then, 1806 cases, including 564 deaths, have been reported by the Kingdom of Saudi Arabia (KSA) and affected countries as of 1 June 2016. Previous literature attributed increases in MERS-CoV transmission to camel breeding season as camels are likely the reservoir for the virus. However, this literature review and subsequent analysis indicate a lack of seasonality. A retrospective, epidemiological cluster analysis was conducted to investigate increases in MERS-CoV transmission and reports of household and nosocomial clusters. Cases were verified and associations between cases were substantiated through an extensive literature review and the Armed Forces Health Surveillance Branch's Tiered Source Classification System. A total of 51 clusters were identified, primarily nosocomial (80·4%) and most occurred in KSA (45·1%). Clusters corresponded temporally with the majority of periods of greatest incidence, suggesting a strong correlation between nosocomial transmission and notable increases in cases.
Whether monozygotic (MZ) and dizygotic (DZ) twins differ from each other in a variety of phenotypes is important for genetic twin modeling and for inferences made from twin studies in general. We analyzed whether there were differences in individual, maternal and paternal education between MZ and DZ twins in a large pooled dataset. Information was gathered on individual education for 218,362 adult twins from 27 twin cohorts (53% females; 39% MZ twins), and on maternal and paternal education for 147,315 and 143,056 twins respectively, from 28 twin cohorts (52% females; 38% MZ twins). Together, we had information on individual or parental education from 42 twin cohorts representing 19 countries. The original education classifications were transformed to education years and analyzed using linear regression models. Overall, MZ males had 0.26 (95% CI [0.21, 0.31]) years and MZ females 0.17 (95% CI [0.12, 0.21]) years longer education than DZ twins. The zygosity difference became smaller in more recent birth cohorts for both males and females. Parental education was somewhat longer for fathers of DZ twins in cohorts born in 1990–1999 (0.16 years, 95% CI [0.08, 0.25]) and 2000 or later (0.11 years, 95% CI [0.00, 0.22]), compared with fathers of MZ twins. The results show that the years of both individual and parental education are largely similar in MZ and DZ twins. We suggest that the socio-economic differences between MZ and DZ twins are so small that inferences based upon genetic modeling of twin data are not affected.
Despite the ubiquity of tunnel channels and valleys within formerly glaciated areas, their origin remains enigmatic. Few modern analogues exist for event-related subglacial erosion. This paper presents evidence of subglacial meltwater erosion and tunnel channel formation during the November 1996 jökulhlaup, Skeiðarárjökull, Iceland. The jökulhlaup reached a peak discharge of 45 000 to 50 000 m3 s–1, with flood outbursts emanating from multiple outlets across the entire 23 km wide glacier snout. Subsequent retreat of the southeast margin of Skeiðarárjökull has revealed a tunnel channel excavated into the surrounding moraine sediment and ascending 11.5m over a distance of 160 m from a larger trough to join the apex of an ice-contact fan formed in November 1996. The tunnel channel formed via hydro-mechanical erosion of 14 000m3 to 24 000 m3 of unconsolidated glacier substrate, evidenced by copious rip-up clasts within the ice-contact fan. Flow reconstruction provides peak discharge estimates of 680±140m3 s–1. The tunnel channel orientation, oblique to local ice flow direction and within a col, suggests that local jökulhlaup routing was controlled by (a) subglacial topography and (b) the presence of a nearby proglacial lake. We describe the first modern example of tunnel channel formation and illustrate the importance of pressurized subglacial jökulhlaup flow for tunnel channel formation.
To determine whether patients using the Centers for Medicare and Medicaid Services (CMS) Hospital Compare website (http://medicare.gov/hospitalcompare) can use nationally reported healthcare-associated infection (HAI) data to differentiate hospitals.
Secondary analysis of publicly available HAI data for calendar year 2013.
We assessed the availability of HAI data for geographically proximate hospitals (ie, hospitals within the same referral region) and then analyzed these data to determine whether they are useful to differentiate hospitals. We assessed data for the 6 HAIs reported by hospitals to the Centers for Disease Control and Prevention (CDC).
Data were analyzed for 4,561 hospitals representing 88% of registered community and federal government hospitals in the United States. Healthcare-associated infection data are only useful for comparing hospitals if they are available for multiple hospitals within a geographic region. We found that data availability differed by HAI. Clostridium difficile infections (CDI) data were most available, with 82% of geographic regions (ie, hospital referral regions) having >50% of hospitals reporting them. In contrast, 4% of geographic regions had >50% of member hospitals reporting surgical site infections (SSI) for hysterectomies, which had the lowest availability. The ability of HAI data to differentiate hospitals differed by HAI: 72% of hospital referral regions had at least 1 pair of hospitals with statistically different risk-adjusted CDI rates (SIRs), compared to 9% for SSI (hysterectomy).
HAI data generally are reported by enough hospitals to meet minimal criteria for useful comparisons in many geographic locations, though this varies by type of HAI. CDI and catheter-associated urinary tract infection (CAUTI) are more likely to differentiate hospitals than the other publicly reported HAIs.
Cognitive deficits are a core feature of schizophrenia, and impairments in most domains are thought to be stable over the course of the illness. However, cross-sectional evidence indicates that some areas of cognition, such as visuospatial associative memory, may be preserved in the early stages of psychosis, but become impaired in later established illness stages. This longitudinal study investigated change in visuospatial and verbal associative memory following psychosis onset.
In total 95 first-episode psychosis (FEP) patients and 63 healthy controls (HC) were assessed on neuropsychological tests at baseline, with 38 FEP and 22 HCs returning for follow-up assessment at 5–11 years. Visuospatial associative memory was assessed using the Cambridge Neuropsychological Test Automated Battery Visuospatial Paired-Associate Learning task, and verbal associative memory was assessed using Verbal Paired Associates subtest of the Wechsler Memory Scale - Revised.
Visuospatial and verbal associative memory at baseline did not differ significantly between FEP patients and HCs. However, over follow-up, visuospatial associative memory deteriorated significantly for the FEP group, relative to healthy individuals. Conversely, verbal associative memory improved to a similar degree observed in HCs. In the FEP cohort, visuospatial (but not verbal) associative memory ability at baseline was associated with functional outcome at follow-up.
Areas of cognition that develop prior to psychosis onset, such as visuospatial and verbal associative memory, may be preserved early in the illness. Later deterioration in visuospatial memory ability may relate to progressive structural and functional brain abnormalities that occurs following psychosis onset.
Currently it is estimated that about 1 billion people globally have non-alcoholic fatty liver disease (NAFLD), a condition in which liver fat exceeds 5 % of liver weight in the absence of significant alcohol intake. Due to the central role of the liver in metabolism, the prevalence of NAFLD is increasing in parallel with the prevalence of obesity, insulin resistance and other risk factors of metabolic diseases. However, the contribution of liver fat to the risk of type 2 diabetes mellitus and CVD, relative to other ectopic fat depots and to other risk markers, is unclear. Various studies have suggested that the accumulation of liver fat can be reduced or prevented via dietary changes. However, the amount of liver fat reduction that would be physiologically relevant, and the timeframes and dose–effect relationships for achieving this through different diet-based approaches, are unclear. Also, it is still uncertain whether the changes in liver fat per se or the associated metabolic changes are relevant. Furthermore, the methods available to measure liver fat, or even individual fatty acids, differ in sensitivity and reliability. The present report summarises key messages of presentations from different experts and related discussions from a workshop intended to capture current views and research gaps relating to the points above.
Introduction: Point of care ultrasound (PoCUS) has become an established tool in the initial management of patients with undifferentiated hypotension in the emergency department (ED). Current established protocols (e.g. RUSH and ACES) were developed by expert user opinion, rather than objective, prospective data. Recently the SHoC Protocol was published, recommending 3 core scans; cardiac, lung, and IVC; plus other scans when indicated clinically. We report the abnormal ultrasound findings from our international multicenter randomized controlled trial, to assess if the recommended 3 core SHoC protocol scans were chosen appropriately for this population. Methods: Recruitment occurred at seven centres in North America (4) and South Africa (3). Screening at triage identified patients (SBP<100 or shock index>1) who were randomized to PoCUS or control (standard care with no PoCUS) groups. All scans were performed by PoCUS-trained physicians within one hour of arrival in the ED. Demographics, clinical details and study findings were collected prospectively. A threshold incidence for positive findings of 10% was established as significant for the purposes of assessing the appropriateness of the core recommendations. Results: 138 patients had a PoCUS screen completed. All patients had cardiac, lung, IVC, aorta, abdominal, and pelvic scans. Reported abnormal findings included hyperdynamic LV function (59; 43%); small collapsing IVC (46; 33%); pericardial effusion (24; 17%); pleural fluid (19; 14%); hypodynamic LV function (15; 11%); large poorly collapsing IVC (13; 9%); peritoneal fluid (13; 9%); and aortic aneurysm (5; 4%). Conclusion: The 3 core SHoC Protocol recommendations included appropriate scans to detect all pathologies recorded at a rate of greater than 10 percent. The 3 most frequent findings were cardiac and IVC abnormalities, followed by lung. It is noted that peritoneal fluid was seen at a rate of 9%. Aortic aneurysms were rare. This data from the first RCT to compare PoCUS to standard care for undifferentiated hypotensive ED patients, supports the use of the prioritized SHoC protocol, though a larger study is required to confirm these findings.
Sediments from the Antarctic continental margin may provide detailed palaeoenvironmental records for Antarctic shelf waters during the late Quaternary. Here we present results from a palaeoenvironmental study of two sediment cores recovered from the continental shelf off Mac. Robertson Land, East Antarctica. These gravity cores were collected approximately 90 km apart from locations on the inner and outer shelf. Both cores are apparently undisturbed sequences of diatom ooze mixed with fine, quartz-rich sand. Core stratigraphies have been established from radiocarbon analyses of bulk organic carbon. Down-core geochemical determinations include the lithogenic components AÍ and Fe, biogenic components opal and organic carbon, and palaco-redox proxies Mn, Mo and U. We use the geochemical data to infer past variations in the deposition of biogenic and lithogenic materials, and the radiocarbon dates to estimate average sediment accumulation rates. The Holocene record of the outer-shelf core suggests three episodes of enhanced diatom export production at about 1.8, 3.8 and 5.5 ka BP, as well as less pronounced bloom episodes which occurred over a shorter period. Average sediment accumulation rates at this location range from 13.7 cm ka−1 in the late Pleistocene early Holocene to 82 cm ka−1 in the late Holocene, and suggest that the inferred episodes of enhanced biogenic production lasted 100-1000 years. in contrast, data for the inner-shelf core suggest that there has been a roughly constant proportion of biogenic and lithogenic material accumulating during the middle to late Holocene, with a greater proportion of biogenic material relative to the outer shelf. Notably, there is an approximately 7-fold increase in average sediment accumulation rate (from 24.5 to 179 cm ka−1) at this inner-shelf location between the middle and late Holocene, with roughly comparable increases in the mass accumulation rates of both biogenic and lithogenic material. This may represent changes in sediment transport processes, or reflect real increases in pelagic sedimentation in this region during the Holocene. Our results suggest quite different sedimentation regimes in these two shelf locations during the middle to late Holocene.
To determine which comorbid conditions are considered causally related to central-line associated bloodstream infection (CLABSI) and surgical-site infection (SSI) based on expert consensus.
Using the Delphi method, we administered an iterative, 2-round survey to 9 infectious disease and infection control experts from the United States.
Based on our selection of components from the Charlson and Elixhauser comorbidity indices, 35 different comorbid conditions were rated from 1 (not at all related) to 5 (strongly related) by each expert separately for CLABSI and SSI, based on perceived relatedness to the outcome. To assign expert consensus on causal relatedness for each comorbid condition, all 3 of the following criteria had to be met at the end of the second round: (1) a majority (>50%) of experts rating the condition at 3 (somewhat related) or higher, (2) interquartile range (IQR)≤1, and (3) standard deviation (SD)≤1.
From round 1 to round 2, the IQR and SD, respectively, decreased for ratings of 21 of 35 (60%) and 33 of 35 (94%) comorbid conditions for CLABSI, and for 17 of 35 (49%) and 32 of 35 (91%) comorbid conditions for SSI, suggesting improvement in consensus among this group of experts. At the end of round 2, 13 of 35 (37%) and 17 of 35 (49%) comorbid conditions were perceived as causally related to CLABSI and SSI, respectively.
Our results have produced a list of comorbid conditions that should be analyzed as risk factors for and further explored for risk adjustment of CLABSI and SSI.
We investigated the physiology of two closely related albatross species relative to their breeding strategy: black-browed albatrosses (Thalassarche melanophris) breed annually, while grey-headed albatrosses (T. chrysostoma) breed biennially. From observations of breeding fate and blood samples collected at the end of breeding in one season and feather corticosterone levels (fCort) sampled at the beginning of the next breeding season, we found that in both species some post-breeding physiological parameters differed according to breeding outcome (successful, failed, deferred). Correlations between post-breeding physiology and fCort, and links to future breeding decisions, were examined. In black-browed albatrosses, post-breeding physiology and fCort were not significantly correlated, but fCort independently predicted breeding decision the next year, which we interpret as a possible migratory carry-over effect. In grey-headed albatrosses, post-breeding triglyceride levels were negatively correlated with fCort, but only in females, which we interpret as a potential cost of reproduction. However, this potential cost did not carry-over to future breeding in the grey-headed albatrosses. None of the variables predicted future breeding decisions. We suggest that biennial breeding in the grey-headed albatrosses may have evolved as a strategy to buffer against the apparent susceptibility of females to negative physiological costs of reproduction. Future studies are needed to confirm this.
The horse is a non-ruminant herbivore adapted to eating plant-fibre or forage-based diets. Some horses are stabled for most or the majority of the day with limited or no access to fresh pasture and are fed preserved forage typically as hay or haylage and sometimes silage. This raises questions with respect to the quality and suitability of these preserved forages (considering production, nutritional content, digestibility as well as hygiene) and required quantities. Especially for performance horses, forage is often replaced with energy dense feedstuffs which can result in a reduction in the proportion of the diet that is forage based. This may adversely affect the health, welfare, behaviour and even performance of the horse. In the past 20 years a large body of research work has contributed to a better and deeper understanding of equine forage needs and the physiological and behavioural consequences if these are not met. Recent nutrient requirement systems have incorporated some, but not all, of this new knowledge into their recommendations. This review paper amalgamates recommendations based on the latest understanding in forage feeding for horses, defining forage types and preservation methods, hygienic quality, feed intake behaviour, typical nutrient composition, digestion and digestibility as well as health and performance implications. Based on this, consensual applied recommendations for feeding preserved forages are provided.