To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
An Early Intervention in Psychosis (EIP) programme aims to engage patients in early assessment and phase-specific interventions which are the key elements of the Irish National Clinical Programme for psychosis. This study aims to describe and review the EIP programme offered by Cork’s North Lee Mental Health Services over a 5-year period.
A retrospective descriptive study design was adopted to describe and review the EIP programme, patient demographics and treatments offered in the service over a 5-year period.
A total of 139 patients were accepted into the programme over the 5-year period. The mean age of onset was 30 years (median = 28, SD = 9.9), and the mean duration of untreated psychosis was 8 months (median = 2.5, SD = 15.3). Two-thirds of patients were single on initial assessment, had a history of substance misuse and were unemployed. The majority of the cohort engaged with the keyworkers and occupational therapy but did not complete the full psychological or family programmes offered. Hospital admission was required for 12% of the cohort.
Patients experiencing their first episode of psychosis can successfully be treated in the community with appropriate professional and family support. However, deficiencies were noted in physical health monitoring, as well as in the availability and engagement with family and psychological therapies. Properly resourced early interventions in psychosis teams are necessary to deliver services at internationally recognised standards.
Shiga toxin-producing Escherichia coli (STEC) infection can cause serious illness including haemolytic uraemic syndrome. The role of socio-economic status (SES) in differential clinical presentation and exposure to potential risk factors amongst STEC cases has not previously been reported in England. We conducted an observational study using a dataset of all STEC cases identified in England, 2010–2015. Odds ratios for clinical characteristics of cases and foodborne, waterborne and environmental risk factors were estimated using logistic regression, stratified by SES, adjusting for baseline demographic factors. Incidence was higher in the highest SES group compared to the lowest (RR 1.54, 95% CI 1.19–2.00). Odds of Accident and Emergency attendance (OR 1.35, 95% CI 1.10–1.75) and hospitalisation (OR 1.71, 95% CI 1.36–2.15) because of illness were higher in the most disadvantaged compared to the least, suggesting potential lower ascertainment of milder cases or delayed care-seeking behaviour in disadvantaged groups. Advantaged individuals were significantly more likely to report salad/fruit/vegetable/herb consumption (OR 1.59, 95% CI 1.16–2.17), non-UK or UK travel (OR 1.76, 95% CI 1.40–2.27; OR 1.85, 95% CI 1.35–2.56) and environmental exposures (walking in a paddock, OR 1.82, 95% CI 1.22–2.70; soil contact, OR 1.52, 95% CI 2.13–1.09) suggesting other unmeasured risks, such as person-to-person transmission, could be more important in the most disadvantaged group.
Introduction: The number of seniors presenting to emergency departments after a fall is increasing. Head injury concerns in this population often leads to a head CT scan. The CT rate among physicians is variable and the reasons for this are unknown. This study examined the role of patient characteristics and country of practice in the decision to order a CT. Methods: This study used a case-based survey of physicians across multiple countries. Each survey included 9 cases pertaining to an 82-year old man who falls. Each case varied in one aspect compared to a base case (aspirin, warfarin, or rivaroxaban use, occipital hematoma, amnesia, dementia, and fall with no head trauma). For each case, participants indicated how “likely” they were to order a head CT scan, measured on a 100-point scale. A response of 80 or more was defined a priori as ‘likely to order a CT scan’. The survey was piloted among emergency residents for feedback on design and comprehension, and was published in French and English. Recruitment was through the Canadian Association of Emergency Physicians, Twitter and CanadiEM. For each case we compared the proportion of physicians who were ‘likely to scan’ with relative to the base case. We also compared the proportion of participants who were ‘likely to scan’ each case in the USA, UK and Australia, relative to Canada. Results: Data was collected from 484 respondents (Canada-308, USA-64, UK-67, Australia-27, and 18 from other countries). Social media distribution limited our ability to estimate of the response rate. Physicians were most likely to scan in the anticoagulation cases (90% likely to order a scan compared to 36% for the base case (p = <0.001)). Other features associated with increased scans were occipital hematoma (48%), multiple falls (68%), and amnesia (68%) (all p < 0.005). Compared to Canada, US physicians were more likely to order CT scans for all cases (p = <0.05). Compared to Canada, UK physicians were significantly less likely to order CT for patients in every case except in the patient with amnesia. Finally, Australian physicians differed from Canada only for the occipital hematoma case where they were significantly more likely to order CT scan. Conclusion: Anticoagulation, amnesia and a history of multiple falls appear to drive the ordering a head CT scan in elderly patients who had fallen. We observed variations in practice between countries. Future clinical decision rules will likely have variable impact on head CT scan rates depending on baseline practice variation.
Introduction: Simulation has assumed an integral role in the Canadian healthcare system with applications in quality improvement, systems development, and medical education. High quality simulation-based research (SBR) is required to ensure the effective and efficient use of this tool. This study sought to establish national SBR priorities and describe the barriers and facilitators of SBR in Emergency Medicine (EM) in Canada. Methods: Simulation leads (SLs) from all fourteen Canadian Departments or Divisions of EM associated with an adult FRCP-EM training program were invited to participate in three surveys and a final consensus meeting. The first survey documented active EM SBR projects. Rounds two and three established and ranked priorities for SBR and identified the perceived barriers and facilitators to SBR at each site. Surveys were completed by SLs at each participating institution, and priority research themes were reviewed by senior faculty for broad input and review. Results: Twenty SLs representing all 14 invited institutions participated in all three rounds of the study. 60 active SBR projects were identified, an average of 4.3 per institution (range 0-17). 49 priorities for SBR in Canada were defined and summarized into seven priority research themes. An additional theme was identified by the senior reviewing faculty. 41 barriers and 34 facilitators of SBR were identified and grouped by theme. Fourteen SLs representing 12 institutions attended the consensus meeting and vetted the final list of eight priority research themes for SBR in Canada: simulation in CBME, simulation for interdisciplinary and inter-professional learning, simulation for summative assessment, simulation for continuing professional development, national curricular development, best practices in simulation-based education, simulation-based education outcomes, and simulation as an investigative methodology. Conclusion: Conclusion: This study has summarized the current SBR activity in EM in Canada, as well as its perceived barriers and facilitators. We also provide a consensus on priority research themes in SBR in EM from the perspective of Canadian simulation leaders. This group of SLs has formed a national simulation-based research group which aims to address these identified priorities with multicenter collaborative studies.
The commercially available collar device MooMonitor+ was evaluated with regards to accuracy and application potential for measuring grazing behavior. These automated measurements are crucial as cows feed intake behavior at pasture is an important parameter of animal performance, health and welfare as well as being an indicator of feed availability. Compared to laborious and time-consuming visual observation, the continuous and automated measurement of grazing behavior may support and improve the grazing management of dairy cows on pasture. Therefore, there were two experiments as well as a literature analysis conducted to evaluate the MooMonitor+ under grazing conditions. The first experiment compared the automated measurement of the sensor against visual observation. In a second experiment, the MooMonitor+ was compared to a noseband sensor (RumiWatch), which also allows continuous measurement of grazing behavior. The first experiment on n = 12 cows revealed that the automated sensor MooMonitor+ and visual observation were highly correlated as indicated by the Spearman’s rank correlation coefficient (rs) = 0.94 and concordance correlation coefficient (CCC) = 0.97 for grazing time. An rs-value of 0.97 and CCC = 0.98 was observed for rumination time. In a second experiment with n = 12 cows over 24-h periods, a high correlation between the MooMonitor+ and the RumiWatch was observed for grazing time as indicated by an rs-value of 0.91 and a CCC-value of 0.97. Similarly, a high correlation was observed for rumination time with an rs-value of 0.96 and a CCC-value of 0.99. While a higher level of agreement between the MooMonitor+ and both visual observation and RumiWatch was observed for rumination time compared to grazing time, the overall results showed a high level of accuracy of the collar device in measuring grazing and rumination times. Therefore, the collar device can be applied to monitor cow behavior at pasture on farms. With regards to the application potential of the collar device, it may not only be used on commercial farms but can also be applied to research questions when a data resolution of 15 min is sufficient. Thus, at farm level, the farmer can get an accurate and continuous measurement of grazing behavior of each individual cow and may then use those data for decision-making to optimize the animal management.
Infant protein intake has been associated with child growth, however, research on maternal protein intake during pregnancy is limited. Insulin-like growth factors (IGF) play a role in early fetal development and maternal protein intake may influence child body composition via IGF-1. The aim of this study was to investigate the association of maternal protein intake throughout pregnancy on cord blood IGF-1 and child body composition from birth to 5 years of age. Analysis was carried out on 570 mother–child dyads from the Randomised cOntrol trial of LOw glycaemic index diet study. Protein intake was recorded using 3-d food diaries in each trimester of pregnancy and protein intake per kg of maternal weight (g/d per kg) was calculated. Cord blood IGF-1 was measured at birth. Infant anthropometry was measured at birth, 6 months, 2 and 5 years of age. Mixed modelling, linear regression, and mediation analysis were carried out. Birth weight centiles were positively associated with early-pregnancy protein intake (g/d per kg), while weight centiles from 6 months to 5 years were negatively associated (B=−21·6, P<0·05). These associations were not mediated by IGF-1. Our findings suggest that high protein intake in early-pregnancy may exert an in utero effect on offspring body composition with a higher weight initially at birth but slower growth rates into childhood. Further research is needed to elucidate the exact mechanisms by which dietary protein modulates fetal growth.
Important Bird and Biodiversity Areas (IBAs) are sites identified as being globally important for the conservation of bird populations on the basis of an internationally agreed set of criteria. We present the first review of the development and spread of the IBA concept since it was launched by BirdLife International (then ICBP) in 1979 and examine some of the characteristics of the resulting inventory. Over 13,000 global and regional IBAs have so far been identified and documented in terrestrial, freshwater and marine ecosystems in almost all of the world’s countries and territories, making this the largest global network of sites of significance for biodiversity. IBAs have been identified using standardised, data-driven criteria that have been developed and applied at global and regional levels. These criteria capture multiple dimensions of a site’s significance for avian biodiversity and relate to populations of globally threatened species (68.6% of the 10,746 IBAs that meet global criteria), restricted-range species (25.4%), biome-restricted species (27.5%) and congregatory species (50.3%); many global IBAs (52.7%) trigger two or more of these criteria. IBAs range in size from < 1 km2 to over 300,000 km2 and have an approximately log-normal size distribution (median = 125.0 km2, mean = 1,202.6 km2). They cover approximately 6.7% of the terrestrial, 1.6% of the marine and 3.1% of the total surface area of the Earth. The launch in 2016 of the KBA Global Standard, which aims to identify, document and conserve sites that contribute to the global persistence of wider biodiversity, and whose criteria for site identification build on those developed for IBAs, is a logical evolution of the IBA concept. The role of IBAs in conservation planning, policy and practice is reviewed elsewhere. Future technical priorities for the IBA initiative include completion of the global inventory, particularly in the marine environment, keeping the dataset up to date, and improving the systematic monitoring of these sites.
Mycobacterium ulcerans is recognised as the third most common mycobacterial infection worldwide. It causes necrotising infections of skin and soft tissue and is classified as a neglected tropical disease by the World Health Organization (WHO). However, despite extensive research, the environmental reservoir of the organism and mode of transmission of the infection to humans remain unknown. This limits the ability to design and implement public health interventions to effectively and consistently prevent the spread and reduce the incidence of this disease. In recent years, the epidemiology of the disease has changed. In most endemic regions of the world, the number of cases reported to the WHO are reducing, with a 64% reduction in cases reported worldwide in the last 9 years. Conversely, in a smaller number of countries including Australia and Nigeria, reported cases are increasing at a rapid rate, new endemic areas continue to appear, and in Australia cases are becoming more severe. The reasons for this changing epidemiology are unknown. We review the epidemiology of M. ulcerans disease worldwide, and document recent changes. We also outline and discuss the current state of knowledge on the ecology of M. ulcerans, possible transmission mechanisms to humans and what may be enabling the spread of M. ulcerans into new endemic areas.
Amorphous TiO2 and SnO2 electron transport layers (ETLs) were deposited by low-temperature atomic layer deposition (ALD). Surface morphology and x-ray photoelectron spectroscopy (XPS) indicate uniform and pinhole free coverage of these ALD hole blocking layers. Both mesoporous and planar perovskite solar cells were fabricated based on these thin films with aperture areas of 1.04 cm2 for TiO2 and 0.09 cm2 and 0.70 cm2 for SnO2. The resulting cell performance of 18.3 % power conversion efficiency (PCE) using planar SnO2 on 0.09 cm2 and 15.3 % PCE using mesoporous TiO2 on 1.04 cm2 active areas are discussed in conjunction with the significance of growth parameters and ETL composition.
We compared sepsis “time zero” and Centers for Medicare and Medicaid Services (CMS) SEP-1 pass rates among 3 abstractors in 3 hospitals. Abstractors agreed on time zero in 29 of 80 (36%) cases. Perceived pass rates ranged from 9 of 80 cases (11%) to 19 of 80 cases (23%). Variability in time zero and perceived pass rates limits the utility of SEP-1 for measuring quality.
Current standard-of-care for glioblastoma (GBM) includes surgery, radiation and temozolomide. Most tumors recur within a year from diagnosis and median survival for recurrent GBM (rGBM) is 3-9 months. Unmethylated promoter status for O6-methylguanine-DNA-methyltransferase (MGMT) is a validated biomarker for temozolomide-resistance, exhibited by most GBM patients. VAL-083 is a DNA-targeting agent with a mechanism-of-action that is independent of MGMT. VAL-083 overcomes temozolomide-resistance in GBM cell-lines, cancer stem cells, and in vivo models. VAL-083 readily crosses the blood-brain barrier and accumulates in brain-tumor tissue. We recently completed a VAL-083 dose-escalation trial in temozolomide- and bevacizumab-refractory rGBM and determined that 40mg/m2/day given intravenously on days 1,2,3 of a 21-day cycle is generally well-tolerated. This dosing regimen was selected for subsequent GBM trials, including an ongoing single-arm, biomarker-driven Phase 2 trial (N=48) in temolozomide-refractory, bevacizumab-naïve rGBM , MGMT-unmethylated (Clinicaltrials.gov:NCT02717962). The primary objective of this study is to determine if VAL-083 improves OS compared to a historical control of 7.15 months for MGMT-unmethylated rGBM patients treated with lomustine (EORTC26101). In addition, another single-arm, biomarker-driven, Phase 2 study (N=25) of VAL-083 in combination with radiotherapy in newly diagnosed GBM, MGMT-unmethylated is ongoing (Clinicaltrials.gov:NCT03050736). This trial aims to determine a dose for further study of VAL-083 in combination with radiotherapy and explore if VAL-083 improves PFS and OS compared to historical results in newly diagnosed GBM. Enrollment and safety data updates will be provided at the meeting. The results of these studies, if successful, may support VAL-083 as part of a new chemotherapeutic treatment paradigm for GBM.
To date, Ireland has been a leading light in the provision of youth mental health services. However, cognisant of the efforts of governmental and non-governmental agencies working in youth mental health, there is much to be done. Barriers into care as well as discontinuity of care across the spectrum of services remain key challenges. This editorial provides guidance for the next stage of development in youth mental care and support that will require significant national engagement and resource investment.
Introduction: Inspired by the Choosing Wisely® campaign, St. Michaels Hospital (SMH) launched an initiative to reduce unnecessary tests, treatments and procedures that may cause patient harm. Stakeholder engagement identified inappropriate ordering of urine culture & sensitivities (C&S) in the emergency department (ED) as a focus area. Inappropriate urine C&S increase workload, healthcare costs and detection of asymptomatic bacteriuria which can lead to unnecessary antibiotics. The project’s purposes were to describe the scope of inappropriately ordered urine C&S in the SMH ED and to conduct a root-cause analysis to inform future quality improvement interventions. Methods: Criteria for determining appropriateness was developed a priori using evidence-based guidelines from the University Health Network together with additional literature review. A retrospective chart review was performed on all urine C&S ordered in the ED from Jun 1 Aug 30, 2016. Each chart was reviewed for order appropriateness, demographic information and ordering provider. All inappropriate urine C&S were reviewed to identify root causes which were then grouped into common themes. A pareto chart was constructed to analyze the frequency of causes. Results: Of 425 urine C&S ordered, 75 (17.7%) were inappropriate. The top 3 reasons were: inappropriate urosepsis work-ups (53%), order processing errors (17%) and inappropriate work-ups for weakness (16%). Inappropriate urosepsis work-ups were defined as urine C&S that were ordered empirically despite there being a clear focus for infection elsewhere (i.e. cough, cellulitis) and in the absence of urinary symptoms. Order processing errors were defined as urine C&S which were sent despite there being no documented order. Inappropriate testing was more likely to occur overnight, in females and when a urine routine and microscopy was not ordered prior to C&S. 29% of patients with inappropriate C&S received antibiotics. Conclusion: 17.7% of urine C&S ordered in the SMH ED during the 3-month study period were inappropriate. The top cause was septic patients who were empirically tested despite having another source for infection identified from the outset. A possible reason for this is the recent ED emphasis on early recognition of sepsis which may encourage early use of antibiotics and empiric urine C&S. One question to resolve is whether a 17.7% overutilization rate is sufficient to make it a target for change. Interventions designed to reduce inappropriate urine C&S may inadvertently increase the number of missed cultures in patients admitted with sepsis not yet diagnosed. Next steps involve discussions between the ED, Internal Medicine, Infectious Disease and Microbiology, and patient partners to identify patient-centered change ideas and sustainable strategies. This may involve establishing guidelines for ordering urine C&S and incorporating lab services to provide oversight into urine C&S processing.
Introduction: Head injury is a common presentation to all emergency departments. Previous research has shown that such injuries may be complicated by delayed intracranial hemorrhage (D-ICH) after the initial scan is negative. Exposure to anticoagulant or anti-platelet medications (ACAP) may be a risk factor for D-ICH. We have conducted a systematic review and meta-analysis to determine the incidence of delayed traumatic intracranial hemorrhage in patients taking anticoagulants, anti-platelets or both. Methods: The literature search was conducted in March 2017 with an update in April 2017. Keyword and MeSH terms were used to search OVID Medline, Embase and the Cochrane database as well as grey literature sources. All cohort and experimental studies were eligible for selection. Inclusion criteria included pre-injury exposure to oral anticoagulant and / or anti-platelet medication and a negative initial CT scan of the brain (CT1). The primary outcome was delayed intracranial hemorrhage present on repeat CT scan (CT2) within 48 hours of the presentation. Only patients who were rescanned or observed minimally were included. Clinically significant D-ICH were those that required neurosurgery, caused death or necessitated a change in management strategy, such as admission. Results: Fifteen primary studies were ultimately identified, comprising a total of 3801 patients. Of this number, 2111 had a control CT scan. 39 cases of D-ICH were identified, with the incidence of D-ICH calculated to be 1.31% (95% CI [0.56, 2.27]). No more than 12 of these patients had a clinically significant D-ICH representing 0.09% (95% CI [0.00, 0.31]). 10 of them were on warfarin and two on aspirin. There were three deaths recorded and three patients needed neurosurgery. Conclusion: The relatively low incidence suggests that repeat CT should not be mandatory for patients without ICH on first CT. This is further supported by the negligibly low rate of clinically significant D-ICH. Evidence-based assessments should be utilised to indicate the appropriate discharge plan, with further research required to guide the balance between clinical observation and repeat CT.
We present a multi-frequency study of the intermediate spiral SAB(r)bc type galaxy NGC 6744, using available data from the Chandra X-Ray telescope, radio continuum data from the Australia Telescope Compact Array and Murchison Widefield Array, and Wide-field Infrared Survey Explorer infrared observations. We identify 117 X-ray sources and 280 radio sources. Of these, we find nine sources in common between the X-ray and radio catalogues, one of which is a faint central black hole with a bolometric radio luminosity similar to the Milky Way’s central black hole. We classify 5 objects as supernova remnant (SNR) candidates, 2 objects as likely SNRs, 17 as H ii regions, 1 source as an AGN; the remaining 255 radio sources are categorised as background objects and one X-ray source is classified as a foreground star. We find the star-formation rate (SFR) of NGC 6744 to be in the range 2.8–4.7 M⊙~yr − 1 signifying the galaxy is still actively forming stars. The specific SFR of NGC 6744 is greater than that of late-type spirals such as the Milky Way, but considerably less that that of a typical starburst galaxy.
Early detection of karyotype abnormalities, including aneuploidy, could aid producers in identifying animals which, for example, would not be suitable candidate parents. Genome-wide genetic marker data in the form of single nucleotide polymorphisms (SNPs) are now being routinely generated on animals. The objective of the present study was to describe the statistics that could be generated from the allele intensity values from such SNP data to diagnose karyotype abnormalities; of particular interest was whether detection of aneuploidy was possible with both commonly used genotyping platforms in agricultural species, namely the Applied BiosystemsTM AxiomTM and the Illumina platform. The hypothesis was tested using a case study of a set of dizygotic X-chromosome monosomy 53,X sheep twins. Genome-wide SNP data were available from the Illumina platform (11 082 autosomal and 191 X-chromosome SNPs) on 1848 male and 8954 female sheep and available from the AxiomTM platform (11 128 autosomal and 68 X-chromosome SNPs) on 383 female sheep. Genotype allele intensity values, either as their original raw values or transformed to logarithm intensity ratio (LRR), were used to accurately diagnose two dizygotic (i.e. fraternal) twin 53,X sheep, both of which received their single X chromosome from their sire. This is the first reported case of 53,X dizygotic twins in any species. Relative to the X-chromosome SNP genotype mean allele intensity values of normal females, the mean allele intensity value of SNP genotypes on the X chromosome of the two females monosomic for the X chromosome was 7.45 to 12.4 standard deviations less, and were easily detectable using either the AxiomTM or Illumina genotype platform; the next lowest mean allele intensity value of a female was 4.71 or 3.3 standard deviations less than the population mean depending on the platform used. Both 53,X females could also be detected based on the genotype LRR although this was more easily detectable when comparing the mean LRR of the X chromosome of each female to the mean LRR of their respective autosomes. On autopsy, the ovaries of the two sheep were small for their age and evidence of prior ovulation was not appreciated. In both sheep, the density of primordial follicles in the ovarian cortex was lower than normally found in ovine ovaries and primary follicle development was not observed. Mammary gland development was very limited. Results substantiate previous studies in other species that aneuploidy can be readily detected using SNP genotype allele intensity values generally already available, and the approach proposed in the present study was agnostic to genotype platform.
The majority of people living with dementia in Ireland reside in their own homes, some supported by formal or informal home care. This audit aimed to estimate the prevalence of dementia and suspected cognitive impairment (CI) among older adults, 65+ years, in receipt of formal home care (domiciliary care) in a defined health service area in North Dublin. A secondary objective of the audit was to explore factors associated with dementia or CI in this cohort.
A cross-sectional audit was conducted on all clients aged 65+ years actively receiving publicly funded home care packages (HCPs) during May 2016 in Healthcare Service Executive CHO9 Dublin North Central. A total of 935 urban community dwelling older adults were included in the study [mean age 83.7 (s.d. 7.4) years and 65% female]. Basic socio-demographic and health data were extracted from common summary assessment reports. Service users were categorised as having (a) dementia if a diagnosis of dementia or cognitive decline which impacts on independent living, was documented by a health professional or (b) suspected CI where a validated cognitive screening tool was applied and the score was indicative of mild CI.
Overall, the estimated prevalence of dementia and suspected CI was 37.1% and 8.7%, respectively. Factors significantly associated with dementia and suspected CI were higher dependency and home care hours, communication difficulty and being non-self-caring (p<0.001). Notably, half (51.6%) of those with either dementia or suspected CI group lived alone.
Our findings suggest a high prevalence dementia among HCP users, highlighting a need and opportunity for dementia-specific approaches to support older people in their homes.
In the context of water use for agricultural production, water footprints (WFs) have become an important sustainability indicator. To understand better the water demand for beef and sheep meat produced on pasture-based systems, a WF of individual farms is required. The main objective of this study was to determine the primary contributors to freshwater consumption up to the farm gate expressed as a volumetric WF and associated impacts for the production of 1 kg of beef and 1 kg of sheep meat from a selection of pasture-based farms for 2 consecutive years, 2014 and 2015. The WF included green water, from the consumption of soil moisture due to evapotranspiration, and blue water, from the consumption of ground and surface waters. The impact of freshwater consumption on global water stress from the production of beef and sheep meat in Ireland was also computed. The average WF of the beef farms was 8391 l/kg carcass weight (CW) of which 8222 l/kg CW was green water and 169 l/kg CW was blue water; water for the production of pasture (including silage and grass) contributed 88% to the WF, concentrate production – 10% and on-farm water use – 1%. The average stress-weighted WF of beef was 91 l H2O eq/kg CW, implying that each kg of beef produced in Ireland contributed to freshwater scarcity equivalent to the consumption of 91 l of freshwater by an average world citizen. The average WF of the sheep farms was 7672 l/kg CW of which 7635 l/kg CW was green water and 37 l/kg CW was blue water; water for the production of pasture contributed 87% to the WF, concentrate production – 12% and on-farm water use – 1%. The average stress-weighted WF was 2 l H2O eq/kg CW for sheep. This study also evaluated the sustainability of recent intensification initiatives in Ireland and found that increases in productivity were supported through an increase in green water use and higher grass yields per hectare on both beef and sheep farms.