To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Case management has been an integral part of psychiatric practice in the United States for over a decade and has generated a large body of literature. The application of case management principles to the care of people suffering from psychiatric disorders is becoming increasingly popular in the United Kingdom and Europe and literature is now beginning to be published. However, no definitive statements about the efficacy of case management have been made due to a range of conceptual and methodological problems. The present paper is a critical review of the case management outcome literature. Reported outcomes are reviewed in the context of study design and service characteristics. The authors conclude that case management practice can have at least some impact on patients' use of services (including marked decrease in in-patient bed days); satisfaction with services; engagement with services; and social networks and relationships when it is delivered as a direct, clinical service with high staff: patient ratios. A set of recommendations are suggested for the future practice and presentation of research into case management.
A primary barrier to translation of clinical research discoveries into care delivery and population health is the lack of sustainable infrastructure bringing researchers, policymakers, practitioners, and communities together to reduce silos in knowledge and action. As National Institutes of Healthʼs (NIH) mechanism to advance translational research, Clinical and Translational Science Award (CTSA) awardees are uniquely positioned to bridge this gap. Delivering on this promise requires sustained collaboration and alignment between research institutions and public health and healthcare programs and services. We describe the collaboration of seven CTSA hubs with city, county, and state healthcare and public health organizations striving to realize this vision together. Partnership representatives convened monthly to identify key components, common and unique themes, and barriers in academic–public collaborations. All partnerships aligned the activities of the CTSA programs with the needs of the city/county/state partners, by sharing resources, responding to real-time policy questions and training needs, promoting best practices, and advancing community-engaged research, and dissemination and implementation science to narrow the knowledge-to-practice gap. Barriers included competing priorities, differing timelines, bureaucratic hurdles, and unstable funding. Academic–public health/health system partnerships represent a unique and underutilized model with potential to enhance community and population health.
A low finishing weight and poor carcass characteristics are major causes of lower incomes in extensive sheep flocks; however, the use of terminal sire crossbreeding would improve lamb performance and carcass traits under these conditions. The aim of this study was to evaluate sire breed effects on the performance of lambs born to Corriedale ewes in extensive sheep systems in Western Patagonia. A total of 10 Corriedale, 10 Dorset, nine Suffolk and seven Texel sires, 16 of which were under a genetic recorded scheme and 20 selected from flocks not participating in genetic improvement programmes, were used across six commercial farms for 2 successive years. Data were collected from 685 lambs of the four resulting genotypes. Overall, Corriedale lambs were 0.47 kg lighter at birth than crossbred lambs (P<0.001). Suffolk and Texel sired lambs required more assistance (P<0.01) at birth than Corriedale or Dorset sired lambs, with Suffolk sired lambs requiring the most assistance (8%). Ewes sired with Suffolk rams had larger (P<0.05) litters than ewes sired with Texel or Corriedale rams. Lamb live weight gain from birth to weaning was higher (P<0.001) in crossbred lambs compared with Corriedale lambs, therefore, crossbred lambs averaged 2.9 kg heavier BW (P<0.001) than Corriedale lambs. A significant sire breed x sire source interaction was detected for lamb live weight gain (P<0.05) and lamb live weight at weaning (P<0.01), showing that the heaviest lambs were from recorded sires, except for Suffolk crossbred lambs. Mortality rate to weaning was increased (P<0.05) in Suffolk cross lambs (31%), with Corriedale lambs showing the lowest (17%) mortality. Terminal sire breeds increased (P<0.001) cold carcass weight, with 13.8, 16.0, 15.2 and 14.9 kg for the Corriedale, Dorset, Suffolk and Texel sired lambs, respectively. Carcass length, kidney knob and channel fat, fat grade, grade rule and fat depth measurements were not affected by sire breed (P>0.05). Carcass conformation was higher in Texel sired lambs compared with Corriedale lambs (P<0.05), with Dorset and Suffolk sired lambs being intermediate. Crossbred lambs showed a greater (P<0.001) eye muscle than Corriedale. Commercial cuts were affected by sire breed, as a result of the Corriedale lambs being smaller and having lighter carcass than crossbred lambs. Significant improvement in lamb weights at weaning and carcass traits could be expected when using a terminal sire on Corriedale ewes in Western Patagonia. However, no advantages were detected with the use of recorded sires under these production systems.
Dementia cases are increasing worldwide; thus, investigators seek to identify interventions that might prevent or ameliorate cognitive decline in later life. Extensive research confirms the benefits of physical exercise for brain health, yet only a fraction of older adults exercise regularly. Interactive mental and physical exercise, as in aerobic exergaming, not only motivates, but has also been found to yield cognitive benefit above and beyond traditional exercise. This pilot study sought to investigate whether greater cognitive challenge while exergaming would yield differential outcomes in executive function and generalize to everyday functioning. Sixty-four community based older adults (mean age=82) were randomly assigned to pedal a stationary bike, while interactively engaging on-screen with: (1) a low cognitive demand task (bike tour), or (2) a high cognitive demand task (video game). Executive function (indices from Trails, Stroop and Digit Span) was assessed before and after a single-bout and 3-month exercise intervention. Significant group × time interactions were found after a single-bout (Color Trails) and after 3 months of exergaming (Stroop; among 20 adherents). Those in the high cognitive demand group performed better than those in the low cognitive dose condition. Everyday function improved across both exercise conditions. Pilot data indicate that for older adults, cognitive benefit while exergaming increased concomitantly with higher doses of interactive mental challenge. (JINS, 2015, 21, 768–779)
The aim of this study was to examine cross-sectionally whether higher cardiorespiratory fitness (CRF) might favorably modify amyloid-β (Aβ)-related decrements in cognition in a cohort of late-middle-aged adults at risk for Alzheimer’s disease (AD). Sixty-nine enrollees in the Wisconsin Registry for Alzheimer’s Prevention participated in this study. They completed a comprehensive neuropsychological exam, underwent 11C Pittsburgh Compound B (PiB)-PET imaging, and performed a graded treadmill exercise test to volitional exhaustion. Peak oxygen consumption (VO2peak) during the exercise test was used as the index of CRF. Forty-five participants also underwent lumbar puncture for collection of cerebrospinal fluid (CSF) samples, from which Aβ42 was immunoassayed. Covariate-adjusted regression analyses were used to test whether the association between Aβ and cognition was modified by CRF. There were significant VO2peak*PiB-PET interactions for Immediate Memory (p=.041) and Verbal Learning & Memory (p=.025). There were also significant VO2peak*CSF Aβ42 interactions for Immediate Memory (p<.001) and Verbal Learning & Memory (p<.001). Specifically, in the context of high Aβ burden, that is, increased PiB-PET binding or reduced CSF Aβ42, individuals with higher CRF exhibited significantly better cognition compared with individuals with lower CRF. In a late-middle-aged, at-risk cohort, higher CRF is associated with a diminution of Aβ-related effects on cognition. These findings suggest that exercise might play an important role in the prevention of AD. (JINS, 2015, 21, 841–850)
The emergence of invasive fungal wound infections (IFIs) in combat casualties led to development of a combat trauma-specific IFI case definition and classification. Prospective data were collected from 1133 US military personnel injured in Afghanistan (June 2009–August 2011). The IFI rates ranged from 0·2% to 11·7% among ward and intensive care unit admissions, respectively (6·8% overall). Seventy-seven IFI cases were classified as proven/probable (n = 54) and possible/unclassifiable (n = 23) and compared in a case-case analysis. There was no difference in clinical characteristics between the proven/probable and possible/unclassifiable cases. Possible IFI cases had shorter time to diagnosis (P = 0·02) and initiation of antifungal therapy (P = 0·05) and fewer operative visits (P = 0·002) compared to proven/probable cases, but clinical outcomes were similar between the groups. Although the trauma-related IFI classification scheme did not provide prognostic information, it is an effective tool for clinical and epidemiological surveillance and research.
To assess the prevalence of traumatic stress experienced by secondary responders to disaster events to determine if mental health education should be included in HAZWOPER training.
Preexisting survey tools for assessing posttraumatic stress disorder (PTSD), resiliency, and mental distress were combined to form a web-based survey tool that was distributed to individuals functioning in secondary response roles. Data were analyzed using the Fisher exact test, 1-way ANOVA, and 1-sample t tests.
Respondents reported elevated PTSD levels (32.9%) as compared to the general population. HAZWOPER-trained responders with disaster work experience were more likely to be classified as PTSD positive as compared to untrained, inexperienced responders and those possessing only training or experience. A majority (68.75%) scored below the mean resiliency level of 80.4 on the Connor-Davidson Resilience Scale. Respondents with only training or both training and experience were more likely to exhibit lower resiliency scores than those with no training or experience. PTSD positivity correlated with disaster experience. Among respondents, 91% indicated support for mental health education.
Given the results of the survey, consideration should be given to the inclusion of pre- and postdeployment mental health education in the HAZWOPER training regimen. (Disaster Med Public Health Preparedness. 2013;0:1-9)
Over the past decade, a growing number of deep imaging surveys have started to provide meaningful constraints on the population of extrasolar giant planets at large orbital separation. Primary targets for these surveys have been carefully selected based on their age, distance and spectral type, and often on their membership to young nearby associations where all stars share common kinematics, photometric and spectroscopic properties. The next step is a wider statistical analysis of the frequency and properties of low mass companions as a function of stellar mass and orbital separation. In late 2009, we initiated a coordinated European Large Program using angular differential imaging in the H band (1.66 μm) with NaCo at the VLT. Our aim is to provide a comprehensive and statistically significant study of the occurrence of extrasolar giant planets and brown dwarfs at large (5-500 AU) orbital separation around ~150 young, nearby stars, a large fraction of which have never been observed at very deep contrast. The survey has now been completed and we present the data analysis and detection limits for the observed sample, for which we reach the planetary-mass domain at separations of ≳50 AU on average. We also present the results of the statistical analysis that has been performed over the 75 targets newly observed at high-contrast. We discuss the details of the statistical analysis and the physical constraints that our survey provides for the frequency and formation scenario of planetary mass companions at large separation.
A study was undertaken to investigate the performance of breeding ewes fed a range of forage and concentrate-based diets in late pregnancy, balanced for supply of metabolizable protein (MP). For the final 6 weeks before lambing, 104 twin-bearing multiparous ewes were offered one of four diets: adlibitum precision-chop grass silage + 0.55 kg/day concentrates (GS); ad libitum maize silage + 0.55 kg/day concentrates (MS); a 1 : 1 mixture (on a dry matter (DM) basis) of grass silage and maize silage fed ad libitum + 0.55 kg/day (GSMS); or 1.55 kg/day concentrates + 50 g/day chopped barley straw (C). The CP content of the concentrates was varied between treatments (157 to 296 g/kg DM) with the aim of achieving a daily intake of 130 g/day MP across all treatments. Compared with ewes fed GS, forage DM intake was higher (P < 0.05) in ewes fed MS (+0.21 kg/day) and GSMS (+0.16 kg/day), resulting in higher (P < 0.001) total DM intakes with these treatments. C ewes had the lowest total DM intake of all the treatments examined (P < 0.001). C ewes lost more live weight (LW; P < 0.001) and body condition score (BCS; P < 0.05) during the first 3 weeks of the study but there were no dietary effects on ewe LW or BCS thereafter. The incidence of dystocia was lower (P < 0.01) in C ewes compared with those offered silage-based diets (7.5% v. 37.4% ewes), and was higher (P < 0.01) in ewes fed MS compared with GS or GSMS (50.7%, 34.7% and 26.9%, respectively). There were no significant dietary effects on the plasma metabolite concentrations of ewes in late pregnancy, pre-weaning lamb mortality, weaned lamb output per ewe or on lamb growth rate. The results of this study demonstrate that both maize silage and all-concentrate diets can replace grass silage in pregnant ewe rations without impacting on performance, provided the supply of MP is non-limiting. The higher incidence of dystocia in ewes fed maize silage as the sole forage is a concern.
Sixty-five Holstein–Friesian calves were randomly allocated to one of eight nutritional treatments at 4 days of age. In this factorial design study, the treatments comprised of four levels of milk replacer (MR) mixed in 6 l of water (500, 750, 1000 and 1250 g/day) × two crude protein (CP) concentrations (230 and 270 g CP/kg dry matter (DM)). MR was fed via automatic teat feeders and concentrates were offered via automated dispensers during the pre-wean period. MR and calf starter concentrate intake were recorded until weaning with live weight and body measurements recorded throughout the rearing period until heifers entered the dairy herd at a targeted 24 months of age. There was no effect of MR protein concentration on concentrate or MR intake, and no effect on body size or live weight at any stage of development. During the pre-weaning period, for every 100 g increase in MR allowance, concentrate consumption was reduced by 39 g/day. While, for every 100 g increase in the amount of MR offered, live weight at days 28 and 270 increased by 0.76 and 2.61 kg, respectively (P < 0.05). Increasing MR feed levels increased (P < 0.05) heart girth and body condition score at recordings during the first year of life, but these effects disappeared thereafter. Increasing MR feeding level tended to reduce both age at first observed oestrus and age at first service but no significant effect on age at first calving was observed. Neither MR feeding level nor MR CP content affected post-calving live weight or subsequent milk production. Balance measurements conducted using 44 male calves during the pre-weaning period showed that increasing milk allowance increased energy and nitrogen (N) intake, diet DM digestibility, true N digestibility and the biological value of the dietary protein. Increasing the MR protein content had no significant effect on the apparent digestibility of N or DM.
The objectives of this study were to investigate the effects of fish oil supplementation on performance and muscle fatty acid composition of hill lambs finished on grass-based or concentrate-based diets, and to examine the interaction with selenium (Se) status. In September 2006, 180 entire male lambs of mixed breeds were sourced from six hill farms after weaning and finished on five dietary treatments: grazed grass (GG), grass +0.4 kg/day cereal-based concentrate (GC), grass +0.4 kg/day cereal-based concentrate enriched with fish oil (GF), ad libitum cereal-based concentrate (HC) and ad libitum fish oil-enriched concentrate (HF). Within each treatment, half of the lambs were also supplemented with barium selenate by subcutaneous injection. At the start of the trial, the proportion of lambs with a marginal (<0.76 μmol/l) or deficient (<0.38 μmol/l) plasma Se status was 0.84 and 0.39, respectively. Compared with control lambs, GG lambs treated with Se had higher (P < 0.01) plasma Se levels, whereas erythrocyte glutathione peroxidase activity was higher (P < 0.01) for Se-supplemented lambs fed diets GG and GF. However, Se supplementation had no effects on any aspect of animal performance. Fish oil increased (P < 0.05) levels of 22:5n-3 and 22:6n-3 in the Longissimus dorsi of HF lambs but otherwise had no effect on the health attributes of lamb meat. There were no significant effects of fish oil on dry matter intake, animal performance or lamb carcass characteristics. Daily carcass weight gain (CWG; P < 0.001), carcass weight (P < 0.01) and conformation score (P < 0.01) increased with increasing concentrate inputs. Lambs fed concentrate-based diets achieved a higher mean CWG (P < 0.001), dressing proportion (P < 0.001) and carcass weight (P < 0.011), and were slaughtered up to 8.3 days earlier (P < 0.05) and at 1.2 kg lower (P < 0.05) live weight than pasture-fed lambs. However, carcasses from grass-fed lambs contained lower levels of perinephric and retroperitoneal fat (P < 0.05), and had less fat over the Iliocostalis thoracis (P < 0.001) and Obliquus internus abdominis (P < 0.05). Meat from grass-fed lambs also had lower levels of 18:2n-6 and total n-6 fatty acids compared with those finished indoors. The results of this study demonstrate that fish oil supplementation has some benefits for the health attributes of meat from lambs fed concentrate-based diets but not grass-based diets. Supplementing Se-deficient lambs with barium selenate will improve Se status of lambs fed zero-concentrate diets, but has no additional benefit when lambs are already consuming their daily Se requirement from concentrates or when fish oil-enriched diets are fed.
Government policies relating to red meat production take account of the carbon footprint, environmental impact, and contributions to human health and nutrition, biodiversity and food security. This paper reviews the impact of grazing on these parameters and their interactions, identifying those practices that best meet governments’ strategic goals. The recent focus of research on livestock grazing and biodiversity has been on reducing grazing intensity on hill and upland areas. Although this produces rapid increases in sward height and herbage mass, changes in structural diversity and plant species are slower, with no appreciable short-term increases in biodiversity so that environmental policies that simply involve reductions in numbers of livestock may not result in increased biodiversity. Furthermore, upland areas rely heavily on nutrient inputs to pastures so that withdrawal of these inputs can threaten food security. Differences in grazing patterns among breeds increase our ability to manage biodiversity if they are matched appropriately to different conservation grazing goals. Lowland grassland systems differ from upland pastures in that additional nutrients in the form of organic and inorganic fertilisers are more frequently applied to lowland pastures. Appropriate management of these nutrient applications is required, to reduce the associated environmental impact. New slurry-spreading techniques and technologies (e.g. the trailing shoe) help reduce nutrient losses but high nitrogen losses from urine deposition remain a key issue for lowland grassland systems. Nitrification inhibitors have the greatest potential to successfully tackle this problem. Greenhouse gas (GHG) emissions are lower from indoor-based systems that use concentrates to shorten finishing periods. The challenge is to achieve the same level of performance from grass-based systems. Research has shown potential solutions through the use of forages containing condensed tannins or establishing swards with a high proportion of clover and high-sugar grasses. Relative to feeding conserved forage or concentrates, grazing fresh grass not only reduces GHG emissions but also enhances the fatty acid composition of meat in terms of consumer health. It is possible to influence biodiversity, nutrient utilisation, GHG emissions and the nutritional quality of meat in grass-based systems, but each of these parameters is intrinsically linked and should not be considered in isolation. Interactions between these parameters must be considered carefully when policies are being developed, in order to ensure that strategies designed to achieve positive gains in one category do not lead to a negative impact in another. Some win–win outcomes are identified.
The objectives of this study were to investigate the effect of dietary lipid source on the growth and carcass characteristics of lambs sourced from a range of crossbred hill ewes. Over a 2-year period, 466 lambs representing the progeny of Scottish Blackface (BF × BF), Swaledale (SW) × BF, North Country Cheviot (CH) × BF, Lleyn (LL) × BF and Texel (T) × BF ewes were sourced from six commercial hill flocks and finished on one of four diets: grass pellets (GP), cereal-based concentrate (CC), CC enriched with oilseed rape (CR) and CC enriched with fish oil (CF). Dry matter intake (DMI) was highest (P < 0.001) in lambs offered GP; however, carcass weight gain (CWG) and feed conversion efficiency were higher (P < 0.001) in lambs fed concentrate-based diets. For lambs offered concentrate-based diets, DMI and live weight gain were lower (P < 0.001) for CF than CC or CR. Lambs with T × BF dams achieved a higher (P < 0.05) daily CWG and CWG/kg DMI than BF × BF, SW × BF or LL × BF dams. When lambs were slaughtered at fat score 3, CH × BF, LL × BF and T × BF dams increased carcass weight by 0.8 to 1.4 kg (P < 0.001) and conformation score (CS) by 0.2 to 0.4 units (P < 0.001) compared with BF × BF or SW × BF dams. However, breed effects on carcass conformation were reduced by 50% when lambs were slaughtered at a constant carcass weight. Diets CC and CR increased carcass weight by 0.8 to 1.6 kg (P < 0.001) and CS by 0.1 to 0.3 units (P < 0.001) compared with GP and CF. Both, dam breed and dietary effects on carcass conformation were associated with an increase (P < 0.001) in shoulder width of the lambs. Lambs fed CF and slaughtered at a constant carcass weight had more subcutaneous fat over the Longissumusdorsi (P < 0.05), Iliocostalisthoracis (P < 0.001) and Obliquus internusabdominis (P < 0.001) compared with those fed CC. However, these effects were removed when lambs were slaughtered at a constant fat score. At both endpoints, lambs from T × BF dams contained less (P < 0.05) perinephric and retroperitoneal fat than SW × BF or LL × BF dams fed GP or CC, respectively. The results from this study show that using crossbred ewes sired by CH, LL or T sires will increase carcass weight and improve carcass conformation of lambs sourced from hill flocks. Inclusion of oilseed rape in lamb finishing diets had only minor effects on performance compared with a standard CC but feeding fish oil or GP impacted negatively on lamb growth and carcass quality.
The aim of this study was to evaluate the effects of age and breed on the reproductive performance and lamb output of crossbred hill ewes relative to purebred Scottish Blackface (BF). BF ewes were compared alongside Swaledale (SW) × BF, North Country Cheviot (CH) × BF, Lleyn (LL) × BF and Texel (T) × BF ewes on six commercial hill farms across Northern Ireland, on which all the ewes were born and reared. Ewes were mated to a range of sire breeds, balanced across breeds, for up to five successive breeding seasons. Mature live weight of adult BF, SW × BF, CH × BF, LL × BF and T × BF ewes was 52.8, 54.9, 60.3, 55.6 and 58.6 kg (P < 0.001), respectively. Compared with the pure BF, the number of lambs born per ewe lambed was higher with LL × BF and SW × BF (P < 0.05), whereas the number of lambs weaned per ewe lambed was greater for LL × BF and T × BF (P < 0.01). Total litter weight at birth of all the crossbred ewes was heavier (P < 0.01) than the pure BF, except in primiparous 2-year-old ewes. Lambs born to CH × BF and T × BF dams were 0.24 to 0.35 kg heavier at birth (P < 0.01) than the other ewe breeds, whereas lambs born to CH × BF, LL × BF and T × BF dams were, on average, 1.7, 1.3 and 1.5 kg, respectively, heavier (P < 0.01) at weaning than those from BF dams due to their higher (P < 0.05) average daily gain. Compared with the pure BF, total weaned lamb output per ewe lambed was 3.7, 4.8, 6.7 and 5.4 kg heavier (P < 0.05) for SW × BF, CH × BF, LL × BF and T × BF, respectively. However, as a result of the heavier live weight of the crossbred ewes, production efficiency (lamb output per kilogram live weight (W) and lamb output per kilogram metabolic live weight (W0.75)) was higher (P < 0.001) for LL × BF ewes only. For all ewe breeds, litter size at birth per ewe lambed, total lamb birth weight per ewe lambed and litter size at weaning increased (P < 0.001) with age up to 5 years, but decreased in 6-year-old ewes. Average lamb weaning weight and total weaned lamb output per ewe lambed increased (P < 0.001) with age up to 4 years . Production efficiency of the 6-year-old ewes was lower (P < 0.01) than the younger ewes. This study shows that adopting a flock replacement policy based on crossing BF ewes with LL, SW, T and CH sires can lead to significant improvements in the productivity of hill flocks.
A study was undertaken to compare the longevity and lifetime lamb output of purebred Scottish Blackface (BF) ewes with a range of crossbred genotypes from Scottish BF dams. For up to five successive breeding seasons, 1143 Scottish BF, Swaledale × BF (SW × BF), North Country Cheviot × BF (CH × BF), Lleyn × BF (LL × BF) and Texel × BF (T × BF) ewes were mated to a range of sire breeds on six hill farms across Northern Ireland. Dentition and lamb output were recorded annually until completion of the study or until the ewe was removed due to death or culling. Timing of mortality and the main reason for culling were also recorded. When survival analysis was undertaken, SW × BF and CH × BF ewes had better longevity (P < 0.05) than BF ewes due to their lower culling rate (P < 0.01) and lower mortality rate (P = 0.06), respectively. The relative proportion of LL × BF and T × BF culled due to infertility was lower (P < 0.05) than SW × BF and CH × BF, while a higher (P < 0.05) proportion of LL × BF and T × BF ewes were culled for prolapses compared with the other breed crosses. SW × BF ewes had consistently higher bite scores (P < 0.001) compared with BF, LL × BF and T × BF, indicating a greater prevalence and degree of overshoot. In ewes aged 5.5 years old, SW × BF also had a higher incidence of tooth loss (P < 0.01) compared with the other breeds. However, the proportion of SW × BF culled due to poor teeth condition was lower (P < 0.05) than BF. Across all breeds, the chances of surviving to their next mating were influenced by ewe breed (P < 0.05), age at mating (P < 0.001), body condition score at weaning (P < 0.001), number of missing teeth (P < 0.001) and average daily live weight gain per litter (P < 0.05). The cumulative number and weight of lambs weaned per ewe over five successive matings was higher (P < 0.05) for crossbred compared with pure BF ewes; however there were no differences in lifetime output between the different crossbred ewes studied. This study demonstrates that the higher lamb output of crossbred hill ewes does not compromise their longevity compared with pure Blackface, resulting in greater total lifetime production. When the crossbred ewes are sired by a second hill breed, longevity may be improved.