To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Declines in mental health among youth in the COVID-19 pandemic have been observed, yet longitudinal studies on how housing may impact these declines are lacking.
Our aim was to determine whether changes in mental health among Danish youth were dependent on their housing conditions.
Young participants from the Danish National Birth Cohort, who had responded to an online questionnaire at 18 years of age, and later during the initial national Danish lockdown, were included. Associations between housing conditions (direct access to outdoor spaces, urbanicity, household density, and household composition) and changes in mental health (mental well-being, quality of life (QoL) and loneliness) were examined in multivariate linear and logistic regression analyses.
We included 7455 participants. Greater decreases in mental well-being were observed for youth with no access to direct outdoor spaces and those living in denser households (mean difference -0.83 [95 % CI -1.19, -0.48], -0.30 [-0.43, -0.18], respectively). Onset of low mental well-being was associated with no access and living alone (odds ratios (OR) 1.68 [1.15, 2.47] and OR 1.47 [1.05, 2.07], respectively). Household density was negatively associated with QoL (mean difference -0.21 [-0.30, -0.12]). Youth living alone experienced more loneliness (OR 2.12 [95 % CI 1.59, 2.82]).
How youth’s mental health changed from before to during lockdown was associated with housing conditions. Among the Danish youth in our study, greater decreases in mental health during lockdown were observed among youth without access to outdoor spaces, living alone, or living in denser households.
A multi-disciplinary expert group met to discuss vitamin D deficiency in the UK and strategies for improving population intakes and status. Changes to UK Government advice since the 1st Rank Forum on Vitamin D (2009) were discussed, including rationale for setting a reference nutrient intake (10 µg/d; 400 IU/d) for adults and children (4+ years). Current UK data show inadequate intakes among all age groups and high prevalence of low vitamin D status among specific groups (e.g. pregnant women and adolescent males/females). Evidence of widespread deficiency within some minority ethnic groups, resulting in nutritional rickets (particularly among Black and South Asian infants), raised particular concern. Latest data indicate that UK population vitamin D intakes and status reamain relatively unchanged since Government recommendations changed in 2016. Vitamin D food fortification was discussed as a potential strategy to increase population intakes. Data from dose–response and dietary modelling studies indicate dairy products, bread, hens’ eggs and some meats as potential fortification vehicles. Vitamin D3 appears more effective than vitamin D2 for raising serum 25-hydroxyvitamin D concentration, which has implications for choice of fortificant. Other considerations for successful fortification strategies include: (i) need for ‘real-world’ cost information for use in modelling work; (ii) supportive food legislation; (iii) improved consumer and health professional understanding of vitamin D’s importance; (iv) clinical consequences of inadequate vitamin D status and (v) consistent communication of Government advice across health/social care professions, and via the food industry. These areas urgently require further research to enable universal improvement in vitamin D intakes and status in the UK population.
The influence of surface melt on the flow of Greenland's largest outlet glaciers remains poorly known and in situ observations are few. We use field observations to link surface meltwater forcing to glacier-wide diurnal velocity variations on East Greenland's Helheim Glacier over two summer melt seasons. We observe diurnal variations in glacier speed that peak ~6.5 h after daily maximum insolation and extend from the terminus region to the equilibrium line. Both the amplitude of the diurnal speed variation and its sensitivity to daily melt are largest at the glacier terminus and decrease up-glacier, suggesting that the magnitude of the response is controlled not only by melt input volume and temporal variability, but also by background effective pressure, which approaches zero at the terminus. Our results provide evidence that basal lubrication by meltwater drives diurnal velocity variations at Greenland's marine-terminating glaciers in a similar manner to alpine glaciers and Greenland's land-terminating outlet glaciers.
The Academic Development Study of Australian Twins was established in 2012 with the purpose of investigating the relative influence of genes and environments in literacy and numeracy capabilities across two primary and two secondary school grades in Australia. It is the first longitudinal twin project of its kind in Australia and comprises a sample of 2762 twin pairs, 40 triplet sets and 1485 nontwin siblings. Measures include standardized literacy and numeracy test data collected at Grades 3, 5, 7 and 9 as part of the National Assessment Program: Literacy and Numeracy. A range of demographic and behavioral data was also collected, some at multiple longitudinal time points. This article outlines the background and rationale for the study and provides an overview for the research design, sample and measures collected. Findings emerging from the project and future directions are discussed.
Two common approaches to identify subgroups of patients with bipolar disorder are clustering methodology (mixture analysis) based on the age of onset, and a birth cohort analysis. This study investigates if a birth cohort effect will influence the results of clustering on the age of onset, using a large, international database.
The database includes 4037 patients with a diagnosis of bipolar I disorder, previously collected at 36 collection sites in 23 countries. Generalized estimating equations (GEE) were used to adjust the data for country median age, and in some models, birth cohort. Model-based clustering (mixture analysis) was then performed on the age of onset data using the residuals. Clinical variables in subgroups were compared.
There was a strong birth cohort effect. Without adjusting for the birth cohort, three subgroups were found by clustering. After adjusting for the birth cohort or when considering only those born after 1959, two subgroups were found. With results of either two or three subgroups, the youngest subgroup was more likely to have a family history of mood disorders and a first episode with depressed polarity. However, without adjusting for birth cohort (three subgroups), family history and polarity of the first episode could not be distinguished between the middle and oldest subgroups.
These results using international data confirm prior findings using single country data, that there are subgroups of bipolar I disorder based on the age of onset, and that there is a birth cohort effect. Including the birth cohort adjustment altered the number and characteristics of subgroups detected when clustering by age of onset. Further investigation is needed to determine if combining both approaches will identify subgroups that are more useful for research.
Few studies have examined rate and predictors of self-harm in discharged psychiatric patients.
To investigate the rate, coding, timing, predictors and characteristics of self-harm induced somatic admission after discharge from psychiatric acute admission.
Cohort study of 2827 unselected patients consecutively admitted to a psychiatric acute ward during three years. Mean observation period was 2.3 years. Combined register linkage and manual data examination. Cox regression was used to investigate covariates for time to somatic admission due to self-harm, with covariates changing during follow-up entered time dependently.
During the observation period, 10.5% of the patients had 792 somatic self-harm admissions. Strongest risk factors were psychiatric admission due to non-suicidal self-harm, suicide attempt and suicide ideation. The risk was increased throughout the first year of follow-up, during readmission, with increasing outpatient consultations and in patients diagnosed with recurrent depression, personality disorders, substance use disorders and anxiety/stress-related disorders. Only 49% of the somatic self-harm admissions were given hospital self-harm diagnosis.
Self-harm induced somatic admissions were highly prevalent during the first year after discharge from acute psychiatric admission. Underdiagnosing of self-harm in relation to somatic self-harm admissions may cause incorrect follow-up treatments and unreliable register data.
Delirium is a common complex syndrome with serious outcomes such as increased mortality, physical morbidity and length of hospitalization. There are similarities with symptoms of mental illnesses, which may lead to under-diagnosis of delirium in psychiatric patients. Thus in spite of the severity of delirium, it frequently goes unrecognised and hence is inadequately managed.
The literature about the incidence of delirium in psychiatric patients is sparse.
A description of the pattern of admission will help understand the clinical features of delirium.
To estimate the diagnostic incidence and describe the pattern of admission (inpatient, outpatient and emergency) of delirium in Danish psychiatric patients from 1994–2011.
Using a nationwide population-based mental health register to examine diagnoses of delirium from 1994–2011, where ICD-10 diagnostic criteria have been used. The delirium diagnoses include delirium unspecified, delirium superimposed on dementia and delirium due to alcohol or drug use or in a withdrawal state. The incidence rates were age-standardized and the statistical analyses performed with STATA.
Males dominate in all three groups of delirium diagnoses. The incidence rates of delirium unspecified in hospitalized patients have increased markedly, graph 1. The incidence rates of delirium in hospitalized dementia do not show the same increase.
The incidence rates of delirium unspecified in hospitalized psychiatric patients have increased.
Weight gain among psychiatric inpatients is a widespread phenomenon. This change in body mass index (BMI) can be caused by several factors. Based on recent research, we assume the following factors are related to weight gain during psychiatric inpatient treatment: psychiatric medication, psychiatric diagnosis, sex, age, weight on admission and geographic region of treatment.
876 of originally recruited 2328 patients met the criteria for our analysis. Patients were recruited and examined in mental health care centres in Nigeria (N=265), Japan (N=145) and Western-Europe (Denmark, Germany and Switzerland; N=466).
There was a significant effect of psychiatric medication, psychiatric diagnoses and geographic region, but not age and sex, on BMI changes. Geographic region had a significant effect on BMI change, with Nigerian patients gaining significantly more weight than Japanese and Western European patients. Moreover, geographic region influenced the type of psychiatric medication prescribed and the psychiatric diagnoses. The diagnoses and psychiatric medication prescribed had a significant effect on BMI change.
In conclusion, we consider weight gain as a multifactorial phenomenon that is influenced by several factors. One can discuss a number of explanations for our findings, such as different clinical practices in the geographical regions (prescribing or admission strategies and access-to-care aspects), as well as socio-economic and cultural differences.
Both blood- and milk-based biomarkers have been analysed for decades in research settings, although often only in one herd, and without focus on the variation in the biomarkers that are specifically related to herd or diet. Biomarkers can be used to detect physiological imbalance and disease risk and may have a role in precision livestock farming (PLF). For use in PLF, it is important to quantify normal variation in specific biomarkers and the source of this variation. The objective of this study was to estimate the between- and within-herd variation in a number of blood metabolites (β-hydroxybutyrate (BHB), non-esterified fatty acids, glucose and serum IGF-1), milk metabolites (free glucose, glucose-6-phosphate, urea, isocitrate, BHB and uric acid), milk enzymes (lactate dehydrogenase and N-acetyl-β-D-glucosaminidase (NAGase)) and composite indicators for metabolic imbalances (Physiological Imbalance-index and energy balance), to help facilitate their adoption within PLF. Blood and milk were sampled from 234 Holstein dairy cows from 6 experimental herds, each in a different European country, and offered a total of 10 different diets. Blood was sampled on 2 occasions at approximately 14 days-in-milk (DIM) and 35 DIM. Milk samples were collected twice weekly (in total 2750 samples) from DIM 1 to 50. Multilevel random regression models were used to estimate the variance components and to calculate the intraclass correlations (ICCs). The ICCs for the milk metabolites, when adjusted for parity and DIM at sampling, demonstrated that between 12% (glucose-6-phosphate) and 46% (urea) of the variation in the metabolites’ levels could be associated with the herd-diet combination. Intraclass Correlations related to the herd-diet combination were generally higher for blood metabolites, from 17% (cholesterol) to approximately 46% (BHB and urea). The high ICCs for urea suggest that this biomarker can be used for monitoring on herd level. The low variance within cow for NAGase indicates that few samples would be needed to describe the status and potentially a general reference value could be used. The low ICC for most of the biomarkers and larger within cow variation emphasises that multiple samples would be needed - most likely on the individual cows - for making the biomarkers useful for monitoring. The majority of biomarkers were influenced by parity and DIM which indicate that these should be accounted for if the biomarker should be used for monitoring.
The main purpose of this study was to find several early factors affecting stayability in rabbit females. To reach this goal, 203 females were used from their first artificial insemination to their sixth parturition. Throughout that period, 48 traits were recorded, considered to be performance, metabolic and immunological indicators. These traits were initially recorded in females’ first reproductive cycle. Later, removed females due to death or culling and those that were non-removed were identified. A first analysis was used to explore whether it was possible to classify females between those reaching and those not reaching up to the mean lifespan of a rabbit female (the fifth reproductive) cycle using information from the first reproductive cycle. The analysis results showed that 97% of the non-removed females were classified correctly, whereas only 60% of the removed females were classified as animals to be removed. The reason for this difference lies in the model’s characteristics, which was designed using early traits and was able to classify only the cases in which females would be removed due to performance, metabolic or immunologic imbalances in their early lives. Our results suggest that the model defines the necessary conditions, but not the sufficient ones, for females to remain alive in the herd. The aim of a second analysis was to find out the main early differences between the non-removed and removed females. The live weights records taken in the first cycle indicated that the females removed in their first cycle were lighter, while those removed in their second cycle were heavier with longer stayability (−203 and +202 g on average, respectively; P < 0.05). Non-removed females showed higher glucose and lower beta-hydroxybutyrate concentrations in the first cycle than the removed females (+4.8 and −10.7%, respectively; P < 0.05). The average lymphocytes B counts in the first cycle were 22.7% higher in the non-removed females group (P < 0.05). The females removed in the first reproductive cycle presented a higher granulocytes/lymphocytes ratio in this cycle than those that at least reached the second cycle (4.81 v. 1.66; P < 0.001). Consequently, non-removed females at sixth parturition offered adequate body development and energy levels, less immunological stress and a more mature immune function in the first reproductive cycle. The females that deviated from this pattern were at higher risk of being removed from the herd.
Economic pressures continue to mount on modern-day livestock farmers, forcing them to increase herds sizes in order to be commercially viable. The natural consequence of this is to drive the farmer and the animal further apart. However, closer attention to the animal not only positively impacts animal welfare and health but can also increase the capacity of the farmer to achieve a more sustainable production. State-of-the-art precision livestock farming (PLF) technology is one such means of bringing the animals closer to the farmer in the facing of expanding systems. Contrary to some current opinions, it can offer an alternative philosophy to ‘farming by numbers’. This review addresses the key technology-oriented approaches to monitor animals and demonstrates how image and sound analyses can be used to build ‘digital representations’ of animals by giving an overview of some of the core concepts of PLF tool development and value discovery during PLF implementation. The key to developing such a representation is by measuring important behaviours and events in the livestock buildings. The application of image and sound can realise more advanced applications and has enormous potential in the industry. In the end, the importance lies in the accuracy of the developed PLF applications in the commercial farming system as this will also make the farmer embrace the technological development and ensure progress within the PLF field in favour of the livestock animals and their well-being.
This paper reviews the effects of extended lactation (EXT) as a strategy in dairy cattle on milk production and persistency, reproduction, milk quality, lifetime performance of the cow and finally the economic effects on herd and farm levels as well as the impact on emission of greenhouse gas at product level. Primiparous cows are able to produce equal or more milk per feeding day during EXT compared with a standard 305-d lactation, whereas results for multiparous cows are inconsistent. Cows managed for EXT can achieve a higher lifetime production while delivering milk with unchanged or improved quality properties. Delaying insemination enhances mounting behaviour and allows insemination after the cow’s energy balance has become positive. However, in most cases EXT has no effect or a non-significant positive effect on reproduction. The EXT strategy sets off a cascade of effects at herd and farm level. Thus, the EXT strategy leads to fewer calvings and thereby expected fewer diseases, fewer replacement heifers and fewer dry days per cow per year. The optimal lifetime scenario for milk production was modelled to be an EXT of 16 months for first parity cows followed by an EXT of 10 months for later lactations. Modelling studies of herd dynamics indicate a positive effect of EXT on lifetime efficiency (milk per dry matter intake), mainly originating from benefits of EXT on daily milk yield in primiparous cows and the reduced number of replacement heifers. Consequently, EXT also leads to reduced total meat production at herd level. For the farmer, EXT can give the same economic return as a traditional lactation period. At farm level, EXT can contribute to a reduction in the environmental impact of dairy production, mainly as a consequence of the reduced production of beef. A wider dissemination of the EXT concept will be supported by methods to predict which cows may be most suitable for EXT, and clarification of how milking frequency and feeding strategy through the lactation can be organised to support milk yield and an appropriate body condition at the next calving.
Tear staining (TS) in the pig has been related to different stressors and may be a useful tool for assessing animal welfare on farm. The aim of the current study was to investigate TS across the finisher period and its possible relation to age, growth, sex and experimentally induced stressors. The study included 80 finisher pens divided between three batches. Within each batch, the pens either included pigs with docked or undocked tails, had straw provided (150 g/pig/day) or not and had a low (1.21 m2/pig, 11 pigs) or high stocking density (0.73 m2/pig, 18 pigs). Tear staining (scores 1 to 4; from smaller to larger tear stain area, respectively) and tail damage were scored on each individual pig three times per week over the 9-week study period, and the individual maximum TS score within each week was chosen for further analysis. Data were analysed using logistic regression separately for each of the four possible TS score levels. The TS scores 1 and 2 decreased with weeks into the study period and were negatively related to the average daily gain (ADG) of the pigs, whereas the TS score 4 increased with weeks into the study period and was positively related to ADG. None of the TS scores differed between females and castrated males, and neither straw provision nor lowering the stocking density affected the TS scores. However, the TS score 1 decreased the last week before an event of tail damage (at least one pig in the pen with a bleeding tail wound), whereas the TS score 4 increased. The results of the current study advocates for a relation between TS and the factors such as age, growth and stress in the pig, while no relation was found between TS and the environmental factors straw provision and lowered stocking density. The relations to age and growth are important to take into consideration if using TS as a welfare assessment measure in the pig in the future.
Tail damage within the production of finisher pigs is an animal welfare problem. Recent research suggests that removal of known risk factors may not be enough to eliminate tail biting, especially in undocked pigs, thus a different strategy is worth investigating. This could be early detection of tail biting, using behavioural changes observed before tail damage. If these early stages of tail biting can be detected before tail damage occurs, then tail damage could be prevented by early interventions. The first step in developing such a strategy is to identify the types of behaviour changes that emerge during early stages of tail biting. Thus, the aim of the current study was to investigate whether pen level activity and object manipulation evolved differently during the last 7 days before the scoring of tail damage (day 0) for pens scored with tail damage (tail damage pens) and pens not scored with tail damage (matched control pens). The study included video recordings for twenty-four tail damage pens and thirty-two matched control pens. Activity level and object manipulation were observed the last 7 days before day 0 during the morning (0600 to 0800 h), afternoon (1600 to 1800 h) and evening (2200 to 2400 h, only activity level). Both activity level and object manipulation were analysed using generalised linear mixed effects models with a binomial distribution for activity level and a negative binomial distribution for object manipulation. The probability of being active was higher in tail damage pens compared to control pens during the afternoon the last 5 days before day 0 (P<0.001). This was seen due to a decrease in activity level in the control pens, which makes it difficult to identify future tail damage pens from this difference. Object manipulation was lower in tail damage pens compared to the control pens on all 7 days before day 0, but only in pens with undocked pigs (P<0.01). Thus, it is still unknown when this difference in object manipulation initiated. It was concluded that both activity level and object manipulation seemed related to ongoing tail biting and should be investigated through more detailed observations and for a longer time to establish the normal behaviour pattern for a particular pen. Thus, it is suggested that future research focusses on developing automatic monitoring methods for pen level activity and object manipulation and applies algorithms that establish and detect deviations from the normal behaviour pattern of the pen before tail damage.
Unbalanced metabolic status in the weeks after calving predisposes dairy cows to metabolic and infectious diseases. Blood glucose, IGF-I, non-esterified fatty acids (NEFA) and β-hydroxybutyrate (BHB) are used as indicators of the metabolic status of cows. This work aims to (1) evaluate the potential of milk mid-IR spectra to predict these blood components individually and (2) to evaluate the possibility of predicting the metabolic status of cows based on the clustering of these blood components. Blood samples were collected from 241 Holstein cows on six experimental farms, at days 14 and 35 after calving. Blood samples were analyzed by reference analysis and metabolic status was defined by k-means clustering (k=3) based on the four blood components. Milk mid-IR analyses were undertaken on different instruments and the spectra were harmonized into a common standardized format. Quantitative models predicting blood components were developed using partial least squares regression and discriminant models aiming to differentiate the metabolic status were developed with partial least squares discriminant analysis. Cross-validations were performed for both quantitative and discriminant models using four subsets randomly constituted. Blood glucose, IGF-I, NEFA and BHB were predicted with respective R2 of calibration of 0.55, 0.69, 0.49 and 0.77, and R2 of cross-validation of 0.44, 0.61, 0.39 and 0.70. Although these models were not able to provide precise quantitative values, they allow for screening of individual milk samples for high or low values. The clustering methodology led to the sharing out of the data set into three groups of cows representing healthy, moderately impacted and imbalanced metabolic status. The discriminant models allow to fairly classify the three groups, with a global percentage of correct classification up to 74%. When discriminating the cows with imbalanced metabolic status from cows with healthy and moderately impacted metabolic status, the models were able to distinguish imbalanced group with a global percentage of correct classification up to 92%. The performances were satisfactory considering the variables are not present in milk, and consequently predicted indirectly. This work showed the potential of milk mid-IR analysis to provide new metabolic status indicators based on individual blood components or a combination of these variables into a global status. Models have been developed within a standardized spectral format, and although robustness should preferably be improved with additional data integrating different geographic regions, diets and breeds, they constitute rapid, cost-effective and large-scale tools for management and breeding of dairy cows.
Laying hens housed in free-range systems have access to an outdoor range, and individual hens within a flock differ in their ranging behaviour. Whether there is a link between ranging and laying hen welfare remains unclear. We analysed the relationships between ranging by individual hens on a commercial free-range layer farm and behavioural, physiological and health measures of animal welfare. We hypothesised that hens that access the range more will be (1) less fearful in general and in response to novelty and humans, (2) have better health in terms of physical body condition and (3) have a reduced physiological stress response to behavioural tests of fear and health assessments than hens that use the range less. Using radio frequency identification tracking across two flocks, we recorded individual hens’ frequency, duration and consistency of ranging. We also assessed how far hens ventured into the range based on three zones: 0 to 2.4, 2.4 to 11.4 or >11.4 m from the shed. We assessed hen welfare using a variety of measures including: tonic immobility, open field, novel object, human approach, and human avoidance (HAV) behavioural tests; stress-induced plasma corticosterone response and faecal glucocorticoid metabolites; live weight, comb colour, and beak, plumage, footpad, and keel bone condition. Range use was positively correlated with plasma corticosterone response, faecal glucocorticoid metabolites, and greater flight distance during HAV. Hens that used the range more, moved towards rather than away from the novel object more often than hens that ranged less. Distance ranged from the shed was significantly associated with comb colour and beak condition, in that hens with darker combs and more intact beaks ranged further. Overall the findings suggest that there is no strong link between outdoor range usage and laying hen welfare. Alternatively, it may be that hens that differed in their ranging behaviour showed few differences in measures of welfare because free-range systems provide hens with adequate choice to cope with their environment. Further research into the relationship between individual range access and welfare is needed to test this possibility.
Avian influenza virus (AIV) subtypes H5 and H7 can infect poultry causing low pathogenicity (LP) AI, but these LPAIVs may mutate to highly pathogenic AIV in chickens or turkeys causing high mortality, hence H5/H7 subtypes demand statutory intervention. Serological surveillance in the European Union provides evidence of H5/H7 AIV exposure in apparently healthy poultry. To identify the most sensitive screening method as the first step in an algorithm to provide evidence of H5/H7 AIV infection, the standard approach of H5/H7 antibody testing by haemagglutination inhibition (HI) was compared with an ELISA, which detects antibodies to all subtypes. Sera (n = 1055) from 74 commercial chicken flocks were tested by both methods. A Bayesian approach served to estimate diagnostic test sensitivities and specificities, without assuming any ‘gold standard’. Sensitivity and specificity of the ELISA was 97% and 99.8%, and for H5/H7 HI 43% and 99.8%, respectively, although H5/H7 HI sensitivity varied considerably between infected flocks. ELISA therefore provides superior sensitivity for the screening of chicken flocks as part of an algorithm, which subsequently utilises H5/H7 HI to identify infection by these two subtypes. With the calculated sensitivity and specificity, testing nine sera per flock is sufficient to detect a flock seroprevalence of 30% with 95% probability.
To achieve functional but also productive females, we hypothesised that it is possible to modulate acquisition and allocation of animals from different genetic types by varying the main energy source of the diet. To test this hypothesis, we used 203 rabbit females belonging to three genetic types: H (n=66), a maternal line characterised by hyper-prolificacy; LP (n=67), a maternal line characterised by functional hyper-longevity; R (n=79), a paternal line characterised by growth rate. Females were fed with two isoenergetic and isoprotein diets differing in energy source: animal fat (AF) enhancing milk yield; cereal starch (CS) promoting body reserves recovery. Feed intake, weight, perirenal fat thickness (PFT), milk yield and blood traits were controlled during five consecutive reproductive cycles (RCs). Females fed with CS presented higher PFT (+0.2 mm, P<0.05) and those fed AF had higher milk yield (+11.7%, P<0.05). However, the effect of energy source varied with the genetic type and time. For example, R females presented a decrease in PFT at late lactation (−4.3%; P<0.05) significantly higher than that observed for H and LP lines (on av. −0.1%; P>0.05), particularly for those fed with AF. Moreover, LP females fed with AF progressively increased PFT across the RC, whereas those fed with CS increased PFT during early lactation (+7.3%; P<0.05), but partially mobilised it during late lactation (−2.8%; P<0.05). Independently of the diet offered, LP females reached weaning with similar PFT. H females fed with either of the two diets followed a similar trajectory throughout the RC. For milk yield, the effect of energy source was almost constant during the whole experiment, except for the first RC of females from the maternal lines (H and LP). These females yielded +34.1% (P<0.05) when fed with CS during this period. Results from this work indicate that the resource acquisition capacity and allocation pattern of rabbit females is different for each genetic type. Moreover, it seems that by varying the main energy source of the diet it is possible to modulate acquisition and allocation of resources of the different genetic types. However, the response of each one depends on its priorities over time.