To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Middle East respiratory syndrome coronavirus (MERS-CoV) is a zoonotic disease transmitted from dromedary camels to people, which can result in outbreaks with human-to-human transmission. Because it is a subclinical infection in camels, epidemiological measures other than prevalence are challenging to assess. This study estimated the force of infection (FOI) of MERS-CoV in camel populations from age-stratified serological data. A cross-sectional study of MERS-CoV was conducted in Kenya from July 2016 to July 2017. Seroprevalence was stratified into four age groups: <1, 1–2, 2–3 and >3 years old. Age-independent and age-dependent linear and quadratic generalised linear models were used to estimate FOI in pastoral and ranching camel herds. Models were compared based on computed AIC values. Among pastoral herds, the age-dependent quadratic FOI was the best fit model, while the age-independent FOI was the best fit for the ranching herd data. FOI provides an indirect estimate of infection risk, which is especially valuable where direct estimates of incidence and other measures of infection are challenging to obtain. The FOIs estimated in this study provide important insight about MERS-CoV dynamics in the reservoir species, and contribute to our understanding of the zoonotic risks of this important public health threat.
Under current Australian industry pre-slaughter guidelines, lambs may be off feed for up to 48 h before slaughter. The purpose of this study was to examine what proportion of circulating metabolites at slaughter are due to stress and feed deprivation and if this response differs between Merino and Terminal genotypes. In addition the effect of feed deprivation on carcass weight and meat quality was examined. Jugular blood samples were collected from 88 Merino and Terminal sired lambs at rest and at slaughter following 24, 36 and 48 h of feed deprivation and plasma analysed for glucose, lactate, non-esterified fatty acids (NEFA) and β-hydroxybutyrate (BHOB). From the same carcasses hot carcass weight (HCWT) were measured as well as a suite of meat quality traits measured such as M. longissimus lumborum (loin) and M. semitendinosus pH at 24 h postmortem. Loin samples were also analysed for intramuscular fat content and Warner–Bratzer Shear Force. Merino sired lambs had a higher NEFA response compared to Terminal sired lambs at slaughter after 24, 36 and 48 h of feed deprivation, with NEFA levels up to 35% higher than previously reported in the same animals at rest in animal house conditions, whereas BHOB response to feed deprivation was not affected by sire type (P>0.05) and similar to previously reported at rest. In addition to the metabolic effects, increasing feed deprivation from 36 h was associated with a 3% reduction in HCWT and dressing percentage as well as causing increased ultimate pH in the M. semitendinosus in Merino sired lambs. Findings from this study demonstrate that Merino and Terminal sired lambs differ in their metabolic response to feed deprivation under commercial slaughter conditions. In addition, commercial feed deprivation appears to have a negative effect on ultimate pH and carcass weight and warrants further investigation.
The aim of this study was to examine the metabolic response to feed deprivation up to 48 h in low and high yielding lamb genotypes. It was hypothesised that Terminal sired lambs would have decreased plasma glucose and increased plasma non-esterified fatty acids (NEFA) and β-hydroxybutyrate (BHOB) concentrations in response to feed deprivation compared to Merino sired lambs. In addition, it was hypothesised that the metabolic changes due to feed deprivation would also be greater in progeny of sires with breeding values for greater growth, muscling and leanness. Eighty nine lambs (45 ewes, 44 wethers) from Merino dams with Merino or Terminal sires with a range in Australian Sheep Breeding Values (ASBVs) for post-weaning weight (PWT), post-weaning eye muscle depth and post-weaning fat depth (PFAT) were used in this experiment. Blood samples were collected via jugular cannulas every 6 h from time 0 to 48 h of feed deprivation for the determination of plasma glucose, NEFA, BHOB and lactate concentration. From 12 to 48 h of feed deprivation plasma glucose concentration decreased (P < 0.05) by 25% from 4.04 ± 0.032 mmol/l to 3.04 ± 0.032 mmol/l. From 6 h NEFA concentration increased (P < 0.05) from 0.15 ± 0.021 mmol/l by almost 10-fold to 1.34 ± 0.021 mmol/l at 48 h of feed deprivation. Feed deprivation also influenced BHOB concentrations and from 12 to 48 h it increased (P < 0.05) from 0.15 ± 0.010 mmol/l to 0.52 ± 0.010 mmol/l. Merino sired lambs had a 8% greater reduction in glucose and 29% and 10% higher NEFA and BHOB response, respectively, compared to Terminal sired lambs (P < 0.05). In Merino sired lambs, increasing PWT was also associated with an increase in glucose and decline in NEFA and BHOB concentration (P < 0.05). In Terminal sired lambs, increasing PFAT was associated with an increase in glucose and decline in NEFA concentration (P < 0.05). Contrary to the hypothesis, Merino sired lambs showed the greatest metabolic response to fasting especially in regards to fat metabolism.
Jumping to conclusions (JTC), which is the proneness to require less information before forming beliefs or making a decision, has been related to formation and maintenance of delusions. Using data from the National Institute of Health Research Biomedical Research Centre Genetics and Psychosis (GAP) case–control study of first-episode psychosis (FEP), we set out to test whether the presence of JTC would predict poor clinical outcome at 4 years.
One-hundred and twenty-three FEP patients were assessed with the Positive and Negative Syndrome Scale (PANSS), Global Assessment of Functioning (GAF) and the probabilistic reasoning ‘Beads’ Task at the time of recruitment. The sample was split into two groups based on the presence of JTC bias. Follow-up data over an average of 4 years were obtained concerning clinical course and outcomes (remission, intervention of police, use of involuntary treatment – the Mental Health Act (MHA) – and inpatient days).
FEP who presented JTC at baseline were more likely during the follow-up period to be detained under the MHA [adjusted OR 15.62, 95% confidence interval (CI) 2.92–83.54, p = 0.001], require intervention by the police (adjusted OR 14.95, 95% CI 2.68–83.34, p = 0.002) and have longer admissions (adjusted IRR = 5.03, 95% CI 1.91–13.24, p = 0.001). These associations were not accounted for by socio-demographic variables, IQ and symptom dimensions.
JTC in FEP is associated with poorer outcome as indicated and defined by more compulsion police intervention and longer periods of admission. Our findings raise the question of whether the implementation of specific interventions to reduce JTC, such as Metacognition Training, may be a useful addition in early psychosis intervention programmes.
The Meat Standards Australia (MSA) grading scheme has the ability to predict beef eating quality for each ‘cut×cooking method combination’ from animal and carcass traits such as sex, age, breed, marbling, hot carcass weight and fatness, ageing time, etc. Following MSA testing protocols, a total of 22 different muscles, cooked by four different cooking methods and to three different degrees of doneness, were tasted by over 19 000 consumers from Northern Ireland, Poland, Ireland, France and Australia. Consumers scored the sensory characteristics (tenderness, flavor liking, juiciness and overall liking) and then allocated samples to one of four quality grades: unsatisfactory, good-every-day, better-than-every-day and premium. We observed that 26% of the beef was unsatisfactory. As previously reported, 68% of samples were allocated to the correct quality grades using the MSA grading scheme. Furthermore, only 7% of the beef unsatisfactory to consumers was misclassified as acceptable. Overall, we concluded that an MSA-like grading scheme could be used to predict beef eating quality and hence underpin commercial brands or labels in a number of European countries, and possibly the whole of Europe. In addition, such an eating quality guarantee system may allow the implementation of an MSA genetic index to improve eating quality through genetics as well as through management. Finally, such an eating quality guarantee system is likely to generate economic benefits to be shared along the beef supply chain from farmers to retailors, as consumers are willing to pay more for a better quality product.
Accurately quantifying a consumer’s willingness to pay (WTP) for beef of different eating qualities is intrinsically linked to the development of eating-quality-based meat grading systems, and therefore the delivery of consistent, quality beef to the consumer. Following Australian MSA (Meat Standards Australia) testing protocols, over 19 000 consumers from Northern Ireland, Poland, Ireland, France and Australia were asked to detail their willingness to pay for beef from one of four categories that best described the sample; unsatisfactory, good-every-day, better-than-every-day or premium quality. These figures were subsequently converted to a proportion relative to the good-every-day category (P-WTP) to allow comparison between different currencies and time periods. Consumers also answered a short demographic questionnaire. Consumer P-WTP was found to be remarkably consistent between different demographic groups. After quality grade, by far the greatest influence on P-WTP was country of origin. This difference was unable to be explained by the other demographic factors examined in this study, such as occupation, gender, frequency of consumption and the importance of beef in the diet. Therefore, we can conclude that the P-WTP for beef is highly transferrable between different consumer groups, but not countries.
The role of the protozoan parasite Toxoplasma gondii in the pathogenesis of liver disease has recently gained much interest. The aim of this study was to determine the prevalence and risk factors associated with T. gondii infection in patients with liver disease from three cities in Shandong and Henan provinces, China. A case–control study was conducted from December 2014 to November 2015 and included 1142 patients with liver disease and 1142 healthy controls. Serum samples were collected from all individuals and were examined with enzyme-linked immunosorbent assay for the presence of anti-T. gondii IgG and IgM antibodies. Information on the demographics, clinical, and lifestyle characteristics of the participants was collected from the medical records and by the use of a questionnaire. The prevalence of anti-T. gondii IgG was 19·7% in patients with liver disease compared with 12·17% in the controls. Only 13 patients had anti-T. gondii IgM antibodies compared with 12 control individuals (1·14% vs. 1·05%, respectively). The highest seroprevalence was detected in patients with liver cancer (22·13%), followed by hepatitis patients (20·86%), liver cirrhosis patients (20·42%), and steatosis patients (20%). Multivariate logistic regression analysis indicated that consumption of raw meat (odds ratio (OR) = 1·32; 95% confidence interval (CI) 1·01–1·71; P = 0·03) and source of drinking water from wells (OR = 1·56; 95% CI 1·08–2·27; P = 0·01) were independent risk factors for T. gondii infection in liver disease patients. These findings indicate that T. gondii infection is more likely to be present in patients with liver disease. Therefore, efforts should be directed toward health education of populations at high risk of T. gondii infection and measures should be taken to protect vulnerable patients with liver disease.
The phenotype of the human embryo conceived through in vitro fertilization (IVF), that is its morphology, developmental kinetics, physiology and metabolism, can be affected by numerous components of the laboratory and embryo culture system (which comprise the laboratory environment). The culture media formulation is important in determining embryo phenotype, but this exists within a culture system that includes oxygen, temperature, pH and whether an embryo is cultured individually or in a group, all of which can influence embryo development. Significantly, exposure of an embryo to one suboptimal component of the culture system of laboratory typically predisposes the embryo to become more vulnerable to a second stressor, as has been well documented for atmospheric oxygen and individual culture, as well as for oxygen and ammonium. Furthermore, the inherent viability of the human embryo is derived from the quality of the gametes from which it is created. Patient age, aetiology, genetics, lifestyle (as well as ovarian stimulation in women) are all known to affect the developmental potential of gametes and hence the embryo. Thus, as well as considering the impact of the IVF laboratory environment, one needs to be aware of the status of the infertile couple, as this impacts how their gametes and embryos will respond to an in vitro environment. Although far from straight forward, analysing the interactions that exist between the human embryo and its environment will facilitate the creation of more effective and safer treatments for the infertile couple.
This study examined the response of forage crops to composted dairy waste (compost) applied at low rates and investigated effects on soil health. The evenness of spreading compost by commercial machinery was also assessed. An experiment was established on a commercial dairy farm with target rates of compost up to 5 t ha−1 applied to a field containing millet [Echinochloa esculenta (A. Braun) H. Scholz] and Pasja leafy turnip (Brassica hybrid). A pot experiment was also conducted to monitor the response of a legume forage crop (vetch; Vicia sativa L.) on three soils with equivalent rates of compost up to 20 t ha−1 with and without ‘additive blends’ comprising gypsum, lime or other soil treatments. Few significant increases in forage biomass were observed with the application of low rates of compost in either the field or pot experiment. In the field experiment, compost had little impact on crop herbage mineral composition, soil chemical attributes or soil fungal and bacterial biomass. However, small but significant increases were observed in gravimetric water content resulting in up to 22.4 mm of additional plant available water calculated in the surface 0.45 m of soil, 2 years after compost was applied in the field at 6 t ha−1 dried (7.2 t ha−1 undried), compared with the nil control. In the pot experiment, where the soil was homogenized and compost incorporated into the soil prior to sowing, there were significant differences in mineral composition in herbage and in soil. A response in biomass yield to compost was only observed on the sandier and lower fertility soil type, and yields only exceeded that of the conventional fertilizer treatment where rates equivalent to 20 t ha−1 were applied. With few yield responses observed, the justification for applying low rates of compost to forage crops and pastures seems uncertain. Our collective experience from the field and the glasshouse suggests that farmers might increase the response to compost by: (i) increasing compost application rates; (ii) applying it prior to sowing a crop; (iii) incorporating the compost into the soil; (iv) applying only to responsive soil types; (v) growing only responsive crops; and (vi) reducing weed burdens in crops following application. Commercial machinery incorporating a centrifugal twin disc mechanism was shown to deliver double the quantity of compost in the area immediately behind the spreader compared with the edges of the spreading swathe. Spatial variability in the delivery of compost could be reduced but not eliminated by increased overlapping, but this might represent a potential 20% increase in spreading costs.
The beef industry must become more responsive to the changing market place and consumer demands. An essential part of this is quantifying a consumer’s perception of the eating quality of beef and their willingness to pay for that quality, across a broad range of demographics. Over 19 000 consumers from Northern Ireland, Poland, Ireland and France each tasted seven beef samples and scored them for tenderness, juiciness, flavour liking and overall liking. These scores were weighted and combined to create a fifth score, termed the Meat Quality 4 score (MQ4) (0.3×tenderness, 0.1×juiciness, 0.3×flavour liking and 0.3×overall liking). They also allocated the beef samples into one of four quality grades that best described the sample; unsatisfactory, good-every-day, better-than-every-day or premium. After the completion of the tasting panel, consumers were then asked to detail, in their own currency, their willingness to pay for these four categories which was subsequently converted to a proportion relative to the good-every-day category (P-WTP). Consumers also answered a short demographic questionnaire. The four sensory scores, the MQ4 score and the P-WTP were analysed separately, as dependant variables in linear mixed effects models. The answers from the demographic questionnaire were included in the model as fixed effects. Overall, there were only small differences in consumer scores and P-WTP between demographic groups. Consumers who preferred their beef cooked medium or well-done scored beef higher, except in Poland, where the opposite trend was found. This may be because Polish consumers were more likely to prefer their beef cooked well-done, but samples were cooked medium for this group. There was a small positive relationship with the importance of beef in the diet, increasing sensory scores by about 4% in Poland and Northern Ireland. Men also scored beef about 2% higher than women for most sensory scores in most countries. In most countries, consumers were willing to pay between 150 and 200% more for premium beef, and there was a 50% penalty in value for unsatisfactory beef. After quality grade, by far the greatest influence on P-WTP was country of origin. Consumer age also had a small negative relationship with P-WTP. The results indicate that a single quality score could reliably describe the eating quality experienced by all consumers. In addition, if reliable quality information is delivered to consumers they will pay more for better quality beef, which would add value to the beef industry and encourage improvements in quality.
Australian abattoir workers, farmers, veterinarians and people handling animal birthing products or slaughtering animals continue to be at high risk of Q fever despite an effective vaccine being available. National Notifiable Diseases Surveillance System data were analysed for the period 1991–2014, along with enhanced risk factor data from notified cases in the states of New South Wales and Queensland, to examine changes in the epidemiology of Q fever in Australia. The national Q fever notification rate reduced by 20% [incident rate ratio (IRR) 0·82] following the end of the National Q fever Management Program in 2006, and has increased since 2009 (IRR 1·01–1·34). Highest rates were in males aged 40–59 years (5·9/100 000) and 87% of Q fever cases occurred in New South Wales and Queensland. The age of Q fever cases and proportion of females increased over the study period. Based on the enhanced risk factor data, the most frequently listed occupation for Q fever cases involved contact with livestock, followed by ‘no known risk’ occupations. More complete and comparable enhanced risk factor data, at the State/Territory and national levels, would aid in further understanding of the epidemiology of Q fever.
Quantifying consumer responses to beef across a broad range of demographics, nationalities and cooking methods is vitally important for any system evaluating beef eating quality. On the basis of previous work, it was expected that consumer scores would be highly accurate in determining quality grades for beef, thereby providing evidence that such a technique could be used to form the basis of and eating quality grading system for beef. Following the Australian MSA (Meat Standards Australia) testing protocols, over 19 000 consumers from Northern Ireland, Poland, Ireland, France and Australia tasted cooked beef samples, then allocated them to a quality grade; unsatisfactory, good-every-day, better-than-every-day and premium. The consumers also scored beef samples for tenderness, juiciness, flavour-liking and overall-liking. The beef was sourced from all countries involved in the study and cooked by four different cooking methods and to three different degrees of doneness, with each experimental group in the study consisting of a single cooking doneness within a cooking method for each country. For each experimental group, and for the data set as a whole, a linear discriminant function was calculated, using the four sensory scores which were used to predict the quality grade. This process was repeated using two conglomerate scores which are derived from weighting and combining the consumer sensory scores for tenderness, juiciness, flavour-liking and overall-liking, the original meat quality 4 score (oMQ4) (0.4, 0.1, 0.2, 0.3) and current meat quality 4 score (cMQ4) (0.3, 0.1, 0.3, 0.3). From the results of these analyses, the optimal weightings of the sensory scores to generate an ‘ideal meat quality 4 score (MQ4)’ for each country were calculated, and the MQ4 values that reflected the boundaries between the four quality grades were determined. The oMQ4 weightings were far more accurate in categorising European meat samples than the cMQ4 weightings, highlighting that tenderness is more important than flavour to the consumer when determining quality. The accuracy of the discriminant analysis to predict the consumer scored quality grades was similar across all consumer groups, 68%, and similar to previously reported values. These results demonstrate that this technique, as used in the MSA system, could be used to predict consumer assessment of beef eating quality and therefore to underpin a commercial eating quality guarantee for all European consumers.
European conformation and fat grades are a major factor determining carcass value throughout Europe. The relationships between these scores and sensory scores were investigated. A total of 3786 French, Polish and Irish consumers evaluated steaks, grilled to a medium doneness, according to protocols of the ‘Meat Standards Australia’ system, from 18 muscles representing 455 local, commercial cattle from commercial abattoirs. A mixed linear effects model was used for the analysis. There was a negative relationship between juiciness and European conformation score. For the other sensory scores, a maximum of three muscles out of a possible 18 demonstrated negative effects of conformation score on sensory scores. There was a positive effect of European fat score on three individual muscles. However, this was accounted for by marbling score. Thus, while the European carcass classification system may indicate yield, it has no consistent relationship with sensory scores at a carcass level that is suitable for use in a commercial system. The industry should consider using an additional system related to eating quality to aid in the determination of the monetary value of carcasses, rewarding eating quality in addition to yield.
Delivering beef of consistent quality to the consumer is vital for consumer satisfaction and will help to ensure demand and therefore profitability within the beef industry. In Australia, this is being tackled with Meat Standards Australia (MSA), which uses carcass traits and processing factors to deliver an individual eating quality guarantee to the consumer for 135 different ‘cut by cooking methods’ from each carcass. The carcass traits used in the MSA model, such as ossification score, carcass weight and marbling explain the majority of the differences between breeds and sexes. Therefore, it was expected that the model would predict with eating quality of bulls and dairy breeds with good accuracy. In total, 8128 muscle samples from 482 carcasses from France, Poland, Ireland and Northern Ireland were MSA graded at slaughter then evaluated for tenderness, juiciness, flavour liking and overall liking by untrained consumers, according to MSA protocols. The scores were weighted (0.3, 0.1, 0.3, 0.3) and combined to form a global eating quality (meat quality (MQ4)) score. The carcasses were grouped into one of the three breed categories: beef breeds, dairy breeds and crosses. The difference between the actual and the MSA-predicted MQ4 scores were analysed using a linear mixed effects model including fixed effects for carcass hang method, cook type, muscle type, sex, country, breed category and postmortem ageing period, and random terms for animal identification, consumer country and kill group. Bulls had lower MQ4 scores than steers and females and were predicted less accurately by the MSA model. Beef breeds had lower eating quality scores than dairy breeds and crosses for five out of the 16 muscles tested. Beef breeds were also over predicted in comparison with the cross and dairy breeds for six out of the 16 muscles tested. Therefore, even after accounting for differences in carcass traits, bulls still differ in eating quality when compared with females and steers. Breed also influenced eating quality beyond differences in carcass traits. However, in this case, it was only for certain muscles. This should be taken into account when estimating the eating quality of meat. In addition, the coefficients used by the Australian MSA model for some muscles, marbling score and ultimate pH do not exactly reflect the influence of these factors on eating quality in this data set, and if this system was to be applied to Europe then the coefficients for these muscles and covariates would need further investigation.
Ossification score and animal age are both used as proxies for maturity-related collagen crosslinking and consequently decreases in beef tenderness. Ossification score is strongly influenced by the hormonal status of the animal and may therefore better reflect physiological maturity and consequently eating quality. As part of a broader cross-European study, local consumers scored 18 different muscle types cooked in three ways from 482 carcasses with ages ranging from 590 to 6135 days and ossification scores ranging from 110 to 590. The data were studied across three different maturity ranges; the complete range of maturities, a lesser range and a more mature range. The lesser maturity group consisted of carcasses having either an ossification score of 200 or less or an age of 987 days or less with the remainder in the greater maturity group. The three different maturity ranges were analysed separately with a linear mixed effects model. Across all the data, and for the greater maturity group, animal age had a greater magnitude of effect on eating quality than ossification score. This is likely due to a loss of sensitivity in mature carcasses where ossification approached and even reached the maximum value. In contrast, age had no relationship with eating quality for the lesser maturity group, leaving ossification score as the more appropriate measure. Therefore ossification score is more appropriate for most commercial beef carcasses, however it is inadequate for carcasses with greater maturity such as cull cows. Both measures may therefore be required in models to predict eating quality over populations with a wide range in maturity.
The aims of the study were to determine the prevalence of cardiometabolic risk factors and establish the proportion of people with psychosis meeting criteria for the metabolic syndrome (MetS). The study also aimed to identify the key lifestyle behaviours associated with increased risk of the MetS and to investigate whether the MetS is associated with illness severity and degree of functional impairment.
Baseline data were collected as part of a large randomized controlled trial (IMPaCT RCT). The study took place within community mental health teams in five Mental Health NHS Trusts in urban and rural locations across England. A total of 450 randomly selected out-patients, aged 18–65 years, with an established psychotic illness were recruited. We ascertained the prevalence rates of cardiometabolic risk factors, illness severity and functional impairment and calculated rates of the MetS, using International Diabetes Federation (IDF) and National Cholesterol Education Program Third Adult Treatment Panel criteria.
High rates of cardiometabolic risk factors were found. Nearly all women and most men had waist circumference exceeding the IDF threshold for central obesity. Half the sample was obese (body mass index ≥ 30 kg/m2) and a fifth met the criteria for type 2 diabetes mellitus. Females were more likely to be obese than males (61% v. 42%, p < 0.001). Of the 308 patients with complete laboratory measures, 57% (n = 175) met the IDF criteria for the MetS.
In the UK, the prevalence of cardiometabolic risk factors in individuals with psychotic illnesses is much higher than that observed in national general population studies as well as in most international studies of patients with psychosis.
Intramuscular fat (IMF) % contributes positively to the juiciness and flavour of lamb and is therefore a useful indicator of eating quality. A rapid, non-destructive method of IMF determination like computed tomography (CT) would enable pre-sorting of carcasses based on IMF% and potential eating quality. Given the loin muscle (longissimus lumborum) is easy to sample, a single measurement at this site would be useful, providing is correlates well to other muscles. To determine the ability of CT to predict IMF%, this study used 400 animals and examined 5 muscles from three sections of the carcass: from the fore-section the m. supraspinatus and m. infraspinatus, from the saddle-section the m. longissimus lumborum and from the hind-section the m. semimembranosus and m. semitendinosus. The average CT pixel density of muscle was negatively associated with IMF% and can be used to predict IMF% although precision in this study was poor. The ability of CT to predict IMF% was greatest in the m. longissimus lumborum (slope −0.07) and smallest in the m. infraspinatus (slope −0.02). The correlation coefficients of IMF% between the five muscles were variable, with the highest correlation coefficients evident between muscles of the fore section (0.67 between the m. supraspinatus and the m. infraspinatus) and the weakest correlations were between the muscle of the fore and hind section. The correlation between the m. longissimus lumborum to the other muscles was fairly consistent with values ranging between 0.34 and 0.40 (partial correlation coefficient). The correlation between the proportion of carcass fat and the IMF% of the five muscles varied and was greatest in the m. longissimus lumborum (0.41).