To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To evaluate the prevalence of food and beverage marketing on Twitch.tv (Twitch), a social media platform where individuals broadcast live audiovisual material to millions of daily users.
Observational analysis of the prevalence of 238 food and beverage brands in five distinct categories (processed snacks; food delivery services and restaurants; candies, energy drinks/coffees/teas; and sodas and other sugar-sweetened beverages) over the course of 18 months.
Twitch streamer profiles and stream titles between January 2018 and July 2019. Twitch chat room messages during July 2019.
There was a significant increase in brand exposure on Twitch both in stream titles (sodas and candies, P < 0·05) and on streamer profiles (sodas, restaurants/food delivery services, candies, and energy drinks/coffees/teas, P < 0·05) over the 18-month study period. Energy drinks, coffees and teas had the most exposure with 1·08 billion exposure hours from profiles and 83 million exposure hours from titles. Restaurants/food delivery services and sugar-sweetened beverages were the most frequently mentioned products in chat rooms with 1·24 million messages and 1·10 million messages, respectively.
This study is the first to demonstrate the extent by which food and beverage brands garner millions of hours of exposure on Twitch. Future studies should evaluate the impact that this level of exposure to nutrient-poor, energy-dense products may have on behavioural and health outcomes.
Poultry production is an important way of enhancing the livelihoods of rural populations, especially in low- and middle-income countries (LMICs). As poultry production in LMICs remains dominated by backyard systems with low inputs and low outputs, considerable yield gaps exist. Intensification can increase poultry productivity, production and income. This process is relatively recent in LMICs compared to high-income countries. The management practices and the constraints faced by smallholders trying to scale-up their production, in the early stages of intensification, are poorly understood and described. We thus investigated the features of the small-scale commercial chicken sector in a rural area distant from major production centres. We surveyed 111 commercial chicken farms in Kenya in 2016. We targeted farms that sell the majority of their production, owning at least 50 chickens, partly or wholly confined and provided with feeds. We developed a typology of semi-intensive farms. Farms were found mainly to raise dual-purpose chickens of local and improved breeds, in association with crops and were not specialized in any single product or market. We identified four types of semi-intensive farms that were characterized based on two groups of variables related to intensification and accessibility: (i) remote, small-scale old farms, with small flocks, growing a lot of their own feed; (ii) medium-scale, old farms with a larger flock and well located in relation to markets and (iii) large-scale recently established farms, with large flocks, (iii-a) well located and buying chicks from third-party providers and (iii-b) remotely located and hatching their own chicks. The semi-intensive farms we surveyed were highly heterogeneous in terms of size, age, accessibility, management, opportunities and challenges. Farm location affects market access and influences the opportunities available to farmers, resulting in further diversity in farm profiles. The future of these semi-intensive farms could be compromised by several factors, including the competition with large-scale intensive farmers and with importations. Our study suggests that intensification trajectories in rural areas of LMICs are potentially complex, diverse and non-linear. A better understanding of intensification trajectories should, however, be based on longitudinal data. This could, in turn, help designing interventions to support small-scale farmers.
The use of diets with increased fibre content from alternative feedstuffs less digestible for pigs is a solution considered to limit the impact of increased feed costs on pig production. This study aimed at determining the impact of an alternative diet on genetic parameters for growth, feed efficiency, carcass composition and meat quality traits. A total of 783 Large White pigs were fed a high-fibre (HF) diet and 880 of their sibs were fed a conventional (CO) cereal-based diet. Individual daily feed intake, average daily gain, feed conversion ratio and residual feed intake were recorded as well as lean meat percentage (LMP), carcass yield (CY) and meat quality traits. Pigs fed the CO diet had better performances for growth and feed efficiency than pigs fed the HF diet. They also had lower LMP and higher CY. In addition, pigs fed the CO diet had lower loin percentage and ham percentage and higher backfat percentage. No differences were observed in meat quality traits between diets, except for a* and b* values. For all traits, the genetic variances and heritability were not different between diets. Genetic correlations for traits between diets ranged between 0.80 ± 0.13 and 0.99 ± not estimable, and none were significantly different from 0.99, except for LMP. Thus, traits in both diets were considered as mainly affected by similar sets of genes in the two diets. A genetic correlation lower than 0.80 would justify redesigning the breeding scheme; however, some genetic correlations did not differ significantly from 0.80 either. Therefore, larger populations are needed for a more definitive answer regarding the design of the breeding scheme. To further evaluate selection strategies, a production index was computed within diets for the 29 sires with estimated breeding value reliability higher than 0.35. The rank correlation between indices estimated in the CO and in the HF diet was 0.72. Altogether, we concluded that limited interaction between feed and genetics could be evidenced, and based on these results there is no need to change pig selection schemes to adapt to the future increased use of alternative feedstuffs in production farms.
Whilst cannabis use appears to be a causal risk factor for the development of schizophrenia-related psychosis, associations with mania remain relatively unknown.
This review aimed to examine the impact of cannabis use on the incidence of manic symptoms and on their occurrence in those with pre-existing bipolar disorder
A systematic review of the scientific literature using the PRISMA guidelines. PsychINFO, Cochrane, Scopus, Embase and MEDLINE databases were searched for prospective studies.
Six articles met inclusion criteria. These sampled 2,391 individuals who had experienced mania symptoms. The mean length of follow up was 3.9 years. Studies support an association between cannabis use and the exacerbation of manic symptoms in those with previously diagnosed bipolar disorder. Furthermore, a meta-analysis of two studies suggests that cannabis use is associated with an approximately 3-fold (Odds Ratio: 2.97; 95% CI: 1.80 to 4.90) increased risk for the new onset of manic symptoms.
Our findings whilst tentative, suggest that cannabis use may worsen the occurrence of manic symptoms in those diagnosed with bipolar disorder and may also act as a causal risk factor in the incidence of manic symptoms. This underscores the importance of discouraging cannabis use among youth and those with bipolar disorder to help prevent chronic psychiatric morbidity. More high quality prospective studies are required to fully elucidate how cannabis use may contribute to the development of mania over time.
Compassion and self-compassion can be protective factors against mental health difficulties, in particular depression. The cultivation of the compassionate self, associated with a range of practices such as slow and deeper breathing, compassionate voice tones and facial expressions, and compassionate focusing, is central to compassion focused therapy (Gilbert, 2010). However, no study has examined the processes of change that mediate the impact of compassionate self-cultivation practices on depressive symptoms.
The aim of this study is to investigate the impact of a brief compassionate self training (CST) intervention on depressive symptoms, and explore the psychological processes that mediate the change at post intervention.
Using a longitudinal design, participants (general population and college students) were randomly assigned to one of two conditions: Compassionate self training (n = 56) and wait-list control (n = 37). Participants in the CST condition were instructed to practice CST exercises for 15 minutes everyday or in moments of stress during two weeks. Self-report measures of depression, self-criticism, shame and compassion, were completed at pre and post in both conditions.
Results showed that, at post-intervention, participants in the CST condition decreased depression, self-criticism and shame, and increased self-compassion and openness to receive compassion from others. Mediation analyses revealed that changes in depression from pre to post intervention were mediated by decreases in self-criticism and shame, and increases in self-compassion and openness to the compassion from others.
These findings support the efficacy of compassionate self training components on lessening depressive symptoms and promoting mental health.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Preliminary investigations of cross-sectional samples have linked trait mindfulness with measures related to the hypothalamic–pituitary–adrenal (HPA)-mediated stress response and to the inflammatory system, suggesting that this is one potential pathway linking mindfulness based interventions and health. However, no previous studies explored the association between the trait mindfulness construct and markers of cellular ageing.
In the current study we examined in a sample of healthy mothers (n = 92) of a child with Autism Spectrum Disorder (i.e. women showing high levels of chronic psychological stress) the prospective associations between a multidimensional scale of trait mindfulness, the Five Facet Mindfulness Questionnaire (FFMQ), and telomerase activity (TA), a marker of cellular ageing and telomere homeostasis. Participants’ trait mindfulness and TA were assessed at baseline as well as 9 and 18 month follow-up.
Analysis showed that higher levels of baseline mindfulness on FFMQ observation and describe subscales were related to increase in TA from baseline to 9 month (r = 0.27, P = 0.03 and r = 0.24, P = .04, respectively). Additionally, the FFMQ Describe subscale was related to increase in TA from baseline to 18 month (r = .30, P = .02). Results are reported following covariate adjustment of age, BMI, ethnicity, and education.
Our results showed that higher levels of baseline mindfulness are associated with higher increases in TA after 9 months and 18 months, with increased TA reportedly being associated with decreased oxidative damage, increased telomere length and overall more functional cellular physiology. These findings support a role of mindfulness-related interventions to increase general and mental health.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
The psycholinguistic literature suggests that the length of a to-be-spoken phrase impacts the scope of speech planning, as reflected by different patterns of speech onset latencies. However, it is unclear whether such findings extend to first and second language (L1, L2) speech planning. Here, the same bilingual adults produced multi-phrase numerical equations (i.e., with natural break points) and single-phrase numbers (without natural break points) in their L1 and L2. For single-phrase utterances, both L1 and L2 were affected by L2 exposure. For multi-phrase utterances, L1 scope of planning was similar to what has been previously reported for monolinguals; however, L2 scope of planning exhibited variable patterns as a function of individual differences in L2 exposure. Thus, the scope of planning among bilinguals varies as a function of the complexity of their utterances: specifically, by whether people are speaking in their L1 or L2, and bilingual language experience.
The Asian elephant Elephas maximus is at risk of extinction as a result of anthropogenic pressures, and remaining populations are often small and fragmented remnants, occupying a fraction of the species' former range. Once widely distributed across China, only a maximum of 245 elephants are estimated to survive across seven small populations. We assessed the Asian elephant population in Nangunhe National Nature Reserve in Lincang Prefecture, China, using camera traps during May–July 2017, to estimate the population size and structure of this genetically important population. Although detection probability was low (0.31), we estimated a total population size of c. 20 individuals, and an effective density of 0.39 elephants per km2. Social structure indicated a strong sex ratio bias towards females, with only one adult male detected within the population. Most of the elephants associated as one herd but three adult females remained separate from the herd throughout the trapping period. These results highlight the fragility of remnant elephant populations such as Nangunhe and we suggest options such as a managed metapopulation approach for their continued survival in China and more widely.
The partition of the total genetic variance into its additive and non-additive components can differ from trait to trait, and between purebred and crossbred populations. A quantification of these genetic variance components will determine the extent to which it would be of interest to account for dominance in genomic evaluations or to establish mate allocation strategies along different populations and traits. This study aims at assessing the contribution of the additive and dominance genomic variances to the phenotype expression of several purebred Piétrain and crossbred (Piétrain × Large White) pig performances. A total of 636 purebred and 720 crossbred male piglets were phenotyped for 22 traits that can be classified into six groups of traits: growth rate and feed efficiency, carcass composition, meat quality, behaviour, boar taint and puberty. Additive and dominance variances estimated in univariate genotypic models, including additive and dominance genotypic effects, and a genomic inbreeding covariate allowed to retrieve the additive and dominance single nucleotide polymorphism variances for purebred and crossbred performances. These estimated variances were used, together with the allelic frequencies of the parental populations, to obtain additive and dominance variances in terms of genetic breeding values and dominance deviations. Estimates of the Piétrain and Large White allelic contributions to the crossbred variance were of about the same magnitude in all the traits. Estimates of additive genetic variances were similar regardless of the inclusion of dominance. Some traits showed relevant amount of dominance genetic variance with respect to phenotypic variance in both populations (i.e. growth rate 8%, feed conversion ratio 9% to 12%, backfat thickness 14% to 12%, purebreds-crossbreds). Other traits showed higher amount in crossbreds (i.e. ham cut 8% to 13%, loin 7% to 16%, pH semimembranosus 13% to 18%, pH longissimus dorsi 9% to 14%, androstenone 5% to 13% and estradiol 6% to 11%, purebreds-crossbreds). It was not encountered a clear common pattern of dominance expression between groups of analysed traits and between populations. These estimates give initial hints regarding which traits could benefit from accounting for dominance for example to improve genomic estimated breeding value accuracy in genetic evaluations or to boost the total genetic value of progeny by means of assortative mating.
Objectives: Research has shown that analyzing intrusion errors generated on verbal learning and memory measures is helpful for distinguishing between the memory disorders associated with Alzheimer’s disease (AD) and other neurological disorders, including Huntington’s disease (HD). Moreover, preliminary evidence suggests that certain clinical populations may be prone to exhibit different types of intrusion errors. Methods: We examined the prevalence of two new California Verbal Learning Test-3 (CVLT-3) intrusion subtypes – across-trial novel intrusions and across/within trial repeated intrusions – in individuals with AD or HD. We hypothesized that the encoding/storage impairment associated with medial-temporal involvement in AD would result in a greater number of novel intrusions on the delayed recall trials of the CVLT-3, whereas the executive dysfunction associated with subcortical-frontal involvement in HD would result in a greater number of repeated intrusions across trials. Results: The AD group generated significantly more across-trial novel intrusions than across/within trial repeated intrusions on the delayed cued-recall trials, whereas the HD group showed the opposite pattern on the delayed free-recall trials. Conclusions: These new intrusion subtypes, combined with traditional memory analyses (e.g., recall versus recognition performance), promise to enhance our ability to distinguish between the memory disorders associated with primarily medial-temporal versus subcortical-frontal involvement.
Major depressive disorder (MDD) is a leading cause of disease burden worldwide, with lifetime prevalence in the United States of 17%. Here we present the results of the first prospective, large-scale, patient- and rater-blind, randomized controlled trial evaluating the clinical importance of achieving congruence between combinatorial pharmacogenomic (PGx) testing and medication selection for MDD.
1,167 outpatients diagnosed with MDD and an inadequate response to ≥1 psychotropic medications were enrolled and randomized 1:1 to a Treatment as Usual (TAU) arm or PGx-guided care arm. Combinatorial PGx testing categorized medications in three groups based on the level of gene-drug interactions: use as directed, use with caution, or use with increased caution and more frequent monitoring. Patient assessments were performed at weeks 0 (baseline), 4, 8, 12 and 24. Patients, site raters, and central raters were blinded in both arms until after week 8. In the guided-care arm, physicians had access to the combinatorial PGx test result to guide medication selection. Primary outcomes utilized the Hamilton Depression Rating Scale (HAM-D17) and included symptom improvement (percent change in HAM-D17 from baseline), response (50% decrease in HAM-D17 from baseline), and remission (HAM-D17<7) at the fully blinded week 8 time point. The durability of patient outcomes was assessed at week 24. Medications were considered congruent with PGx test results if they were in the ‘use as directed’ or ‘use with caution’ report categories while medications in the ‘use with increased caution and more frequent monitoring’ were considered incongruent. Patients who started on incongruent medications were analyzed separately according to whether they changed to congruent medications by week8.
At week 8, symptom improvement for individuals in the guided-care arm was not significantly different than TAU (27.2% versus 24.4%, p=0.11). However, individuals in the guided-care arm were more likely than those in TAU to achieve remission (15% versus 10%; p<0.01) and response (26% versus 20%; p=0.01). Remission rates, response rates, and symptom reductions continued to improve in the guided-treatment arm until the 24week time point. Congruent prescribing increased to 91% in the guided-care arm by week 8. Among patients who were taking one or more incongruent medication at baseline, those who changed to congruent medications by week 8 demonstrated significantly greater symptom improvement (p<0.01), response (p=0.04), and remission rates (p<0.01) compared to those who persisted on incongruent medications.
Combinatorial PGx testing improves short- and long-term response and remission rates for MDD compared to standard of care. In addition, prescribing congruency with PGx-guided medication recommendations is important for achieving symptom improvement, response, and remission for MDD patients.
Funding Acknowledgements: This study was supported by Assurex Health, Inc.
Objectives: The third edition of the California Verbal Learning Test (CVLT-3) includes a new index termed List A versus Novel/Unrelated recognition discriminability (RD) on the Yes/No Recognition trial. Whereas the Total RD index incorporates false positive (FP) errors associated with all distractors (including List B and semantically related items), the new List A versus Novel/Unrelated RD index incorporates only FP errors associated with novel, semantically unrelated distractors. Thus, in minimizing levels of source and semantic interference, the List A versus Novel/Unrelated RD index may yield purer assessments of yes/no recognition memory independent of vulnerability to source memory difficulties or semantic confusion, both of which are often seen in individuals with primarily frontal-system dysfunction (e.g., early Huntington’s disease [HD]). Methods: We compared the performance of individuals with Alzheimer’s disease (AD) and HD in mild and moderate stages of dementia on CVLT-3 indices of Total RD and List A versus Novel/Unrelated RD. Results: Although AD and HD subgroups exhibited deficits on both RD indices relative to healthy comparison groups, those with HD generally outperformed those with AD, and group differences were more robust on List A versus Novel/Unrelated RD than on Total RD. Conclusions: Our findings highlight the clinical utility of the new CVLT-3 List A versus Novel/Unrelated RD index, which (a) maximally assesses yes/no recognition memory independent of source and semantic interference; and (b) provides a greater differentiation between individuals whose memory disorder is primarily at the encoding/storage level (e.g., as in AD) versus at the retrieval level (e.g., as in early HD). (JINS, 2018, 24, 833–841)
The importance of parasites as a selective force in host evolution is a topic of current interest. However, short-term ecological studies of host–parasite systems, on which such studies are usually based, provide only snap-shots of what may be dynamic systems. We report here on four surveys, carried out over a period of 12 years, of helminths of spiny mice (Acomys dimidiatus), the numerically dominant rodents inhabiting dry montane wadis in the Sinai Peninsula. With host age (age-dependent effects on prevalence and abundance were prominent) and sex (female bias in abundance in helminth diversity and in several taxa including Cestoda) taken into consideration, we focus on the relative importance of temporal and spatial effects on helminth infracommunities. We show that site of capture is the major determinant of prevalence and abundance of species (and higher taxa) contributing to helminth community structure, the only exceptions being Streptopharaus spp. and Dentostomella kuntzi. We provide evidence that most (notably the Spiruroidea, Protospirura muricola, Mastophorus muris and Gongylonema aegypti, but with exceptions among the Oxyuroidae, e.g. Syphacia minuta), show elements of temporal-site stability, with a rank order of measures among sites remaining similar over successive surveys. Hence, there are some elements of predictability in these systems.
Good education requires student experiences that deliver lessons about practice as well as theory and that encourage students to work for the public good—especially in the operation of democratic institutions (Dewey 1923; Dewy 1938). We report on an evaluation of the pedagogical value of a research project involving 23 colleges and universities across the country. Faculty trained and supervised students who observed polling places in the 2016 General Election. Our findings indicate that this was a valuable learning experience in both the short and long terms. Students found their experiences to be valuable and reported learning generally and specifically related to course material. Postelection, they also felt more knowledgeable about election science topics, voting behavior, and research methods. Students reported interest in participating in similar research in the future, would recommend other students to do so, and expressed interest in more learning and research about the topics central to their experience. Our results suggest that participants appreciated the importance of elections and their study. Collectively, the participating students are engaged and efficacious—essential qualities of citizens in a democracy.
Pigs selected for high performance may be more at risk of developing diseases. This study aimed to assess the health and performance of two pig lines divergently selected for residual feed intake (RFI) (low RFI (LRFI) v. high RFI (HRFI)) and housed in two contrasted hygiene conditions (poor v. good) using a 2×2 factorial design (n=40/group). The challenge period (Period 1), started on week zero (W0) when 12-week-old pigs were transferred to good or poor housing conditions. At week 6 (W6), half of the pigs in each group were slaughtered. During a recovery period (Period 2) from W6 to W13 to W14, the remaining pigs (n=20/group) were transferred in good hygiene conditions before being slaughtered. Blood was collected every three (Period 1) or 2 weeks (Period 2) to assess blood indicators of immune and inflammatory responses. Pulmonary lesions at slaughter and performance traits were evaluated. At W6, pneumonia prevalence was greater for pigs housed in poor than in good conditions (51% v. 8%, respectively, P<0.001). Irrespective of hygiene conditions, lung lesion scores were lower for LRFI pigs than for HRFI pigs (P=0.03). At W3, LRFI in poor conditions had the highest number of blood granulocytes (hygiene×line, P=0.03) and at W6, HRFI pigs in poor conditions had the greatest plasma haptoglobin concentrations (hygiene×line, P=0.02). During Period 1, growth rate and growth-to-feed ratio were less affected by poor hygiene in LRFI pigs than in HRFI pigs (hygiene×line, P=0.001 and P=0.02, respectively). Low residual feed intake pigs in poor conditions ate more than the other groups (hygiene×line, P=0.002). Irrespective of the line, fasting plasma glucose concentrations were higher in poor conditions, whereas fasting free fatty acids concentrations were lower than in good conditions. At the end of Period 2, pneumonia prevalence was similar for both housing conditions (39% v. 38%, respectively). During Period 2, plasma protein concentrations were greater for pigs previously housed in poor than in good conditions during Period 1. Immune traits, gain-to-feed ratio, BW gain and feed consumption did not differ during Period 2. Nevertheless, at W12, BW of HRFI previously housed in poor conditions was 13.4 kg lower than BW of HRFI pigs (P<0.001) previously housed in good conditions. In conclusion, health of the most feed efficient LRFI pigs was less impaired by poor hygiene conditions. This line was able to preserve its health, growth performance and its feed ingestion to a greater extent than the less efficient HRFI line.
Introduction: Decreasing readmission rates and return emergency department (ED) visits represent a major challenge for health organizations. Seniors are especially vulnerable to discharge adverse events which can result in unplanned readmissions and loss of physical, functional and/or cognitive capacity. The ACE Collaborative is a national quality improvement initiative that aims to improve care of elderly patients. We aimed to adapt Mount Sinai’s Care Transitions program to our local context in order to decrease avoidable readmissions and ED visits among seniors. Methods: We performed a prospective pre/post implementation cohort study. We recruited frail elderly hospitalized patients (≥50 years old) discharged to home and at risk of readmission (modified LACE index score≥7/12). We excluded patients being discharged to long-term nursing homes or institutions. Our intervention is based on selected strategic ACE Care Transitions best practices: transition coach, telehealth personal response services and a structured discharge checklist. The intervention is offered to selected patients before hospital discharge. Our primary outcome is a 30-day post-discharge composite of hospital readmission and return ED visit rate. Our secondary outcomes are functional autonomy, satisfaction with care transition, quality of life, caregiver strain and healthcare resource use at recruitment and at 30-days follow-up. Hospital-level administrative data is also collected to measure global effect of practice changes. Results: The project is currently ongoing and preliminary results are available for the pre-implementation cohort only. Patients in this cohort (n=33) were mainly men (61%), aged 75±10 years and presented an OARS score (Activities of Daily Living instrument that ranges from 0-28) of 5.6±4.9. At 30 days post-discharge, the patients in our cohort had a 42.4% readmission rate (14 hospitalisations) and a 54.5% return ED visit rate (18 visits). For the same time period, readmission and return ED rates for all patients in the same corresponding age-group at the hospital level were 14.4% and 21.9%, respectively. Further results for our post-intervention cohort will be presented at CAEP 2017. Conclusion: Our cohort of elderly patients have high readmission and return ED visit rates. Our ongoing quality improvement project aims to decrease these readmissions and ED visits.
The primary goal was to investigate the effects of l-carnitine on fuel efficiency, as an antioxidant, and for muscle recovery in Labrador retrievers. Dogs were split into two groups, with one group being supplemented with 250 mg/d of Carniking™ l-carnitine powder. Two experiments (Expt 1 and Expt 2) were performed over a 2-year period which included running programmes, activity monitoring, body composition scans and evaluation of recovery using biomarkers. Each experiment differed slightly in dog number and design: fifty-six v. forty dogs; one endurance and two sprint runs per week v. two endurance runs; and differing blood collection time points. All dogs were fed a low-carnitine diet in which a fixed amount was offered based on maintaining the minimum starting weight. Results from Expt 1 found that the carnitine dogs produced approximately 4000 more activity points per km compared with the control group during sprint (P = 0·052) and endurance runs (P = 0·0001). Male carnitine dogs produced half the creatine phosphokinase (CPK) following exercise compared with male control dogs (P = 0·05). Carnitine dogs had lower myoglobin at 6·69 ng/ml following intensive exercise compared with controls at 24·02 ng/ml (P = 0·0295). Total antioxidant capacity (TAC) and thiobarbituric acid reactive substance (TBARS) results were not considered significant. In Expt 2, body composition scans indicated that the carnitine group gained more total tissue mass while controls lost tissue mass (P = 0·0006) and also gained lean mass while the control group lost lean mass (P < 0·0001). Carnitine dogs had lower CPK secretion at 23·06 v. control at 28·37 mU/ml 24 h after post-run (P = 0·003). Myoglobin levels were lower in carnitine v. control dogs both 1 h post-run (P = 0·0157; 23·83 v. 37·91 ng/ml) and 24 h post-run (P = 0·0189; 6·25 v.13·5 ng/ml). TAC indicated more antioxidant activity in carnitine dogs at 0·16 mmv. control at 0·13 mm (P = 0·0496). TBARS were also significantly lower in carnitine dogs both pre-run (P = 0·0013; 15·36 v. 23·42 µm) and 1 h post-run (P = 0·056; 16·45 v. 20·65 µm). Supplementing l-carnitine in the form of Carniking™ had positive benefits in Labrador retrievers for activity intensity, body composition, muscle recovery and oxidative capacity.
This article reports on a case study of a decade-long organizing forms response to the need for groundbreaking innovation while maintaining existing operational performance – the explore–exploit conundrum. Employing ‘grounded research,’ data were collected on the experiences of the Asia-Pacific arm of a multinational professional service firm’s key decision-makers, innovators and entrepreneurs. The findings reveal a three-tiered organizing forms response to the explore–exploit paradox, characterized by a novel combination of heavy exploitation-driven actions alongside deep exploration projects. This case suggests that one successful approach to delivering on both explore and exploit focuses on a productive tension that emerges by enacting innovative organizing forms with contextual awareness. This productive tension was sufficiently powerful to impel individuals to innovate, but also sufficiently contained to avoid interfering with commercial outcomes. An explore–exploit framework conceptualizes organizational changes incorporating complexity and contradiction, without the implicit emphasis on removing or denying the existing tension.
This review summarizes the results from the INRA (Institut National de la Recherche Agronomique) divergent selection experiment on residual feed intake (RFI) in growing Large White pigs during nine generations of selection. It discusses the remaining challenges and perspectives for the improvement of feed efficiency in growing pigs. The impacts on growing pigs raised under standard conditions and in alternative situations such as heat stress, inflammatory challenges or lactation have been studied. After nine generations of selection, the divergent selection for RFI led to highly significant (P<0.001) line differences for RFI (−165 g/day in the low RFI (LRFI) line compared with high RFI line) and daily feed intake (−270 g/day). Low responses were observed on growth rate (−12.8 g/day, P<0.05) and body composition (+0.9 mm backfat thickness, P=0.57; −2.64% lean meat content, P<0.001) with a marked response on feed conversion ratio (−0.32 kg feed/kg gain, P<0.001). Reduced ultimate pH and increased lightness of the meat (P<0.001) were observed in LRFI pigs with minor impact on the sensory quality of the meat. These changes in meat quality were associated with changes of the muscular energy metabolism. Reduced maintenance energy requirements (−10% after five generations of selection) and activity (−21% of time standing after six generations of selection) of LRFI pigs greatly contributed to the gain in energy efficiency. However, the impact of selection for RFI on the protein metabolism of the pig remains unclear. Digestibility of energy and nutrients was not affected by selection, neither for pigs fed conventional diets nor for pigs fed high-fibre diets. A significant improvement of digestive efficiency could likely be achieved by selecting pigs on fibre diets. No convincing genetic or blood biomarker has been identified for explaining the differences in RFI, suggesting that pigs have various ways to achieve an efficient use of feed. No deleterious impact of the selection on the sow reproduction performance was observed. The resource allocation theory states that low RFI may reduce the ability to cope with stressors, via the reduction of a buffer compartment dedicated to responses to stress. None of the experiments focussed on the response of pigs to stress or challenges could confirm this theory. Understanding the relationships between RFI and responses to stress and energy demanding processes, as such immunity and lactation, remains a major challenge for a better understanding of the underlying biological mechanisms of the trait and to reconcile the experimental results with the resource allocation theory.