To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
No single environmental factor is a necessary or sufficient cause of mental disorder; multifactorial and transdiagnostic approaches are needed to understand the impact of the environment on the development of mental disorders across the life course.
Using linked multi-agency administrative data for 71 932 children from the New South Wales Child Developmental Study, using logistic regression, we examined associations between 16 environmental risk factors in early life (prenatal period to <6 years of age) and later diagnoses of mental disorder recorded in health service data (from age 6 to 13 years), both individually and summed as an environmental risk score (ERS).
The ERS was associated with all types of mental disorder diagnoses in a dose–response fashion, such that 2.8% of children with no exposure to any of the environmental factors (ERS = 0), compared to 18.3% of children with an ERS of 8 or more indicating exposure to 8 or more environmental factors (ERS ⩾ 8), had been diagnosed with any type of mental disorder up to age 13–14 years. Thirteen of the 16 environmental factors measured (including prenatal factors, neighbourhood characteristics and more proximal experiences of trauma or neglect) were positively associated with at least one category of mental disorder.
Exposure to cumulative environmental risk factors in early life is associated with an increased likelihood of presenting to health services in childhood for any kind of mental disorder. In many instances, these factors are preventable or capable of mitigation by appropriate public policy settings.
Nitrogen fixation from pasture legumes is a fundamental process that contributes to the profitability and sustainability of dryland agricultural systems. The aim of this research was to determine whether well-managed pastures, based on aerial-seeding pasture legumes, could partially or wholly meet the nitrogen (N) requirements of subsequent grain crops in an annual rotation. Fifteen experiments were conducted in Western Australia with wheat, barley or canola crops grown in a rotation that included the pasture legume species French serradella (Ornithopus sativus), biserrula (Biserrula pelecinus), bladder clover (Trifolium spumosum), annual medics (Medicago spp.) and the non-aerial seeded subterranean clover (Trifolium subterraneum). After the pasture phase, five rates of inorganic N fertilizer (Urea, applied at 0, 23, 46, 69 and 92 kg/ha) were applied to subsequent cereal and oil seed crops. The yields of wheat grown after serradella, biserrula and bladder clover, without the use of applied N fertilizer, were consistent with the target yields for growing conditions of the trials (2.3 to 5.4 t/ha). Crop yields after phases of these pasture legume species were similar or higher than those following subterranean clover or annual medics. The results of this study suggest a single season of a legume-dominant pasture may provide sufficient organic N in the soil to grow at least one crop, without the need for inorganic N fertilizer application. This has implications for reducing inorganic N requirements and the carbon footprint of cropping in dryland agricultural systems.
Post-operative oral feeding difficulties in neonates and infants with CHD is common. While pre-operative oral feeding may be normal, oral feeding challenges manifest in the post-operative period without a clearly defined aetiology. The objective of this scoping review was to examine post-operative oral feeding in full-term neonates and infants with a CHD. Electronic databases query (1 January 1975–31 May 2021), hand-search of the reference lists of included studies, contact with experts, and review of relevant conferences were performed to identify quantitative studies evaluating post-operative oral feeding in full-term neonates and infants with a CHD. Associations with additional quantitative variables in these studies were also examined. Twenty-five studies met inclusion criteria. Eighty per cent were cohort studies that utilised retrospective chart review from a single institution. The primary variable of interest in all studies was oral feeding status upon discharge from neonatal hospitalisation. The most common risk factors evaluated with poor feeding at time of discharge were birth weight (36% of included studies), gestational age (44%), duration of post-operative intubation (48%), cardiac diagnosis (40%), and presence of genetic syndrome or chromosomal anomaly (36%). The most common health-related outcomes evaluated were length of hospital stay (40%) and length of ICU stay (16%). Only the health-related outcomes of length of hospital stay and length of ICU stay were consistently significantly associated with poor post-operative oral feeding across studies in this review. A clear aetiology of poor post-operative oral feeding remains unknown.
Herbicides that inhibit very-long-chain fatty acids (VLCFAs) have been widely used for preemergence control of annual monocot and small-seeded dicot weed species, such as waterhemp, since their discovery in the 1950s. VLCFA-inhibiting herbicides are often applied in combination with active ingredients that possess residual activity on small-seeded broadleaf weeds, which can make their contribution to preemergence waterhemp control difficult to quantify. Bare-ground field experiments were designed to investigate the efficacy of eight VLCFA-inhibiting herbicides applied at their minimum and maximum labeled rates for control of Illinois waterhemp populations. Four different locations were selected, two of which contained previously characterized VLCFA inhibitor–resistant waterhemp populations in Champaign County (CHR) and McLean County (MCR). Two locations with VLCFA inhibitor–sensitive waterhemp populations included the University of Illinois South Farm in Urbana, IL, and the Orr Research Center in Perry, IL. Soils at the CHR, MCR, and Urbana locations contained greater than 3% organic matter, but less than 3% organic matter at Perry. Non-encapsulated acetochlor and alachlor controlled CHR and MCR waterhemp populations 28 d after treatment (DAT), whereas other VLCFA-inhibiting herbicides resulted in 61% and 76% control of the CHR and MCR populations, respectively. In contrast, all VLCFA-inhibiting herbicides resulted in 81% and 88% control of the Perry and Urbana waterhemp populations, respectively, 28 DAT. Waterhemp control decreased by 42 DAT, especially for the VLCFA inhibitor–resistant CHR and MCR populations. Overall, VLCFA-inhibiting herbicides remain effective for controlling sensitive waterhemp, but most are not effective for controlling VLCFA inhibitor–resistant waterhemp populations. Proper herbicide stewardship and integrated weed management practices should be implemented to maintain VLCFA-inhibiting herbicide efficacy for waterhemp management in the future.
Paramedics received training in point-of-care ultrasound (POCUS) to assess for cardiac contractility during management of medical out-of-hospital cardiac arrest (OHCA). The primary outcome was the percentage of adequate POCUS video acquisition and accurate video interpretation during OHCA resuscitations. Secondary outcomes included POCUS impact on patient management and resuscitation protocol adherence.
A prospective, observational cohort study of paramedics was performed following a four-hour training session, which included a didactic lecture and hands-on POCUS instruction. The Prehospital Echocardiogram in Cardiac Arrest (PECA) protocol was developed and integrated into the resuscitation algorithm for medical non-shockable OHCA. The ultrasound (US) images were reviewed by a single POCUS expert investigator to determine the adequacy of the POCUS video acquisition and accuracy of the video interpretation. Change in patient management and resuscitation protocol adherence data, including end-tidal carbon dioxide (EtCO2) monitoring following advanced airway placement, adrenaline administration, and compression pauses under ten seconds, were queried from the prehospital electronic health record (EHR).
Captured images were deemed adequate in 42/49 (85.7%) scans and paramedic interpretation of sonography was accurate in 43/49 (87.7%) scans. The POCUS results altered patient management in 14/49 (28.6%) cases. Paramedics adhered to EtCO2 monitoring in 36/36 (100.0%) patients with an advanced airway, adrenaline administration for 38/38 (100.0%) patients, and compression pauses under ten seconds for 36/38 (94.7%) patients.
Paramedics were able to accurately obtain and interpret cardiac POCUS videos during medical OHCA while adhering to a resuscitation protocol. These findings suggest that POCUS can be effectively integrated into paramedic protocols for medical OHCA.
The Trial Innovation Network has established an infrastructure for single IRB review in response to federal policies. The Network’s single IRB (sIRBs) have successfully supported over 70 multisite studies via more than 800 reliance arrangements. This has generated several lessons learned that can benefit the national clinical research enterprise, as we work to improve the conduct of clinical trials. These lessons include distinguishing the roles of the single IRB from institutional Human Research Protections programs, establishing a consistent sIRB review model, standardizing collection of local context and supplemental, study-specific information, and educating and empowering lead study teams to support their sites.
Sustainment refers to continued intervention delivery over time, while continuing to produce intended outcomes, often with ongoing adaptations, which are purposeful changes to the design or delivery of an intervention to improve its fit or effectiveness. The Hispanic Kidney Transplant Program (HKTP), a complex, culturally competent intervention, was implemented in two transplant programs to reduce disparities in Hispanic/Latinx living donor kidney transplant rates. This study longitudinally examined the influence of adaptations on HKTP sustainment.
Qualitative interviews, learning collaborative calls, and telephone meetings with physicians, administrators, and staff (n = 55) were conducted over three years of implementation to identify HKTP adaptations. The Framework for Reporting Adaptations and Modifications-Expanded was used to classify adaptation types and frequency, which were compared across sites over time.
Across sites, more adaptations were made in the first year (n = 47), then fell and plateaued in the two remaining years (n = 35). Adaptations at Site-A were consistent across years (2017: n = 18, 2018: n = 17, 2019: n = 14), while Site-B made considerably fewer adaptations after the first year (2017: n = 29, 2018: n = 18, 2019: n = 21). Both sites proportionally made mostly skipping (32%), adding (20%), tweaking (20%), and substituting (16%) adaptation types. Skipping- and substituting-type adaptations were made due to institutional structural characteristics and lack of available resources, respectively. However, Site-A’s greater proportion of skipping-type adaptations was attributed to greater system complexity, and Site-B’s greater proportion of adding-type adaptation was attributed to the egalitarian team-based culture.
Our findings can help prepare implementers to expect certain context-specific adaptations and preemptively avoid those that hinder sustainment.
This study aimed to explore effects of adjunctive minocycline treatment on inflammatory and neurogenesis markers in major depressive disorder (MDD). Serum samples were collected from a randomised, placebo-controlled 12-week clinical trial of minocycline (200 mg/day, added to treatment as usual) for adults (n = 71) experiencing MDD to determine changes in interleukin-6 (IL-6), lipopolysaccharide binding protein (LBP) and brain derived neurotrophic factor (BDNF). General Estimate Equation modelling explored moderation effects of baseline markers and exploratory analyses investigated associations between markers and clinical outcomes. There was no difference between adjunctive minocycline or placebo groups at baseline or week 12 in the levels of IL-6 (week 12; placebo 2.06 ± 1.35 pg/ml; minocycline 1.77 ± 0.79 pg/ml; p = 0.317), LBP (week 12; placebo 3.74 ± 0.95 µg/ml; minocycline 3.93 ± 1.33 µg/ml; p = 0.525) or BDNF (week 12; placebo 24.28 ± 6.69 ng/ml; minocycline 26.56 ± 5.45 ng/ml; p = 0.161). Higher IL-6 levels at baseline were a predictor of greater clinical improvement. Exploratory analyses suggested that the change in IL-6 levels were significantly associated with anxiety symptoms (HAMA; p = 0.021) and quality of life (Q-LES-Q-SF; p = 0.023) scale scores. No other clinical outcomes were shown to have this mediation effect, nor did the other markers (LBP or BDNF) moderate clinical outcomes. There were no overall changes in IL-6, LBP or BDNF following adjunctive minocycline treatment. Exploratory analyses suggest a potential role of IL-6 on mediating anxiety symptoms with MDD. Future trials may consider enrichment of recruitment by identifying several markers or a panel of factors to better represent an inflammatory phenotype in MDD with larger sample size.
Many mental disorders, including depression, bipolar disorder and schizophrenia, are associated with poor dietary quality and nutrient intake. There is, however, a deficit of research looking at the relationship between obsessive–compulsive disorder (OCD) severity, nutrient intake and dietary quality.
This study aims to explore the relationship between OCD severity, nutrient intake and dietary quality.
A post hoc regression analysis was conducted with data combined from two separate clinical trials that included 85 adults with diagnosed OCD, using the Structured Clinical Interview for DSM-5. Nutrient intakes were calculated from the Dietary Questionnaire for Epidemiological Studies version 3.2, and dietary quality was scored with the Healthy Eating Index for Australian Adults – 2013.
Nutrient intake in the sample largely aligned with Australian dietary guidelines. Linear regression models adjusted for gender, age and total energy intake showed no significant associations between OCD severity, nutrient intake and dietary quality (all P > 0.05). However, OCD severity was inversely associated with caffeine (β = −15.50, 95% CI −28.88 to −2.11, P = 0.024) and magnesium (β = −6.63, 95% CI −12.72 to −0.53, P = 0.034) intake after adjusting for OCD treatment resistance.
This study showed OCD severity had little effect on nutrient intake and dietary quality. Dietary quality scores were higher than prior studies with healthy samples, but limitations must be noted regarding comparability. Future studies employing larger sample sizes, control groups and more accurate dietary intake measures will further elucidate the relationship between nutrient intake and dietary quality in patients with OCD.
To describe the incidence of systemic overlap and typical coronavirus disease 2019 (COVID-19) symptoms in healthcare personnel (HCP) following COVID-19 vaccination and association of reported symptoms with diagnosis of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) infection in the context of public health recommendations regarding work exclusion.
This prospective cohort study was conducted between December 16, 2020, and March 14, 2021, with HCP who had received at least 1 dose of either the Pfizer-BioNTech or Moderna COVID-19 vaccine.
Large healthcare system in New England.
HCP were prompted to complete a symptom survey for 3 days after each vaccination. Reported symptoms generated automated guidance regarding symptom management, SARS-CoV-2 testing requirements, and work restrictions. Overlap symptoms (ie, fever, fatigue, myalgias, arthralgias, or headache) were categorized as either lower or higher severity. Typical COVID-19 symptoms included sore throat, cough, nasal congestion or rhinorrhea, shortness of breath, ageusia and anosmia.
Among 64,187 HCP, a postvaccination electronic survey had response rates of 83% after dose 1 and 77% after dose 2. Report of ≥3 lower-severity overlap symptoms, ≥1 higher-severity overlap symptoms, or at least 1 typical COVID-19 symptom after dose 1 was associated with increased likelihood of testing positive. HCP with prior COVID-19 infection were significantly more likely to report severe overlap symptoms after dose 1.
Reported overlap symptoms were common; however, only report of ≥3 low-severity overlap symptoms, at least 1 higher-severity overlap symptom, or any typical COVID-19 symptom were associated with infection. Work-related restrictions for overlap symptoms should be reconsidered.
Background: Children with pathogenic variations in SCN8A can present with early infantile epileptic encephalopathy-13, benign familial infantile seizures-5 or intellectual disability alone without epilepsy. In this case series, we discuss six children with variants in SCN8A managed at BC Children’s Hospital. Methods: We describe clinical and genetic results on six individuals with SCN8A variants identified via clinical or research next-generation sequencing. Functional consequences of two SCN8A variants were assessed using electrophysiological analyses in transfected cells. Results: Clinical findings ranged from normal development with well-controlled epilepsy to significant developmental delay with treatment-resistant epilepsy. Phenotypes and genotypes in our cohort are described in the table below. Functional analysis supported gain-of-function in P2 and loss-of-function in P4. Conclusions: Our cohort expands the clinical and genotypic spectrum of SCN8A-related disorders. We establish functional evidence for two missense variants in SCN8A, including LoF variant in a patient with intellectual disability, and autism spectrum disorder without seizures.
Table for P.120
Current antiseizure medication
Infantile spasms, LGS, hyperkinetic movements
3y - EEG abnormality only
Sodium valproate (discontinued)
No clinical seizure
c.971G>A (p.Cys324Tyr)/LoF, VUS in KCNQ3
Abbreviations: *Father with similar history, y Years, m Months, GDD Global developmental delay, LGS Lennox-Gastaut syndrome, VUS Variant of unknown significance, LoF Loss-of-function, GoF Gain-of-function, EEG Electroencephalogram, F - Female, M - Male, CBD - Cannabidiol
Microscopic examination of blood smears remains the gold standard for laboratory inspection and diagnosis of malaria. Smear inspection is, however, time-consuming and dependent on trained microscopists with results varying in accuracy. We sought to develop an automated image analysis method to improve accuracy and standardization of smear inspection that retains capacity for expert confirmation and image archiving. Here, we present a machine learning method that achieves red blood cell (RBC) detection, differentiation between infected/uninfected cells, and parasite life stage categorization from unprocessed, heterogeneous smear images. Based on a pretrained Faster Region-Based Convolutional Neural Networks (R-CNN) model for RBC detection, our model performs accurately, with an average precision of 0.99 at an intersection-over-union threshold of 0.5. Application of a residual neural network-50 model to infected cells also performs accurately, with an area under the receiver operating characteristic curve of 0.98. Finally, combining our method with a regression model successfully recapitulates intraerythrocytic developmental cycle with accurate lifecycle stage categorization. Combined with a mobile-friendly web-based interface, called PlasmoCount, our method permits rapid navigation through and review of results for quality assurance. By standardizing assessment of Giemsa smears, our method markedly improves inspection reproducibility and presents a realistic route to both routine lab and future field-based automated malaria diagnosis.
Little is known about the neural correlates of dissociative amnesia, a transdiagnostic symptom mostly present in the dissociative disorders and core characteristic of dissociative identity disorder (DID). Given the vital role of the hippocampus in memory, a prime candidate for investigation is whether total and/or subfield hippocampal volume can serve as biological markers of dissociative amnesia.
A total of 75 women, 32 with DID and 43 matched healthy controls (HC), underwent structural magnetic resonance imaging (MRI). Using Freesurfer (version 6.0), volumes were extracted for bilateral global hippocampus, cornu ammonis (CA) 1–4, the granule cell molecular layer of the dentate gyrus (GC-ML-DG), fimbria, hippocampal−amygdaloid transition area (HATA), parasubiculum, presubiculum and subiculum. Analyses of covariance showed volumetric differences between DID and HC. Partial correlations exhibited relationships between the three factors of the dissociative experience scale scores (dissociative amnesia, absorption, depersonalisation/derealisation) and traumatisation measures with hippocampal global and subfield volumes.
Hippocampal volumes were found to be smaller in DID as compared with HC in bilateral global hippocampus and bilateral CA1, right CA4, right GC-ML-DG, and left presubiculum. Dissociative amnesia was the only dissociative symptom that correlated uniquely and significantly with reduced bilateral hippocampal CA1 subfield volumes. Regarding traumatisation, only emotional neglect correlated negatively with bilateral global hippocampus, bilateral CA1, CA4 and GC-ML-DG, and right CA3.
We propose decreased CA1 volume as a biomarker for dissociative amnesia. We also propose that traumatisation, specifically emotional neglect, is interlinked with dissociative amnesia in having a detrimental effect on hippocampal volume.
Obsessive–compulsive disorder (OCD) is often challenging to treat and resistant to psychological interventions and prescribed medications. The adjunctive use of nutraceuticals with potential neuromodulatory effects on underpinning pathways such as the glutamatergic and serotonergic systems is one novel approach.
To assess the effectiveness and safety of a purpose-formulated combination of nutraceuticals in treating OCD: N-acetyl cysteine, L-theanine, zinc, magnesium, pyridoxal-5′ phosphate, and selenium.
A 20-week open label proof-of-concept study was undertaken involving 28 participants with treatment-resistant DSM-5-diagnosed OCD, during 2017 to 2020. The primary outcome measure was the Yale-Brown Obsessive–Compulsive Scale (YBOCS), administered every 4 weeks.
An intention-to-treat analysis revealed an estimated mean reduction across time (baseline to week-20) on the YBOCS total score of −7.13 (95% confidence interval = −9.24, −5.01), with a mean reduction of −1.21 points per post-baseline visit (P ≤ .001). At 20-weeks, 23% of the participants were considered “responders” (YBOCS ≥35% reduction and “very much” or “much improved” on the Clinical Global Impression-Improvement scale). Statistically significant improvements were also revealed on all secondary outcomes (eg, mood, anxiety, and quality of life). Notably, treatment response on OCD outcome scales (eg, YBOCS) was greatest in those with lower baseline symptom levels, while response was limited in those with relatively more severe OCD.
While this pilot study lacks placebo-control, the significant time effect in this treatment-resistant OCD population is encouraging and suggests potential utility especially for those with lower symptom levels. Our findings need to be confirmed or refuted via a follow-up placebo-controlled study.
Comparative transcriptomics can be used to translate an understanding of gene regulatory networks from model systems to less studied species. Here, we use RNA-Seq to determine and compare gene expression dynamics through the floral transition in the model species Arabidopsis thaliana and the closely related crop Brassica rapa. We find that different curve registration functions are required for different genes, indicating that there is no single common ‘developmental time’ between Arabidopsis and B. rapa. A detailed comparison between Arabidopsis and B. rapa and between two B. rapa accessions reveals different modes of regulation of the key floral integrator SOC1, and that the floral transition in the B. rapa accessions is triggered by different pathways. Our study adds to the mechanistic understanding of the regulatory network of flowering time in rapid cycling B. rapa and highlights the importance of registration methods for the comparison of developmental gene expression data.
Patients with pelvic pain are often told that their pain is “all in their head.” In many years of seeing patients for pelvic pain I have never seen one who did not have an organic reason for pain. Often patients with chronic pain, especially pelvic pain, may develop secondary depression and anxiety but neither of these conditions alone is responsible for their pain. Patients who are unable to have intercourse because of pain fear loss of the partner and become especially anxious. Additionally, because of the very personal nature of their pain they are often not able to discuss their condition with any friends or family members. It is very important to believe that the patient’s pain is real and not voice any doubts, especially in the presence of a partner. Treatment of coexisting psychological disorders such as anxiety or depression it is very important in patients with pelvic pain.
The COVID-19 pandemic prompted the development and implementation of hundreds of clinical trials across the USA. The Trial Innovation Network (TIN), funded by the National Center for Advancing Translational Sciences, was an established clinical research network that pivoted to respond to the pandemic.
The TIN’s three Trial Innovation Centers, Recruitment Innovation Center, and 66 Clinical and Translational Science Award Hub institutions, collaborated to adapt to the pandemic’s rapidly changing landscape, playing central roles in the planning and execution of pivotal studies addressing COVID-19. Our objective was to summarize the results of these collaborations and lessons learned.
The TIN provided 29 COVID-related consults between March 2020 and December 2020, including 6 trial participation expressions of interest and 8 community engagement studios from the Recruitment Innovation Center. Key lessons learned from these experiences include the benefits of leveraging an established infrastructure, innovations surrounding remote research activities, data harmonization and central safety reviews, and early community engagement and involvement.
Our experience highlighted the benefits and challenges of a multi-institutional approach to clinical research during a pandemic.
The distribution of genetic diversity in invasive plant populations can have important management implications. Alligatorweed [Alternanthera philoxeroides (Mart.) Griseb.] was introduced into the United States around 1900 and has since spread throughout much of the southern United States and California. A successful biological control program was initiated in the late 1960s that reduced A. philoxeroides in the southern United States, although control has varied geographically. The degree to which variation among genotypes may be responsible for variation in control efficacy has not been well studied due to a lack of genetic data. We sampled 373 plants from 90 sites across the United States and genotyped all samples at three chloroplast regions to help inform future management efforts. Consistent with clonal spread, there was high differentiation between sites, yet we found six haplotypes and high haplotype diversity (mean h = 0.48) across states, suggesting this plant has been introduced multiple times. Two of the haplotypes correspond to previously described biotypes that differ in their susceptibility to herbicides and herbivory. The geographic distribution of the three common haplotypes varied by latitude and longitude, while the other haplotypes were widespread or localized to one or a few sites. All the haplotypes we screened are hexaploid (6n = 102), which may enhance biological control. Future studies can use these genetic data to determine whether genotypes differ in their invasiveness or respond differently to control measures. Some states, for instance, have mainly a single haplotype that may respond more uniformly to a single control strategy, whereas other states may require a variety of control strategies. These data will also provide the basis for identifying the source regions in South America, which may lead to the discovery of new biological control agents more closely matched to particular genotypes.
Dietary protein is a pre-requisite for the maintenance of skeletal muscle mass; stimulating increases in muscle protein synthesis (MPS), via essential amino acids (EAA), and attenuating muscle protein breakdown, via insulin. Muscles are receptive to the anabolic effects of dietary protein, and in particular the EAA leucine, for only a short period (i.e. about 2–3 h) in the rested state. Thereafter, MPS exhibits tachyphylaxis despite continued EAA availability and sustained mechanistic target of rapamycin complex 1 signalling. Other notable characteristics of this ‘muscle full’ phenomenon include: (i) it cannot be overcome by proximal intake of additional nutrient signals/substrates regulating MPS; meaning a refractory period exists before a next stimulation is possible, (ii) it is refractory to pharmacological/nutraceutical enhancement of muscle blood flow and thus is not induced by muscle hypo-perfusion, (iii) it manifests independently of whether protein intake occurs in a bolus or intermittent feeding pattern, and (iv) it does not appear to be dependent on protein dose per se. Instead, the main factor associated with altering muscle full is physical activity. For instance, when coupled to protein intake, resistance exercise delays the muscle full set-point to permit additional use of available EAA for MPS to promote muscle remodelling/growth. In contrast, ageing is associated with blunted MPS responses to protein/exercise (anabolic resistance), while physical inactivity (e.g. immobilisation) induces a premature muscle full, promoting muscle atrophy. It is crucial that in catabolic scenarios, anabolic strategies are sought to mitigate muscle decline. This review highlights regulatory protein turnover interactions by dietary protein, exercise, ageing and physical inactivity.