We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Many older adults experience memory changes that can have a meaningful impact on their everyday lives, such as restrictions to lifestyle activities and negative emotions. Older adults also report a variety of positive coping responses that help them manage these changes. The purpose of this study was to determine how objective cognitive performance and self-reported memory are related to the everyday impact of memory change.
Methods:
We examined these associations in a sample of 94 older adults (age 60–89, 52% female) along a cognitive ability continuum from normal cognition to mild cognitive impairment.
Results:
Correlational analyses revealed that greater restrictions to lifestyle activities (|rs| = .36–.66), more negative emotion associated with memory change (|rs| = .27–.76), and an overall greater burden of memory change on everyday living (|rs| = .28–.61) were associated with poorer objective memory performance and lower self-reported memory ability and satisfaction. Performance on objective measures of executive attention was unrelated to the impact of memory change. Self-reported strategy use was positively related to positive coping with memory change (|r| = .26), but self-reported strategy use was associated with more negative emotions regarding memory change (|r| = .23).
Conclusions:
Given the prevalence of memory complaints among older adults, it is important to understand the experience of memory change and its impact on everyday functioning in order to develop services that target the specific needs of this population.
Antibiotics are among the most common medications prescribed in nursing homes. The annual prevalence of antibiotic use in residents of nursing homes ranges from 47% to 79%, and more than half of antibiotic courses initiated in nursing-home settings are unnecessary or prescribed inappropriately (wrong drug, dose, or duration). Inappropriate antibiotic use is associated with a variety of negative consequences including Clostridioides difficile infection (CDI), adverse drug effects, drug–drug interactions, and antimicrobial resistance. In response to this problem, public health authorities have called for efforts to improve the quality of antibiotic prescribing in nursing homes.
In this study, we aimed to capture perspectives of healthcare workers (HCWs) on coronavirus disease 2019 (COVID-19) and infection prevention and control (IPAC) measures implemented during the early phase of the COVID-19 pandemic.
Design:
A cross-sectional survey of HCWs.
Participants:
HCWs from the Hospital for Sick Children, Toronto, Canada.
Intervention:
A self-administered survey was distributed to HCWs. We analyzed factors influencing HCW knowledge and self-reported use of personal protective equipment (PPE), concerns about contracting COVID-19 and acceptance of the recommended IPAC precautions for COVID-19.
Results:
In total, 175 HCWs completed the survey between March 6 and March 10: 35 staff physicians (20%), 24 residents or fellows (14%), 72 nurses (41%), 14 respiratory therapists (8%), 14 administration staff (8%), and 14 other employees (8%). Most of the respondents were from the emergency department (n = 58, 33%) and the intensive care unit (n = 58, 33%). Only 86 respondents (50%) identified the correct donning order; only 60 (35%) identified the correct doffing order; but the majority (n = 113, 70%) indicated the need to wash their hands immediately prior to removal of their mask and eye protection. Also, 91 (54%) respondents felt comfortable with recommendations for droplet and/or contact precautions for routine care of patients with COVID-19. HCW occupation and concerns about contracting COVID-19 outside work were associated with nonacceptance of the recommendations (P = .016 and P = .036 respectively).
Conclusion:
As part of their pandemic response plans, healthcare institutions should have ongoing training for HCWs that focus on appropriate PPE doffing and discussions around modes of transmission of COVID-19.
In this paper, we investigate the impingement of a two-dimensional (2-D) vortex pair translating downwards onto a horizontal wall with a wavy surface. A principal purpose is to compare the vortex dynamics with the complementary case of a wavy vortex pair (deformed by the long-wavelength Crow instability) impinging onto a flat surface. The simpler case of a 2-D vortex pair descending onto a flat horizontal ground plane leads to the well known ‘rebound’ effect, wherein the primary vortex pair approaches the wall but subsequently advects vertically upwards, due to the induced velocity of secondary vorticity. In contrast, a wavy vortex pair descending onto a flat plane leads to ‘rebounding’ vorticity in the form of vortex rings. A descending 2-D vortex pair, impinging on a wavy wall, also generates ‘rebounding’ vortex rings. In this case, we observe that the vortex pair interacts first with the ‘hills’ of the wavy wall before the ‘valleys’. The resulting secondary vorticity rolls up into a concentrated vortex tube, ultimately forming a vortex loop along each valley. Each vortex loop pinches off to form a vortex ring, which advects upwards. Surprisingly, these rebounding vortex rings evolve without the strong axial flows fundamental to the wavy vortex case. The present research is relevant to wing tip trailing vortices interacting with a non-uniform ground plane. A non-flat wall is shown to accelerate the decay of the primary vortex pair. Such a passive, ground-based method to diminish the wake vortex hazard close to the ground is consistent with Stephan et al. (J. Aircraft, vol. 50 (4), 2013a, pp. 1250–1260; CEAS Aeronaut. J., vol. 5 (2), 2013b, pp. 109–125).
To evaluate the long-term safety and efficacy of adjunctive aripiprazole (ARI) to lithium (LI) or valproate (VAL) in delaying time to relapse in bipolar I disorder.
Methods
Bipolar I disorder subjects with a current manic or mixed episode received LI or VAL for at least 2 weeks; inadequate responders (YMRS score ≥ 16 and ≤35% decrease from baseline at 2 weeks) received adjunctive ARI. Subjects maintaining mood stability (YMRS and MADRS ≤ 12 for 12 consecutive weeks) were randomised 1:1 to double-blind ARI (10 to 30 mg/day) or placebo (PBO) plus LI or VAL. Relapse was monitored up to 52 weeks.
Results
337 subjects were randomised to continuation of mood stabiliser plus adjunctive ARI or PBO; 61.3% and 52.7%, respectively, completed the study. Adjunctive ARI significantly delayed the time to any relapse, hazard ratio = 0.544 (95% CI: 0.33, 0.89, log-rank p = 0.014). Overall relapse rates at 52 weeks were 14.9% and 25.4% in ARI vs PBO subjects. A superior reduction in CGI-BP Mania Severity of Illness from baseline at 52 weeks was also observed (0.3 vs. 0.0, respectively, p = 0.01). Adverse events generally were as expected per known drug and illness profiles with no significant difference in mean change in body weight between adjunctive PBO (0.60 kg) and adjunctive ARI (1.07 kg) (p = 0.49 Week 52, LOCF).
Conclusion
Continuation of aripiprazole treatment increased time to relapse to any mood episode compared with placebo plus LI/VAL over 1 year, indicating a long-term benefit in continuing adjunctive aripiprazole to a mood stabiliser after sustained remission is achieved.
Major depression is a significant problem for people with a traumatic brain injury (TBI) and its treatment remains difficult. A promising approach to treat depression is Mindfulness-based cognitive therapy (MBCT), a relatively new therapeutic approach rooted in mindfulness based stress-reduction (MBSR) and cognitive behavioral therapy (CBT). We conducted this study to examine the effectiveness of MBCT in reducing depression symptoms among people who have a TBI.
Methods:
Twenty individuals diagnosed with major depression were recruited from a rehabilitation clinic and completed the 8-week MBCT intervention. Instruments used to measure depression symptoms included: BDI-II, PHQ-9, HADS, SF-36 (Mental Health subscale), and SCL-90 (Depression subscale). They were completed at baseline and post-intervention.
Results:
All instruments indicated a statistically significant reduction in depression symptoms post-intervention (p < .05). For example, the total mean score on the BDI-II decreased from 25.2 (9.8) at baseline to 18.2 (11.7) post-intervention (p=.001). Using a PHQ threshold of 10, the proportion of participants with a diagnosis of major depression was reduced by 59% at follow-up (p=.012).
Conclusions:
Most participants reported reductions in depression symptoms after the intervention such that many would not meet the criteria for a diagnosis of major depression. This intervention may provide an opportunity to address a debilitating aspect of TBI and could be implemented concurrently with more traditional forms of treatment, possibly enhancing their success. The next step will involve the execution of multi-site, randomized controlled trials to fully demonstrate the value of the intervention.
Attentional bias is an important psychological mechanism that has been extensively explored within the anxiety literature and more recently in chronic pain. Cognitive behavioural models of chronic fatigue syndrome (CFS) and chronic pain suggest an overlap in the mechanisms of these two conditions. the current study investigated attentional bias towards health-threat stimuli in individuals with CFS, compared to healthy controls. the study also examined whether individuals with CFS have impaired executive attention, and how it was related to attentional bias.
Methods:
Two participant groups, CFS (n = 27) and healthy control (n = 35), completed a Visual Probe Task measuring attentional bias towards health-threat stimuli (words and pictures) presented at 500ms and 1250ms, and an Attention Network Test measuring alerting, orienting and executive attention. Participants also completed a series of standard self-report measures.
Results:
When compared to the control group, the CFS group showed greater attentional bias towards threat-words, but not pictures, regardless of stimulus duration. This was not related to anxiety or depression. the CFS group was also significantly impaired on executive attention compared to the controls. Post-hoc analyses indicated that only CFS individuals with poor executive attention showed a threat-word bias when compared to controls and CFS individuals with good executive attention.
Conclusions:
The findings from this study suggest that CFS individuals show enhanced attentional biases for health-threat stimuli, which may contribute to the perpetuation of the condition. Moreover, the attentional biases in CFS are dependent on an individual's capacity to voluntarily control their attention.
Cardiovascular risk prediction tools are important for cardiovascular disease (CVD) prevention, however, which algorithms are appropriate for people with severe mental illness (SMI) is unclear.
Objectives/aims
To determine the cost-effectiveness using the net monetary benefit (NMB) approach of two bespoke SMI-specific risk algorithms compared to standard risk algorithms for primary CVD prevention in those with SMI, from an NHS perspective.
Methods
A microsimulation model was populated with 1000 individuals with SMI from The Health Improvement Network Database, aged 30–74 years without CVD. Four cardiovascular risk algorithms were assessed; (1) general population lipid, (2) general population BMI, (3) SMI-specific lipid and (4) SMI-specific BMI, compared against no algorithm. At baseline, each cardiovascular risk algorithm was applied and those high-risk (> 10%) were assumed to be prescribed statin therapy, others received usual care. Individuals entered the model in a ‘healthy’ free of CVD health state and with each year could retain their current health state, have cardiovascular events (non-fatal/fatal) or die from other causes according to transition probabilities.
Results
The SMI-specific BMI and general population lipid algorithms had the highest NMB of the four algorithms resulting in 12 additional QALYs and a cost saving of approximately £37,000 (US$ 58,000) per 1000 patients with SMI over 10 years.
Conclusions
The general population lipid and SMI-specific BMI algorithms performed equally well. The ease and acceptability of use of a SMI-specific BMI algorithm (blood tests not required) makes it an attractive algorithm to implement in clinical settings.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Nutritional ketosis, induced via either the classical ketogenic diet or the use of emulsified medium-chain triglycerides, is an established treatment for pharmaceutical resistant epilepsy in children and more recently in adults. In addition, the use of oral ketogenic compounds, fractionated coconut oil, very low carbohydrate intake, or ketone monoester supplementation has been reported to be potentially helpful in mild cognitive impairment, Parkinson’s disease, schizophrenia, bipolar disorder, and autistic spectrum disorder. In these and other neurodegenerative and neuroprogressive disorders, there are detrimental effects of oxidative stress, mitochondrial dysfunction, and neuroinflammation on neuronal function. However, they also adversely impact on neurone–glia interactions, disrupting the role of microglia and astrocytes in central nervous system (CNS) homeostasis. Astrocytes are the main site of CNS fatty acid oxidation; the resulting ketone bodies constitute an important source of oxidative fuel for neurones in an environment of glucose restriction. Importantly, the lactate shuttle between astrocytes and neurones is dependent on glycogenolysis and glycolysis, resulting from the fact that the astrocytic filopodia responsible for lactate release are too narrow to accommodate mitochondria. The entry into the CNS of ketone bodies and fatty acids, as a result of nutritional ketosis, has effects on the astrocytic glutamate–glutamine cycle, glutamate synthase activity, and on the function of vesicular glutamate transporters, EAAT, Na+, K+-ATPase, Kir4.1, aquaporin-4, Cx34 and KATP channels, as well as on astrogliosis. These mechanisms are detailed and it is suggested that they would tend to mitigate the changes seen in many neurodegenerative and neuroprogressive disorders. Hence, it is hypothesized that nutritional ketosis may have therapeutic applications in such disorders.
Nearly half of care home residents with advanced dementia have clinically significant agitation. Little is known about costs associated with these symptoms toward the end of life. We calculated monetary costs associated with agitation from UK National Health Service, personal social services, and societal perspectives.
Design:
Prospective cohort study.
Setting:
Thirteen nursing homes in London and the southeast of England.
Participants:
Seventy-nine people with advanced dementia (Functional Assessment Staging Tool grade 6e and above) residing in nursing homes, and thirty-five of their informal carers.
Measurements:
Data collected at study entry and monthly for up to 9 months, extrapolated for expression per annum. Agitation was assessed using the Cohen-Mansfield Agitation Inventory (CMAI). Health and social care costs of residing in care homes, and costs of contacts with health and social care services were calculated from national unit costs; for a societal perspective, costs of providing informal care were estimated using the resource utilization in dementia (RUD)-Lite scale.
Results:
After adjustment, health and social care costs, and costs of providing informal care varied significantly by level of agitation as death approached, from £23,000 over a 1-year period with no agitation symptoms (CMAI agitation score 0–10) to £45,000 at the most severe level (CMAI agitation score >100). On average, agitation accounted for 30% of health and social care costs. Informal care costs were substantial, constituting 29% of total costs.
Conclusions:
With the increasing prevalence of dementia, costs of care will impact on healthcare and social services systems, as well as informal carers. Agitation is a key driver of these costs in people with advanced dementia presenting complex challenges for symptom management, service planners, and providers.
Congenital renal and urinary tract anomalies are common, accounting for up to 21% of all congenital abnormalities [1]. The reported incidence is approximately 1:250–1:1000 pregnancies [2] and the routine use of prenatal ultrasonography allows relatively early detection, particularly for the obstructive uropathies, which account for the majority. According to the latest UK renal registry report in 2015, ‘obstructive uropathy’ was the second leading cause (19%) of chronic renal failure in children under 16 years of age after renal dysplasia +/− reflux [3]. The obstructions may occur within the upper or lower urinary tract, and their prognosis varies significantly, with obstructions at the level of the bladder neck being associated with the majority of neonatal mortality and renal failure. In untreated cases, perinatal mortality is high (up to 45%, often because of associated severe oligohydramnios and pulmonary hypoplasia) [4], and 30% of the survivors suffer from end-stage renal failure (ESRF) requiring dialysis and renal transplantation before the age of 5 [5]. The overall chance of survival in childhood is lowest if renal support therapy or transplantation is commenced before 2 years old when compared with starting at 12–16 years old (hazard ratio [HR] of 4.1, 95% confidence interval [CI] 1.7–9.9, P = 0.002) [3]. Therefore, in utero intervention, by the insertion of a vesicoamniotic shunt, or therapeutic treatment by fetal cystoscopy and valvular ablation, has been attempted to attenuate in utero progression of these pathologies (and their consequences) and to alter the natural history of congenital bladder neck obstruction in childhood. In this chapter, we discuss the etiology, pathophysiology, prenatal presentation and diagnosis of congenital bladder neck obstruction. Suggested algorithms for screening and the prenatal prognostic evaluation in selecting candidates for in utero therapy will be discussed.
Introduction: Transcutaneous cardiac pacing (TCP) is recommended for the treatment of symptomatic bradycardia, a life-threatening condition. Although TCP is taught in ACLS (advanced cardiac life support) courses, it is a difficult skill to master for junior residents. The main objective of this study is to measure the impact of having access to a checklist on successful TCP implementation. Our hypothesis was that the availability of a CL would improve performance of junior residents in the management of symptomatic bradycardia by facilitating TCP. Methods: We conducted a prospective, randomized, single-site study. First-year residents entering postgraduate programs and taking a mandatory ACLS course were enrolled. Students had didactic sessions on the management of symptomatic bradycardia followed by hands-on teaching on a low-fidelity manikin (ALS® simulator, Laerdal) using a CL conceived for this project as a teaching tool. Study participants were then assessed with a simulation scenario requiring TCP. Participants were randomly assigned to groups with and without CL accessibility. Performances were graded on six critical tasks. The primary outcome was the successful use of TCP, defined as having completed all tasks. Participants then completed a post-test questionnaire. Sample size estimation was based on a previous project (Ranger et al., 2018). Accepting an alpha error of 0.05 and a power of 80%, 45 participants in each group would permit the detection of 26.5% in performance gain. Results: Of 250 residents completing the ACLS course in 2017, 85 voluntary participants were randomized to a control group (no CL available during testing, n = 42) or an experimental group (CL available during testing, n = 43). Six participants in the experimental group adequately used TCP compared to five participants in the control group (p = 0.81, chi-squared test). Out of the 43 participants who had access to the CL, only 2 (5%) used it. Reasons why the CL was infrequently used were stated as the following: 24 participants (56%) mentioned not realizing it was available, 8 (19%) considered it was of little to no utility and 5 (19%) forgot a CL existed. Conclusion: Availability of a checklist previously used during simulation teaching did not increase junior residents’ capacity to correctly apply TCP. Non-recognition of CL availability and decreased perceived need for it were the main reasons for marginal use. Our results suggest that there are many limiting factors to CL effectiveness.
Introduction: Individualizing risk for stroke following a transient ischemic attack (TIA) is a topic of intense research, as existing scores are context-dependent or have not been well validated. The Canadian TIA Score stratifies risk of subsequent stroke into low, moderate and high risk. Our objective was to prospectively validate the Canadian TIA Score in a new cohort of emergency department (ED) patients. Methods: We conducted a prospective cohort study in 14 Canadian EDs over 4 years. We enrolled consecutive adult patients with an ED visit for TIA or nondisabling stroke. Treating physicians recorded standardized clinical variables onto data collection forms. Given the ability of prompt emergency carotid endarterectomy (CEA) to prevent stroke (NNT = 3) in high risk patients, our primary outcome was the composite of subsequent stroke or CEA ≤7 days. We conducted telephone follow-up using the validated Questionnaire for Verifying Stroke Free Status at 7 and 90 days. Outcomes were adjudicated by panels of 3 local stroke experts, blinded to the index ED data collection form. Based on prior work, we estimated a sample size of 5,004 patients including 93 subsequent strokes, would yield 95% confidence bands of +/− 10% for sensitivity and likelihood ratio (LR). Our analyses assessed interval LRs (iLR) with 95% CIs. Results: We prospectively enrolled 7,569 patients with mean 68.4 +/−14.7 years and 52.4% female, of whom 107 (1.4%) had a subsequent stroke and 74 (1.0%) CEA ≤7 days (total outcomes = 181). We enrolled 81.2% of eligible patients; missed patients were similar to enrolled. The Canadian TIA Score stratified the stroke/CEA ≤7days risk as: Low (probability <0.2%, iLR 0.20 [95%CI 0.091-0.44]; Moderate (probability 1.3%, iLR 0.79 [0.68-0.92]; High (probability 2.6%, iLR 2.2 [1.9-2.6]. Sensitivity analysis for just stroke ≤7 days yielded similar results: Low iLR 0.17 [95%CI 0.056-0.52], Medium iLR 0.89 [0.75-1.1], High iLR 2.0 [1.6-2.4]. Conclusion: The Canadian TIA Score accurately identifies TIA patients risk for stroke/CEA ≤7 days. Patients classified as low risk can be safely discharged following a careful ED assessment with elective follow-up. Patients at moderate risk can undergo additional testing in the ED, have antithrombotic therapy optimized, and be offered early stroke specialist follow-up. Patients at high risk should in most cases be fully investigated and managed ideally in consultation with a stroke specialist during their index ED visit.
Important ear problems can affect the outer ear, the middle ear and the inner ear. Globally, the greatest burden of disease is due to ear conditions that are associated with otorrhoea and hearing loss.
Methods
This study reviewed the literature on the prevention and treatment of common ear conditions that are most relevant to settings with high rates of ear disease and limited resources. The grading of recommendations assessment, development and evaluation (‘GRADE’) approach was utilised to assess interventions.
Results
Accurate diagnosis of ear disease is challenging. Much of the preventable burden of ear disease is associated with otitis media. Nine otitis media interventions for which there is moderate to high certainty of effect were identified. While most interventions only provide modest benefit, the impact of treatment is more substantial in children with acute otitis media with perforation and chronic suppurative otitis media.
Conclusion
Disease prevention through good hygiene practices, breastfeeding, reducing smoke exposure, immunisation and limiting noise exposure is recommended. Children with acute otitis media with perforation, chronic suppurative otitis media, complications of otitis media, and significant hearing loss should be prioritised for medical treatment.
Hill (Twin Research and Human Genetics, Vol. 21, 2018, 84–88) presented a critique of our recently published paper in Cell Reports entitled ‘Large-Scale Cognitive GWAS Meta-Analysis Reveals Tissue-Specific Neural Expression and Potential Nootropic Drug Targets’ (Lam et al., Cell Reports, Vol. 21, 2017, 2597–2613). Specifically, Hill offered several interrelated comments suggesting potential problems with our use of a new analytic method called Multi-Trait Analysis of GWAS (MTAG) (Turley et al., Nature Genetics, Vol. 50, 2018, 229–237). In this brief article, we respond to each of these concerns. Using empirical data, we conclude that our MTAG results do not suffer from ‘inflation in the FDR [false discovery rate]’, as suggested by Hill (Twin Research and Human Genetics, Vol. 21, 2018, 84–88), and are not ‘more relevant to the genetic contributions to education than they are to the genetic contributions to intelligence’.
The longstanding association between the major histocompatibility complex (MHC) locus and schizophrenia (SZ) risk has recently been accounted for, partially, by structural variation at the complement component 4 (C4) gene. This structural variation generates varying levels of C4 RNA expression, and genetic information from the MHC region can now be used to predict C4 RNA expression in the brain. Increased predicted C4A RNA expression is associated with the risk of SZ, and C4 is reported to influence synaptic pruning in animal models.
Methods
Based on our previous studies associating MHC SZ risk variants with poorer memory performance, we tested whether increased predicted C4A RNA expression was associated with reduced memory function in a large (n = 1238) dataset of psychosis cases and healthy participants, and with altered task-dependent cortical activation in a subset of these samples.
Results
We observed that increased predicted C4A RNA expression predicted poorer performance on measures of memory recall (p = 0.016, corrected). Furthermore, in healthy participants, we found that increased predicted C4A RNA expression was associated with a pattern of reduced cortical activity in middle temporal cortex during a measure of visual processing (p < 0.05, corrected).
Conclusions
These data suggest that the effects of C4 on cognition were observable at both a cortical and behavioural level, and may represent one mechanism by which illness risk is mediated. As such, deficits in learning and memory may represent a therapeutic target for new molecular developments aimed at altering C4’s developmental role.
The discovery of the first electromagnetic counterpart to a gravitational wave signal has generated follow-up observations by over 50 facilities world-wide, ushering in the new era of multi-messenger astronomy. In this paper, we present follow-up observations of the gravitational wave event GW170817 and its electromagnetic counterpart SSS17a/DLT17ck (IAU label AT2017gfo) by 14 Australian telescopes and partner observatories as part of Australian-based and Australian-led research programs. We report early- to late-time multi-wavelength observations, including optical imaging and spectroscopy, mid-infrared imaging, radio imaging, and searches for fast radio bursts. Our optical spectra reveal that the transient source emission cooled from approximately 6 400 K to 2 100 K over a 7-d period and produced no significant optical emission lines. The spectral profiles, cooling rate, and photometric light curves are consistent with the expected outburst and subsequent processes of a binary neutron star merger. Star formation in the host galaxy probably ceased at least a Gyr ago, although there is evidence for a galaxy merger. Binary pulsars with short (100 Myr) decay times are therefore unlikely progenitors, but pulsars like PSR B1534+12 with its 2.7 Gyr coalescence time could produce such a merger. The displacement (~2.2 kpc) of the binary star system from the centre of the main galaxy is not unusual for stars in the host galaxy or stars originating in the merging galaxy, and therefore any constraints on the kick velocity imparted to the progenitor are poor.
To assess relationships between mothers’ feeding practices (food as a reward, food for emotion regulation, modelling of healthy eating) and mothers’ willingness to purchase child-marketed foods and fruits/vegetables (F&V) requested by their children during grocery co-shopping.
Design
Cross-sectional. Mothers completed an online survey that included questions about feeding practices and willingness (i.e. intentions) to purchase child-requested foods during grocery co-shopping. Feeding practices scores were dichotomized at the median. Foods were grouped as nutrient-poor or nutrient-dense (F&V) based on national nutrition guidelines. Regression models compared mothers with above-the-median v. at-or-below-the-median feeding practices scores on their willingness to purchase child-requested food groupings, adjusting for demographic covariates.
Setting
Participants completed an online survey generated at a public university in the USA.
Subjects
Mothers (n 318) of 2- to 7-year-old children.
Results
Mothers who scored above-the-median on using food as a reward were more willing to purchase nutrient-poor foods (β=0·60, P<0·0001), mothers who scored above-the-median on use of food for emotion regulation were more willing to purchase nutrient-poor foods (β=0·29, P<0·0031) and mothers who scored above-the-median on modelling of healthy eating were more willing to purchase nutrient-dense foods (β=0·22, P<0·001) than were mothers with at-or-below-the-median scores, adjusting for demographic covariates.
Conclusions
Mothers who reported using food to control children’s behaviour were more willing to purchase child-requested, nutrient-poor foods. Parental feeding practices may facilitate or limit children’s foods requested in grocery stores. Parent–child food consumer behaviours should be investigated as a route that may contribute to children’s eating patterns.