To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Poor diet quality (DQ) is associated with poor cognition and increased neurodegeneration, including Alzheimer’s disease (AD). We are interested in the role of DQ on cognitive functioning (by sex and increasing genetic risk for AD), in a sample of African American (AA) middle-aged adults. We analysed a sub-group of participants (about 55 % women; mean follow-up time of about 4·7 years) from the Healthy Aging in Neighborhoods of Diversity across the Life Span study with a genetic risk score for AD (hAlzScore). The Healthy Eating Index-2010, Dietary Approaches to Stop Hypertension and the mean adequacy ratio computed at baseline (2004–2009) and follow-up visits (2009–2013) were used to assess initial DQ and change over time. Linear mixed-effects regression models were utilised, adjusting for select covariates, selection bias and multiple testing. DQ change (ΔDQ) was associated with California Verbal Learning Test-List A – overall (0·15 (se 0·06), P = 0·008) and in women (0·21 (se 0·08), P = 0·006), at highest AD risk, indicating protective effects over time. Greater AD risk was longitudinally associated with poorer Clock Command Test scores in men. Poor DQ was positively and cross-sectionally associated with Trails B scores, but in women only. Better-quality diet was associated with a slower decline in verbal memory among AA women, with greater AD risk. Insufficient clinical evidence and/or mixed findings dictate that more studies are needed to investigate brain morphology and volume changes in relation to DQ in an at-risk population for AD, over time.
Longitudinal studies of first episode of psychosis (FEP) patients are critical to understanding the dynamic clinical factors influencing functional outcomes; negative symptoms and verbal memory (VM) deficits are two such factors that remain a therapeutic challenge. This study uses white-gray matter contrast at the inner edge of the cortex, in addition to cortical thickness, to probe changes in microstructure and their relation with negative symptoms and possible intersections with verbal memory.
T1-weighted images and clinical data were collected longitudinally for patients (N = 88) over a two-year period. Cognitive data were also collected at baseline. Relationships between baseline VM (immediate/delayed recall) and rate of change in two negative symptom dimensions, amotivation and expressivity, were assessed at the behavioral level, as well as at the level of brain structure.
VM, particularly immediate recall, was significantly and positively associated with a steeper rate of expressivity symptom decline (r = 0.32, q = 0.012). Significant interaction effects between baseline delayed recall and change in expressivity were uncovered in somatomotor regions bilaterally for both white-gray matter contrast and cortical thickness. Furthermore, interaction effects between immediate recall and change in expressivity on cortical thickness rates were uncovered across higher-order regions of the language processing network.
This study shows common neural correlates of language-related brain areas underlying expressivity and VM in FEP, suggesting deficits in these domains may be more linked to speech production rather than general cognitive capacity. Together, white-gray matter contrast and cortical thickness may optimally inform clinical investigations aiming to capture peri-cortical microstructural changes.
The Comprehensive Assessment of Neurodegeneration and Dementia (COMPASS-ND) cohort study of the Canadian Consortium on Neurodegeneration in Aging (CCNA) is a national initiative to catalyze research on dementia, set up to support the research agendas of CCNA teams. This cross-country longitudinal cohort of 2310 deeply phenotyped subjects with various forms of dementia and mild memory loss or concerns, along with cognitively intact elderly subjects, will test hypotheses generated by these teams.
The COMPASS-ND protocol, initial grant proposal for funding, fifth semi-annual CCNA Progress Report submitted to the Canadian Institutes of Health Research December 2017, and other documents supplemented by modifications made and lessons learned after implementation were used by the authors to create the description of the study provided here.
The CCNA COMPASS-ND cohort includes participants from across Canada with various cognitive conditions associated with or at risk of neurodegenerative diseases. They will undergo a wide range of experimental, clinical, imaging, and genetic investigation to specifically address the causes, diagnosis, treatment, and prevention of these conditions in the aging population. Data derived from clinical and cognitive assessments, biospecimens, brain imaging, genetics, and brain donations will be used to test hypotheses generated by CCNA research teams and other Canadian researchers. The study is the most comprehensive and ambitious Canadian study of dementia. Initial data posting occurred in 2018, with the full cohort to be accrued by 2020.
Availability of data from the COMPASS-ND study will provide a major stimulus for dementia research in Canada in the coming years.
The Flat Rocks locality in the Wonthaggi Formation (Strzelecki Group) of the Gippsland Basin, southeastern Australia, hosts fossils of a late Barremian vertebrate fauna that inhabited the ancient rift between Australia and Antarctica. Known from its dentary, Qantassaurus intrepidus Rich and Vickers-Rich, 1999 has been the only dinosaur named from this locality. However, the plethora of vertebrate fossils collected from Flat Rocks suggests that further dinosaurs await discovery. From this locality, we name a new small-bodied ornithopod, Galleonosaurus dorisae n. gen. n. sp. from craniodental remains. Five ornithopodan genera are now named from Victoria. Galleonosaurus dorisae n. gen. n. sp. is known from five maxillae, from which the first description of jaw growth in an Australian dinosaur is provided. The holotype of Galleonosaurus dorisae n. gen. n. sp. is the most complete dinosaur maxilla known from Victoria. Micro-CT imagery of the holotype reveals the complex internal anatomy of the neurovascular tract and antorbital fossa. We confirm that Q. intrepidus is uniquely characterized by a deep foreshortened dentary. Two dentaries originally referred to Q. intrepidus are reassigned to Q. ?intrepidus and a further maxilla is referred to cf. Atlascopcosaurus loadsi Rich and Rich, 1989. A further ornithopod dentary morphotype is identified, more elongate than those of Q. intrepidus and Q. ?intrepidus and with three more tooth positions. This dentary might pertain to Galleonosaurus dorisae n. gen. n. sp. Phylogenetic analysis recovered Cretaceous Victorian and Argentinian nonstyracosternan ornithopods within the exclusively Gondwanan clade Elasmaria. However, the large-bodied taxon Muttaburrasaurus langdoni Bartholomai and Molnar, 1981 is hypothesised as a basal iguanodontian with closer affinities to dryomorphans than to rhabdodontids.
Serum uric acid (SUA), a causative agent for gout, is linked to dietary factors, perhaps differentially by race. Cross-sectional (SUAbase, i.e. baseline SUA) and longitudinal (SUArate; i.e. annual rate of change in SUA) associations of SUA with diet were evaluated across race and sex–race groups, in a large prospective cohort study of urban adults. Of 3720 African American (AA) and White urban adults participating in the Healthy Aging in Neighborhoods of Diversity across the Life Span study, longitudinal data (2004–2013, k=1·7 repeats, follow-up, 4·64 (sd 0·93) years) on n 2138 participants were used. The main outcome consisted of up to two repeated measures on SUA. Exposures included the dietary factors such as ‘added sugar’, ‘alcoholic beverages’, ‘red meat’, ‘total fish’, ‘legumes’, ‘total dairy product’, ‘caffeine’, ‘vitamin C’ and a composite measure termed ‘dietary urate index’. Mixed-effects linear regression models were conducted, stratifying by race and by race×sex. A positive association between legume intake and SUArate was restricted to AA, whereas alcohol intake was positively associated with SUAbase overall without racial differences. Added sugars were directly related to SUAbase among White men (P<0·05 for race×sex interaction), whereas dairy product intake was linked with slower SUArate among AA women, unlike among White women. Nevertheless, dairy product intake was associated with a lower SUAbase among Whites. Finally, the dietary urate index was positively associated with both SUAbase and SUArate, particularly among AA. In sum, race and sex interactions with dietary intakes of added sugars, dairy products and legumes were detected in determining SUA. Similar studies are needed to replicate these findings.
The role of dairy foods and related nutrients in cardiometabolic health aetiology is poorly understood. We investigated longitudinal associations between the metabolic syndrome (MetS) and its components with key dairy product exposures. We used prospective data from a bi-racial cohort of urban adults (30–64 years at baseline (n 1371)), the Healthy Aging in Neighborhoods of Diversity across the Life Span (HANDLS), in Baltimore City, MD (2004–2013). The average of two 24-h dietary recalls measured 4–10 d apart was computed at baseline (V1) and follow-up (V2) waves. Annual rates of change (Δ) in dairy foods and key nutrients were estimated. Incident obesity, central obesity and the MetS were determined. Among key findings, in the overall urban adult population, both cheese and yogurt (V1 and Δ) were associated with an increased risk of central obesity (hazard ratio (HR) 1·13; 95 % CI 1·05, 1·23 per oz equivalent of cheese (V1); HR 1·21; 95 % CI 1·01, 1·44 per fl oz equivalent of yogurt (V1)]. Baseline fluid milk intake (V1 in cup equivalents) was inversely related to the MetS (HR 0·86; 95 % CI 0·78, 0·94), specifically to dyslipidaemia–TAG (HR 0·89; 95 % CI 0·81, 0·99), although it was directly associated with dyslipidaemia–HDL-cholesterol (HR 1·10; 95 % CI 1·01, 1·21). Furthermore, ΔCa and ΔP were inversely related to dyslipidaemia–HDL and MetS incidence, respectively, whereas Δdairy product fat was positively associated with incident TAG–dyslipidaemia and HDL-cholesterol–dyslipidaemia and the MetS. A few of those associations were sex and race specific. In sum, various dairy product exposures had differential associations with metabolic disturbances. Future intervention studies should uncover how changes in dairy product components over time may affect metabolic disorders.
The factors associated with opioid poisoning death are poorly understood. We performed a retrospective autopsy study of decedents (a term used for people who are deceased) of opioid poisoning in Wales in 2015. Using anonymized linked data, we describe demographic characteristics, patterns of emergency service utilization, and clinical presentation prior to death.
Decedents of opioid poisoning in Wales in 2015 were identified from the Office of National Statistics (ONS) mortality dataset. Records were linked with the Emergency Department Dataset (EDDS) by the National Welsh Informatics Service (NWIS); and held in the Secure Anonymized Information Linkage (SAIL) databank. The data were accessed and analyzed in the SAIL gateway.
Age at death ranged from eighteen to seventy-eight years, with a mean age of forty-two years. Average male age was forty-one years and average female age was forty-four and a half years. Seventy-three percent of decedents were men (n = 228/312). Eight-seven percent of decedents (n = 281/312) attended the emergency department in the three years prior to death. In total 2081 attendances were made, forty-one percent of which involved conveyance by ambulance. Attendances per individual ranged from one to 114, with over half of decedents attending more than three times. Diagnostic codes were mostly missing or non-specific, with only seven and a half percent of attendances representing eighty-two decedents, coded as drug related. Treatment codes were also mostly missing or non-specific, with sixteen percent of attendances representing 148 attendees attributed a treatment code. Thirty-nine percent of attendances (n = 822) ended in treatment and discharge, whilst twenty-seven percent (n = 562) led to hospital admission.
Matching previously published data, we found that fatal opioid poisoning is preceded by a period of high emergency health service utilization. On average decedents were in their fifth decade and more likely to be male than female. Attendances varied widely, with men less likely to attend than women.
Serum uric acid (SUA), a causative agent for gout among others, is affected by both genetic and dietary factors, perhaps differentially by sex. We evaluated cross-sectional (SUAbase) and longitudinal (SUArate) associations of SUA with a genetic risk score (GRS), diet and sex. We then tested the interactive effect of GRS, diet and sex on SUA. Longitudinal data on 766 African-American urban adults participating in the Healthy Aging in Neighborhood of Diversity across the Lifespan study were used. In all, three GRS for SUA were created from known SUA-associated SNP (GRSbase (n 12 SNP), GRSrate (n 3 SNP) and GRStotal (n 15 SNP)). Dietary factors included added sugar, total alcohol, red meat, total fish, legumes, dairy products, caffeine and vitamin C. Mixed-effects linear regression models were conducted. SUAbase was higher among men compared with that among women, and increased with GRStotal tertiles. SUArate was positively associated with legume intake in women (γ=+0·14; 95 % CI +0·06, +0·22, P=0·001) and inversely related to dairy product intake in both sexes combined (γ=−0·042; 95 % CI −0·075, −0·009), P=0·010). SUAbase was directly linked to alcohol consumption among women (γ=+0·154; 95 % CI +0·046, +0·262, P=0·005). GRSrate was linearly related to SUArate only among men. Legume consumption was also positively associated with SUArate within the GRStotal’s lowest tertile. Among women, a synergistic interaction was observed between GRSrate and red meat intake in association with SUArate. Among men, a synergistic interaction between low vitamin C and genetic risk was found. In sum, sex–diet, sex–gene and gene–diet interactions were detected in determining SUA. Further similar studies are needed to replicate our findings.
Patients who experience Transient Ischaemic Attack (TIA) should be assessed and treated in a specialist clinic to reduce risk of further TIA or stroke. But referrals are often delayed. We aimed to identify published studies describing pathways for emergency assessment and referral of patients with suspected TIA at first medical contact: primary care; ambulance services; and emergency department.
We conducted a scoping literature review. We searched four databases (PubMed, CINAHL, Web of Science, Scopus). We screened studies for eligibility. We extracted and analysed data to describe setting, assessment and referral processes reported in primary research on referral of suspected TIA patients directly to specialist outpatient services.
We identified eight studies in nine papers from five countries: 1/9 randomized trial; 6/9 before-and-after designs; 2/9 descriptive account. Five pathways were used by family doctors and three by Emergency Department (ED) physicians. None were used by paramedics. Clinicians identified TIA patients using a checklist incorporating the ABCD2 tool to describe risk of further stroke, online decision support tool or clinical judgement. They referred to a specialist clinic, either directly or via a telephone helpline. Anti-platelet medication was often given, usually aspirin unless contraindicated. Some patients underwent neurological and blood tests before referral and discharge. Five studies reported reduced incident of stroke at 90 days, from 6–10 percent predicted rate to 1.2-2.1 percent actual rate. Between 44 percent and 83 percent of suspected TIA cases in these studies were directly referred to stroke clinics through the pathways.
Research literature has focused on assessment and referral by family doctors and ED physicians to reduce hospitalization of TIA patients. No pathways for paramedic use were reported. Since many suspected TIA patients present to ambulance services, effective pre-hospital assessment and referral pathways are needed. We will use review results to develop a paramedic referral pathway to test in a feasibility trial.
Adequate pain relief at the scene of injury and during transport to hospital is a major challenge in all acute traumas, especially for those with hip fractures, whose injuries are difficult to immobilize and long-term outcomes may be adversely affected by administration of opiate analgesics. Fascia Iliaca Compartment Block (FICB) is a procedure routinely undertaken by clinicians in emergency departments for hip fracture patients, but use by paramedics at the scene of emergency calls, is not yet evaluated (1).
We undertook a randomized controlled feasibility trial using novel audited scratchcard randomization to allocate eligible patients to FICB or usual care. Paramedics are recruited and trained to assess patients for hip fracture and carry out FICB. We will follow up patients to assess accuracy of paramedic diagnosis, acceptability to patients and paramedics, compliance of paramedics and also measures of pain, side effects, time in hospital and quality of life in order to plan a full trial if appropriate. The primary outcome measure is health related quality of life, measured using Short Form (SF)-12 at 1 and 6 months. Interviews and focus groups will be used to understand acceptability of FICB to patients and paramedics. This study was funded by Health and Care Research Wales (1003).
We have developed:
•paramedic pathway to assess patients for hip fracture and FICB
•paramedic training package, delivered by Consultant Anaesthetist
To date we have recruited nineteen paramedics; ten are fully trained and recruiting patients, the remainder are being trained. Fifty-four patients have been randomized and thirty-five have consented to follow-up. Thirteen 1-month and five 6-month follow-up questionnaires have been received.
This study will enable us to recommend whether to undertake a definitive multi-centre randomized controlled trial of FICB by paramedics for hip fracture to determine if the procedure is effective for patients and worthwhile for the National Health Service.
New approaches are needed to safely reduce emergency admissions to hospital by targeting interventions effectively in primary care. A predictive risk stratification tool (PRISM) identifies each registered patient's risk of an emergency admission in the following year, allowing practitioners to identify and manage those at higher risk. We evaluated the introduction of PRISM in primary care in one area of the United Kingdom, assessing its impact on emergency admissions and other service use.
We conducted a randomized stepped wedge trial with cluster-defined control and intervention phases, and participant-level anonymized linked outcomes. PRISM was implemented in eleven primary care practice clusters (total thirty-two practices) over a year from March 2013. We analyzed routine linked data outcomes for 18 months.
We included outcomes for 230,099 registered patients, assigned to ranked risk groups.
Overall, the rate of emergency admissions was higher in the intervention phase than in the control phase: adjusted difference in number of emergency admissions per participant per year at risk, delta = .011 (95 percent Confidence Interval, CI .010, .013). Patients in the intervention phase spent more days in hospital per year: adjusted delta = .029 (95 percent CI .026, .031). Both effects were consistent across risk groups.
Primary care activity increased in the intervention phase overall delta = .011 (95 percent CI .007, .014), except for the two highest risk groups which showed a decrease in the number of days with recorded activity.
Introduction of a predictive risk model in primary care was associated with increased emergency episodes across the general practice population and at each risk level, in contrast to the intended purpose of the model. Future evaluation work could assess the impact of targeting of different services to patients across different levels of risk, rather than the current policy focus on those at highest risk.
Emergency admissions to hospital are a major financial burden on health services. In one area of the United Kingdom (UK), we evaluated a predictive risk stratification tool (PRISM) designed to support primary care practitioners to identify and manage patients at high risk of admission. We assessed the costs of implementing PRISM and its impact on health services costs. At the same time as the study, but independent of it, an incentive payment (‘QOF’) was introduced to encourage primary care practitioners to identify high risk patients and manage their care.
We conducted a randomized stepped wedge trial in thirty-two practices, with cluster-defined control and intervention phases, and participant-level anonymized linked outcomes. We analysed routine linked data on patient outcomes for 18 months (February 2013 – September 2014). We assigned standard unit costs in pound sterling to the resources utilized by each patient. Cost differences between the two study phases were used in conjunction with differences in the primary outcome (emergency admissions) to undertake a cost-effectiveness analysis.
We included outcomes for 230,099 registered patients. We estimated a PRISM implementation cost of GBP0.12 per patient per year.
Costs of emergency department attendances, outpatient visits, emergency and elective admissions to hospital, and general practice activity were higher per patient per year in the intervention phase than control phase (adjusted δ = GBP76, 95 percent Confidence Interval, CI GBP46, GBP106), an effect that was consistent and generally increased with risk level.
Despite low reported use of PRISM, it was associated with increased healthcare expenditure. This effect was unexpected and in the opposite direction to that intended. We cannot disentangle the effects of introducing the PRISM tool from those of imposing the QOF targets; however, since across the UK predictive risk stratification tools for emergency admissions have been introduced alongside incentives to focus on patients at risk, we believe that our findings are generalizable.
A predictive risk stratification tool (PRISM) to estimate a patient's risk of an emergency hospital admission in the following year was trialled in general practice in an area of the United Kingdom. PRISM's introduction coincided with a new incentive payment (‘QOF’) in the regional contract for family doctors to identify and manage the care of people at high risk of emergency hospital admission.
Alongside the trial, we carried out a complementary qualitative study of processes of change associated with PRISM's implementation. We aimed to describe how PRISM was understood, communicated, adopted, and used by practitioners, managers, local commissioners and policy makers. We gathered data through focus groups, interviews and questionnaires at three time points (baseline, mid-trial and end-trial). We analyzed data thematically, informed by Normalisation Process Theory (1).
All groups showed high awareness of PRISM, but raised concerns about whether it could identify patients not yet known, and about whether there were sufficient community-based services to respond to care needs identified. All practices reported using PRISM to fulfil their QOF targets, but after the QOF reporting period ended, only two practices continued to use it. Family doctors said PRISM changed their awareness of patients and focused them on targeting the highest-risk patients, though they were uncertain about the potential for positive impact on this group.
Though external factors supported its uptake in the short term, with a focus on the highest risk patients, PRISM did not become a sustained part of normal practice for primary care practitioners.
There are multiple recent reports of an association between anxious/depressed (A/D) symptomatology and the rate of cerebral cortical thickness maturation in typically developing youths. We investigated the degree to which anxious/depressed symptoms are tied to age-related microstructural changes in cerebral fiber pathways. The participants were part of the NIH MRI Study of Normal Brain Development. Child Behavior Checklist A/D scores and diffusion imaging were available for 175 youths (84 males, 91 females; 241 magnetic resonance imagings) at up to three visits. The participants ranged from 5.7 to 18.4 years of age at the time of the scan. Alignment of fractional anisotropy data was implemented using FSL/Tract-Based Spatial Statistics, and linear mixed model regression was carried out using SPSS. Child Behavior Checklist A/D was associated with the rate of microstructural development in several white matter pathways, including the bilateral anterior thalamic radiation, bilateral inferior longitudinal fasciculus, left superior longitudinal fasciculus, and right cingulum. Across these pathways, greater age-related fractional anisotropy increases were observed at lower levels of A/D. The results suggest that subclinical A/D symptoms are associated with the rate of microstructural development within several white matter pathways that have been implicated in affect regulation, as well as mood and anxiety psychopathology.
Poor diet quality contributes to morbidity, including poor brain health outcomes such as cognitive decline and dementia. African Americans and individuals living in poverty may be at greater risk for cognitive decrements from poor diet quality.
Baltimore, MD, USA.
Participants were 2090 African Americans and Whites (57 % female, mean age=47·9 years) who completed two 24 h dietary recalls. We examined cognitive performance and potential interactions of diet quality with race and poverty status using baseline data from the Healthy Aging in Neighborhoods of Diversity across the Life Span (HANDLS) study. Healthy Eating Index-2010 (HEI-2010) scores were calculated and interpreted using federal guidelines. A neurocognitive test battery was administered to evaluate cognitive function over several domains.
Linear regression analyses showed that lower HEI-2010 scores were associated with poorer verbal learning and memory (P<0·05) after adjustment for covariates. Diet quality within the sample was poor. Significant interactions of HEI-2010 and poverty status (all P<0·05) indicated that higher diet quality was associated with higher performance on tests of attention and cognitive flexibility, visuospatial ability and perceptual speed among those below the poverty line. No significant race interactions emerged. Higher diet quality was associated with better performance on two measures of verbal learning and memory, irrespective of race and poverty status.
Findings suggest that diet quality and cognitive function are likely related at the population level. Future research is needed to determine whether the association is clinically significant.
Analysing dietary data to capture how individuals typically consume foods is dependent on the coding variables used. Individual foods consumed simultaneously, like coffee with milk, are given codes to identify these combinations. Our literature review revealed a lack of discussion about using combination codes in analysis. The present study identified foods consumed at mealtimes and by race when combination codes were or were not utilized.
Duplicate analysis methods were performed on separate data sets. The original data set consisted of all foods reported; each food was coded as if it was consumed individually. The revised data set was derived from the original data set by first isolating coded foods consumed as individual items from those foods consumed simultaneously and assigning a code to designate a combination. Foods assigned a combination code, like pancakes with syrup, were aggregated and associated with a food group, defined by the major food component (i.e. pancakes), and then appended to the isolated coded foods.
Healthy Aging in Neighborhoods of Diversity across the Life Span study.
African-American and White adults with two dietary recalls (n 2177).
Differences existed in lists of foods most frequently consumed by mealtime and race when comparing results based on original and revised data sets. African Americans reported consumption of sausage/luncheon meat and poultry, while ready-to-eat cereals and cakes/doughnuts/pastries were reported by Whites on recalls.
Use of combination codes provided more accurate representation of how foods were consumed by populations. This information is beneficial when creating interventions and exploring diet–health relationships.