To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Introduction: The primary objective of this study was to determine the incidence of clinically significant traumatic intracranial haemorrhage (T-ICH) following minor head trauma in older adults. Secondary objective was to investigate the impact of anticoagulant and antiplatelet therapies on T-ICH incidence. Methods: This retrospective cohort study extracted data from electronic patient records. The cohort consisted of patients presenting after a fall and/or head injury and presented to one of five ED between 1st March 2010 and 31st July 2017. Inclusion criteria were age ≥ 65 years old and a minor head trauma defined as an impact to the head without fulfilling criteria for traumatic brain injury. Results: From the 1,000 electronic medical records evaluated, 311 cases were included. The mean age was 80.1 (SD 7.9) years. One hundred and eighty-nine (189) patients (60.8%) were on an anticoagulant (n = 69), antiplatelet (n = 130) or both (n = 16). Twenty patients (6.4%) developed a clinically significant T-ICH. Anticoagulation and/or antiplatelets therapies were not associated with an increased risk of clinically significant T-ICH in this cohort (Odds ratio (OR) 2.7, 95% CI 0.9-8.3). Conclusion: In this cohort of older adults presenting to the ED following minor head trauma, the incidence of clinically significant T-ICH was 6.4%.
Innovation Concept: A major barrier to the development of a national simulation case repository and multi-site simulation research is the lack of a standardized national case template. This issue was recently identified as a priority research topic for Canadian simulation based education (SBE) research in emergency medicine (EM). We partnered with the EM Simulation Education Researchers Collaborative (EM-SERC) to develop a national simulation template. Methods: The EM Sim Cases template was chosen as a starting point for the consensus process. We generated feedback on the template using a three-phase modified nominal group technique. Members of the EM-SERC mailing list were consulted, which included 20 EM simulation educators from every Canadian medical school except Northern Ontario School of Medicine and Memorial University. When comments conflicted, the sentiment with more comments in favour was incorporated. Curriculum, Tool or Material: In phase one we sought free-text feedback on the EM Sim Cases template via email. We received 65 comments from 11 respondents. An inductive thematic analysis identified four major themes (formatting, objectives, debriefing, and assessment tools). In phase two we sought free-text feedback on the revised template via email. A second thematic analysis on 40 comments from 12 respondents identified three broad themes (formatting, objectives, and debriefing). In phase three we sought feedback on the penultimate template via focus groups with simulation educators and technologists at multiple Canadian universities. This phase generated 98 specific comments which were grouped according to the section of the template being discussed and used to develop the final template (posted on emsimcases.com). Conclusion: We describe a national consensus-building process which resulted in a simulation case template endorsed by simulation educators from across Canada. This template has the potential to: 1. Reduce the replication of effort across sites by facilitating the sharing of simulation cases. 2. Enable national collaboration on the development of both simulation cases and curricula. 3. Facilitate multi centre simulation-based research by removing confounders related to the local adoption of an unfamiliar case template. This could improve the rigour and validity of these studies by reducing inter-site variability. 4. Increase the validity of any simulation scenarios developed for use in national high-stakes assessment.
This study describes the social, demographic and clinical characteristics of all the new referrals in a mental health catchment area. This study aims to compare Irish and non-Irish service users in terms of their mental health needs and service utilization.
Case notes were reviewed retrospectively to investigate demographic, clinical and service utilization parameters among new referrals to the psychiatric services in Galway, Ireland over a six-month period.
One hundred and fifty four new referrals, of whom 41 were non-Irish, presented over a sixmonth period. Results showed no difference between Irish and non-Irish patients in terms of sociodemographic variables. Alcohol problems and subsequent need for detoxification and counseling were significantly increased among service users from the new EU accession states with a significant impact on the duration of their hospital stay and the need for intensive psychiatric care.
There is an urgent need for enhanced resources for the delivery of mental healthcare to immigrants. Service utilization and mental health needs are not explained merely by illness-related aspects in immigrant service users. Social and cultural factors have to be recognized in order to prevent disadvantages in psychiatric care.
The application of recovery principles within everyday mental health services is understudied.
Objectives and aims
We studied the implementation of a programme of intensive case management (ICM) emphasizing recovery principles in an Irish community mental health service.
Eighty service attenders with severe and enduring illness were randomized into groups
(1) receiving a programme of ICM,
(2) receiving treatment as usual (TAU).
Groups were compared before/after the programme for general psychopathology using the Brief psychiatric Rating Scale [BPRS] (clinician rated) and How are You? Scale (self-rated). The Functional Analysis of Care Environments [FACE] scale provided assessment of functional domains.
The overall group [mean age 44.5 ± 13.2 years; 60% male] had mean total Health of the Nation Outcome Scale [HoNOS] scale scores 10.5 ± 4.6 with impaired social functioning especially prominent (mean social subscale score 5.0 ± 2.7). The ICM group were younger (p < 0.01) with higher baseline scores on the HoNOS social subscale and BPRS (p < 0.05). An analysis of covariance, controlling for these baseline differences, indicated greater improvement in BPRS scores (p = 0.001), How are You? scores (p = 0.02) and FACE domains for cognition, symptoms and interpersonal relationships (all p < 0.001) in the ICM group. The ICM group underwent greater changes in structured daily activities that were linked to improved BPRS scores (p = 0.01).
A programme of ICM emphasizing recovery principles allowed significant improvement across psychopathological and functional domains. Improvements were linked to enhanced engagement with structured daily activities. Recovery-oriented practices can be integrated into existing mental health services and provided alongside traditional models of care.
Aging is associated with numerous stressors that negatively impact older adults’ well-being. Resilience improves ability to cope with stressors and can be enhanced in older adults. Senior housing communities are promising settings to deliver positive psychiatry interventions due to rising resident populations and potential impact of delivering interventions directly in the community. However, few intervention studies have been conducted in these communities. We present a pragmatic stepped-wedge trial of a novel psychological group intervention intended to improve resilience among older adults in senior housing communities.
A pragmatic modified stepped-wedge trial design.
Five senior housing communities in three states in the US.
Eighty-nine adults over age 60 years residing in independent living sector of senior housing communities.
Raise Your Resilience, a manualized 1-month group intervention that incorporated savoring, gratitude, and engagement in value-based activities, administered by unlicensed residential staff trained by researchers. There was a 1-month control period and a 3-month post-intervention follow-up.
Validated self-report measures of resilience, perceived stress, well-being, and wisdom collected at months 0 (baseline), 1 (pre-intervention), 2 (post-intervention), and 5 (follow-up).
Treatment adherence and satisfaction were high. Compared to the control period, perceived stress and wisdom improved from pre-intervention to post-intervention, while resilience improved from pre-intervention to follow-up. Effect sizes were small in this sample, which had relatively high baseline resilience. Physical and mental well-being did not improve significantly, and no significant moderators of change in resilience were identified.
This study demonstrates feasibility of conducting pragmatic intervention trials in senior housing communities. The intervention resulted in significant improvement in several measures despite ceiling effects. The study included several features that suggest high potential for its implementation and dissemination across similar communities nationally. Future studies are warranted, particularly in samples with lower baseline resilience or in assisted living facilities.
Europe’s roadmap to a low-carbon economy aims to cut greenhouse gas (GHG) emissions 80% below 1990 levels by 2050. Beef production is an important source of GHG emissions and is expected to increase as the world population grows. LIFE BEEF CARBON is a voluntary European initiative that aims to reduce GHG emissions per unit of beef (carbon footprint) by 15% over a 10-year period on 2172 farms in four large beef-producing countries. Changes in farms beef carbon footprint are normally estimated via simulation modelling, but the methods current models apply differ. Thus, our initial goal was to develop a common modelling framework to estimate beef farms carbon footprint. The framework was developed for a diverse set of Western Europe farms located in Ireland, Spain, Italy and France. Whole farm and life cycle assessment (LCA) models were selected to quantify emissions for the different production contexts and harmonized. Carbon Audit was chosen for Ireland, Bovid-CO2 for Spain and CAP’2ER for France and Italy. All models were tested using 20 case study farms, that is, 5 per country and quantified GHG emissions associated with on-farm live weight gain. The comparison showed the ranking of beef systems gross carbon footprint was consistent across the three models. Suckler to weaning or store systems generally had the highest carbon footprint followed by suckler to beef systems and fattening beef systems. When applied to the same farm, Carbon Audit’s footprint estimates were slightly lower than CAP’2ER, but marginally higher than Bovid-CO2. These differences occurred because the models were adapted to a specific region’s production circumstances, which meant their emission factors for key sources; that is, methane from enteric fermentation and GHG emissions from concentrates were less accurate when used outside their target region. Thus, for the common modelling framework, region-specific LCA models were chosen to estimate beef carbon footprints instead of a single generic model. Additionally, the Carbon Audit and Bovid-CO2 models were updated to include carbon removal by soil and other environmental metrics included in CAP’2ER, for example, acidification. This allows all models to assess the effect carbon mitigation strategies have on other potential pollutants. Several options were identified to reduce beef farms carbon footprint, for example, improving genetic merit. These options were assessed for beef systems, and a mitigation plan was created by each nation. The cumulative mitigation effect of the LIFE BEEF CARBON plan was estimated to exceed the projects reduction target (−15%).
Shiga toxin-producing Escherichia coli (STEC) infection can cause serious illness including haemolytic uraemic syndrome. The role of socio-economic status (SES) in differential clinical presentation and exposure to potential risk factors amongst STEC cases has not previously been reported in England. We conducted an observational study using a dataset of all STEC cases identified in England, 2010–2015. Odds ratios for clinical characteristics of cases and foodborne, waterborne and environmental risk factors were estimated using logistic regression, stratified by SES, adjusting for baseline demographic factors. Incidence was higher in the highest SES group compared to the lowest (RR 1.54, 95% CI 1.19–2.00). Odds of Accident and Emergency attendance (OR 1.35, 95% CI 1.10–1.75) and hospitalisation (OR 1.71, 95% CI 1.36–2.15) because of illness were higher in the most disadvantaged compared to the least, suggesting potential lower ascertainment of milder cases or delayed care-seeking behaviour in disadvantaged groups. Advantaged individuals were significantly more likely to report salad/fruit/vegetable/herb consumption (OR 1.59, 95% CI 1.16–2.17), non-UK or UK travel (OR 1.76, 95% CI 1.40–2.27; OR 1.85, 95% CI 1.35–2.56) and environmental exposures (walking in a paddock, OR 1.82, 95% CI 1.22–2.70; soil contact, OR 1.52, 95% CI 2.13–1.09) suggesting other unmeasured risks, such as person-to-person transmission, could be more important in the most disadvantaged group.
Introduction: Simulation has assumed an integral role in the Canadian healthcare system with applications in quality improvement, systems development, and medical education. High quality simulation-based research (SBR) is required to ensure the effective and efficient use of this tool. This study sought to establish national SBR priorities and describe the barriers and facilitators of SBR in Emergency Medicine (EM) in Canada. Methods: Simulation leads (SLs) from all fourteen Canadian Departments or Divisions of EM associated with an adult FRCP-EM training program were invited to participate in three surveys and a final consensus meeting. The first survey documented active EM SBR projects. Rounds two and three established and ranked priorities for SBR and identified the perceived barriers and facilitators to SBR at each site. Surveys were completed by SLs at each participating institution, and priority research themes were reviewed by senior faculty for broad input and review. Results: Twenty SLs representing all 14 invited institutions participated in all three rounds of the study. 60 active SBR projects were identified, an average of 4.3 per institution (range 0-17). 49 priorities for SBR in Canada were defined and summarized into seven priority research themes. An additional theme was identified by the senior reviewing faculty. 41 barriers and 34 facilitators of SBR were identified and grouped by theme. Fourteen SLs representing 12 institutions attended the consensus meeting and vetted the final list of eight priority research themes for SBR in Canada: simulation in CBME, simulation for interdisciplinary and inter-professional learning, simulation for summative assessment, simulation for continuing professional development, national curricular development, best practices in simulation-based education, simulation-based education outcomes, and simulation as an investigative methodology. Conclusion: Conclusion: This study has summarized the current SBR activity in EM in Canada, as well as its perceived barriers and facilitators. We also provide a consensus on priority research themes in SBR in EM from the perspective of Canadian simulation leaders. This group of SLs has formed a national simulation-based research group which aims to address these identified priorities with multicenter collaborative studies.
The commercially available collar device MooMonitor+ was evaluated with regards to accuracy and application potential for measuring grazing behavior. These automated measurements are crucial as cows feed intake behavior at pasture is an important parameter of animal performance, health and welfare as well as being an indicator of feed availability. Compared to laborious and time-consuming visual observation, the continuous and automated measurement of grazing behavior may support and improve the grazing management of dairy cows on pasture. Therefore, there were two experiments as well as a literature analysis conducted to evaluate the MooMonitor+ under grazing conditions. The first experiment compared the automated measurement of the sensor against visual observation. In a second experiment, the MooMonitor+ was compared to a noseband sensor (RumiWatch), which also allows continuous measurement of grazing behavior. The first experiment on n = 12 cows revealed that the automated sensor MooMonitor+ and visual observation were highly correlated as indicated by the Spearman’s rank correlation coefficient (rs) = 0.94 and concordance correlation coefficient (CCC) = 0.97 for grazing time. An rs-value of 0.97 and CCC = 0.98 was observed for rumination time. In a second experiment with n = 12 cows over 24-h periods, a high correlation between the MooMonitor+ and the RumiWatch was observed for grazing time as indicated by an rs-value of 0.91 and a CCC-value of 0.97. Similarly, a high correlation was observed for rumination time with an rs-value of 0.96 and a CCC-value of 0.99. While a higher level of agreement between the MooMonitor+ and both visual observation and RumiWatch was observed for rumination time compared to grazing time, the overall results showed a high level of accuracy of the collar device in measuring grazing and rumination times. Therefore, the collar device can be applied to monitor cow behavior at pasture on farms. With regards to the application potential of the collar device, it may not only be used on commercial farms but can also be applied to research questions when a data resolution of 15 min is sufficient. Thus, at farm level, the farmer can get an accurate and continuous measurement of grazing behavior of each individual cow and may then use those data for decision-making to optimize the animal management.
Introduction: Head injury is a common presentation to all emergency departments. Previous research has shown that such injuries may be complicated by delayed intracranial hemorrhage (D-ICH) after the initial scan is negative. Exposure to anticoagulant or anti-platelet medications (ACAP) may be a risk factor for D-ICH. We have conducted a systematic review and meta-analysis to determine the incidence of delayed traumatic intracranial hemorrhage in patients taking anticoagulants, anti-platelets or both. Methods: The literature search was conducted in March 2017 with an update in April 2017. Keyword and MeSH terms were used to search OVID Medline, Embase and the Cochrane database as well as grey literature sources. All cohort and experimental studies were eligible for selection. Inclusion criteria included pre-injury exposure to oral anticoagulant and / or anti-platelet medication and a negative initial CT scan of the brain (CT1). The primary outcome was delayed intracranial hemorrhage present on repeat CT scan (CT2) within 48 hours of the presentation. Only patients who were rescanned or observed minimally were included. Clinically significant D-ICH were those that required neurosurgery, caused death or necessitated a change in management strategy, such as admission. Results: Fifteen primary studies were ultimately identified, comprising a total of 3801 patients. Of this number, 2111 had a control CT scan. 39 cases of D-ICH were identified, with the incidence of D-ICH calculated to be 1.31% (95% CI [0.56, 2.27]). No more than 12 of these patients had a clinically significant D-ICH representing 0.09% (95% CI [0.00, 0.31]). 10 of them were on warfarin and two on aspirin. There were three deaths recorded and three patients needed neurosurgery. Conclusion: The relatively low incidence suggests that repeat CT should not be mandatory for patients without ICH on first CT. This is further supported by the negligibly low rate of clinically significant D-ICH. Evidence-based assessments should be utilised to indicate the appropriate discharge plan, with further research required to guide the balance between clinical observation and repeat CT.
The objective of this experiment was to establish the effect of low-concentrate (LC) and high-concentrate (HC) supplementation in the early and late periods of lactation on milk production and cow traffic in a pasture-based automatic milking (AM) system. In total, 40 cows (10 primiparous and 30 multiparous) were randomly assigned to one of the two treatments. The experimental periods for the early and late lactation trials extended from 23 February to 12 April 2015 and 31 August to 18 October 2015, respectively (49 days in each trial period). The early lactation supplement levels were 2.3 and 4.4 kg/cow per day for LC and HC, respectively, whereas the late lactation supplement levels were 0.5 and 2.7 kg/cow per day for LC and HC, respectively. Variables measured included milking frequency, milking interval, milking outcome and milking characteristics, milk yield/visit and per day, wait time/visit and per day, return time/visit and the distribution of gate passes. As the herd was seasonal (spring) calving, the experimental periods could not run concurrently and as a result no statistical comparison between the periods was conducted. There was no significant effect of treatment in the early lactation period on any of the milk production, milking characteristics or cow traffic variables. However, treatment did significantly affect the distribution of gate passes, with the HC cows recording significantly more gate passes in the hours preceding the gate time change such as hours 7 (P<0.01), 15 (P<0.05), 20, 21 (P<0.001), and 22 (P<0.05), whereas the LC treatment recorded significantly more gate passes in the hours succeeding the gate time change, such as time points 2 (P<0.01) and 10 (P<0.05). There was a significant effect of treatment in late lactation, with HC having a greater milk yield (P<0.01), milking duration and activity/day (P<0.05), while also having a significantly shorter milking interval (P<0.05) and return time/visit (P<0.01). The distribution of gate passes were similar to the early lactation period, with HC also recording a significantly greater number of gate passes during the early morning period (P<0.01) when visitations were at their lowest. Any decision regarding the supplementing of dairy cows with concentrates needs to be examined from an economic perspective, to establish if the milk production and cow traffic benefits displayed in late lactation outweigh the cost of the concentrate; thereby ensuring that the decision to supplement is financially prudent.
Identifying the transmission sources and reservoirs of Streptococcus pneumoniae (SP) is a long-standing question for pneumococcal epidemiology, transmission dynamics, and vaccine policy. Here we use serotype to identify SP transmission and examine acquisitions (in the same household, local community, and county, or of unidentified origin) in a longitudinal cohort of children and adults from the Navajo Nation and the White Mountain Apache American Indian Tribes. We found that adults acquire SP relatively more in the household than other age groups, and children 2–8 years old typically acquire in their own or surrounding communities. Age-specific transmission probability matrices show that transmissions within household were mostly seen from older to younger siblings. Outside the household, children most often transmit to other children in the same age group, showing age-assortative mixing behavior. We find toddlers and older children to be most involved in SP transmission and acquisition, indicating their role as key drivers of SP epidemiology. Although infants have high carriage prevalence, they do not play a central role in transmission of SP compared with toddlers and older children. Our results are relevant to inform alternative pneumococcal conjugate vaccine dosing strategies and analytic efforts to inform optimization of vaccine programs, as well as assessing the transmission dynamics of pathogens transmitted by close contact in general.
Introduction: In the rural setting, Point-of-Care Ultrasound (POCUS) can dramatically impact rural acute care. In Saskatchewan, many rural clinicians have undertaken POCUS training, but widespread integration into rural emergency care remains elusive. We aimed to explore of the obstacles limiting adoption and their possible solutions to inform the development of a robust and innovative rural POCUS program in Saskatchewan. Methods: We conducted a mixed methods Participatory Action Research (PAR) study using surveys and focus groups. Our rural co-investigators identified 4 key realms relating to rural POCUS use: equipment, access to training, quality assurance (QA), and research. These guided the design of an online survey sent out to rural clinicians throughout Saskatchewan. Results of the survey informed the development of three approaches (centralized, hub-and-spoke, and decentralized) to training, QA, and research which were discussed at focus group sessions held at Saskatchewan’s Emergency Medicine Annual Conference (Regina, SK. 2016). The focus groups were facilitated by the study investigators. Responses were analyzed using a simple thematic analysis to identify relevant themes and subthemes. Results: 34 rural clinicians responded to the online survey. There was general agreement that POCUS is valuable in rural acute care, training is difficult to access and should be standardized, and that QA and research are desired but impractical in the current environment. 11 rural clinicians attended the focus groups. Analysis of focus groups yielded seven distinct themes/needs: infrastructure needs, peer networks, common standards, both local and regional training opportunities, academic support, access to resources, and culture change. Seventeen sub-themes were identified and noted as having either a positive or negative and direct or indirect effect on the above themes. Broadly speaking, participants supported a distributed “spoke-hub” model where training, research and QA occurs within distributed, regional hubs with support from academic sites. Conclusion: The adoption of POCUS for emergency care in rural Saskatchewan faces significant opportunities and obstacles. There is interest on the part of rural clinicians to overcome these challenges to improve patient care.
Introduction: Burnout is well documented in residents and emergency physicians. Wellness initiatives are becoming increasingly prevalent, but there is a lack of data supporting their efficacy. In some populations, a relationship between sleep, exercise and wellness has been documented, but this relationship has not been established in emergency medicine (EM) residents or physicians. We aim to determine whether exercise and sleep quality and quantity as measured by a Fitbit are associated with greater perceived wellness in EM residents. Methods: Fifteen EM residents from two training sites wore a Fitbit during a 4-week EM rotation. The Fitbit recorded data on sleep quantity (minutes sleeping)/quality (sleep disruptions) and exercise quantity (daily step count)/quality (daily active minutes performing activity of 3-6 and >6 metabolic equivalents). Participants completed an end-of-rotation Perceived Wellness Survey (PWS) which provided information on six domains of personal wellness (psychological, emotional, social, physical, spiritual and intellectual). Associations between PWS scores and the Fitbit markers were evaluated using a Mann-Whitney-U statistical analysis. Results: Preliminary results indicate that residents who scored ≥50th percentile for sleep quantity had significantly higher PWS scores than those who scored ≤50th percentile (median PWS 17.0 vs 13.0 respectively, p=0.04). There was no significant correlation between PWS scores, sleep interruptions, daily step count and average daily active minutes. Postgraduate Year PGY1 and PGY2-5 report median PWS scores of 13.9 and 17.2 respectively. Conclusion: To our knowledge, this is the first study to objectively measure the quality and quantity of sleep as well as exercise habits of EM residents using a Fitbit device. Our data indicates a significant relationship between better sleep quantity and higher wellness scores in this population. We aim to enroll 30 residents in order to obtain a more robust data set. A larger sample size will increase statistical power and allow us to more extensively evaluate the use of exercise and sleep monitoring devices in the efficacy assessment of wellness initiatives.
Volumetric atrophy and microstructural alterations in diffusion tensor imaging (DTI) measures of the hippocampus have been reported in people with Alzheimer's disease (AD) and mild cognitive impairment (MCI). However, no study to date has jointly investigated concomitant microstructural and volumetric changes of the hippocampus in dementia with Lewy bodies (DLB).
A total of 84 subjects (23 MCI, 17 DLB, 14 AD, and 30 healthy controls) were recruited for a multi-modal imaging (3T MRI and DTI) study that included neuropsychological evaluation. Freesurfer was used to segment the total hippocampus and delineate its subfields. The hippocampal segmentations were co-registered to the mean diffusivity (MD) and fractional anisotropy (FA) maps obtained from the DTI images.
Both AD and MCI groups showed significantly smaller hippocampal volumes compared to DLB and controls, predominantly in the CA1 and subiculum subfields. Compared to controls, hippocampal MD was elevated in AD, but not in MCI. DLB was characterized by both volumetric and microstructural preservation of the hippocampus. In MCI, higher hippocampal MD was associated with greater atrophy of the hippocampus and CA1 region. Hippocampal volume was a stronger predictor of memory scores compared to MD within the MCI group.
Through a multi-modal integration, we report novel evidence that the hippocampus in DLB is characterized by both macrostructural and microstructural preservation. Contrary to recent suggestions, our findings do not support the view that DTI measurements of the hippocampus are superior to volumetric changes in characterizing group differences, particularly between MCI and controls.
We studied neuroinflammation in individuals with late-life, depression, as a
risk factor for dementia, using [11C]PK11195 positron emission
tomography (PET). Five older participants with major depression and 13
controls underwent PET and multimodal 3T magnetic resonance imaging (MRI),
with blood taken to measure C-reactive protein (CRP). We found significantly
higher CRP levels in those with late-life depression and raised
[11C]PK11195 binding compared with controls in brain regions
associated with depression, including subgenual anterior cingulate cortex,
and significant hippocampal subfield atrophy in cornu ammonis 1 and
subiculum. Our findings suggest neuroinflammation requires further
investigation in late-life depression, both as a possible aetiological
factor and a potential therapeutic target.
To measure the trends in traditional marine food intake and serum vitamin D levels in Alaska Native women of childbearing age (20–29 years old) from the 1960s to the present.
We measured a biomarker of traditional food intake, the δ15N value, and vitamin D level, as 25-hydroxycholecalciferol (25(OH)D3) concentration, in 100 serum samples from 20–29-year-old women archived in the Alaska Area Specimen Bank, selecting twenty-five per decade from the 1960s to the 1990s. We compared these with measurements of red-blood-cell δ15N values and serum 25(OH)D3 concentrations from 20–29-year-old women from the same region collected during the 2000s and 2010s in a Center for Alaska Native Health Research study.
The Yukon Kuskokwim Delta region of south-west Alaska.
Alaska Native women (n 319) aged 20–29 years at the time of specimen collection.
Intake of traditional marine foods, as measured by serum δ15N values, decreased significantly each decade from the 1960s through the 1990s, then remained constant from the 1990s through the present (F5,306=77·4, P<0·0001). Serum vitamin D concentrations also decreased from the 1960s to the present (F4,162=26·1, P<0·0001).
Consumption of traditional marine foods by young Alaska Native women dropped significantly between the 1960s and the 1990s and was associated with a significant decline in serum vitamin D concentrations. Studies are needed to evaluate the promotion of traditional marine foods and routine vitamin D supplementation during pregnancy for this population.
FFQ data can be used to characterise dietary patterns for diet–disease association studies. In the present study, we evaluated three previously defined dietary patterns – ‘subsistence foods’, market-based ‘processed foods’ and ‘fruits and vegetables’ – among a sample of Yup'ik people from Southwest Alaska. We tested the reproducibility and reliability of the dietary patterns, as well as the associations of these patterns with dietary biomarkers and participant characteristics. We analysed data from adult study participants who completed at least one FFQ with the Center for Alaska Native Health Research 9/2009–5/2013. To test the reproducibility of the dietary patterns, we conducted a confirmatory factor analysis (CFA) of a hypothesised model using eighteen food items to measure the dietary patterns (n 272). To test the reliability of the dietary patterns, we used the CFA to measure composite reliability (n 272) and intra-class correlation coefficients for test–retest reliability (n 113). Finally, to test the associations, we used linear regression (n 637). All factor loadings, except one, in CFA indicated acceptable correlations between foods and dietary patterns (r>0·40), and model-fit criteria were >0·90. Composite and test–retest reliability of the dietary patterns were, respectively, 0·56 and 0·34 for ‘subsistence foods’, 0·73 and 0·66 for ‘processed foods’, and 0·72 and 0·54 for ‘fruits and vegetables’. In the multi-predictor analysis, the dietary patterns were significantly associated with dietary biomarkers, community location, age, sex and self-reported lifestyle. This analysis confirmed the reproducibility and reliability of the dietary patterns in the present study population. These dietary patterns can be used for future research and development of dietary interventions in this underserved population.