To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The European conquest and colonization of the Caribbean precipitated massive changes in indigenous cultures and societies of the region. One of the earliest changes was the introduction of new plant and animal foods and culinary traditions. This study presents the first archaeological reconstruction of indigenous diets and foodways in the Caribbean spanning the historical divide of 1492. We use multiple isotope datasets to reconstruct these diets and investigate the potential relationships between dietary and mobility patterns at multiple scales. Dietary patterns are assessed by isotope analyses of different skeletal elements from the archaeological skeletal population of El Chorro de Maíta, Cuba. This approach integrates carbon and nitrogen isotope analyses of bone and dentine collagen with carbon and oxygen isotope analyses of bone and enamel apatite. The isotope results document extreme intrapopulation dietary heterogeneity but few systematic differences in diet between demographic/social groups. Comparisons with published isotope data from other precolonial and colonial period populations in the Caribbean indicate distinct dietary and subsistence practices at El Chorro de Maíta. The majority of the local population consumed more animal protein resources than other indigenous populations in the Caribbean, and their overall dietary patterns are more similar to colonial period enslaved populations than to indigenous ones.
Campylobacteriosis is the most common notifiable disease in New Zealand. While the risk of campylobacteriosis has been found to be strongly associated with the consumption of undercooked poultry, other risk factors include rainwater-sourced drinking water, contact with animals and consumption of raw dairy products. Despite this, there has been little investigation of raw milk as a risk factor for campylobacteriosis. Recent increases in demand for untreated or ‘raw’ milk have also raised concerns that this exposure may become a more important source of disease in the future. This study describes the cases of notified campylobacteriosis from a sentinel surveillance site. Previously collected data from notified cases of raw milk-associated campylobacteriosis were examined and compared with campylobacteriosis cases who did not report raw milk consumption. Raw milk campylobacteriosis cases differed from non-raw milk cases on comparison of age and occupation demographics, with raw milk cases more likely to be younger and categorised as children or students for occupation. Raw milk cases were more likely to be associated with outbreaks than non-raw milk cases. Study-suggested motivations for raw milk consumption (health reasons, natural product, produced on farm, inexpensive or to support locals) were not strongly supported by cases. More information about the raw milk consumption habits of New Zealanders would be helpful to better understand the risks of this disease, especially with respect to increased disease risk observed in younger people. Further discussion with raw milk consumers around their motivations may also be useful to find common ground between public health concerns and consumer preferences as efforts continue to manage this ongoing public health issue.
To develop and validate the Discrepancy-based Evidence for Loss of Thinking Abilities (DELTA) score. The DELTA score characterizes the strength of evidence for cognitive decline on a continuous spectrum using well-established psychometric principles for improving detection of cognitive changes.
DELTA score development used neuropsychological test scores from the Alzheimer’s Disease Neuroimaging Initiative (ADNI) cohort (two tests each from Memory, Executive Function, and Language domains). We derived regression-based normative reference scores using age, gender, years of education, and word-reading ability from robust cognitively normal ADNI participants. Discrepancies between predicted and observed scores were used for calculating the DELTA score (range 0–15). We validated DELTA scores primarily against longitudinal Clinical Dementia Rating-Sum of Boxes (CDR-SOB) and Functional Activities Questionnaire (FAQ) scores (baseline assessment through Year 3) using linear mixed models and secondarily against cross-sectional Alzheimer’s biomarkers.
There were 1359 ADNI participants with calculable baseline DELTA scores (age 73.7 ± 7.1 years, 55.4% female, 100% white/Caucasian). Higher baseline DELTA scores (stronger evidence of cognitive decline) predicted higher baseline CDR-SOB (ΔR2 = .318) and faster rates of CDR-SOB increase over time (ΔR2 = .209). Longitudinal changes in DELTA scores tracked closely and in the same direction as CDR-SOB scores (fixed and random effects of mean + mean-centered DELTA, ΔR2 > .7). Results were similar for FAQ scores. High DELTA scores predicted higher PET-Aβ SUVr (ρ = 324), higher CSF-pTau/CSF-Aβ ratio (ρ = .460), and demonstrated PPV > .9 for positive Alzheimer’s disease biomarker classification.
Data support initial development and validation of the DELTA score through its associations with longitudinal functional changes and Alzheimer’s biomarkers. We provide several considerations for future research and include an automated scoring program for clinical use.
Adolescent association with deviant and delinquent friends was examined for its roots in coercive parent–teen interactions and its links to functional difficulties extending beyond delinquent behavior and into adulthood. A community sample of 184 adolescents was followed from age 13 to age 27, with collateral data obtained from close friends, classmates, and parents. Even after accounting for adolescent levels of delinquent and deviant behavior, association with deviant friends was predicted by coercive parent–teen interactions and then linked to declining functioning with peers during adolescence and greater internalizing and externalizing symptoms and poorer overall adjustment in adulthood. Results are interpreted as suggesting that association with deviant friends may disrupt a core developmental task—establishing positive relationships with peers—with implications that extend well beyond deviancy-training effects.
Anecdotal evidence suggests the use of bolus tube feeding is increasing in the long-term home enteral tube feed (HETF) patients. A cross-sectional survey to assess the prevalence of bolus tube feeding and to characterise these patients was undertaken. Dietitians from ten centres across the UK collected data on all adult HETF patients on the dietetic caseload receiving bolus tube feeding (n 604, 60 % male, age 58 years). Demographic data, reasons for tube and bolus feeding, tube and equipment types, feeding method and patients’ complete tube feeding regimens were recorded. Over a third of patients receiving HETF used bolus feeding (37 %). Patients were long-term tube fed (4·1 years tube feeding, 3·5 years bolus tube feeding), living at home (71 %) and sedentary (70 %). The majority were head and neck cancer patients (22 %) who were significantly more active (79 %) and lived at home (97 %), while those with cerebral palsy (12 %) were typically younger (age 31 years) but sedentary (94 %). Most patients used bolus feeding as their sole feeding method (46 %), because it was quick and easy to use, as a top-up to oral diet or to mimic mealtimes. Importantly, oral nutritional supplements (ONS) were used for bolus feeding in 85 % of patients, with 51 % of these being compact-style ONS (2·4 kcal (10·0 kJ)/ml, 125 ml). This survey shows that bolus tube feeding is common among UK HETF patients, is used by a wide variety of patient groups and can be adapted to meet the needs of a variety of patients, clinical conditions, nutritional requirements and lifestyles.
The rocky shores of the north-east Atlantic have been long studied. Our focus is from Gibraltar to Norway plus the Azores and Iceland. Phylogeographic processes shape biogeographic patterns of biodiversity. Long-term and broadscale studies have shown the responses of biota to past climate fluctuations and more recent anthropogenic climate change. Inter- and intra-specific species interactions along sharp local environmental gradients shape distributions and community structure and hence ecosystem functioning. Shifts in domination by fucoids in shelter to barnacles/mussels in exposure are mediated by grazing by patellid limpets. Further south fucoids become increasingly rare, with species disappearing or restricted to estuarine refuges, caused by greater desiccation and grazing pressure. Mesoscale processes influence bottom-up nutrient forcing and larval supply, hence affecting species abundance and distribution, and can be proximate factors setting range edges (e.g., the English Channel, the Iberian Peninsula). Impacts of invasive non-native species are reviewed. Knowledge gaps such as the work on rockpools and host–parasite dynamics are also outlined.
Withdrawal reactions when coming off antidepressants have long been neglected or minimised. It took almost two decades after the selective serotonin reuptake inhibitors (SSRIs) entered the market for the first systematic review to be published. More reviews have followed, demonstrating that the dominant and long-held view that withdrawal is mostly mild, affects only a small minority and resolves spontaneously within 1–2 weeks, was at odd with the sparse but growing evidence base. What the scientific literature reveals is in close agreement with the thousands of service user testimonies available online in large forums. It suggests that withdrawal reactions are quite common, that they may last from a few weeks to several months or even longer, and that they are often severe. These findings are now increasingly acknowledged by official professional bodies and societies.
To understand how exposure to victimization during adolescence and the presence of comorbid psychological conditions influence substance use treatment entry and substance use disorder diagnosis from 14 to 25 years old among serious juvenile offenders, this study included 1,354 serious juvenile offenders who were prospectively followed over 7 years. Growth mixture modeling was used to assess profiles of early victimization during adolescence (14–17 years). Discrete time survival mixture analysis was used to assess time to treatment entry and substance use disorder diagnosis. Posttraumatic stress disorder (PTSD) and major depressive disorder (MDD) were used as predictors of survival time. Mixture models revealed three profiles of victimization: sustained poly-victimization, moderate/decreasing victimization, and low victimization. Youth in the sustained poly-victimization class were more likely to enter treatment earlier and have a substance use diagnosis earlier than other classes. PTSD was a significant predictor of treatment entry for youth in the sustained poly-victimization class, and MDD was a significant predictor of substance use disorder diagnosis for youth in the moderate/decreasing victimization class. Therefore, substance use prevention programming targeted at youth experiencing poly-victimization in early adolescence—especially those who have PTSD or MDD—is needed.
Higher health literacy is associated with higher cognitive function and better health. Despite its wide use in medical research, no study has investigated the genetic contributions to health literacy. Using 5783 English Longitudinal Study of Ageing (ELSA) participants (mean age = 65.49, SD = 9.55) who had genotyping data and had completed a health literacy test at wave 2 (2004–2005), we carried out a genome-wide association study (GWAS) of health literacy. We estimated the proportion of variance in health literacy explained by all common single nucleotide polymorphisms (SNPs). Polygenic profile scores were calculated using summary statistics from GWAS of 21 cognitive and health measures. Logistic regression was used to test whether polygenic scores for cognitive and health-related traits were associated with having adequate, compared to limited, health literacy. No SNPs achieved genome-wide significance for association with health literacy. The proportion of variance in health literacy accounted for by common SNPs was 8.5% (SE = 7.2%). Greater odds of having adequate health literacy were associated with a 1 standard deviation higher polygenic score for general cognitive ability [OR = 1.34, 95% CI (1.26, 1.42)], verbal-numerical reasoning [OR = 1.30, 95% CI (1.23, 1.39)], and years of schooling [OR = 1.29, 95% CI (1.21, 1.36)]. Reduced odds of having adequate health literacy were associated with higher polygenic profiles for poorer self-rated health [OR = 0.92, 95% CI (0.87, 0.98)] and schizophrenia [OR = 0.91, 95% CI (0.85, 0.96)). The well-documented associations between health literacy, cognitive function and health may partly be due to shared genetic etiology. Larger studies are required to obtain accurate estimates of SNP-based heritability and to discover specific health literacy-associated genetic variants.
A power MOSFET-based push–pull configuration nanosecond-pulse generator has been designed, constructed, and characterized to permeabilize cells for biological and medical applications. The generator can deliver pulses with durations ranging from 80 ns up to 1 µs and pulse amplitudes up to 1.4 kV. The unit has been tested for in vitro experiments on a medulloblastoma cell line. Following the exposure of cells to 100, 200, and 300 ns electric field pulses, permeabilization tests were carried out, and viability tests were conducted to verify the performance of the generator. The maximum temperature rise of the biological load was also calculated based on Joule heating energy conservation and experimental validation. Our results indicate that the developed device has good capabilities to achieve well-controlled electro-manipulation in vitro.
Although the UK is the largest lamb meat producer in Europe, there are limited data available on sheep flock performance and on how sheep farmers manage their flocks. The aims of this study were to gather evidence on the types of disease control practices implemented in sheep flocks, and to explore husbandry factors associated with flock productivity. A questionnaire focusing on farm characteristics, general husbandry and flock health management was carried out in 648 farms located in the UK over summer 2016. Abattoir sales data (lamb sales over 12 months) was compared with the number of breeding ewes on farm to estimate flock productivity (number of lambs sold for meat per 100 ewes per farm per year). Results of a multivariable linear regression model, conducted on 615 farms with complete data, indicated that farms vaccinating ewes against abortion and clostridial agents and administering a group 4/5 anthelmintic to ewes (as recommended by the Sustainable Control of Parasites in Sheep Initiative) during quarantining had a greater flock productivity than farms not implementing these actions (P<0.01 and 0.02, respectively). Flocks with maternal breed types had higher productivity indexes compared with flocks with either pure hill or terminal breeds (P<0.01). Farms weighing lambs during lactation had greater productivity than those not weighing (P<0.01). Importantly, these actions were associated with other disease control practices, for example, treating individual lame ewes with an antibiotic injection, weaning lambs between 13 and 15 weeks of age and carrying out faecal egg counts, suggesting that an increase in productivity may be associated with the combined effect of these factors. This study provides new evidence on the positive relationship between sheep flock performance and disease control measures and demonstrates that lamb sales data can be used as a baseline source of information on flock performance and for farm benchmarking. Further research is needed to explore additional drivers of flock performance.
This study tested whether the association between interparental conflict and adolescent externalizing symptoms was moderated by a polygenic composite indexing low dopamine activity (i.e., 7-repeat allele of DRD4; Val alleles of COMT; 10-repeat variants of DAT1) in a sample of seventh-grade adolescents (Mean age = 13.0 years) and their parents. Using a longitudinal, autoregressive design, observational assessments of interparental conflict at Wave 1 predicted increases in a multi-informant measurement of youth externalizing symptoms 2 years later at Wave 3 only for children who were high on the hypodopaminergic composite. Moderation was expressed in a “for better” or “for worse” form hypothesized by differential susceptibility theory. Thus, children high on the dopaminergic composite experienced more externalizing problems than their peers when faced with more destructive conflicts but also fewer externalizing problems when exposed to more constructive interparental conflicts. Mediated moderation findings indicated that adolescent reports of their emotional insecurity in the interparental relationship partially explained the greater genetic susceptibility experienced by these children. More specifically, the dopamine composite moderated the association between Wave 1 interparental conflict and emotional insecurity 1 year later at Wave 2 in the same “for better” or “for worse” pattern as externalizing symptoms. Adolescent insecurity at Wave 2, in turn, predicted their greater externalizing symptoms 1 year later at Wave 3. Post hoc analyses further revealed that the 7-repeat allele of the dopamine receptor D4 (DRD4) gene was the primary source of plasticity in the polygenic composite. Results are discussed as to how they advance process-oriented Gene x Environment models of emotion regulation.
Introduction: One in nine (11.7%) people in Saskatchewan identifies as First Nations. In Canada, First Nations people experience a higher burden of cardiovascular disease when compared to the general population, but it is unknown whether they have different outcomes in out of hospital cardiac arrest (OHCA). Methods: We reviewed pre-hospital and inpatient records of patients sustaining an OHCA between January 1st, 2015 and December 31st, 2017. The population consisted of patients aged 18 years or older with OHCA of presumed cardiac origin occurring in the catchment area of Saskatoon's EMS service. Variables of interest included, age, gender, First Nations status (as identified by treaty number), EMS response times, bystander CPR, and shockable rhythm. Outcomes of interest included return of spontaneous circulation (ROSC), survival to hospital admission, and survival to hospital discharge. Results: In all, 372 patients sustained OHCA, of which 27 were identified as First Nations. First Nations patients with OHCA tended to be significantly younger (mean age 46 years vs. 65 years, p < 0.0001) and had shorter EMS response times (median times 5.3 minutes vs. 6.2 minutes, p = 0.01). There were no differences between First Nations and non-First Nations patients in terms of incidence of shockable rhythms (24% vs. 26%, p = 0.80), ROSC (42% vs. 41%, p = 0.87), survival to admission (27% vs 33%, p = 0.53), and survival to hospital discharge (15% vs. 12%, p = 0.54). Conclusion: In Saskatoon, First Nations patients sustaining OHCA appear to have similar survival rates when compared with non-First Nations patients, suggesting similar baseline care. Interestingly, First Nations patients sustaining OHCA were significantly younger than their non-First Nations counterparts. This may reflect a higher burden of cardiovascular disease, suggesting a need improved prevention strategies.
Introduction: Evidence suggests that prehospital point of care ultrasound (POCUS) may improve outcomes. It serves as an aid in physical examination, triage, diagnosis, and patient disposition. The rate of adoption of POCUS among aeromedical services (AMS) throughout Canada is unknown. The objective of this study was to describe current POCUS use among Canadian AMS providers. Methods: This is a cross-sectional observational study. A survey was emailed to directors of government-funded AMS bases in Canada. Data was analyzed using descriptive statistics. Results: The response rate was 88.2% (15/17 AMS directors) and accounted for 42 out of 46 individual bases. POCUS is used by AMS in British Columbia, Alberta, Saskatchewan, and Manitoba. New Brunswick, Nova Scotia, Prince Edward Island, and Yukon are planning to introduce POCUS within the next year. Ontario, Quebec, and Newfoundland are not utilizing POCUS and are not planning to introduce it. BC is the only province currently using POCUS on fixed-wing aircraft. POCUS is used in <25% of missions, most frequently at sending hospital and in flight. Most useful applications were assessment for pneumothorax, free abdominal fluid, and cardiac standstill. Most common barrier to POCUS use was cost of training and maintenance of competence. Conclusion: Prehospital POCUS is available in Western Canada with one third of the Canadian population having access to AMS utilizing ultrasound. The Maritimes and the Yukon Territory will further extend POCUS use on fixed-wing aircraft. While there are barriers to POCUS use, those bases that have adopted POCUS consider it valuable.
Introduction: Prior Canadian Emergency Department (ED) studies have demonstrated variable benefits of initial assessment physician (IAP) to rapidly assess and initiate care of ED patients after triage. These studies have been conducted primarily in academic teaching and large urban hospitals. It is not clear if such an IAP role could be beneficial in an small community hospital. Our pilot study hypothesized that instituting a supported IAP role can reduce physician intial assessment (PIA) time, total ED length of stay (LOS), and left-without-being-seen (LWBS) rates. Methods: This was a pre and post interrupted time series observational study at a community ED in Niagara Health Systems (Welland Ontario, 4 MD shifts, 36hrs total coverage, 30000 annual visits). In July 2017, an IAP ED shift (with separate assessment/treatment area) was re-purposed, with nursing support, to reduce initial time to MD assessment after triage. For lower acuity cases, the IAP MD generally completed full case management & disposition. Higher acuity complex cases were initiated by IAP, and transferred into the main ED care areas for “inside” MD management. Administrative data was accessed for 6 months prior to intervention, and 4 months available post-intervention. Descriptive statistics were calculated for collected data. Results: A modest improvement in different administrative ED performance metrics was observed. The following changes were noted pre and post IAP intervention: PIA time reduced from 3.6hrs to 3.2hrs, total ED LOS reduced from 19.2hrs to 13.8hrs, and daily LWBS rate reduced from 4.2% to 3.7%. This pilot study demonstrated improvement trends in ED performance metrics, although there is insufficient data to show statistical significance. Aggregate data was not subgrouped based on CTAS categories. This pilot was not intended to collect patient or staff satisfaction data, adverse events, nor designed to demonstrate cost-effectiveness Conclusion: Introducing an IAP shift in a small community ED has shown improvement trends for various ED throughput measures pertaining to outcomes such as PIA time, total LOS and LWBS rates. Further research is required to determine statistical significance of time reductions, satisfaction (patients, staff), resource utilization impact and CTAS subgroup performance. This improvement demonstrates potential impact system-wide across Niagara region.
Introduction: Many drugs, including cannabis and alcohol, cause impairment and contribute to motor vehicle collisions (MVCs). Policy makers require knowledge of the prevalence of drug use in crash-involved drivers, and types of drugs used in order to develop effective prevention programs. This issue is particularly relevant with the recent legalization of cannabis. We aim to study the prevalence of alcohol, cannabis, sedating medications, and other drugs in injured drivers from 4 Canadian Provinces. Methods: This prospective cohort study obtained excess clinical blood samples from consecutive injured drivers who attended a participating Canadian trauma centre following a MVC. Blood samples were analyzed using a broad spectrum toxicology screen capable of detecting cannabinoids, cocaine, amphetamines (including their major analogues), and opioids as well as psychotropic pharmaceuticals (including antihistamines, benzodiazepines, other hypnotics, and sedating antidepressants). Alcohol and cannabinoids were quantified. Health records were reviewed to extract demographic, medical, and MVC information using a standardized data collection tool. Results: This study has been collecting data in 4 trauma centres in British Columbia (BC) since 2011 and was launched in 2 trauma centres in Alberta (AB), 1 in Saskatchewan (SK), and 2 in Ontario (ON) in 2018. In preliminary results from BC (n = 2412), 8% of injured drivers tested positive for THC and 13% for alcohol. Preliminary results from other provinces (n = 301) suggest a regional variation in prevalence of drivers testing positive for THC (10% - 27%), alcohol (17% - 29%), and other drugs. By May 2018, an estimated 4500 cases from BC, 600 from AB, 150 from SK, and 650 from ON will have been analyzed. We will report the prevalence of positive tests for alcohol, THC, other recreational drugs, and sedating medications, pre and post cannabis legalization. The number of cases with alcohol and/or THC levels above Canadian per se limits will also be reported. Results will be reported according to province, driver sex, age, single vs. multi vehicle crashes, and requirement for hospital admission. Conclusion: This will be among the largest international datasets on drug use by injured drivers. Our findings will provide patterns of drug and alcohol impairment in 4 Canadian provinces pre and post cannabis legalization. The significance of these findings and implication for impaired driving policy and prevention programs in Canada will be discussed.
Post-disaster archaeological investigations at Jaffna Fort have revealed material demonstrating pre-colonial contact, shedding new light on the importance of the site in Indian Ocean trade and communications networks before European occupation.
OBJECTIVES/SPECIFIC AIMS: To examine rural-urban disparities in prevalence of diagnosed diabetes in veterans receiving care at the VA and to determine the extent to which demographic factors and obesity levels contribute to identified disparities. METHODS/STUDY POPULATION: A retrospective serial cross-sectional analysis was employed. A stratified weighted random sample of veterans who received care at a VA facility was selected each year for 2007 through 2012. Rural Urban Commuting Area (RUCA) codes were based on resident zip code. Diabetes was defined by two or more primary or secondary ICD-9 codes for diabetes (250.xx) within a 12 month period. Data were analyzed using complex survey-specific procedures. RESULTS/ANTICIPATED RESULTS: Diabetes prevalence 2007-2012 was lowest in urban (20.5%-21.0%), followed by highly rural (21.1%-22.1%) and rural (22.3%-23.0%) areas with the prevalence being significantly higher on the insular islands (31.0%-32.4%). In 2012, 41% of urban, 43% of rural and highly rural and 30% of insular island veterans were obese. Relative to urban areas, the odds ratio for prevalent diabetes was 1.10 (95% CI: 1.08, 1.12) for rural veterans, 1.19 (95% CI: 1.16, 1.23) for insular island veterans, and 1.00 (95% CI: 0.98, 1.02) for highly rural veterans. DISCUSSION/SIGNIFICANCE OF IMPACT: Prevalence of diagnosed diabetes is high in veterans residing in rural, highly rural and urban areas, but markedly higher on the insular islands. Understanding the burden of disease and factors driving disparities provides information required to develop targeted interventions.
We investigated whether older adults are more likely than younger adults to violate a foundational property of rational decision making, the axiom of transitive preference. Our experiment consisted of two groups, older (ages 60-75; 21 participants) and younger (ages 18-30; 20 participants) adults. We used Bayesian model selection to investigate whether individuals were better described via (transitive) weak order-based decision strategies or (possibly intransitive) lexicographic semiorder decision strategies. We found weak evidence for the hypothesis that older adults violate transitivity at a higher rate than younger adults. At the same time, a hierarchical Bayesian analysis suggests that, in this study, the distribution of decision strategies across individuals is similar for both older and younger adults.