To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Although hepcidin synthesis is stimulated by inflammation and inhibited by Fe deficiency, the strength of their opposing effects on serum hepcidin (SHep) in humans remains unclear. It was recently shown that an inflammatory stimulus in anaemic women did not increase SHep or decrease Fe absorption. The enhancing effect of ascorbic acid on Fe absorption may not be effective during inflammation because of increased SHep. Our study aim was to test whether reducing inflammation in Fe-depleted overweight (OW) women with low-grade inflammation would lower SHep and improve Fe absorption with and without ascorbic acid, compared with normal-weight (NW) women without inflammation. Before and after 14 d of anti-inflammatory treatment (3 × 600 mg ibuprofen daily) in OW and NW women (n 36; 19–46 years of age), we measured SHep and fractional Fe absorption (FIA) (erythrocyte Fe incorporation) from 57Fe- and 58Fe-labelled test meals with and without ascorbic acid. There were significant group effects on IL-6, C-reactive protein, serum ferritin and SHep (for all, P < 0·05). There was a significant treatment effect on SHep (P < 0·05): in OW women, treatment decreased IL-6 by approximately 30 % and SHep by approximately 45 %. However, there were no significant treatment or group effects on FIA. Body Fe stores (BIS) were a significant positive predictor of SHep before and after treatment (P < 0·001), but IL-6 was not. Reducing chronic inflammation in OW women halved SHep but did not affect Fe absorption with or without ascorbic acid, and the main predictor of Fe absorption was BIS.
To describe the pattern of emergency department (ED) consultations in children with cerebral palsy (CP) compared to controls and factors predictive of ED consultations.
This retrospective cohort study linked data from the Registre de la paralysie cérébrale du Québec (REPACQ) and provincial administrative health databases. The CP cohort was comprised of children enrolled in REPACQ born between 1999 and 2002. REPACQ covers 6 of 17 Quebec health administrative regions. Region-, age-, and gender-matched controls were identified from administrative health databases in a 20:1 ratio. The primary outcome was high use of ED services (≥4 ED visits during the study period). Relative risk (RR) and 95% confidence interval (CI) were calculated.
In total, 301 children with CP were linked to administrative data and 6040 peer controls were selected. Ninety-two percent (92%) of the CP cohort had at least one ED visit in the study period, compared to 74% among controls (RR 1.24, 95% CI 1.19–1.28). Children with CP were more likely than their peers to have high ED use (RR 1.40; 95% CI 1.30–1.52). Factors predictive of high ED use were comorbid epilepsy (RR 1.23; 95% CI 1.04–1.46) and severity of motor impairment (RR 1.14; 95% CI 0.95–1.37).
Children with CP are more likely to present to the ED than their peers, resulting in increased use of ED services. Coordinated care with improved access to same-day evaluations could decrease ED use. Health system factors and barriers should be investigated to ensure optimal and appropriate use of ED services.
Many studies document cognitive decline following specific types of acute illness hospitalizations (AIH) such as surgery, critical care, or those complicated by delirium. However, cognitive decline may be a complication following all types of AIH. This systematic review will summarize longitudinal observational studies documenting cognitive changes following AIH in the majority admitted population and conduct meta-analysis (MA) to assess the quantitative effect of AIH on post-hospitalization cognitive decline (PHCD).
We followed Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines. Selection criteria were defined to identify studies of older age adults exposed to AIH with cognitive measures. 6566 titles were screened. 46 reports were reviewed qualitatively, of which seven contributed data to the MA. Risk of bias was assessed using the Newcastle–Ottawa Scale.
The qualitative review suggested increased cognitive decline following AIH, but several reports were particularly vulnerable to bias. Domain-specific outcomes following AIH included declines in memory and processing speed. Increasing age and the severity of illness were the most consistent risk factors for PHCD. PHCD was supported by MA of seven eligible studies with 41,453 participants (Cohen’s d = −0.25, 95% CI [−0.02, −0.49] I2 35%).
There is preliminary evidence that AIH exposure accelerates or triggers cognitive decline in the elderly patient. PHCD reported in specific contexts could be subsets of a larger phenomenon and caused by overlapping mechanisms. Future research must clarify the trajectory, clinical significance, and etiology of PHCD: a priority in the face of an aging population with increasing rates of both cognitive impairment and hospitalization.
To compare hospitalizations among children with cerebral palsy (CP) and healthy controls and to identify factors associated with hospitalizations in children with CP.
This retrospective cohort study linked data from a provincial CP Registry and administrative health databases. The CP cohort was comprised of children born from 1999 to 2002. Age, sex, and region-matched controls were identified from administrative health databases. Mean differences, relative risk (RR), and 95% confidence intervals (CIs) were calculated.
A total of 301 children with CP were linked to administrative health data and matched to 6040 controls. Mean hospitalizations per child during the study period were higher in children with CP compared to controls (raw mean difference (RMD) 5.0 95% CI 4.7 to 5.2) with longer length of stay (RMD 2.8 95% CI 1.8 to 3.8) and number of diagnoses per hospitalization (RMD 1.6 95% CI 1.4 to 1.8). Increased risk of hospitalization was observed in non-ambulant children with CP (RR 1.12 95% CI 1.01 to 1.22) compared to ambulant children and among those with spastic tri/quadriplegic CP compared to other CP subtypes (RR 1.15, 95% CI 1.05 to 1.27). Feeding difficulties (RR 1.20 95% CI 1.13 to 1.27), cortical visual (RR 1.22 95% CI 1.13 to 1.32), cognitive (RR 1.16 95% CI 1.04 to 1.30), and communication impairment (RR 1.26 95% CI 1.10 to 1.44) were associated with increased hospitalizations.
Children with CP face more frequent, longer hospital stays than peers, especially those with a more severe CP profile. Coordinated interdisciplinary care is needed in school-aged children with CP and medical complexity.
In conceptualizing lifestyle approaches to promote health and well-being, most attention is paid toward physical rituals, including physical activity, dietary modification, and cessation of drugs and alcohol. Less discussed are the fundamental aspects of work, play, and love that have the potential to promote – or erode – overall health and wellness. These factors may, in certain cases, play an even more profound role in wellness than traditional physical practices. “Joie de vivre” is a term used to describe overall enjoyment of life, and in a sense, encompasses all three domains.
The Minimal Data Set are demographic and tobacco use questions asked during enrollment at many quitlines. We tested whether these questions can be used to predict program engagement and success, and to evaluate whether findings can inform the tailoring of protocols to disparate populations. We analyzed 7,920 Arizona Smokers' Helpline treatment records to test a Structural Equation Model of the mediating effects of quitline services and short-term cessation outcomes on the relationship between intake questions and 7-month quit rate. Education (b = 0.05), gender (b = 0.03), Medicaid (b = −0.09), longest length of previous quit attempt (b = 0.05), confidence in quitting for 24 h (b = 0.04), environmental risk (b = −0.05), and life stress (b = 0.04) all significantly (P < 0.05) predicted engagement in quitline services. Program engagement had a direct effect on an in-program cessation outcomes construct (b = 0.47) and 7-month quit rate (b = 0.44). This in-program cessation outcomes construct had a significant direct effect on 7-month quit rate (b = −0.12). This model showing the relationship between program engagement and outcomes suggests that tailoring protocols can focus on engaging clients who have historically not taken full advantage of quitline services.
Consumer willingness to pay (WTP) for yogurt attributes was evaluated using a survey targeted to be nationally representative within the United States. A novel approach was used to allow for self-selection into the choice experiment for commonly purchased types of yogurt, either Greek or traditional, based on what consumers purchase. They were willing to pay a positive amount for requiring pasture access and not permitting dehorning/disbudding (which references the removal of horns or horn buds) for both traditional and Greek yogurt. Respondents had positive WTP for Greek yogurt labeled free of high-fructose corn syrup and a higher WTP for low-fat yogurt when compared to nonfat for both yogurt types.
Replicated evidence indicates that children with attention-deficit/hyperactivity disorder (ADHD) show disproportionate increases in hyperactivity/physical movement when their underdeveloped executive functions are taxed. However, our understanding of hyperactivity’s relation with set shifting is limited, which is surprising given set shifting’s importance as the third core executive function alongside working memory and inhibition. The aim of this study was to experimentally examine the effect of imposing set shifting and inhibition demands on objectively measured activity level in children with and without ADHD.
The current study used a validated experimental manipulation to differentially evoke set shifting, inhibition, and general cognitive demands in a carefully phenotyped sample of children aged 8–13 years with ADHD (n = 43) and without ADHD (n = 34). Activity level was sampled during each task using multiple, high-precision actigraphs; total hyperactivity scores (THS) were calculated.
Results of the 2 × 5 Bayesian ANOVA for hyperactivity revealed strong support for a main effect of task (BF10 = 1.79 × 1018, p < .001, ω2 = .20), such that children upregulated their physical movement in response to general cognitive demands and set shifting demands specifically, but not in response to increased inhibition demands. Importantly, however, this manipulation did not disproportionally increase hyperactivity in ADHD as demonstrated by significant evidence against the task × group interaction (BF01 = 18.21, p = .48, ω2 = .002).
Inhibition demands do not cause children to upregulate their physical activity. Set shifting produces reliable increases in children’s physical movement/hyperactivity over and above the effects of general cognitive demands but cannot specifically explain hyperactivity in children with ADHD.
Leukocyte telomere length (LTL) is a widely hypothesized biomarker of biological aging. Persons with shorter LTL may have a greater likelihood of developing dementia. We investigate whether LTL is associated with cognitive function, differently for individuals without cognitive impairment versus individuals with dementia or incipient dementia.
Enrolled subjects belong to the Long Life Family Study (LLFS), a multi-generational cohort study, where enrollment was predicated upon exceptional family longevity. Included subjects had valid cognitive and telomere data at baseline. Exclusion criteria were age ≤ 60 years, outlying LTL, and missing sociodemographic/clinical information. Analyses were performed using linear regression with generalized estimating equations, adjusting for sex, age, education, country, generation, and lymphocyte percentage.
Older age and male gender were associated with shorter LTL, and LTL was significantly longer in family members than spouse controls (p < 0.005). LTL was not associated with working or episodic memory, semantic processing, and information processing speed for 1613 cognitively unimpaired individuals as well as 597 individuals with dementia or incipient dementia (p < 0.005), who scored significantly lower on all cognitive domains (p < 0.005).
Within this unique LLFS cohort, a group of families assembled on the basis of exceptional survival, LTL is unrelated to cognitive ability for individuals with and without cognitive impairment. LTL does not change in the context of degenerative disease for these individuals who are biologically younger than the general population.
Asking psychiatric in-patients about their drug consumption is unlikely to yield reliable results, particularly where alcohol and illicit drug use is involved. The main aim of this study was to compare spontaneous self-reports of drug use in hospitalized psychiatric patients to biological measures of same. A secondary aim was to determine which personal factors were associated with the use of tobacco, alcohol, and illicit drugs as indicated by these biological measures.
The consumption of substances was investigated using biological measures (urine cotinine, cannabis, opiates, cocaine, amphetamines and barbiturates; blood carbohydrate-deficient transferrin [CDT] and gamma-glutamyl transferase [GGT]) in 486 consecutively admitted psychiatric patients, one day following their hospitalization. Patients’ self-reports of alcohol, tobacco and illicit drugs consumption were recorded. Socio-professional and familial data were also recorded.
The results show a low correlation between biological measures and self-reported consumption of alcohol and illicit drugs. Fifty-two percent of the patients under-reported their consumption of illicit drugs (kappa = .47). Patients with schizophrenia and personality disorders were more likely to disclose their illicit drug consumption relative to patients suffering from mood disorders and alcohol dependence. Fifty-six percent of patients underreported alcohol use, as evaluated by CDT (kappa = .2), and 37% underreported when using the CDT + GGT measure as an indicator. Smoking appeared to be reported adequately. In the study we observed a strong negative correlation between cannabis use and age, a strong correlation between tobacco and cannabis use, and correlations between tobacco, cannabis and alcohol consumption.
This study is the first to compare self-reports and biological measures of alcohol, tobacco and illicit drug uses in a large sample of inpatients suffering from various categories of psychiatric illnesses, allowing for cross-diagnosis comparisons.
The opioid epidemic has led to the wide-spread distribution of naloxone to emergency personnel and to the general public. Recommended storage conditions based on prescribing information are between 15°C and 25°C (59°F and 77°F), with excursions permitted between 4°C and 40°C (39°F and 104°F). Actual storage likely varies widely with potential exposures to extreme temperatures outside of these ranges. These potentially prolonged extreme temperatures may alter the volume of naloxone dispensed from the nasal spray device, which could result in suboptimal efficacy.
The aim of this study was to assess the naloxone volume deployed following nasal spray device storage at extreme temperatures over an extended period of time.
Naloxone nasal spray devices were exposed to storage temperatures of −29°C (−20°F), 20°C (68°F), and 71°C (160°F) to simulate extreme temperatures and a control for 10 hours. First, the density was measured under each temperature condition. Following the density calculation part of the experiment, the mass of naloxone dispensed from each nasal spray device at each temperature was captured and used to calculate volume: calculated volume (microliter, µl) = spray mass (mg converted to g)/mean density (g/mL). Measurements and calculations are reported as means with standard deviation and standard error, and a one-way ANOVA was used to evaluate mean dispensed volume differences at different temperatures.
There was no difference in the mean volume deployed at −29°C (−20°F), 20°C (68°F), and 71°C (160°F), and measurements were 101.44µl (SD = 9.56; SE = 5.52), 99.01µl (SD = 6.31; SE = 3.64), and 108.28µl (SD = 2.04; SE = 1.18), respectively; P value = .289, F-statistic value = 1.535.
The results of this study suggest that naloxone nasal spray devices will dispense the appropriate volume, even when stored at extreme temperatures outside of the manufacturer’s recommended range.
High-resolution analysis of the ice core from Colle Gnifetti, Switzerland, allows yearly and sub-annual measurement of pollution for the period of highest lead production in the European Middle Ages, c. AD 1170–1220. Here, the authors use atmospheric circulation analysis and other geoarchaeological records to establish that Britain was the principal source of that lead pollution. The comparison of annual lead deposition at Colle Gnifetti displays a strong similarity to trends in lead production documented in the English historical accounts. This research provides unique new insight into the yearly political economy and environmental impact of the Angevin Empire of Kings Henry II, Richard the Lionheart and John.
Different manufacturers recommend different levels of disinfection for oxygen nipple and nut adaptors, also known as Christmas-tree adaptors (CTAs). We aimed to determine the bacterial contamination rates of CTAs before and after clinical use and whether disinfection wipes effectively eliminate bacteria from CTAs.
CTAs were swabbed for bacteria directly from the shipment box or after use in a medical intensive care unit to determine levels of contamination. CTAs were also inoculated in the laboratory with a variety of bacteria and disinfected with either 0.5% hydrogen peroxide (Oxivir 1) or 0.25% tetra-ammonium chloride with 44.50% isopropyl alcohol (Super Sani-Cloth), and the effectiveness of each wipe was determined by comparing the bacterial recovery before and after disinfection.
CTAs exhibit low levels of bacterial burden before and after clinical use. Both disinfecting wipes were effective at removing bacteria from the CTAs.
Low-level disinfection of CTAs is appropriate prior to redeployment in the clinical setting.
Obesity is characterized by chronic low-grade inflammation. Visceral adipose tissue (VAT) is heavily infiltrated by macrophages producing pro-inflammatory cytokines (IL-6), therefore VAT predicts greater systemic inflammation compared to peripheral fat. Thus, central adiposity may cause increased serum hepcidin (SHep) and may affect iron metabolism more than peripheral adiposity. Although increased total body fat (BF) is linked to disordered iron homeostasis, the potential effects of body fat distribution on iron metabolism have not been studied. Therefore, the aim of this study was to assess the effect of BF distribution on iron and inflammation parameters, SHep and iron metabolism.
We enrolled 37 normal-weight women and 81 overweight/obese women in this cross-sectional study. Body composition was assessed using DXA and iron- and inflammation parameters and SHep were measured. The overweight/obese women were assigned to a peripheral (n = 54) and a central (n = 27) fat deposit group, according to their android fat percentage. All women received 100 mg oral iron as ferrous citrate and the change in serum iron was assessed after 2 h to determine iron absorption.
The three groups differed significantly in body weight, BMI, waist circumference, android fat, gynoid fat, total fat, android/gynoid ratio and VAT (for all p < 0.001). Hemoglobin, serum ferritin, body iron stores (BIS), serum iron and transferrin saturation (TSAT) were lowest and transferrin receptor was highest in central obesity. CRP was higher in central obesity compared to both, peripheral obesity (p < 0.05) and normal-weight (p < 0.001). SHep was higher in central and peripheral obesity compared to normal-weight (both p < 0.01), with no difference between the two overweight/obese groups. Δserum iron was ≈30% and ≈20% lower in central obesity compared to normal-weight and peripheral obesity. We performed linear regression analysis on SHep, CRP, TSAT and Δserum iron: Android fat and BIS were positive predictors of SHep (p < 0.05, p < 0.001), android fat was a positive predictor of CRP (p < 0.001), BIS was a positive, android fat was a negative predictor of TSAT (p < 0.001, p < 0.05) and TSAT and android fat were both negative predictors of Δserum iron (p < 0.001, p < 0.05).
Controlling for iron status, inflammation and SHep are increased in women with central obesity and predict lower iron absorption and hypoferremia compared to women with more peripheral fat distribution. Thus, women with central fat distribution may be at increased risk for iron deficiency and anemia.
Overweight/obesity (owob) causes low-grad systemic inflammation and thereby an up-regulation of hepcidin and a reduction in fractional iron absorption (FIA) even with low iron stores. Pregnancy increases iron needs because of the expansion of maternal blood volume and fetal needs. It is unclear to what extent owob pregnancy influences FIA, iron supply of the fetus and risk of iron deficiency. Therefore, the main aim of this study was to determine the effect of maternal owob on iron absorption during pregnancy and on the iron transfer to the fetus. Secondary objectives were to investigate the development of hepcidin, plasma ferritin and inflammatory markers over the course of pregnancy dependent on weight status. In this multicenter case-control study we included 44 normal weight (nw) and 36 owob women around pregnancy week (PW) 12. We administered 57Fe or 58Fe labeled FeSO4 to women during the 2nd and 3rd trimester of pregnancy. We measured FIA determining erythrocyte incorporation of iron stable isotopes 14 days after administration. From PW 12 until PW 36 iron-, inflammation and hepcidin were monitored. Iron transfer to the fetus was determined as iron stable isotope concentration in cord blood. Sample analysis is currently ongoing, all results will be available in October. Subject characteristics in PW 12 for the nw (n = 26) and owob (n = 10) were: mean BMI: 21.4 ± 2.2 and 36.7 ± 6.8 kg/m2, mean hemoglobin: 12.4 ± 1.2 and 12.4 ± 0.9 g/dL and median plasma ferritin: 41.3 (29.6–83.6) and 61.6 (24.3–119.0) μg/L. Preliminary results indicate FIA increased by 2.4 fold in the nw and by 1.3 fold in the owob women between the 2nd and the 3rd trimester of pregnancy. Iron stores decreased in both groups over the course of pregnancy. Hepcidin was still significantly higher in the owob women in the 3rd trimester. Inflammation tended to be higher in owob women throughout pregnancy. Iron isotopes were highly detectable in cord blood. The 58Fe:57Fe-ratio determined in cord blood corresponded to the 58Fe:57Fe-ratio determined in the mother in the 3rd trimester. Thus, in owob women, the increase in FIA throughout pregnancy to support iron needs of mother and fetus is blunted compared to nw women. This is consistent with elevated hepcidin in the 3rd trimester and higher inflammation throughout pregnancy. Thus, even though iron demands are strongly increased, owob may prohibit an adequate iron supply to the expecting mother and the fetus due to persistent subclinical inflammation.
Simulation plays an integral role in the Canadian healthcare system with applications in quality improvement, systems development, and medical education. High-quality, simulation-based research will ensure its effective use. This study sought to summarize simulation-based research activity and its facilitators and barriers, as well as establish priorities for simulation-based research in Canadian emergency medicine (EM).
Simulation-leads from Canadian departments or divisions of EM associated with a general FRCP-EM training program surveyed and documented active EM simulation-based research at their institutions and identified the perceived facilitators and barriers. Priorities for simulation-based research were generated by simulation-leads via a second survey; these were grouped into themes and finally endorsed by consensus during an in-person meeting of simulation leads. Priority themes were also reviewed by senior simulation educators.
Twenty simulation-leads representing all 14 invited institutions participated in the study between February and May, 2018. Sixty-two active, simulation-based research projects were identified (median per institution = 4.5, IQR 4), as well as six common facilitators and five barriers. Forty-nine priorities for simulation-based research were reported and summarized into eight themes: simulation in competency-based medical education, simulation for inter-professional learning, simulation for summative assessment, simulation for continuing professional development, national curricular development, best practices in simulation-based education, simulation-based education outcomes, and simulation as an investigative methodology.
This study summarized simulation-based research activity in EM in Canada, identified its perceived facilitators and barriers, and built national consensus on priority research themes. This represents the first step in the development of a simulation-based research agenda specific to Canadian EM.
Tranexamic acid (TXA) is an antifibrinolytic agent shown to reduce morbidity and mortality in hemorrhagic shock. It has potential use in prehospital and wilderness medicine; however, in these environments, TXA is likely to be exposed to fluctuating and extreme temperatures. If TXA degrades under these conditions, this may reduce antifibrinolytic effects.
This study sought to determine if repetitive temperature derangement causes degradation of TXA.
Experimental samples underwent either seven days of freeze/thaw or heating cycles and then were analyzed via mass spectrometry for degradation of TXA. An internal standard was used for comparison between experimental samples and controls. These samples were compared to room temperature controls to determine if fluctuating extreme temperatures cause degradation of TXA.
The coefficient of variability of ratios of TXA to internal standard within each group (room temperature, freeze, and heated) was less than five percent. An independent t-test was performed on freeze/thaw versus control samples (t = 2.77; P = .17) and heated versus control samples (t = 2.77; P = .722) demonstrating no difference between the groups.
These results suggest that TXA remains stable despite repeated exposure to extreme temperatures and does not significantly degrade. These findings support the stability of TXA and its use in extreme environments.
The mammal family Tenrecidae (Afrotheria: Afrosoricida) is endemic to Madagascar. Here we present the conservation priorities for the 31 species of tenrec that were assessed or reassessed in 2015–2016 for the IUCN Red List of Threatened Species. Six species (19.4%) were found to be threatened (4 Vulnerable, 2 Endangered) and one species was categorized as Data Deficient. The primary threat to tenrecs is habitat loss, mostly as a result of slash-and-burn agriculture, but some species are also threatened by hunting and incidental capture in fishing traps. In the longer term, climate change is expected to alter tenrec habitats and ranges. However, the lack of data for most tenrecs on population size, ecology and distribution, together with frequent changes in taxonomy (with many cryptic species being discovered based on genetic analyses) and the poorly understood impact of bushmeat hunting on spiny species (Tenrecinae), hinders conservation planning. Priority conservation actions are presented for Madagascar's tenrecs for the first time since 1990 and focus on conserving forest habitat (especially through improved management of protected areas) and filling essential knowledge gaps. Tenrec research, monitoring and conservation should be integrated into broader sustainable development objectives and programmes targeting higher profile species, such as lemurs, if we are to see an improvement in the conservation status of tenrecs in the near future.
The seventh-century AD switch from gold to silver currencies transformed the socio-economic landscape of North-west Europe. The source of silver, however, has proven elusive. Recent research, integrating ice-core data from the Colle Gnifetti drill site in the Swiss Alps, geoarchaeological records and numismatic and historical data, has provided new evidence for this transformation. Annual ice-core resolution data are combined with lead pollution analysis to demonstrate that significant new silver mining facilitated the change to silver coinage, and dates the introduction of such coinage to c. AD 660. Archaeological evidence and atmospheric modelling of lead pollution locates the probable source of the silver to mines at Melle, in France.