To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Selenium (Se) is an essential element for human health. However, our knowledge of the prevalence of Se deficiency is less than for other micronutrients of public health concern such as iodine, iron and zinc, especially in sub-Saharan Africa (SSA). Studies of food systems in SSA, in particular in Malawi, have revealed that human Se deficiency risks are widespread and influenced strongly by geography. Direct evidence of Se deficiency risks includes nationally representative data of Se concentrations in blood plasma and urine as population biomarkers of Se status. Long-range geospatial variation in Se deficiency risks has been linked to soil characteristics and their effects on the Se concentration of food crops. Selenium deficiency risks are also linked to socio-economic status including access to animal source foods. This review highlights the need for geospatially-resolved data on the movement of Se and other micronutrients in food systems which span agriculture–nutrition–health disciplinary domains (defined as a GeoNutrition approach). Given that similar drivers of deficiency risks for Se, and other micronutrients, are likely to occur in other countries in SSA and elsewhere, micronutrient surveillance programmes should be designed accordingly.
Evidence on whether nutritional supplementation affects physical activity (PA) during early childhood is limited. We examined the long-term effects of lipid-based nutrient supplements (LNS) on total PA, moderate-to-vigorous PA (MVPA) and sedentary behaviour (SB) of children at 4–6 years using an accelerometer for 1 week. Their mothers were enrolled in the International Lipid-based Nutrient Supplement-DYAD randomised controlled trial in Ghana, assigned to daily LNS or multiple micronutrients (MMN) during pregnancy through 6 months postpartum or Fe and folic acid (IFA) during pregnancy and placebo for 6 months postpartum. From 6 to 18 months, children in the LNS group received LNS; the other two groups received no supplements. Analysis was done with intention to treat comparing two groups: LNS v. non-LNS (MMN+ IFA). Of the sub-sample of 375 children fitted with accelerometers, 353 provided sufficient data. Median vector magnitude (VM) count was 1374 (interquartile range (IQR) 309), and percentages of time in MVPA and SB were 4·8 (IQR 2) and 31 (IQR 8) %, respectively. The LNS group (n 129) had lower VM (difference in mean −73 (95 % CI −20, −126), P = 0·007) and spent more time in SB (LNS v. non-LNS: 32·3 v. 30·5 %, P = 0·020) than the non-LNS group (n 224) but did not differ in MVPA (4·4 v. 4·7 %, P = 0·198). Contrary to expectations, provision of LNS in early life slightly reduced the total PA and increased the time in SB but did not affect time in MVPA. Given reduced social-emotional difficulties in the LNS group previously reported, including hyperactivity, one possible explanation is less restless movement in the LNS group.
Introduction: Telephone Triage Services (TTS) manage phone calls from the public regarding general medical problems and provide telephone advice. This telephone based care can overlap with care provided by Poison Centres. Our objective was to examine the impact of a provincial 811 TTS on the IWK Regional Poison Centre (RPC). Methods: This is a retrospective descriptive study using interrupted time series methodology. We compared monthly IWK RPC call volume in the pre-811 era (January 2007-July 2009) and the post-811 era (September 2009-December 2017). We summarized the characteristics of callers who accessed the IWK RPC in terms of client age, sex, intentionality, time of day, call disposition and outcome. Caller characteristics were compared between the pre- and post-811 eras using chi-square test for categorical variables. We used segmented regression analysis to evaluate changes in slope of call volume in the pre- and post 811 eras. The Durbin-Watson statistic was performed to test for serial correlation and the Dickey-Fuller test to investigate seasonality. Results: The dataset included 82683 calls to the IWK RPC – 27028 pre-811 and 55655 post-811. Overall, 55% of calls were for female clients and the largest age group was children aged 0-5 years (37%). Most calls originated from home (47%), followed by a health care facility (23%). Most calls were managed at home (65%). Less than 3% of calls resulted in major effect or death. The Durbin Watson statistic was not statistically significant (p = 0.94). The Dickey-Fuller test indicated series stationarity (p = 0.001). There was no statistically significant change in call volume to the IWK RPC due to the introduction of 811 (p = 0.39). There was no significant variation by time of day, day of week or month, with most calls occurring in the evening. There were significantly more calls regarding intentional ingestions in the post-811 era (23% vs. 19% pre-811, p < .001). Outcomes in the pre and post 811 eras were as follows: minor/no effect/non-toxic/minimal 80% vs. 78%; moderate 7% vs. 10%; and, major/death 1.7% vs. 2.0%. Conclusion: The introduction of a TTS did not change call volumes at our RPC. The increase in the percentage of calls about intentional ingestions may reflect an increase in call acuity as the 811-TTS likely manages calls about minor/non-toxic ingestions without consulting with the RPC. Our future research will examine the nature of poison related calls to the 811-TTS.
Firefighters are routinely exposed to various traumatic events and often experience a range of trauma-related symptoms. Although these repeated traumatic exposures rarely progress to the development of post-traumatic stress disorder, firefighters are still considered to be a vulnerable population with regard to trauma.
To investigate how the human brain responds to or compensates for the repeated experience of traumatic stress.
We included 98 healthy firefighters with repeated traumatic experiences but without any diagnosis of mental illness and 98 non-firefighter healthy individuals without any history of trauma. Functional connectivity within the fear circuitry, which consists of the dorsal anterior cingulate cortex, insula, amygdala, hippocampus and ventromedial prefrontal cortex (vmPFC), was examined using resting-state functional magnetic resonance imaging. Trauma-related symptoms were evaluated using the Impact of Event Scale – Revised.
The firefighter group had greater functional connectivity between the insula and several regions of the fear circuitry including the bilateral amygdalae, bilateral hippocampi and vmPFC as compared with healthy individuals. In the firefighter group, stronger insula–amygdala connectivity was associated with greater severity of trauma-related symptoms (β = 0.36, P = 0.005), whereas higher insula–vmPFC connectivity was related to milder symptoms in response to repeated trauma (β = −0.28, P = 0.01).
The current findings suggest an active involvement of insular functional connectivity in response to repeated traumatic stress. Functional connectivity of the insula in relation to the amygdala and vmPFC may be potential pathways that underlie the risk for and resilience to repeated traumatic stress, respectively.
Background Attention-deficit/hyperactivity disorder (ADHD) is among the most common psychiatric disorders of childhood that often persists into adulthood and old age. Yet ADHD is currently underdiagnosed and undertreated in many European countries, leading to chronicity of symptoms and impairment, due to lack of, or ineffective treatment, and higher costs of illness.
Methods The European Network Adult ADHD and the Section for Neurodevelopmental Disorders Across the Lifespan (NDAL) of the European Psychiatric Association (EPA), aim to increase awareness and knowledge of adult ADHD in and outside Europe. This Updated European Consensus Statement aims to support clinicians with research evidence and clinical experience from 63 experts of European and other countries in which ADHD in adults is recognized and treated.
Results Besides reviewing the latest research on prevalence, persistence, genetics and neurobiology of ADHD, three major questions are addressed: (1) What is the clinical picture of ADHD in adults? (2) How should ADHD be properly diagnosed in adults? (3) How should adult ADHDbe effectively treated?
Conclusions ADHD often presents as a lifelong impairing condition. The stigma surrounding ADHD, mainly due to lack of knowledge, increases the suffering of patients. Education on the lifespan perspective, diagnostic assessment, and treatment of ADHD must increase for students of general and mental health, and for psychiatry professionals. Instruments for screening and diagnosis of ADHD in adults are available, as are effective evidence-based treatments for ADHD and its negative outcomes. More research is needed on gender differences, and in older adults with ADHD.
The illegal wildlife trade is driving declines in populations of a number of large, charismatic animal species but also many lesser known and restricted-range species, some of which are now facing extinction as a result. The ploughshare tortoise Astrochelys yniphora, endemic to the Baly Bay National Park of north-western Madagascar, is affected by poaching for the international illegal pet trade. To quantify this, we estimated population trends during 2006–2015, using distance sampling surveys along line transects, and recorded national and international confiscations of trafficked tortoises for 2002–2016. The results suggest the ploughshare tortoise population declined > 50% during this period, to c. 500 adults and subadults in 2014–2015. Prior to 2006 very few tortoises were seized either in Madagascar or internationally but confiscations increased sharply from 2010. Since 2015 poaching has intensified, with field reports suggesting that two of the four subpopulations are extinct, leaving an unknown but almost certainly perilously low number of adult tortoises in the wild. This study has produced the first reliable population estimate of the ploughshare tortoise and shows that the species has declined rapidly because of poaching for the international pet trade. There is an urgent need for increased action both in Madagascar and along international trade routes if the extinction of the ploughshare tortoise in the wild is to be prevented.
Knowledge of the effects of burial depth and burial duration on seed viability and, consequently, seedbank persistence of Palmer amaranth (Amaranthus palmeri S. Watson) and waterhemp [Amaranthus tuberculatus (Moq.) J. D. Sauer] ecotypes can be used for the development of efficient weed management programs. This is of particular interest, given the great fecundity of both species and, consequently, their high seedbank replenishment potential. Seeds of both species collected from five different locations across the United States were investigated in seven states (sites) with different soil and climatic conditions. Seeds were placed at two depths (0 and 15 cm) for 3 yr. Each year, seeds were retrieved, and seed damage (shrunken, malformed, or broken) plus losses (deteriorated and futile germination) and viability were evaluated. Greater seed damage plus loss averaged across seed origin, burial depth, and year was recorded for lots tested at Illinois (51.3% and 51.8%) followed by Tennessee (40.5% and 45.1%) and Missouri (39.2% and 42%) for A. palmeri and A. tuberculatus, respectively. The site differences for seed persistence were probably due to higher volumetric water content at these sites. Rates of seed demise were directly proportional to burial depth (α=0.001), whereas the percentage of viable seeds recovered after 36 mo on the soil surface ranged from 4.1% to 4.3% compared with 5% to 5.3% at the 15-cm depth for A. palmeri and A. tuberculatus, respectively. Seed viability loss was greater in the seeds placed on the soil surface compared with the buried seeds. The greatest influences on seed viability were burial conditions and time and site-specific soil conditions, more so than geographical location. Thus, management of these weed species should focus on reducing seed shattering, enhancing seed removal from the soil surface, or adjusting tillage systems.
Anti-retroviral therapy (ART) regimes for HIV are associated with raised levels of circulating triglycerides (TGs) in western populations. However, there are limited data on the impact of ART on cardiometabolic risk in sub-Saharan African (SSA) populations.
Pooled analyses of 14 studies comprising 21 023 individuals, on whom relevant cardiometabolic risk factors (including TG), HIV and ART status were assessed between 2003 and 2014, in SSA. The association between ART and raised TG (>2.3 mmol/L) was analysed using regression models.
Among 10 615 individuals, ART was associated with a two-fold higher probability of raised TG (RR 2.05, 95% CI 1.51–2.77, I2 = 45.2%). The associations between ART and raised blood pressure, glucose, HbA1c, and other lipids were inconsistent across studies.
Evidence from this study confirms the association of ART with raised TG in SSA populations. Given the possible causal effect of raised TG on cardiovascular disease (CVD), the evidence highlights the need for prospective studies to clarify the impact of long term ART on CVD outcomes in SSA.
People with pancreatic cancer have poor survival, and management is challenging. Pancreatic cancer patients' perceptions of their care coordination and its association with their outcomes have not been well-studied. Our objective was to determine if perception of care coordination is associated with patient-reported outcomes or survival.
People with pancreatic cancer who were 1–8 months postdiagnosis (52 with completed resection and 58 with no resection) completed a patient-reported questionnaire that assessed their perceptions of care coordination, quality of life, anxiety, and depression using validated instruments. Mean scores for 15 care-coordination items were calculated and then ranked from highest (best experience) to lowest (worst experience). Associations between care-coordination scores (including communication and navigation domains) and patient-reported outcomes and survival were investigated using general linear regression and Cox regression, respectively. All analyses were stratified by whether or not the tumor had been resected.
In both groups, the highest-ranked care-coordination items were: knowing who was responsible for coordinating care, health professionals being informed about their history, and waiting times. The worst-ranked items related to: how often patients were asked about visits with other health professionals and how well they and their family were coping, knowing the symptoms they should monitor, having sufficient emotional help from staff, and access to additional specialist services. For people who had a resection, better communication and navigation scores were significantly associated with higher quality of life and less anxiety and depression. However, these associations were not statistically significant for those with no resection. Perception of cancer care coordination was not associated with survival in either group.
Significance of results:
Our results suggest that, while many core clinical aspects of care are perceived to be done well for pancreatic cancer patients, improvements in emotional support, referral to specialist services, and self-management education may improve patient-reported outcomes.
Fruit and vegetable (FV) intake is associated with reduced risk of a number of non-communicable diseases. Research tends to focus on antioxidants, flavonoids and polyphenols contained in FV as the main beneficial components to health; however, increasing FV may also alter overall diet profile. Extra FV may be substituted for foods thought to be less healthy, therefore altering the overall macronutrient and/or micronutrient content in the diet. This analysis merged dietary data from four intervention studies in participants with varying health conditions and examined the effect of increased FV consumption on diet profile. Dietary intake was assessed by either diet diaries or diet histories used in four FV randomised intervention studies. All food and drink intake recorded was analysed using WISP version 3.0, and FV portions were manually counted using household measures. Regression analysis revealed significant increases in intakes of energy (172 kJ (+41 kcal)), carbohydrate (+3·9 g/4184 kJ (1000 kcal)), total sugars (+6·0 g/4184 kJ (1000 kcal)) and fibre (+0·8 g/4184 kJ (1000 kcal)) and significant decreases in intakes of total fat (−1·4 g/4184 kJ (1000 kcal)), SFA (−0·6 g/4184 kJ (1000 kcal)), MUFA (−0·6 g/4184 kJ (1000 kcal)), PUFA (−0·1 g/4184 kJ (1000 kcal)) and starch (−2·1 g/4184 kJ (1000 kcal)) per one portion increase in FV. Significant percentage increases were also observed in vitamin C (+24 %) and -carotene (+20 %) intake, per one portion increase in FV. In conclusion, pooled analysis of four FV intervention studies, that used similar approaches to achieving dietary change, in participants with varying health conditions, demonstrated an increase in energy, total carbohydrate, sugars and fibre intake, and a decrease in fat intake alongside an expected increase in micronutrient intake.
A significant minority of people presenting with a major depressive episode (MDE) experience co-occurring subsyndromal hypo/manic symptoms. As this presentation may have important prognostic and treatment implications, the DSM–5 codified a new nosological entity, the “mixed features specifier,” referring to individuals meeting threshold criteria for an MDE and subthreshold symptoms of (hypo)mania or to individuals with syndromal mania and subthreshold depressive symptoms. The mixed features specifier adds to a growing list of monikers that have been put forward to describe phenotypes characterized by the admixture of depressive and hypomanic symptoms (e.g., mixed depression, depression with mixed features, or depressive mixed states [DMX]). Current treatment guidelines, regulatory approvals, as well the current evidentiary base provide insufficient decision support to practitioners who provide care to individuals presenting with an MDE with mixed features. In addition, all existing psychotropic agents evaluated in mixed patients have largely been confined to patient populations meeting the DSM–IV definition of “mixed states” wherein the co-occurrence of threshold-level mania and threshold-level MDE was required. Toward the aim of assisting clinicians providing care to adults with MDE and mixed features, we have assembled a panel of experts on mood disorders to develop these guidelines on the recognition and treatment of mixed depression, based on the few studies that have focused specifically on DMX as well as decades of cumulated clinical experience.
Critical to the development of improved HIV elimination efforts is a greater understanding of how social networks and their dynamics are related to HIV risk and prevention. In this paper, we examine network stability of confidant and sexual networks among young black men who have sex with men (YBMSM). We use data from uConnect (2013–2016), a population-based, longitudinal cohort study. We use an innovative approach to measure both sexual and confidant network stability at three time points, and examine the relationship between each type of stability and HIV risk and prevention behaviors. This approach is consistent with a co-evolutionary perspective in which behavior is not only affected by static properties of an individual's network, but may also be associated with changes in the topology of his or her egocentric network. Our results indicate that although confidant and sexual network stability are moderately correlated, their dynamics are distinct with different predictors and differing associations with behavior. Both types of stability are associated with lower rates of risk behaviors, and both are reduced among those who have spent time in jail. Public health awareness and engagement with both types of networks may provide new opportunities for HIV prevention interventions.
Antimicrobial resistance (AMR) is a global public health threat. Emergence of AMR occurs naturally, but can also be selected for by antimicrobial exposure in clinical and veterinary medicine. Despite growing worldwide attention to AMR, there are substantial limitations in our understanding of the burden, distribution and determinants of AMR at the population level. We highlight the importance of population-based approaches to assess the association between antimicrobial use and AMR in humans and animals. Such approaches are needed to improve our understanding of the development and spread of AMR in order to inform strategies for the prevention, detection and management of AMR, and to support the sustainable use of antimicrobials in healthcare.
Many adults with autism spectrum disorder (ASD) remain undiagnosed. Specialist assessment clinics enable the detection of these cases, but such services are often overstretched. It has been proposed that unnecessary referrals to these services could be reduced by prioritizing individuals who score highly on the Autism-Spectrum Quotient (AQ), a self-report questionnaire measure of autistic traits. However, the ability of the AQ to predict who will go on to receive a diagnosis of ASD in adults is unclear.
We studied 476 adults, seen consecutively at a national ASD diagnostic referral service for suspected ASD. We tested AQ scores as predictors of ASD diagnosis made by expert clinicians according to International Classification of Diseases (ICD)-10 criteria, informed by the Autism Diagnostic Observation Schedule-Generic (ADOS-G) and Autism Diagnostic Interview-Revised (ADI-R) assessments.
Of the participants, 73% received a clinical diagnosis of ASD. Self-report AQ scores did not significantly predict receipt of a diagnosis. While AQ scores provided high sensitivity of 0.77 [95% confidence interval (CI) 0.72–0.82] and positive predictive value of 0.76 (95% CI 0.70–0.80), the specificity of 0.29 (95% CI 0.20–0.38) and negative predictive value of 0.36 (95% CI 0.22–0.40) were low. Thus, 64% of those who scored below the AQ cut-off were ‘false negatives’ who did in fact have ASD. Co-morbidity data revealed that generalized anxiety disorder may ‘mimic’ ASD and inflate AQ scores, leading to false positives.
The AQ's utility for screening referrals was limited in this sample. Recommendations supporting the AQ's role in the assessment of adult ASD, e.g. UK NICE guidelines, may need to be reconsidered.
The burden and aetiology of type 2 diabetes (T2D) and its microvascular complications may be influenced by varying behavioural and lifestyle environments as well as by genetic susceptibility. These aspects of the epidemiology of T2D have not been reliably clarified in sub-Saharan Africa (SSA), highlighting the need for context-specific epidemiological studies with the statistical resolution to inform potential preventative and therapeutic strategies. Therefore, as part of the Human Heredity and Health in Africa (H3Africa) initiative, we designed a multi-site study comprising case collections and population-based surveys at 11 sites in eight countries across SSA. The goal is to recruit up to 6000 T2D participants and 6000 control participants. We will collect questionnaire data, biophysical measurements and biological samples for chronic disease traits, risk factors and genetic data on all study participants. Through integrating epidemiological and genomic techniques, the study provides a framework for assessing the burden, spectrum and environmental and genetic risk factors for T2D and its complications across SSA. With established mechanisms for fieldwork, data and sample collection and management, data-sharing and consent for re-approaching participants, the study will be a resource for future research studies, including longitudinal studies, prospective case ascertainment of incident disease and interventional studies.
With the changing distribution of infectious diseases, and an increase in the burden of non-communicable diseases, low- and middle-income countries, including those in Africa, will need to expand their health care capacities to effectively respond to these epidemiological transitions. The interrelated risk factors for chronic infectious and non-communicable diseases and the need for long-term disease management, argue for combined strategies to understand their underlying causes and to design strategies for effective prevention and long-term care. Through multidisciplinary research and implementation partnerships, we advocate an integrated approach for research and healthcare for chronic diseases in Africa.
Satellite altimetric time series allow high-precision monitoring of ice-sheet mass balance. Understanding elevation changes in these regions is important because outlet glaciers along ice-sheet margins are critical in controlling flow of inland ice. Here we discuss a new airborne altimetry dataset collected as part of the ICECAP (International Collaborative Exploration of the Cryosphere by Airborne Profiling) project over East Antarctica. Using the ALAMO (Airborne Laser Altimeter with Mapping Optics) system of a scanning photon-counting lidar combined with a laser altimeter, we extend the 2003–09 surface elevation record of NASA’s ICESat satellite, by determining cross-track slope and thus independently correcting for ICESat’s cross-track pointing errors. In areas of high slope, cross-track errors result in measured elevation change that combines surface slope and the actual Δz/Δt signal. Slope corrections are particularly important in coastal ice streams, which often exhibit both rapidly changing elevations and high surface slopes. As a test case (assuming that surface slopes do not change significantly) we observe a lack of ice dynamic change at Cook Ice Shelf, while significant thinning occurred at Totten and Denman Glaciers during 2003–09.