We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The term habitable is used to describe planets that can harbour life. Debate exists as to specific conditions that allow for habitability but the use of the term as a planetary variable has become ubiquitous. This paper poses a meta-level question: What type of variable is habitability? Is it akin to temperature, in that it is something that characterizes a planet, or is something that flows through a planet, akin to heat? That is, is habitability a state or a process variable? Forth coming observations can be used to discriminate between these end-member hypotheses. Each has different implications for the factors that lead to differences between planets (e.g. the differences between Earth and Venus). Observational tests can proceed independent of any new modelling of planetary habitability. However, the viability of habitability as a process can influence future modelling. We discuss a specific modelling framework based on anticipating observations that can discriminate between different views of habitability.
It is unclear if mild-to-moderate dehydration independently affects mood without confounders like heat exposure or exercise. This study examined the acute effect of cellular dehydration on mood. Forty-nine adults (55 % female, age 39 (sd 8) years) were assigned to counterbalanced, crossover trials. Intracellular dehydration was induced with 2-h (0·1 ml/kg per min) 3 % hypertonic saline (HYPER) infusion or 0·9 % isotonic saline (ISO) as a control. Plasma osmolality increased in HYPER (pre 285 (sd 3), post 305 (sd 4) mmol/kg; P < 0·05) but remained unchanged in ISO (pre 285 (sd 3), post 288 (sd 3) mmol/kg; P > 0·05). Mood was assessed with the short version of the Profile of Mood States Questionnaire (POMS). The POMS sub-scale (confusion-bewilderment, depression-dejection, fatigue-inertia) increased in HYPER compared with ISO (P < 0·05). Total mood disturbance score (TMD) assessed by POMS increased from 10·3 (sd 0·9) to 16·6 (sd 1·7) in HYPER (P < 0·01), but not in ISO (P > 0·05). When TMD was stratified by sex, the increase in the HYPER trial was significant in females (P < 0·01) but not in males (P > 0·05). Following infusion, thirst and copeptin (surrogate for vasopressin) were also higher in females than in males (21·3 (sd 2·0), 14·1 (sd 1·4) pmol/l; P < 0·01) during HYPER. In conclusion, cellular dehydration acutely degraded specific aspects of mood mainly in women. The mechanisms underlying sex differences may be related to elevated thirst and vasopressin.
There is considerable interest in the therapeutic potential of Cannabidiol (CBD), the second most abundant component of Cannabis. While delta-9-THC, the main psychoactive ingredient of cannabis, impairs memory and induces anxiety and psychotic symptoms acutely and increases the risk of psychotic disorders in regular cannabis users, CBD does not impair memory, may have anxiolytic and possibly antipsychotic effects. Hence, we compared directly the acute neural effects of these two active ingredients of cannabis, by combining pharmacological challenge with fMRI. Using a double-blind, repeated measures design and oral challenge with 10mg of delta-9-THC, 600mg of CBD or placebo in 15 healthy volunteers, we examined whether delta-9-THC and CBD have opposing effects on the neural substrates of verbal memory and fear processing and whether they also have opposing effects on the neural substrates of anxiety and psychotic symptoms induced by delta-9-THC. Delta-9-THC induced anxiety and psychotic symptoms acutely while there was a trend for a reduction in anxiety but no change in psychotic symptoms with CBD. During the memory task, delta-9-THC attenuated and CBD increased activation in the striatum bilaterally. Effect of delta-9-THC on striatal activation was inversely correlated with the psychotic symptoms induced by it concomitantly. During the processing of fearful faces, delta-9-THC increased and CBD attenuated activation in the amygdala and these effects correlated with their anxiogenic and anxiolytic effects respectively. These opposing effects of CBD on the key neural substrates for psychotic symptoms and anxiety induced by delta-9-THC may suggest its possible therapeutic role in countering these conditions.
The study sought to examine the neurophysiological effects of cannabidiol (CBD) on the emotional processing using functional Magnetic Resonance Imaging (fMRI).
Method
Fifteen healthy male participants (age range 18-35) with a lifetime exposure to cannabis of 15 times or less were recruited in a double blind event-related fMRI design. Prior to each scanning session, participants were given an oral dose of either 600mg CBD or a placebo. The blood levels of drugs were monitored via an intravenous line, while systolic and diastolic blood pressure and heart rate (beats per minute) were recorded manually. During the scan, subjects were presented with 10 different facial identities, each identity expressing 50% or 100% intensities of fear or a neutral expression. Neuropsychological performance and symptoms ratings were recorded at baseline, immediately before scanning (1 hr), immediately after scanning (2 hr), and one hour post scanning (3 hr).
Results
CBD had no significant effect on the gender discrimination task. Reaction times were significantly faster when processing 100% fearful faces than compared to 50% fearful and neutral faces. CBD had a significant effect on brain activation in response to faces with emotional expressions, decreasing activation in the right posterior cingulate gyrus and in the right cerebellum, when compared to placebo. Furthermore, a significant interaction effect was observed. In the right cingulate gyrus CBD attenuated activation during the processing of intense fearful faces but had no effect of neural response to neutral or mild fearful faces.
Conclusion
CBD significantly modulates the neurophysiological response associated with anxiety.
To assess differences in cognition functions and gross brain structure in children seven years after an episode of severe acute malnutrition (SAM), compared with other Malawian children.
Design
Prospective longitudinal cohort assessing school grade achieved and results of five computer-based (CANTAB) tests, covering three cognitive domains. A subset underwent brain MRI scans which were reviewed using a standardized checklist of gross abnormalities and compared with a reference population of Malawian children.
Setting
Blantyre, Malawi.
Participants
Children discharged from SAM treatment in 2006 and 2007 (n 320; median age 9·3 years) were compared with controls: siblings closest in age to the SAM survivors and age/sex-matched community children.
Results
SAM survivors were significantly more likely to be in a lower grade at school than controls (adjusted OR = 0·4; 95 % CI 0·3, 0·6; P < 0·0001) and had consistently poorer scores in all CANTAB cognitive tests. Adjusting for HIV and socio-economic status diminished statistically significant differences. There were no significant differences in odds of brain abnormalities and sinusitis between SAM survivors (n 49) and reference children (OR = 1·11; 95 % CI 0·61, 2·03; P = 0·73).
Conclusions
Despite apparent preservation in gross brain structure, persistent impaired school achievement is likely to be detrimental to individual attainment and economic well-being. Understanding the multifactorial causes of lower school achievement is therefore needed to design interventions for SAM survivors to thrive in adulthood. The cognitive and potential economic implications of SAM need further emphasis to better advocate for SAM prevention and early treatment.
The evolutionary-aided design process is a method to find solutions to design and optimisation problems. Evolutionary algorithms (EAs) are applied to search for optimal solutions from a solution space that evolves over several generations. EAs have found applications in many areas of robotics. This paper covers the efforts to determine body morphology of robots through evolution and body morphology with the controller of robots or similar creatures through co-evolution. The works are reviewed from the perspective of how different algorithms are applied and includes a brief explanation of how they are implemented.
OBJECTIVES/SPECIFIC AIMS: Clinical guidelines recommend using predicted atherosclerotic cardiovascular disease (ASCVD) risk to inform treatment decisions. The objective was to compare the contribution of changes in modifiable risk factors Versus aging to the development of high 10-year predicted ASCVD risk. METHODS/STUDY POPULATION: Prospective follow-up of the Jackson Heart Study, an exclusively African-American cohort, at visit 1 (2000–2004) and visit 3 (2009–2012). Analyses included 1115 African-American participants without a high 10-year predicted ASCVD risk (<7.5%), hypertension, diabetes, or ASCVD at visit 1. We used the Pooled Cohort equations to calculate the incidence of high (≥7.5%) 10-year predicted ASCVD risk at visit 3. We recalculated the percentage with a high 10-year predicted ASCVD risk at visit 3 assuming each risk factor [age, systolic blood pressure (SBP), antihypertensive medication use, diabetes, smoking, total and high-density lipoprotein cholesterol], one at a time, did not change from visit 1. RESULTS/ANTICIPATED RESULTS: The mean age at visit 1 was 45.2±9.5 years. Overall, 30.9% (95% CI 28.3%–33.4%) of participants developed high 10-year predicted ASCVD risk. Aging accounted for 59.7% (95% CI 54.2%–65.1%) of the development of high 10-year predicted ASCVD risk compared with 32.8% (95% CI 27.0%–38.2%) for increases in SBP or antihypertensive medication initiation and 12.8% (95% CI 9.6%–16.5%) for incident diabetes. Among participants <50 years, the contribution of increases in SBP or antihypertensive medication initiation was similar to aging. DISCUSSION/SIGNIFICANCE OF IMPACT: Increases in SBP and antihypertensive medication initiation are major contributors to the development of high 10-year predicted ASCVD risk in African Americans, particularly among younger adults.
Code phase Global Navigation Satellite System (GNSS) positioning performance is often described by the Geometric or Position Dilution of Precision (GDOP or PDOP), functions of the number of satellites employed in the solution and their geometry. This paper develops lower bounds to both metrics solely as functions of the number of satellites, effectively removing the added complexity caused by their locations in the sky, to allow users to assess how well their receivers are performing with respect to the best possible performance. Such bounds will be useful as receivers sub-select from the plethora of satellites available with multiple GNSS constellations. The bounds are initially developed for one constellation assuming that the satellites are at or above the horizon. Satellite constellations that essentially achieve the bounds are discussed, again with value toward the problem of satellite selection. The bounds are then extended to a non-zero mask angle and to multiple constellations.
Observational evidence suggests that increased whole grain (WG) intake reduces the risks of many non-communicable diseases, such as CVD, type 2 diabetes, obesity and certain cancers. More recently, studies have shown that WG intake lowers all-cause and cause-specific mortality. Much of the reported evidence on risk reduction is from US and Scandinavian populations, where there are tangible WG dietary recommendations. At present there is no quantity-specific WG dietary recommendation in the UK, instead we are advised to choose WG or higher fibre versions. Despite recognition of WG as an important component of a healthy diet, monitoring of WG intake in the UK has been poor, with the latest intake assessment from data collected in 2000–2001 for adults and in 1997 for children. To update this information we examined WG intake in the National Diet and Nutrition Survey rolling programme 2008–2011 after developing our database of WG food composition, a key resource in determining WG intake accurately. The results showed median WG intakes remain low in both adults and children and below that of countries with quantity-specific guidance. We also found a reduction in C-reactive protein concentrations and leucocyte counts with increased WG intake, although no association with other markers of cardio-metabolic health. The recent recommendations by the UK Scientific Advisory Committee on Nutrition to increase dietary fibre intake will require a greater emphasis on consuming more WG. Specific recommendations on WG intake in the UK are warranted as is the development of public health policy to promote consumption of these important foods.
Immigrants and their children who return to their country of origin to visit friends and relatives (VFR) are at increased risk of acquiring infectious diseases compared to other travellers. VFR travel is an important disease control issue, as one quarter of Australia's population are foreign-born and one quarter of departing Australian international travellers are visiting friends and relatives. We conducted a 1-year prospective enhanced surveillance study in New South Wales and Victoria, Australia to determine the contribution of VFR travel to notifiable diseases associated with travel, including typhoid, paratyphoid, measles, hepatitis A, hepatitis E, malaria and chikungunya. Additional data on characteristics of international travel were collected. Recent international travel was reported by 180/222 (81%) enhanced surveillance cases, including all malaria, chikungunya and paratyphoid cases. The majority of cases who acquired infections during travel were immigrant Australians (96, 53%) or their Australian-born children (43, 24%). VFR travel was reported by 117 (65%) travel-associated cases, highest for typhoid (31/32, 97%). Cases of children (aged <18 years) (86%) were more frequently VFR travellers compared to adult travellers (57%, P < 0·001). VFR travel is an important contributor to imported disease in Australia. Communicable disease control strategies targeting these travellers, such as targeted health promotion, are likely to impact importation of these travel-related infections.
Public health bodies in many countries are attempting to increase population-wide habitual consumption of whole grains. Limited data on dietary habits exist in Singaporean children. The present study therefore aimed to assess whole grain consumption patterns in Singaporean children and compare these with dietary intake, physical activity and health parameters. Dietary intake (assessed by duplicate, multipass, 24-h food recalls), physical activity (by questionnaire) and anthropometric measurements were collected from a cross-section of 561 Singaporean children aged 6–12 years. Intake of whole grains was evaluated using estimates of portion size and international food composition data. Only 38·3 % of participants reported consuming whole grains during the dietary data collection days. Median intake of whole grains in consumers was 15·3 (interquartile range 5·4–34·8) g/d. The most commonly consumed whole-grain food groups were rice (29·5 %), wholemeal bread (28·9 %) and ready-to-eat breakfast cereals (18·8 %). A significantly lower proportion of Malay children (seven out of fifty-eight; P < 0·0001) consumed whole grains than children of other ethnicities. Only 6 % of all children consumed the amount of whole grains most commonly associated with improved health outcomes (48 g/d). There was no relationship between whole grain consumption patterns and BMI, waist circumference or physical activity but higher whole grain intake was associated with increased fruit, vegetable and dairy product consumption (P < 0·001). These findings demonstrate that consumption of whole grain foods is low at a population level and infrequent in Singaporean children. Future drives to increase whole-grain food consumption in this population are likely to require input from multiple stakeholders.
A number of socio-economic, biological and lifestyle characteristics change with advancing age and place very old adults at increased risk of micronutrient deficiencies. The aim of this study was to assess vitamin and mineral intakes and respective food sources in 793 75-year-olds (302 men and 491 women) in the North-East of England, participating in the Newcastle 85+ Study. Micronutrient intakes were estimated using a multiple-pass recall tool (2×24 h recalls). Determinants of micronutrient intake were assessed with multinomial logistic regression. Median vitamin D, Ca and Mg intakes were 2·0 (interquartile range (IQR) 1·2–6·5) µg/d, 731 (IQR 554–916) mg/d and 215 (IQR 166–266) mg/d, respectively. Fe intake was 8·7 (IQR 6·7–11·6) mg/d, and Se intake was 39·0 (IQR 27·3–55·5) µg/d. Cereals and cereal products were the top contributors to intakes of folate (31·5 %), Fe (49·2 %) and Se (46·7 %) and the second highest contributors to intakes of vitamin D (23·8 %), Ca (27·5 %) and K (15·8 %). More than 95 % (n 756) of the participants had vitamin D intakes below the UK’s Reference Nutrient Intake (10 µg/d). In all, >20 % of the participants were below the Lower Reference Nutrient Intake for Mg (n 175), K (n 238) and Se (n 418) (comparisons with dietary reference values (DRV) do not include supplements). As most DRV are not age specific and have been extrapolated from younger populations, results should be interpreted with caution. Participants with higher education, from higher social class and who were more physically active had more nutrient-dense diets. More studies are needed to inform the development of age-specific DRV for micronutrients for the very old.
Homelessness is present in most societies and represents a situation in which the basic needs for survival including food are often limited. It is logical to surmise that the homeless person’s diet is likely to be nutritionally deficient and yet there is a relative paucity in research regarding this issue with studies varying in both their methodology and homeless population. Despite these differences, diets of the homeless are frequently characterised as high in saturated fat and deficient in fibre and certain micronutrients, all of which can have negative implications for the homeless individual’s health and/or mental state. The conclusion from intervention studies is that there is no consensus as to the most effective method for assessing dietary intake. In order to address this, the present review aims to provide a greater understanding of the existing literature surrounding nutrition and the homeless and to act as a foundation from which further research can be conducted. An evaluation of the main findings and challenges surrounding the assessment of the nutritional status of the homeless will be provided followed by a review of the physical and mental consequences of the homeless diet. Current and potential interventions aimed at increasing the nutritional quality of food consumed by the homeless will be addressed with a focus on the role of the nutritional science community in assisting in this endeavour.
Very old people (referred to as those aged 85 years and over) are the fastest growing age segment of many Western societies owing to the steady rise of life expectancy and decrease in later life mortality. In the UK, there are now more than 1·5 million very old people (2·5 % of total population) and the number is projected to rise to 3·3 million or 5 % over the next 20 years. Reduced mobility and independence, financial constraints, higher rates of hospitalisation, chronic diseases and disabilities, changes in body composition, taste perception, digestion and absorption of food all potentially influence either nutrient intake or needs at this stage of life. The nutritional needs of the very old have been identified as a research priority by the British Nutrition Foundation's Task Force report, Healthy Ageing: The Role of Nutrition and Lifestyle. However, very little is known about the dietary habits and nutritional status of the very old. The Newcastle 85+ study, a cohort of more than 1000 85-year olds from the North East of England and the Life and Living in Advanced Age study (New Zealand), a bicultural cohort study of advanced ageing of more than 900 participants from the Bay of Plenty and Rotorua regions of New Zealand are two unique cohort studies of ageing, which aim to assess the spectrum of health in the very old as well as examine the associations of health trajectories and outcomes with biological, clinical and social factors as each cohort ages. The nutrition domain included in both studies will help to fill the evidence gap by identifying eating patterns, and measures of nutritional status associated with better, or worse, health and wellbeing. This review will explore some of this ongoing work.
Food and nutrient intake data are scarce in very old adults (85 years and older) – one of the fastest growing age segments of Western societies, including the UK. Our primary objective was to assess energy and macronutrient intakes and respective food sources in 793 85-year-olds (302 men and 491 women) living in North-East England and participating in the Newcastle 85+ cohort Study. Dietary information was collected using a repeated multiple-pass recall (2×24 h recalls). Energy, macronutrient and NSP intakes were estimated, and the contribution (%) of food groups to nutrient intake was calculated. The median energy intake was 6·65 (interquartile ranges (IQR) 5·49–8·16) MJ/d – 46·8 % was from carbohydrates, 36·8 % from fats and 15·7 % from proteins. NSP intake was 10·2 g/d (IQR 7·3–13·7). NSP intake was higher in non-institutionalised, more educated, from higher social class and more physically active 85-year-olds. Cereals and cereal products were the top contributors to intakes of energy and most macronutrients (carbohydrates, non-milk extrinsic sugars, NSP and fat), followed by meat and meat products. The median intakes of energy and NSP were much lower than the estimated average requirement for energy (9·6 MJ/d for men and 7·7 MJ/d for women) and the dietary reference value (DRV) for NSP (≥18 g/d). The median SFA intake was higher than the DRV (≤11 % of dietary energy). This study highlights the paucity of data on dietary intake and the uncertainties about DRV for this age group.
Increased whole-grain (WG) consumption reduces the risk of CVD, type 2 diabetes and some cancers, is related to reduced body weight and weight gain and is related to improved intestinal health. Definitions of ‘WG’ and ‘WG food’ are proposed and used in some countries but are not consistent. Many countries promote WG consumption, but the emphasis given and the messages used vary. We surveyed dietary recommendations of fifty-three countries for mentions of WG to assess the extent, rationale and diversity in emphasis and wording of any recommendations. If present, recommendations were classified as either ‘primary’, where the recommendation was specific for WG, or ‘secondary’, where recommendations were made in order to achieve another (primary) target, most often dietary fibre intake. In total, 127 organisations were screened, including government, non-governmental organisations, charities and professional bodies, the WHO and European Food Safety Authority, of which forty-nine including WHO provide a WG intake recommendation. Recommendations ranged from ‘specific’ with specified target amounts (e.g. x g WG/d), ‘semi-quantitative’ where intake was linked to intake of cereal/carbohydrate foods with proportions of WG suggested (e.g. x servings of cereals of which y servings should be WG) to ‘non-specific’ based on ‘eating more’ WG or ‘choosing WG where possible’. This lack of a harmonised message may result in confusion for the consumer, lessen the impact of public health messages and pose barriers to trade in the food industry. A science-based consensus or expert opinion on WG recommendations is needed, with a global reach to guide public health decision making and increase WG consumption globally.