To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To assess differences in cognition functions and gross brain structure in children seven years after an episode of severe acute malnutrition (SAM), compared with other Malawian children.
Prospective longitudinal cohort assessing school grade achieved and results of five computer-based (CANTAB) tests, covering three cognitive domains. A subset underwent brain MRI scans which were reviewed using a standardized checklist of gross abnormalities and compared with a reference population of Malawian children.
Children discharged from SAM treatment in 2006 and 2007 (n 320; median age 9·3 years) were compared with controls: siblings closest in age to the SAM survivors and age/sex-matched community children.
SAM survivors were significantly more likely to be in a lower grade at school than controls (adjusted OR = 0·4; 95 % CI 0·3, 0·6; P < 0·0001) and had consistently poorer scores in all CANTAB cognitive tests. Adjusting for HIV and socio-economic status diminished statistically significant differences. There were no significant differences in odds of brain abnormalities and sinusitis between SAM survivors (n 49) and reference children (OR = 1·11; 95 % CI 0·61, 2·03; P = 0·73).
Despite apparent preservation in gross brain structure, persistent impaired school achievement is likely to be detrimental to individual attainment and economic well-being. Understanding the multifactorial causes of lower school achievement is therefore needed to design interventions for SAM survivors to thrive in adulthood. The cognitive and potential economic implications of SAM need further emphasis to better advocate for SAM prevention and early treatment.
OBJECTIVES/SPECIFIC AIMS: Clinical guidelines recommend using predicted atherosclerotic cardiovascular disease (ASCVD) risk to inform treatment decisions. The objective was to compare the contribution of changes in modifiable risk factors Versus aging to the development of high 10-year predicted ASCVD risk. METHODS/STUDY POPULATION: Prospective follow-up of the Jackson Heart Study, an exclusively African-American cohort, at visit 1 (2000–2004) and visit 3 (2009–2012). Analyses included 1115 African-American participants without a high 10-year predicted ASCVD risk (<7.5%), hypertension, diabetes, or ASCVD at visit 1. We used the Pooled Cohort equations to calculate the incidence of high (≥7.5%) 10-year predicted ASCVD risk at visit 3. We recalculated the percentage with a high 10-year predicted ASCVD risk at visit 3 assuming each risk factor [age, systolic blood pressure (SBP), antihypertensive medication use, diabetes, smoking, total and high-density lipoprotein cholesterol], one at a time, did not change from visit 1. RESULTS/ANTICIPATED RESULTS: The mean age at visit 1 was 45.2±9.5 years. Overall, 30.9% (95% CI 28.3%–33.4%) of participants developed high 10-year predicted ASCVD risk. Aging accounted for 59.7% (95% CI 54.2%–65.1%) of the development of high 10-year predicted ASCVD risk compared with 32.8% (95% CI 27.0%–38.2%) for increases in SBP or antihypertensive medication initiation and 12.8% (95% CI 9.6%–16.5%) for incident diabetes. Among participants <50 years, the contribution of increases in SBP or antihypertensive medication initiation was similar to aging. DISCUSSION/SIGNIFICANCE OF IMPACT: Increases in SBP and antihypertensive medication initiation are major contributors to the development of high 10-year predicted ASCVD risk in African Americans, particularly among younger adults.
Observational evidence suggests that increased whole grain (WG) intake reduces the risks of many non-communicable diseases, such as CVD, type 2 diabetes, obesity and certain cancers. More recently, studies have shown that WG intake lowers all-cause and cause-specific mortality. Much of the reported evidence on risk reduction is from US and Scandinavian populations, where there are tangible WG dietary recommendations. At present there is no quantity-specific WG dietary recommendation in the UK, instead we are advised to choose WG or higher fibre versions. Despite recognition of WG as an important component of a healthy diet, monitoring of WG intake in the UK has been poor, with the latest intake assessment from data collected in 2000–2001 for adults and in 1997 for children. To update this information we examined WG intake in the National Diet and Nutrition Survey rolling programme 2008–2011 after developing our database of WG food composition, a key resource in determining WG intake accurately. The results showed median WG intakes remain low in both adults and children and below that of countries with quantity-specific guidance. We also found a reduction in C-reactive protein concentrations and leucocyte counts with increased WG intake, although no association with other markers of cardio-metabolic health. The recent recommendations by the UK Scientific Advisory Committee on Nutrition to increase dietary fibre intake will require a greater emphasis on consuming more WG. Specific recommendations on WG intake in the UK are warranted as is the development of public health policy to promote consumption of these important foods.
Debilitating patient-related non-cardiac co-morbidity cumulatively increases risk for congenital heart surgery. At our emerging programme, flexible surgical strategies were used in high-risk neonates and infants generally considered in-operable, in an attempt to make them surgical candidates and achieve excellent outcomes.
Materials and methods
Between April, 2010 and November, 2013, all referred neonates (142) and infants (300) (average scores: RACHS 2.8 and STAT 3.0) underwent 442 primary cardiac operations: patients with bi-ventricular lesions underwent standard (n=294) or alternative (n=19) repair/staging strategies, such as pulmonary artery banding(s), ductal stenting, right outflow patching, etc. Patients with uni-ventricular hearts followed standard (n=96) or alternative hybrid (n=34) staging. The impact of major pre-operative risk factors (37%), standard or alternative surgical strategy, prematurity (50%), gestational age, low birth weight, genetic syndromes (23%), and major non-cardiac co-morbidity requiring same admission surgery (27%) was analysed on the need for extracorporeal membrane oxygenation, mortality, length of intubation, as well as ICU and hospital length of stays.
The need for extracorporeal membrane oxygenation (8%) and hospital survival (94%) varied significantly between surgical strategy groups (p=0.0083 and 0.028, respectively). In high-risk patients, alternative bi- and uni-ventricular strategies minimised mortality, but were associated with prolonged intubation and ICU stay. Major pre-operative risk factors and lower weight at surgery significantly correlated with prolonged intubation, hospital length of stay, and mortality.
In our emerging programme, flexible surgical strategies were offered to 53/442 high-risk neonates and infants with complex CHDs and significant non-cardiac co-morbidity, in order to buffer risk and achieve patient survival, although at the cost of increased resource utilisation.
Increased whole grain intake has been shown to reduce the risk of many non-communicable diseases. Countries including the USA, Canada, Denmark and Australia have specific dietary guidelines on whole grain intake but others, including the UK, do not. Data from 1986/87 and 2000/01 have shown that whole grain intake is low and declining in British adults. The aim of the present study was to describe whole grain intakes in the most current dietary assessment of UK households using data from the National Diet and Nutrition Survey rolling programme 2008–11. In the present study, 4 d diet diaries were completed by 3073 individuals between 2008 and 2011, along with details of socio-economic status (SES). The median daily whole grain intake, calculated for each individual on a dry weight basis, was 20 g/d for adults and 13 g/d for children/teenagers. The corresponding energy-adjusted whole grain intake was 27 g/10 MJ per d for adults and 20 g/10 MJ per d for children/teenagers. Whole grain intake (absolute and energy-adjusted) increased with age, but was lowest in teenagers (13–17 years) and younger adults up to the age of 34 years. Of the total study population, 18 % of adults and 15 % of children/teenagers did not consume any whole-grain foods. Individuals from lower SES groups had a significantly lower whole grain intake than those from more advantaged classifications. The whole grain intake in the UK, although higher than in 2000/01, remains low and below that in the US and Danish recommendations in all age classes. Favourable pricing with increased availability of whole-grain foods and education may help to increase whole grain intake in countries without whole-grain recommendations. Teenagers and younger adults may need targeting to help increase whole grain consumption.
Epidemiological evidence suggests an inverse association between whole grain consumption and the risk of non-communicable diseases, such as CVD, type 2 diabetes, obesity and some cancers. A recent analysis of the National Diet and Nutrition Survey rolling programme (NDNS-RP) has shown lower intake of whole grain in the UK. It is important to understand whether the health benefits associated with whole grain intake are present at low levels of consumption. The present study aimed to investigate the association of whole grain intake with intakes of other foods, nutrients and markers of health (anthropometric and blood measures) in the NDNS-RP 2008–11, a representative dietary survey of UK households. A 4-d diet diary was completed by 3073 individuals. Anthropometric measures, blood pressure levels, and blood and urine samples were collected after diary completion. Individual whole grain intake was calculated with consumers categorised into tertiles of intake. Higher intake of whole grain was associated with significantly decreased leucocyte counts. Significantly higher concentrations of C-reactive protein were seen in adults in the lowest tertile of whole grain intake. No associations with the remaining health markers were seen, after adjustments for sex and age. Over 70 % of this population did not consume the minimum recommend intake associated with disease risk reduction, which may explain small variation across health markers. Nutrient intakes in consumers compared with non-consumers were closer to dietary reference values, such as higher intakes of fibre, Mg and Fe, and lower intakes of Na, suggesting that higher intake of whole grain is associated with improved diet quality.
There is an increasing interest in pasture-based dairy systems in Europe, mainly because of increasing production costs for intensive dairying. Milk is a matrix of compounds that influence nutritional and manufacturing properties, many dependent on husbandry linked to pasture-based systems (increase in pasture intake, forage : concentrate ratio, clover inclusion in swards/silages and use of alternative dairy breeds). The present study investigated the impact of three grazing-based dairy systems with contrasting feeding intensity or reliance on pasture intakes (conventional high-intensity, low pasture intake [CH], organic medium-intensity, medium pasture intake [OM], conventional low-intensity, high pasture intake [CL]) on milk fatty acid (FA) profiles, protein composition and α-tocopherol and antioxidants concentrations. The proportion of animals of alternative breeds (e.g. Jersey) and crossbred cows in the herd increased with decreasing production intensity (CH < OM < CL). Milk constituents known to be beneficial for human health, such as vaccenic acid, rumenic acid, monounsaturated FA, polyunsaturated FA, antioxidants and caseins, were elevated with decreasing production intensity (CH < OM < CL), while less desirable saturated FA were lower, although not all differences between OM and CL were significant. Omega-3 FA were maximized under OM practices, primarily as a result of higher clover intake. Increases in pasture intake may explain the higher concentrations of desirable FA while increased use of crossbreed cows is likely to be responsible for higher total protein and casein content of milk; a combination of these two factors may explain increased antioxidant levels. The higher concentrations of vaccenic acid, rumenic acid, omega-3 FA, lutein, zeaxanthin, protein and casein in OM and CL milk were found over most sampling months and in both years, reinforcing the higher nutritional quality and manufacturing properties associated with milk from these systems. A switch to pasture-based dairy products would increase the intake of milk's beneficial compounds and reduce consumption of less desirable saturated FA.
Control over place of death is deemed important, not only in providing a “good death,” but also in offering person-centered palliative care. Despite the wish to die at home being endorsed by many, few achieve it. The present study aimed to explore the reasons why this wish is not fulfilled by examining the stories of ten individuals who lost a loved one to cancer.
We adopted a narrative approach, with stories synthesized to create one metastory depicting plot similarities and differences.
Stories were divided into four chapters: (1) the cancer diagnosis, (2) the terminal stage and advancement of death, (3) death itself, and (4) reflections on the whole experience. Additionally, several reasons for cessation of home care were uncovered, including the need to consider children's welfare, exhaustion, and admission of the loved one by professionals due to a medical emergency. Some participants described adverse effects as a result of being unable to continue to support their loved one's wish to remain at home.
Significance of Results:
Reflections upon the accounts are provided with a discussion around potential clinical implications.
Increasingly, family or friends are providing care to those with cancer. However, the majority of those assuming the caring role have no prior knowledge related to the provision of care. The present study aimed to explore the experiences of informal carers with respect to their role, thus determining ways that services may support transition to this role.
In order to obtain an in-depth view of such experience, a qualitative meta-synthesis was employed to review the findings of 17 studies.
Out of this synthesis, three main concepts were developed: (1) identity and adopting the caring role, (2) the perception of care tasks, and (3) relationship dynamic changes as a result of caring.
Significance of Results:
The implications for professional practice are discussed.
Cannabis can induce transient psychotic symptoms, but not all users experience these adverse effects. We compared the neural response to Δ9-tetrahydrocannabinol (THC) in healthy volunteers in whom the drug did or did not induce acute psychotic symptoms.
In a double-blind, placebo-controlled, pseudorandomized design, 21 healthy men with minimal experience of cannabis were given either 10 mg THC or placebo, orally. Behavioural and functional magnetic resonance imaging measures were then recorded whilst they performed a go/no-go task.
The sample was subdivided on the basis of the Positive and Negative Syndrome Scale positive score following administration of THC into transiently psychotic (TP; n = 11) and non-psychotic (NP; n = 10) groups. During the THC condition, TP subjects made more frequent inhibition errors than the NP group and showed differential activation relative to the NP group in the left parahippocampal gyrus, the left and right middle temporal gyri and in the right cerebellum. In these regions, THC had opposite effects on activation relative to placebo in the two groups. The TP group also showed less activation than the NP group in the right middle temporal gyrus and cerebellum, independent of the effects of THC.
In this first demonstration of inter-subject variability in sensitivity to the psychotogenic effects of THC, we found that the presence of acute psychotic symptoms was associated with a differential effect of THC on activation in the ventral and medial temporal cortex and cerebellum, suggesting that these regions mediate the effects of the drug on psychotic symptoms.
The most widely used pharmacological therapies for obesity and weight management are based on inhibition of gastrointestinal lipases, resulting in a reduced energy yield of ingested foods by reducing dietary lipid absorption. Colipase-dependent pancreatic lipase is believed to be the major gastrointestinal enzyme involved in catalysis of lipid ester bonds. There is scant literature on the action of pancreatic lipase under the range of physiological conditions that occur within the human small intestine, and the literature that does exist is often contradictory. Due to the importance of pancreatic lipase activity to nutrition and weight management, the present review aims to assess the current body of knowledge with regards to the physiology behind the action of this unique gastrointestinal enzyme system. Existing data would suggest that pancreatic lipase activity is affected by intestinal pH, the presence of colipase and bile salts, but not by the physiological range of Ca ion concentration (as is commonly assumed). The control of secretion of pancreatic lipase and its associated factors appears to be driven by gastrointestinal luminal content, particularly the presence of acid or digested proteins and fats in the duodenal lumen. Secretion of colipase, bile acids and pancreatic lipase is driven by cholecystokinin and secretin release.
Concentrations of air-borne bacteria and particles have been measured in turbulently ventilated operating theatres in full flow, half flow and zero flow conditions. Increased air-borne challenge produced by human activity and by mechanical cleaning procedures is demonstrated: die-away of this contamination is shown to be related to the ventilation rate. Ventilation can be reduced or turned off at night and during weekends, and cleaning can also be carried out, without increased risk of infection if full flow is restored one hour prior to preparation for surgery. Areas surrounding the theatres should remain at positive pressure with regard to the general hospital environment during low or no flow periods. The implementation of such energy-saving policies will substantially reduce theatre running costs without introducing infection hazards.
A disinfectant (2,4,4′-trichloro-2′-hydroxydiphenyl ether: Irgasan, Ciba-Geigy) was incorporated into plastic washers fabricated from ethylvinyl acetate (EVA), polyethylene, polypropylene or TPX. Plastics containing 0·2 and 2% Irgasan gave zones of inhibition on nutrient and blood agar plates seeded with micro-organisms (Staphylococcus aureus, Staph, epidermidis Escherichia coli, Proteus mirabilis or Candida albicans) even after thorough washing. Exceptionally, C. albicans was inhibited only by 2% Irgasan, and EVA gave good inhibition only against the staphylococci. Similar washers of each plastic were implanted subcutaneously into the flanks of rabbits; before insertion each was washed, had thread woven into it and was surrounded by a plasma clot containing 2 × 108Staph. aureus. All the plastics without Irgasan gave rise to abscesses, none of the plastics impregnated with 2% Irgasan did, though from 2 out of 12 sites small numbers of Staph. aureus were isolated at post mortem. Using either clinical or bacteriological criteria, the results were highly significant (P < 0.00001 and P <0.001 respectively), demonstrating the effectiveness of this technique in preventing plastic-associated infection.
The investigation, epidemiology, and effectiveness of control procedures during an outbreak of Legionnaires' disease involving three immunosuppressed patients are described. The source of infection appeared to be a network of fire hydrant spurs connected directly to the incoming hospital mains water supply. Removal of these hydrants considerably reduced, but failed to eliminate, contamination of water storage facilities. As an emergency control procedure the incoming mains water was chlorinated continuously. Additional modifications to improve temperature regulation and reduce stagnation also failed to eliminate the legionellae.
A perspex test-rig was constructed to model the pre-existing hospital water supply and storage system. This showed that through the hydraulic mechanism known as ‘temperature buoyancy’, contaminated water could be efficiently and quickly exchanged between a stagnant spur pipe and its mains supply. Contamination of hospital storage tanks from such sources has not previously been considered a risk factor for Legionnaires' disease. We recommend that hospital water storage tanks are supplied by a dedicated mains pipe without spurs.