To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Birth weight and early growth have been associated with later blood pressure. However, not all studies consistently find a significant reduction in blood pressure with an increase in birth weight. In addition, the relative importance of birth weight and of other lifestyle and environmental factors is often overlooked and the association is rarely studied in adolescents. We investigated early life predictors, including birth weight, of adolescent blood pressure in the Gateshead Millennium Study (GMS). The GMS is a cohort of 1029 individuals born in 1999–2000 in Gateshead in Northern England. Throughout infancy and early childhood, detailed information were collected, including birth weight and measures of height and weight. Assessments of 491 returning participants at age 12 years included measures of body mass and blood pressure. Linear regression and path analysis were used to determine predictors and their relative importance on blood pressure. Birth weight was not directly associated with blood pressure at the age of 12. However, after adjustment for contemporaneous body mass index (BMI), an inverse association of standardized birth weight on systolic blood pressure was significant. The relative importance of birth weight on later systolic blood pressure was smaller than other contemporaneous body measures (height and BMI). There was no independent association of birth weight on blood pressure seen in this adolescent population. Contemporaneous body measures have an important role to play. Lifestyle factors that influence body mass or size, such as diet and physical activity, where interventions are directed at early prevention of hypertension should be targeted.
Current understanding of climate change impacts, adaptation and vulnerability among Inuit in the Arctic is relatively static, rooted in the community and time that case studies were conducted. This paper captures the dynamism of Inuit–climate relationships by applying a longitudinal approach to assessing vulnerability to climate change among Inuit in Ulukhaktok, Northwest Territories, Canada. Data were collected in 2005 and 2016 following a consistent methodology and analytical framework. Findings from the studies are analysed comparatively together with longitudinal datasets. The data reveal that many of the climatic changes recorded in 2005 that adversely affected hunting activities have been observed to be persisting or progressing, such as decreasing sea ice thickness and extent, and stronger and more consistent summer winds. Inuit are responding by altering travel routes and equipment, taking greater pre-trip precautions, and concentrating their efforts on more efficient and accessible hunts. Increasing living and subsistence costs and time-constraints, changes in the generation and transmission of environmental knowledge and land skills, and the concentration of country food sharing networks were identified as key constraints to adaptation. The findings indicate that the connections between subsistence activities and the wage economy are central to understanding how Inuit experience and respond to climate change.
Mental health and wellbeing, including addressing impacts of historical trauma and substance use among young people, has been identified as a key priority by Indigenous communities and leaders across Canada and globally. Yet, research to understand mental health among young Indigenous people who have used drugs is limited.
To examine longitudinal risk and strengths-based factors associated with psychological distress among young Indigenous people who use drugs.
The Cedar Project is an ongoing cohort study involving young Indigenous people who use drugs in Vancouver, Prince George, and Chase, British Columbia, Canada. This study included participants who completed the Symptom Checklist-90-Revised, returned for follow-up between 2010 and 2012, and completed the Childhood Trauma Questionnaire. Adjusted linear mixed-effects models estimated effects of study variables on changes in area T-scores of psychological distress.
Of 202 eligible participants, 53% were women and the mean age was 28 years. Among men, childhood maltreatment (emotional abuse, physical abuse, sexual abuse, physical neglect), any drug use, blackouts from drinking, and sex work were associated with increased distress. Among women, childhood maltreatment (emotional abuse, physical abuse, physical neglect), blackouts from drinking, and sexual assault were associated with increased distress, while having attempted to quit using drugs was associated with reduced distress. Marginal associations were observed between speaking their traditional language and living by traditional culture with lower distress among men.
Culturally safe mental wellness interventions are urgently needed to address childhood trauma and harmful coping strategies that exacerbate distress among young Indigenous people who use drugs.
A crops ability to both suppress weed growth and tolerate weed competition is a key consideration when taking an agroecological approach to weed management. Amongst other cereals, oats are widely considered to have superior weed competitiveness yet studies examining competitive ability of oat varieties are rare. We investigated the ability of oats to suppress weeds and yield in the presence of competition from weeds in trials involving five husked and three naked oat varieties at an organic site in the east of England over four trial years (2009-13). We identified a number of key traits that were important for weed suppression including establishment rate, tillering ability, and early leaf area index (LAI) which highlight the importance of rapid early growth rate. Furthermore, taller varieties tended to be more weed tolerant but not necessarily more suppressive. Trade-offs between competitive traits and yield were not found in this study. Crop tillering ability was highlighted as an important trait for selection due to its beneficial effects on weed suppression as well as grain yield and also its high heritability.
The Numeniini is a tribe of 13 wader species (Scolopacidae, Charadriiformes) of which seven are Near Threatened or globally threatened, including two Critically Endangered. To help inform conservation management and policy responses, we present the results of an expert assessment of the threats that members of this taxonomic group face across migratory flyways. Most threats are increasing in intensity, particularly in non-breeding areas, where habitat loss resulting from residential and commercial development, aquaculture, mining, transport, disturbance, problematic invasive species, pollution and climate change were regarded as having the greatest detrimental impact. Fewer threats (mining, disturbance, problematic native species and climate change) were identified as widely affecting breeding areas. Numeniini populations face the greatest number of non-breeding threats in the East Asian-Australasian Flyway, especially those associated with coastal reclamation; related threats were also identified across the Central and Atlantic Americas, and East Atlantic flyways. Threats on the breeding grounds were greatest in Central and Atlantic Americas, East Atlantic and West Asian flyways. Three priority actions were associated with monitoring and research: to monitor breeding population trends (which for species breeding in remote areas may best be achieved through surveys at key non-breeding sites), to deploy tracking technologies to identify migratory connectivity, and to monitor land-cover change across breeding and non-breeding areas. Two priority actions were focused on conservation and policy responses: to identify and effectively protect key non-breeding sites across all flyways (particularly in the East Asian- Australasian Flyway), and to implement successful conservation interventions at a sufficient scale across human-dominated landscapes for species’ recovery to be achieved. If implemented urgently, these measures in combination have the potential to alter the current population declines of many Numeniini species and provide a template for the conservation of other groups of threatened species.
Observational evidence suggests that increased whole grain (WG) intake reduces the risks of many non-communicable diseases, such as CVD, type 2 diabetes, obesity and certain cancers. More recently, studies have shown that WG intake lowers all-cause and cause-specific mortality. Much of the reported evidence on risk reduction is from US and Scandinavian populations, where there are tangible WG dietary recommendations. At present there is no quantity-specific WG dietary recommendation in the UK, instead we are advised to choose WG or higher fibre versions. Despite recognition of WG as an important component of a healthy diet, monitoring of WG intake in the UK has been poor, with the latest intake assessment from data collected in 2000–2001 for adults and in 1997 for children. To update this information we examined WG intake in the National Diet and Nutrition Survey rolling programme 2008–2011 after developing our database of WG food composition, a key resource in determining WG intake accurately. The results showed median WG intakes remain low in both adults and children and below that of countries with quantity-specific guidance. We also found a reduction in C-reactive protein concentrations and leucocyte counts with increased WG intake, although no association with other markers of cardio-metabolic health. The recent recommendations by the UK Scientific Advisory Committee on Nutrition to increase dietary fibre intake will require a greater emphasis on consuming more WG. Specific recommendations on WG intake in the UK are warranted as is the development of public health policy to promote consumption of these important foods.
We describe the performance of the Boolardy Engineering Test Array, the prototype for the Australian Square Kilometre Array Pathfinder telescope. Boolardy Engineering Test Array is the first aperture synthesis radio telescope to use phased array feed technology, giving it the ability to electronically form up to nine dual-polarisation beams. We report the methods developed for forming and measuring the beams, and the adaptations that have been made to the traditional calibration and imaging procedures in order to allow BETA to function as a multi-beam aperture synthesis telescope. We describe the commissioning of the instrument and present details of Boolardy Engineering Test Array’s performance: sensitivity, beam characteristics, polarimetric properties, and image quality. We summarise the astronomical science that it has produced and draw lessons from operating Boolardy Engineering Test Array that will be relevant to the commissioning and operation of the final Australian Square Kilometre Array Path telescope.
Worldwide, dating rock art is difficult to achieve because of the frequent lack of datable material and the difficulty of removing contamination from samples. Our research aimed to select the paints that would be the most likely to be successfully radiocarbon dated and to estimate the quantity of paint needed depending on the nature of the paint and the weathering and alteration products associated with it. To achieve this aim, a two-step sampling strategy, coupled with a multi-instrument characterization (including SEM-EDS, Raman spectroscopy, and FTIR spectroscopy analysis) and a modified acid-base-acid (ABA) pretreatment, was created. In total, 41 samples were dated from 14 sites in three separate regions of southern Africa. These novel protocols ensure that the 14C chronology produced was robust and could also be subsequently applied to different regions with possible variations in paint preparation, geology, weathering conditions, and contaminants.
Geological disposal facilities (GDF) are intended to isolate and contain radioactive waste within multiple protective barriers, deep underground, to ensure that no harmful quantities of radioactivity reach the surface environment. The last line of defense in a multi-barrier GDF is the geosphere, where iron is present in the host rock mineralogy as either Fe(II) or Fe(III), and in groundwater as Fe(II) under reducing conditions. The mobility of risk-driving radionuclides, including uranium and technetium, in the environment is affected significantly by their valence state. Due to its low redox potential, Fe(II) can mediate reduction of these radionuclides from their oxidized, highly mobile, soluble state to their reduced, insoluble state, preventing them from reaching the biosphere. Here a study of five types of potential host rocks, two granitoids, an andesite, a mudstone and a clay-rich carbonate, is reported. The bulk rocks and their minerals were analysed for iron content, Fe(II/III) ratio, and for the speciation and fine-grained nature of alteration product minerals that might have important controls on groundwater interaction. Total iron content varies between 0.9% in clays to 5.6% in the andesite. X-ray absorption spectroscopy reveals that Fe in the granitoids and andesite is predominantly Fe(II), and in mudstones, argillaceous limestone and terrestrial sandstone is predominantly Fe(III). The redox reactivity of the potential host rocks both in the presence and absence of Fe(II)-containing 'model' groundwater was investigated using an azo dye as a probe molecule. Reduction rates as determined by reactivity with the azo dye were correlated with the ability of the rocks to uptake Fe(II) from groundwater rather than with initial Fe(II) content. Potential GDF host rocks must be characterized in terms of mineralogy, texture, grain size and bulk geochemistry to assess how they might interact with groundwater. This study highlights the importance of redox reactivity, not just total iron and Fe(II)/(III) ratio, when considering the host rock performance as a barrier material to limit transport of radionuclides from the GDF.
A detailed understanding of the response of mineral phases to the radiation fields experienced in a geological disposal facility (GDF) is currently poorly constrained. Prolongued ion irradiation has the potential to affect both the physical integrity and oxidation state of materials and therefore may alter a structure's ability to react with radionuclides. Radiohalos (spheres of radiation damage in minerals surrounding radioactive (α-emitting) inclusions) provide useful analogues for studying long term α-particle damage accumulation. In this study, silicate minerals adjacent to Th- and U-rich monazite and zircon were probed for redox changes and long/short range disorder using microfocus X-ray absorption spectroscopy (XAS) and high resolution X-ray diffraction (XRD) at Beamline I18, Diamond Light Source. Fe3+ → Fe2+ reduction has been demonstrated in an amphibole sample containing structural OH– groups – a trend not observed in anhydrous phases such as garnet. Coincident with the findings of Pattrick et al. (2013), the radiolytic breakdown of OH– groups is postulated to liberate Fe3+ reducing electrons. Across all samples, high point defect densities and minor lattice aberrations are apparent adjacent to the radioactive inclusion, demonstrated by micro-XRD.
The dynamic model Nitrogen Dynamics in Crop rotations in Ecological Agriculture (NDICEA) was used to assess the nitrogen (N), phosphorus (P) and potassium (K) balance of long-term organic cropping trials and typical organic crop rotations on a range of soil types and rainfall zones in the UK. The measurements of soil N taken at each of the organic trial sites were also used to assess the performance of NDICEA. The modeled outputs compared well to recorded soil N levels, with relatively small error margins. NDICEA therefore seems to be a useful tool for UK organic farmers. The modeling of typical organic rotations has shown that positive N balances can be achieved, although negative N balances can occur under high rainfall conditions and on lighter soil types as a result of leaching. The analysis and modeling also showed that some organic cropping systems rely on imported sources of P and K to maintain an adequate balance and large deficits of both nutrients are apparent in stockless systems. Although the K deficits could be addressed through the buffering capacity of minerals, the amount available for crop uptake will depend on the type and amount of minerals present, current cropping and fertilization practices and the climatic environment. A P deficit represents a more fundamental problem for the maintenance of crop yields and the organic sector currently relies on mined sources of P which represents a fundamental conflict with the International Federation of Organic Agriculture Movements organic principles.
The purpose of this study was to examine the hypothesis that excess maternal glucocorticoids in response to maternal undernutrition programs the expression of extracellular matrix (ECM) components potentially by miR-29c. We measured the expression of mRNA (qRT-PCR) and protein (Western blot) for collagen 3A1, collagen 4A5 and matrix metalloproteinase 2 (MMP2) in offspring carotid arteries from three groups of dams: 50% food-restricted in latter half of gestation [maternal undernutrition (MUN)], MUN dams who received metyrapone (MET) (500 mg/ml ) in drinking water from day 10 of gestation to term, and control dams fed an ad libitum diet. The expression of miR-29c was significantly decreased at 3 weeks, 3 months and 9 months in MUN carotid arteries, and these decreases in expression were partially blocked by treatment of dams with MET. The expression pattern of ECM genes that are targets of miR-29c correlated with miR-29c expression. Expression of mRNA was increased for elastin (ELN) and MMP2 mRNA in 3-week MUN carotids; in 9-month carotids there were also significant increases in expression of Col3A1 and Col4A5. These changes in mRNA expression of ECM genes at 3 weeks and 9 months were blocked by MET treatment. Similarly, the expression of ELN and MMP2 proteins at 3 weeks were increased in MUN carotids, and by 9 months there were also increases in expression of Col3A1 and Col4A5, which were blocked by MET in MUN carotids. Overall, the results demonstrate a close correlation between expression of miR-29c and the ECM proteins that are its targets thus supporting our central hypothesis.
Growing awareness and concern for the increasing frequency of incidents involving hazardous materials (HazMat) across a broad spectrum of contaminants from chemical, biological, radiological, and nuclear (CBRN) sources indicates a clear need to refine the capability to respond successfully to mass-casualty contamination incidents. Best results for decontamination from a chemical agent will be achieved if done within minutes following exposure, and delays in decontamination will increase the length of time a casualty is in contact with the contaminate. The findings presented in this report indicate that casualties involved in a HazMat/CBRN mass-casualty incident (MCI) in a typical community would not receive sufficient on-scene care because of operational delays that are integral to a standard HazMat/CBRN first response. This delay in response will mean that casualty care will shift away from the incident scene into already over-tasked health care facilities as casualties seek aid on their own. The self-care decontamination protocols recommended here present a viable option to ensure decontamination is completed in the field, at the incident scene, and that casualties are cared for more quickly and less traumatically than they would be otherwise. Introducing self-care decontamination procedures as a standard first response within the response community will improve the level of care significantly and provide essential, self-care decontamination to casualties. The process involves three distinct stages which should not be delayed; these are summarized by the acronym MADE: Move/Assist, Disrobe/Decontaminate, Evaluate/Evacuate.
MonteithRG, PearceLDR. Self-care Decontamination within a Chemical Exposure Mass-casualty Incident. Prehosp Disaster Med. 2015;30(3):1–9.
Increased whole grain intake has been shown to reduce the risk of many non-communicable diseases. Countries including the USA, Canada, Denmark and Australia have specific dietary guidelines on whole grain intake but others, including the UK, do not. Data from 1986/87 and 2000/01 have shown that whole grain intake is low and declining in British adults. The aim of the present study was to describe whole grain intakes in the most current dietary assessment of UK households using data from the National Diet and Nutrition Survey rolling programme 2008–11. In the present study, 4 d diet diaries were completed by 3073 individuals between 2008 and 2011, along with details of socio-economic status (SES). The median daily whole grain intake, calculated for each individual on a dry weight basis, was 20 g/d for adults and 13 g/d for children/teenagers. The corresponding energy-adjusted whole grain intake was 27 g/10 MJ per d for adults and 20 g/10 MJ per d for children/teenagers. Whole grain intake (absolute and energy-adjusted) increased with age, but was lowest in teenagers (13–17 years) and younger adults up to the age of 34 years. Of the total study population, 18 % of adults and 15 % of children/teenagers did not consume any whole-grain foods. Individuals from lower SES groups had a significantly lower whole grain intake than those from more advantaged classifications. The whole grain intake in the UK, although higher than in 2000/01, remains low and below that in the US and Danish recommendations in all age classes. Favourable pricing with increased availability of whole-grain foods and education may help to increase whole grain intake in countries without whole-grain recommendations. Teenagers and younger adults may need targeting to help increase whole grain consumption.
Epidemiological evidence suggests an inverse association between whole grain consumption and the risk of non-communicable diseases, such as CVD, type 2 diabetes, obesity and some cancers. A recent analysis of the National Diet and Nutrition Survey rolling programme (NDNS-RP) has shown lower intake of whole grain in the UK. It is important to understand whether the health benefits associated with whole grain intake are present at low levels of consumption. The present study aimed to investigate the association of whole grain intake with intakes of other foods, nutrients and markers of health (anthropometric and blood measures) in the NDNS-RP 2008–11, a representative dietary survey of UK households. A 4-d diet diary was completed by 3073 individuals. Anthropometric measures, blood pressure levels, and blood and urine samples were collected after diary completion. Individual whole grain intake was calculated with consumers categorised into tertiles of intake. Higher intake of whole grain was associated with significantly decreased leucocyte counts. Significantly higher concentrations of C-reactive protein were seen in adults in the lowest tertile of whole grain intake. No associations with the remaining health markers were seen, after adjustments for sex and age. Over 70 % of this population did not consume the minimum recommend intake associated with disease risk reduction, which may explain small variation across health markers. Nutrient intakes in consumers compared with non-consumers were closer to dietary reference values, such as higher intakes of fibre, Mg and Fe, and lower intakes of Na, suggesting that higher intake of whole grain is associated with improved diet quality.
This paper describes the system architecture of a newly constructed radio telescope – the Boolardy engineering test array, which is a prototype of the Australian square kilometre array pathfinder telescope. Phased array feed technology is used to form multiple simultaneous beams per antenna, providing astronomers with unprecedented survey speed. The test array described here is a six-antenna interferometer, fitted with prototype signal processing hardware capable of forming at least nine dual-polarisation beams simultaneously, allowing several square degrees to be imaged in a single pointed observation. The main purpose of the test array is to develop beamforming and wide-field calibration methods for use with the full telescope, but it will also be capable of limited early science demonstrations.
Post-traumatic stress disorder (PTSD) is typically associated with high-risk population groups, but the risk of PTSD that is associated with trauma experienced in the community, and effect of changes in diagnostic criteria in DSM-5 on prevalence in the general population, is unknown.
Cross-sectional analysis of population-based data from 4558 adults aged 25–83 years resident in Caerphilly county borough, Wales, UK. Exposure to different traumatic events was assessed using categorisation of free-text descriptions of trauma. PTSD caseness was determined using items assessing Diagnostic and Statistical Manual IV (DSM-IV) and DSM-5 A criteria and the Traumatic Screening Questionnaire.
Of the 4558 participants, 1971 (47.0%) reported a traumatic event. The most common DSM-IV A1 qualifying trauma was life-threatening illnesses and injuries (13.6%). The highest risk of PTSD was associated with assaultive violence [34.1%]. The prevalence of PTSD using DSM-IV A criteria was 14.3% (95% confidence interval [CI] = 12.8, 15.9%). Using DSM-5 A criteria reduced the prevalence to 8.0 (95% CI = 6.9, 9.4%), primarily due to exclusion of DSM-IV A1 qualifying events, such as life-threatening illnesses.
Nearly one-half of a general community sample had experienced a traumatic event and of these around one in seven was a DSM-IV case of PTSD. Although the majority of research has concentrated on combat, rape and assaultive violence, life threatening illness is a more common cause of PTSD in the community. Removal of this traumatic event in DSM-5 could reduce the number of cases of PTSD by around 6.0%.