To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To evaluate the association of ultra-processed food (UPF) consumption with gains in weight and waist circumference, and incident overweight/obesity, in the Brazilian Longitudinal Study of Adult Health (ELSA-Brasil) cohort.
We applied FFQ at baseline and categorized energy intake by degree of processing using the NOVA classification. Height, weight and waist circumference were measured at baseline and after a mean 3·8-year follow-up. We assessed associations, through Poisson regression with robust variance, of UPF consumption with large weight gain (1·68 kg/year) and large waist gain (2·42 cm/year), both being defined as ≥90th percentile in the cohort, and with incident overweight/obesity.
Civil servants of Brazilian public academic institutions in six cities (n 11 827), aged 35–74 years at baseline (2008–2010).
UPF provided a mean 24·6 (sd 9·6) % of ingested energy. After adjustment for smoking, physical activity, adiposity and other factors, fourth (>30·8 %) v. first (<17·8 %) quartile of UPF consumption was associated (relative risk (95 % CI)) with 27 and 33 % greater risk of large weight and waist gains (1·27 (1·07, 1·50) and 1·33 (1·12, 1·58)), respectively. Similarly, those in the fourth consumption quartile presented 20 % greater risk (1·20 (1·03, 1·40)) of incident overweight/obesity and 2 % greater risk (1·02; (0·85, 1·21)) of incident obesity. Approximately 15 % of cases of large weight and waist gains and of incident overweight/obesity could be attributed to consumption of >17·8 % of energy as UPF.
Greater UPF consumption predicts large gains in overall and central adiposity and may contribute to the inexorable rise in obesity seen worldwide.
Forty years ago, Knut Fladmark (1979) argued that the Pacific Coast offered a viable alternative to the ice-free corridor model for the initial peopling of the Americas—one of the first to support a “coastal migration theory” that remained marginal for decades. Today, the pre-Clovis occupation at the Monte Verde site is widely accepted, several other pre-Clovis sites are well documented, investigations of terminal Pleistocene subaerial and submerged Pacific Coast landscapes have increased, and multiple lines of evidence are helping decode the nature of early human dispersals into the Americas. Misconceptions remain, however, about the state of knowledge, productivity, and deglaciation chronology of Pleistocene coastlines and possible technological connections around the Pacific Rim. We review current evidence for several significant clusters of early Pacific Coast archaeological sites in North and South America that include sites as old or older than Clovis. We argue that stemmed points, foliate points, and crescents (lunates) found around the Pacific Rim may corroborate genomic studies that support an early Pacific Coast dispersal route into the Americas. Still, much remains to be learned about the Pleistocene colonization of the Americas, and multiple working hypotheses are warranted.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
Introduction: Trauma and injury play a significant role in the population's burden of disease. Limited research exists evaluating the role of trauma bypass protocols. The objective of this study was to assess the impact and effectiveness of a newly introduced prehospital field trauma triage (FTT) standard, allowing paramedics to bypass a closer hospital and directly transport to a trauma centre (TC) provided transport times were within 30 minutes. Methods: We conducted a 12-month multi-centred health record review of paramedic call reports and emergency department health records following the implementation of the 4 step FTT standard (step 1: vital signs and level of consciousness, step 2: anatomical injury, step 3: mechanism and step 4: special considerations) in nine paramedic services across Eastern Ontario. We included adult trauma patients transported as an urgent transport to hospital, that met one of the 4 steps of the FTT standard and would allow for a bypass consideration. We developed and piloted a standardized data collection tool and obtained consensus on all data definitions. The primary outcome was the rate of appropriate triage to a TC, defined as any of the following: injury severity score ≥12, admitted to an intensive care unit, underwent non-orthopedic operation, or death. We report descriptive and univariate analysis where appropriate. Results: 570 adult patients were included with the following characteristics: mean age 48.8, male 68.9%, attended by Advanced Care Paramedic 71.8%, mechanisms of injury: MVC 20.2%, falls 29.6%, stab wounds 10.5%, median initial GCS 14, mean initial BP 132, prehospital fluid administered 26.8%, prehospital intubation 3.5%, transported to a TC 74.6%. Of those transported to a TC, 308 (72.5%) had bypassed a closer hospital prior to TC arrival. Of those that bypassed a closer hospital, 136 (44.2%) were determined to be “appropriate triage to TC”. Bypassed patients more often met the step 1 or step 2 of the standard (186, 66.9%) compared to the step 3 or step 4 (122, 39.6%). An appropriate triage to TC occurred in 104 (55.9%) patients who had met step 1 or 2 and 32 (26.2%) patients meeting step 3 or 4 of the FTT standard. Conclusion: The FTT standard can identify patients who should be bypassed and transported to a TC. However, this is at a cost of potentially burdening the system with poor sensitivity. More work is needed to develop a FTT standard that will assist paramedics in appropriately identifying patients who require a trauma centre.
When people make decisions with a pre-selected choice option – a ‘default’ – they are more likely to select that option. Because defaults are easy to implement, they constitute one of the most widely employed tools in the choice architecture toolbox. However, to decide when defaults should be used instead of other choice architecture tools, policy-makers must know how effective defaults are and when and why their effectiveness varies. To answer these questions, we conduct a literature search and meta-analysis of the 58 default studies (pooled n = 73,675) that fit our criteria. While our analysis reveals a considerable influence of defaults (d = 0.68, 95% confidence interval = 0.53–0.83), we also discover substantial variation: the majority of default studies find positive effects, but several do not find a significant effect, and two even demonstrate negative effects. To explain this variability, we draw on existing theoretical frameworks to examine the drivers of disparity in effectiveness. Our analysis reveals two factors that partially account for the variability in defaults’ effectiveness. First, we find that defaults in consumer domains are more effective and in environmental domains are less effective. Second, we find that defaults are more effective when they operate through endorsement (defaults that are seen as conveying what the choice architect thinks the decision-maker should do) or endowment (defaults that are seen as reflecting the status quo). We end with a discussion of possible directions for a future research program on defaults, including potential additional moderators, and implications for policy-makers interested in the implementation and evaluation of defaults.
Soldier operational performance is determined by their fitness, nutritional status, quality of rest/recovery, and remaining injury/illness free. Understanding large fluctuations in nutritional status during operations is critical to safeguarding health and well-being. There are limited data world-wide describing the effect of extreme climate change on nutrient profiles. This study investigated the effect of hot-dry deployments on vitamin D status (assessed from 25-hydroxyvitamin D (25(OH)D) concentration) of young, male, military volunteers. Two data sets are presented (pilot study, n 37; main study, n 98), examining serum 25(OH)D concentrations before and during 6-month summer operational deployments to Afghanistan (March to October/November). Body mass, percentage of body fat, dietary intake and serum 25(OH)D concentrations were measured. In addition, parathyroid hormone (PTH), adjusted Ca and albumin concentrations were measured in the main study to better understand 25(OH)D fluctuations. Body mass and fat mass (FM) losses were greater for early (pre- to mid-) deployment compared with late (mid- to post-) deployment (P<0·05). Dietary intake was well-maintained despite high rates of energy expenditure. A pronounced increase in 25(OH)D was observed between pre- (March) and mid-deployment (June) (pilot study: 51 (sd 20) v. 212 (sd 85) nmol/l, P<0·05; main study: 55 (sd 22) v. 167 (sd 71) nmol/l, P<0·05) and remained elevated post-deployment (October/November). In contrast, PTH was highest pre-deployment, decreasing thereafter (main study: 4·45 (sd 2·20) v. 3·79 (sd 1·50) pmol/l, P<0·05). The typical seasonal cycling of vitamin D appeared exaggerated in this active male population undertaking an arduous summer deployment. Further research is warranted, where such large seasonal vitamin D fluctuations may be detrimental to bone health in the longer-term.
The role of herbivorous livestock in supporting the sustainability of the farming systems in which they are found is complex and sometimes conflicting. In Sub-Saharan Africa (SSA), the integration of livestock into farming systems is important for sustainable agriculture as the recycling of nutrients for crop production through returns of animal manure is a central element of the dominant mixed crop-livestock systems. Sustainable agriculture has been widely advocated as the main practical pathway to address the challenge of meeting the food needs of the rapidly growing population in SSA while safeguarding the needs of future generations. The objective of this paper is to review the state of knowledge of the role of herbivores in sustainable intensification of key farming systems in SSA. The pathways to sustainable agriculture in SSA include intensification of production and livelihood diversification. Sustainable agricultural practices in SSA have focused on intensification practices which aim to increase the output : input ratio through increasing use of inputs, introduction of new inputs or use of existing inputs in a new way. Intensification of livestock production can occur through increased and improved fodder availability, genetic production gains, improved crop residue use and better nutrient recycling of manure. Livestock deliver many ‘goods’ in smallholder farming systems in SSA including improving food and nutrition security, increased recycling of organic matter and nutrients and the associated soil fertility amendments, adding value to crop residues by turning them into nutrient-rich foods, income generation and animal traction. Narratives on livestock ‘bads’ or negative environmental consequences have been largely shaped by the production conditions in the Global North but livestock production in SSA is a different story. In SSA, livestock are an integral component of mixed farming systems and they play key roles in supporting the livelihoods of much of the rural population. None-the-less, the environmental consequences of livestock production on the continent cannot be ignored. To enhance agricultural sustainability in SSA, the challenge is to optimize livestock’s role in the farming systems by maximizing livestock ‘goods’ while minimizing the ‘bads’. This can be through better integration of livestock into the farming systems, efficient nutrient management systems, and provision of necessary policy and institutional support.
Introduction: Early recognition of sepsis can improve patient outcomes yet recognition by paramedics is poor and research evaluating the use of prehospital screening tools is limited. Our objective was to evaluate the predictive validity of the Regional Paramedic Program for Eastern Ontario (RPPEO) prehospital sepsis notification tool to identify patients with sepsis and to describe and compare the characteristics of patients with an emergency department (ED) diagnosis of sepsis that are transported by paramedics. The RPPEO prehospital sepsis notification tool is comprised of 3 criteria: current infection, fever &/or history of fever and 2 or more signs of hypoperfusion (eg. SBP<90, HR 100, RR24, altered LOA). Methods: We performed a review of ambulance call records and in-hospital records over two 5-month periods between November 2014 February 2016. We enrolled a convenience sample of patients, assessed by primary and advanced care paramedics (ACPs), with a documented history of fever &/or documented fever of 38.3°C (101°F) that were transported to hospital. In-hospital management and outcomes were obtained and descriptive, t-tests, and chi-square analyses performed where appropriate. The RPPEO prehospital sepsis notification tool was compared to an ED diagnosis of sepsis. The predictive validity of the RPPEO tool was calculated (sensitivity, specificity, NPV, PPV). Results: 236 adult patients met the inclusion criteria with the following characteristics: mean age 65.2 yrs [range 18-101], male 48.7%, history of sepsis 2.1%, on antibiotics 23.3%, lowest mean systolic BP 125.9, treated by ACP 58.9%, prehospital temperature documented 32.6%. 34 (14.4%) had an ED diagnosis of sepsis. Patients with an ED diagnosis of sepsis, compared to those that did not, had a lower prehospital systolic BP (114.9 vs 127.8, p=0.003) and were more likely to have a prehospital shock index >1 (50.0% vs 21.4%, p=0.001). 44 (18.6%) patients met the RPPEO sepsis notification tool and of these, 27.3% (12/44) had an ED diagnosis of sepsis. We calculated the following predictive values of the RPPEO tool: sensitivity 35.3%, specificity 84.2%, NPV 88.5%, PPV 27.3%. Conclusion: The RPPEO prehospital sepsis notification tool demonstrated modest diagnostic accuracy. Further research is needed to improve accuracy and evaluate the impact on patient outcomes.
The impact of a deep-water plunging breaker on a finite height two-dimensional structure with a vertical front face is studied experimentally. The structure is located at a fixed horizontal position relative to a wave maker and the structure’s bottom surface is located at a range of vertical positions close to the undisturbed water surface. Measurements of the water surface profile history and the pressure distribution on the front surface of the structure are performed. As the vertical position,
axis is positive up and
is the mean water level), of the structure’s bottom surface is varied from one experimental run to another, the water surface evolution during impact can be categorized into three classes of behaviour. In class I, with
in a range of values near
is the nominal wavelength of the breaker, the behaviour of the water surface is similar to the flip-through phenomena first described in studies with shallow water and a structure mounted on the sea bed. In the present work, it is found that the water surface between the front face of the structure and the wave crest is well fitted by arcs of circles with a decreasing radius and downward moving centre as the impact proceeds. A spatially and temporally localized high-pressure region was found on the impact surface of the structure and existing theory is used to explore the physics of this phenomenon. In class II, with
in a range of values near the mean water level, the bottom of the structure exits and re-enters the water phase at least once during the impact process. These air–water transitions generate large-amplitude ripple packets that propagate to the wave crest and modify its behaviour significantly. At
, all sensors submerged during the impact record a nearly in-phase high-frequency pressure oscillation indicating possible air entrainment. In class III, with
in a range of values near
, the bottom of the structure remains in air before the main crest hits the bottom corner of the structure. The subsequent free surface behaviour is strongly influenced by the instantaneous momentum of the local flow just before impact and the highest wall pressures of all experimental conditions are found.
This study determines the prevalence of inadequate micronutrient intakes consumed by long-term care (LTC) residents. This cross-sectional study was completed in thirty-two LTC homes in four Canadian provinces. Weighed and estimated food and beverage intake were collected over 3 non-consecutive days from 632 randomly selected residents. Nutrient intakes were adjusted for intra-individual variation and compared with the Dietary Reference Intakes. Proportion of participants, stratified by sex and use of modified (MTF) or regular texture foods, with intakes below the Estimated Average Requirement (EAR) or Adequate Intake (AI), were identified. Numbers of participants that met these adequacy values with use of micronutrient supplements was determined. Mean age of males (n 197) was 85·2 (sd 7·6) years and females (n 435) was 87·4 (sd 7·8) years. In all, 33 % consumed MTF; 78·2 % (males) and 76·1 % (females) took at least one micronutrient pill. Participants on a MTF had lower intake for some nutrients (males=4; females=8), but also consumed a few nutrients in larger amounts than regular texture consumers (males=4; females =1). More than 50 % of participants in both sexes and texture groups consumed inadequate amounts of folate, vitamins B6, Ca, Mg and Zn (males only), with >90 % consuming amounts below the EAR/AI for vitamin D, E, K, Mg (males only) and K. Vitamin D supplements resolved inadequate intakes for 50–70 % of participants. High proportions of LTC residents have intakes for nine of twenty nutrients examined below the EAR or AI. Strategies to improve intake specific to these nutrients are needed.
The catchments of Pine Island Glacier and Thwaites Glacier in the Amundsen Sea Embayment are two of the largest, most rapidly changing, and potentially unstable sectors of the West Antarctic Ice Sheet. They are also neighboring outlets, separated by the topographically unconfined eastern shear margin of Thwaites Glacier and the southwest tributary of Pine Island Glacier. This tributary begins just downstream of the eastern shear margin and flows into the Pine Island ice shelf. As a result, it is a potential locus of interaction between the two glaciers and could result in cross-catchment feedback during the retreat of either. Here, we analyze relative basal reflectivity profiles from three radar sounding survey lines collected using the UTIG HiCARS radar system in 2004 and CReSIS MCoRDS radar system in 2012 and 2014 to investigate the extent and character of ocean access beneath the southwest tributary. These profiles provide evidence of ocean access ~12 km inland of the 1992–2011 InSAR-derived grounding line by 2014, suggesting either retreat since 2011 or the intrusion of ocean water kilometers inland of the grounding line.
Whether monozygotic (MZ) and dizygotic (DZ) twins differ from each other in a variety of phenotypes is important for genetic twin modeling and for inferences made from twin studies in general. We analyzed whether there were differences in individual, maternal and paternal education between MZ and DZ twins in a large pooled dataset. Information was gathered on individual education for 218,362 adult twins from 27 twin cohorts (53% females; 39% MZ twins), and on maternal and paternal education for 147,315 and 143,056 twins respectively, from 28 twin cohorts (52% females; 38% MZ twins). Together, we had information on individual or parental education from 42 twin cohorts representing 19 countries. The original education classifications were transformed to education years and analyzed using linear regression models. Overall, MZ males had 0.26 (95% CI [0.21, 0.31]) years and MZ females 0.17 (95% CI [0.12, 0.21]) years longer education than DZ twins. The zygosity difference became smaller in more recent birth cohorts for both males and females. Parental education was somewhat longer for fathers of DZ twins in cohorts born in 1990–1999 (0.16 years, 95% CI [0.08, 0.25]) and 2000 or later (0.11 years, 95% CI [0.00, 0.22]), compared with fathers of MZ twins. The results show that the years of both individual and parental education are largely similar in MZ and DZ twins. We suggest that the socio-economic differences between MZ and DZ twins are so small that inferences based upon genetic modeling of twin data are not affected.
Particle Image Velocimetry (PIV) has been used to study the complex flowfield created by simulated battle damage to a two-dimensional wing. Computational Fluid Dynamics (CFD) predictions have also been used for validation of internal cavity flow. Two damage cases were selected for the study; both cases were simulated using a single hole with diameters equal to 20% and 40% of the chord, located at the wing half-chord. Wind-tunnel tests were conducted at a Reynolds number of 500,000 over a range of incidences from 0 to 10° with two-component PIV measurements made on three chordwise and three spanwise planes. The PIV data were analysed and compared to CFD data of the same damage cases. The PIV data have shown lower velocity ratios and lower vorticity in the jet compared to past Jet in Cross-Flow experiments and CFD was used to describe the flow features inside the cavity of the wing. It was seen that the wing cavity has large effects on the external flow features, particularly for the 20% damage case. Finally, the flow field data have been related to force balance data. At higher incidence angles, the larger force coefficient increments in both lift and drag can be attributed to the larger wakes and higher jet strengths.
Extinctions have altered island ecosystems throughout the late Quaternary. Here, we review the main historic drivers of extinctions on islands, patterns in extinction chronologies between islands, and the potential for restoring ecosystems through reintroducing extirpated species. While some extinctions have been caused by climatic and environmental change, most have been caused by anthropogenic impacts. We propose a general model to describe patterns in these anthropogenic island extinctions. Hunting, habitat loss and the introduction of invasive predators accompanied prehistoric settlement and caused declines of endemic island species. Later settlement by European colonists brought further land development, a different suite of predators and new drivers, leading to more extinctions. Extinctions alter ecological networks, causing ripple effects for islands through the loss of ecosystem processes, functions and interactions between species. Reintroduction of extirpated species can help restore ecosystem function and processes, and can be guided by palaeoecology. However, reintroduction projects must also consider the cultural, social and economic needs of humans now inhabiting the islands and ensure resilience against future environmental and climate change.