To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The aim of this study was to test the hypotheses that differences in residual feed intake (RFI) of beef steers are related to diet sorting, diet nutrient composition, energy intake and apparent digestibility. To phenotype steers for RFI, 69 weaned Angus × Hereford steers were fed individually for 56 days. A finishing diet was fed twice daily on an ad libitum basis to maintain approximately 0.5 to 1.0 kg refusals. Diet offered and refused was measured daily, and DM intakes (DMI) were calculated by difference. Body weights were recorded at 14-day intervals following an 18-h solid feed withdrawal. The residual feed intake was determined as the residual of the regression of DMI versus mid-test metabolic BW (BW0.75) and average daily gain (ADG). Particle size distributions of diet and refusals were determined using the Penn State Particle Separator to quantify diet sorting. Sampling of diet, refusals and feces were repeated in four sampling periods which occurred during weeks 2, 4, 6 and 8 of the study. Particle size distributions of refusals and diet were analyzed in weeks 2, 4 and 6, and sampling for chemical analysis of refusals and feces occurred in all four periods. Indigestible neutral detergent fiber (288 h in situ) was used as an internal marker of apparent digestibility. We conclude that preference for the intakes of particles > 19 mm and 4 to 8 mm were negatively correlated to RFI and ADG, respectively. Although steers did sort to consume a different diet composition than offered, diet sorting did not impact intake energy, digestible energy or DM digestibility.
Provision of critical care and resuscitation was not practical during early missions into space. Given likely advancements in commercial spaceflight and increased human presence in low Earth orbit (LEO) in the coming decades, development of these capabilities should be considered as the likelihood of emergent medical evacuation increases.
PubMed, Web of Science, Google Scholar, National Aeronautics and Space Administration (NASA) Technical Server, and Defense Technical Information Center were searched from inception to December 2018. Articles specifically addressing critical care and resuscitation during emergency medical evacuation from LEO were selected. Evidence was graded using Oxford Centre for Evidence-Based Medicine guidelines.
The search resulted in 109 articles included in the review with a total of 2,177 subjects. There were two Level I systematic reviews, 33 Level II prospective studies with 647 subjects, seven Level III retrospective studies with 1,455 subjects, and two Level IV case series with four subjects. There were two Level V case reports and 63 pertinent review articles.
The development of a medical evacuation capability is an important consideration for future missions. This review revealed potential hurdles in the design of a dedicated LEO evacuation spacecraft. The ability to provide critical care and resuscitation during transport is likely to be limited by mass, volume, cost, and re-entry forces. Stabilization and treatment of the patient should be performed prior to departure, if possible, and emphasis should be on a rapid and safe return to Earth for definitive care.
To examine factors that influence decision-making, preferences, and plans related to advance care planning (ACP) and end-of-life care among persons with dementia and their caregivers, and examine how these may differ by race.
13 geographically dispersed Alzheimer’s Disease Centers across the United States.
431 racially diverse caregivers of persons with dementia.
Survey on “Care Planning for Individuals with Dementia.”
The respondents were knowledgeable about dementia and hospice care, indicated the person with dementia would want comfort care at the end stage of illness, and reported high levels of both legal ACP (e.g., living will; 87%) and informal ACP discussions (79%) for the person with dementia. However, notable racial differences were present. Relative to white persons with dementia, African American persons with dementia were reported to have a lower preference for comfort care (81% vs. 58%) and lower rates of completion of legal ACP (89% vs. 73%). Racial differences in ACP and care preferences were also reflected in geographic differences. Additionally, African American study partners had a lower level of knowledge about dementia and reported a greater influence of religious/spiritual beliefs on the desired types of medical treatments. Notably, all respondents indicated that more information about the stages of dementia and end-of-life health care options would be helpful.
Educational programs may be useful in reducing racial differences in attitudes towards ACP. These programs could focus on the clinical course of dementia and issues related to end-of-life care, including the importance of ACP.
At GE Research, we are combining “physics” with artificial intelligence and machine learning to advance manufacturing design, processing, and inspection, turning innovative technologies into real products and solutions across our industrial portfolio. This article provides a snapshot of how this physical plus digital transformation is evolving at GE.
Shiga toxin-producing Escherichia coli (STEC) infection can cause serious illness including haemolytic uraemic syndrome. The role of socio-economic status (SES) in differential clinical presentation and exposure to potential risk factors amongst STEC cases has not previously been reported in England. We conducted an observational study using a dataset of all STEC cases identified in England, 2010–2015. Odds ratios for clinical characteristics of cases and foodborne, waterborne and environmental risk factors were estimated using logistic regression, stratified by SES, adjusting for baseline demographic factors. Incidence was higher in the highest SES group compared to the lowest (RR 1.54, 95% CI 1.19–2.00). Odds of Accident and Emergency attendance (OR 1.35, 95% CI 1.10–1.75) and hospitalisation (OR 1.71, 95% CI 1.36–2.15) because of illness were higher in the most disadvantaged compared to the least, suggesting potential lower ascertainment of milder cases or delayed care-seeking behaviour in disadvantaged groups. Advantaged individuals were significantly more likely to report salad/fruit/vegetable/herb consumption (OR 1.59, 95% CI 1.16–2.17), non-UK or UK travel (OR 1.76, 95% CI 1.40–2.27; OR 1.85, 95% CI 1.35–2.56) and environmental exposures (walking in a paddock, OR 1.82, 95% CI 1.22–2.70; soil contact, OR 1.52, 95% CI 2.13–1.09) suggesting other unmeasured risks, such as person-to-person transmission, could be more important in the most disadvantaged group.
In benefit-cost analysis, fatality risk reductions are usually valued based on estimates of adults’ willingness to pay for changes in their own risks, regardless of whether the risk reduction accrues to adults or children. This approach reflects the relatively large number of valuation studies that address adults; however, the literature on children is growing. We review these studies, focusing on those that estimate values for both adults and children using a consistent approach to limit the effects of between-study variability. We rely on explicit selection criteria to identify studies that measure reasonably comparable outcomes and are candidates for application to analyses of U.S. policies. The ratio of values for children to values for adults ranges from 0.6 to 2.9; however, most estimates are greater than 1.5. Although some studies suggest that the divergence between child and adult values decreases as the child ages, this finding is not universal. We conclude that analysts should test the sensitivity of their results to the use of higher values for children than adults. Additional empirical research is needed to support more precise estimates of the variation in values by age that can be featured in the primary analysis.
Recent infection testing algorithms (RITA) for HIV combine serological assays with epidemiological data to determine likely recent infections, indicators of ongoing transmission. In 2016, we integrated RITA into national HIV surveillance in Ireland to better inform HIV prevention interventions. We determined the avidity index (AI) of new HIV diagnoses and linked the results with data captured in the national infectious disease reporting system. RITA classified a diagnosis as recent based on an AI < 1.5, unless epidemiological criteria (CD4 count <200 cells/mm3; viral load <400 copies/ml; the presence of AIDS-defining illness; prior antiretroviral therapy use) indicated a potential false-recent result. Of 508 diagnoses in 2016, we linked 448 (88.1%) to an avidity test result. RITA classified 12.5% of diagnoses as recent, with the highest proportion (26.3%) amongst people who inject drugs. On multivariable logistic regression recent infection was more likely with a concurrent sexually transmitted infection (aOR 2.59; 95% CI 1.04–6.45). Data were incomplete for at least one RITA criterion in 48% of cases. The study demonstrated the feasibility of integrating RITA into routine surveillance and showed some ongoing HIV transmission. To improve the interpretation of RITA, further efforts are required to improve completeness of the required epidemiological data.
Investing in global health and development requires making difficult choices about what policies to pursue and what level of resources to devote to different initiatives. Methods of economic evaluation are well established and widely used to quantify and compare the impacts of alternative investments. However, if not well conducted and clearly reported, these evaluations can lead to erroneous conclusions. Differences in analytic methods and assumptions can obscure important differences in impacts. To increase the comparability of these evaluations, improve their quality, and expand their use, this special issue includes a series of papers developed to support reference case guidance for benefit-cost analysis. In this introductory article, we discuss the background and context for this work, summarize the process we are following, describe the overall framework, and introduce the articles that follow.
The estimates used to value mortality risk reductions are a major determinant of the benefits of many public health and environmental policies. These estimates (typically expressed as the value per statistical life, VSL) describe the willingness of those affected by a policy to exchange their own income for the risk reductions they experience. While these values are relatively well studied in high-income countries, less is known about the values held by lower-income populations. We identify 26 studies conducted in the 172 countries considered low- or middle-income in any of the past 20 years; several have significant limitations. Thus there are few or no direct estimates of VSL for most such countries. Instead, analysts typically extrapolate values from wealthier countries, adjusting only for income differences. This extrapolation requires selecting a base value and an income elasticity that summarizes the rate at which VSL changes with income. Because any such approach depends on assumptions of uncertain validity, we recommend that analysts conduct a standardized sensitivity analysis to assess the extent to which their conclusions change depending on these estimates. In the longer term, more research on the value of mortality risk reductions in low- and middle-income countries is essential.
Increased use of dicamba and/or glyphosate in dicamba/glyphosate-tolerant soybean might affect many sensitive crops, including potato. The objective of this study was to determine the growth and yield of ‘Russet Burbank’ potato grown from seed tubers (generation 2) from mother plants (generation 1) treated with dicamba (4, 20, and 99 g ae ha−1), glyphosate (8, 40, and 197 g ae ha−1), or a combination of dicamba and glyphosate during tuber initiation. Generation 2 tubers were planted near Oakes and Inkster, ND, in 2016 and 2017, at the same research farm where the generation 1 tubers were grown the previous year. Treatment with 99 g ha−1 dicamba, 197 g ha−1 glyphosate, or 99 g ha−1 dicamba + 197 g ha−1 glyphosate caused emergence of generation 2 plants to be reduced by up to 84%, 86%, and 87%, respectively, at 5 wk after planting. Total tuber yield of generation 2 was reduced up to 67%, 55%, and 68% when 99 g ha−1 dicamba, 197 g ha−1 glyphosate, or 99 g ha−1 dicamba + 197 g ha−1 glyphosate was applied to generation 1 plants, respectively. At each site year, 197 g ha−1 glyphosate reduced total yield and marketable yield, while 99 g ha−1 dicamba reduced total yield and marketable yield in some site-years. This study confirms that exposure to glyphosate and dicamba of potato grown for potato seed tubers can negatively affect the growth and yield potential of the subsequently grown daughter generation.
Dicamba may be an efficacious option for the control of glyphosate-resistant (GR) horseweed in glyphosate/dicamba-resistant soybean; research is needed to optimize the application rate based on horseweed height at the time of application. The purpose of this study was to determine the effect of glyphosate/dicamba rate and application timing for the control of GR horseweed. Glyphosate/dicamba was applied at three rates (900, 1,350, and 1,800 g ae ha−1) at three horseweed application timings (5, 15, and 25 cm) in a factorial design. There was no interaction between glyphosate/dicamba rate and timing for GR horseweed control or soybean yield; however, there was an interaction for GR horseweed density and biomass. At 2 and 4 wk after application (WAA), there was a decrease in GR horseweed control as the height at the time application increased. At 4 WAA, the application of glyphosate/dicamba to GR horseweed that was 5-, 15-, and 25-cm tall provided 87%, 76%, and 62% control, respectively. There was no impact of glyphosate/dicamba application timing on soybean yield. At 2, 4, and 8 WAA, there was an increase in GR horseweed control as the rate of glyphosate/dicamba was increased. At 8 WAA, glyphosate/dicamba applied at 900, 1,350, and 1,800 g ae ha−1 controlled GR horseweed 76%, 87%, and 92%, respectively. Earlier application timings and higher rates of glyphosate/dicamba caused the greatest reduction in GR horseweed density and biomass. Reduced GR horseweed competition resulted in a 100% to 144% increase in soybean yield, but there was no difference in soybean yield among glyphosate/dicamba rates tested.
Sexual minority youth have elevated suicidal ideation and self-harm compared with heterosexual young people; however, evidence for mediating mechanisms is predominantly cross-sectional. Using a longitudinal design, we investigated self-esteem and depressive symptoms as mediators of increased rates of suicidal ideation or self-harm (SISH) among sexual minority youth, and the roles of childhood gender nonconformity (CGN) and sex as moderators of these relationships.
In total, 4274 youth from the Avon Longitudinal Study of Parents and Children (ALSPAC) cohort reported sexual orientation at age 15 years, and past-year SISH at age 20 years. Self-esteem and depressive symptoms were assessed at ages 17 and 18 years, respectively. CGN was measured at 30–57 months. Covariates included sociodemographic variables and earlier measures of mediator and outcome variables. Mediation pathways were assessed using structural equation modelling.
Sexual minority youth (almost 12% of the sample) were three times more likely than heterosexual youth to report past-year SISH (95% confidence interval 2.43–3.64) at 20 years. Two mediation pathways were identified: a single mediator pathway involving self-esteem and a multiple-mediated pathway involving self-esteem and depressive symptoms. Although CGN was associated with past-year SISH, it did not moderate any mediation pathways and there was no evidence for moderation by sex.
Lower self-esteem and increased depressive symptoms partly explain the increased risk for later suicidal ideation and self-harm in sexual minority youth. Preventive strategies could include self-esteem-enhancing or protecting interventions, especially in female sexual minority youth, and treatment of depression.
Determining bilingual status has been complicated by varying interpretations of what it means to be bilingual and how to quantify bilingual experience. We examined multiple indices of language dominance (self-reported proficiency, self-reported exposure, expressive language knowledge, receptive language knowledge, and a hybrid), and whether these profiles related to performance on linguistic and cognitive tasks. Participants were administered receptive and expressive vocabulary tasks in English and Spanish, and a nonlinguistic spatial Stroop task. Analyses revealed a relation between dominance profiles and cognate and nonlinguistic Stroop effects, with somewhat different patterns emerging across measures of language dominance and variable type (continuous, categorical). Only a hybrid definition of language dominance accounted for cognate effects in the dominant language, as well as nonlinguistic spatial Stroop effects. Findings suggest that nuanced effects, such as cross-linguistic cognate effects in a dominant language and cognitive control abilities, may be particularly sensitive to operational definitions of language status.
There is an increasing incidence of overweight/obesity and mental health disorders in young adults and the two conditions often coexist. We aimed to investigate the influence of antenatal and postnatal factors that may underlie this association with a focus on maternal prenatal smoking, socio-economic status and gender. Data from the Western Australian Pregnancy Cohort (Raine) Study (women enrolled 1989–1991) including 1056 offspring aged 20 years (cohort recalled 2010–2012) were analyzed (2015–2016) using multivariable models for associations between offspring depression scores (DASS-21 Depression-scale) and body mass index (BMI), adjusting for pregnancy and early life factors and offspring behaviours. There was a significant positive relationship between offspring depression-score and BMI independent of gender and other psychosocial covariates. There was a significant interaction between maternal prenatal smoking and depression-score (interaction coefficient=0.096; 95% CI: 0.006, 0.19, P=0.037), indicating the relationship between depression-score and BMI differed according to maternal prenatal smoking status. In offspring of maternal prenatal smokers, a positive association between BMI and depression-score (coefficient=0.133; 95% CI: 0.05, 0.21, P=0.001) equated to 1.1 kg/m2 increase in BMI for every 1standard deviation (8 units) increase in depression-score. Substituting low family income during pregnancy for maternal prenatal smoking in the interaction (interaction coefficient=0.091; 95% CI: 0.01, 0.17, P=0.027) showed a positive association between BMI and depression score only among offspring of mothers with a low family income during pregnancy (coefficient=0.118; 95% CI: 0.06, 0.18, P<0.001). There were no significant effects of gender on these associations. Whilst further studies are needed to determine whether these associations are supported in other populations, they suggest potentially important maternal behavioural and socio-economic factors that identify individuals vulnerable to the coexistence of obesity and depression in early adulthood.
Background: Neurogenic orthostatic hypotension (NOH) is characterized by a reduction in systolic blood pressure of ≥20 mmHg or diastolic blood pressure of ≥10 mmHg within three minutes of upright posture. NOH is prevalent in the elderly population who is at increased risk for cognitive decline, therefore it is imperative to investigate if there is a relationship between NOH and impaired cognition. Methods: Currently, 9 control subjects and 4 NOH patients have been recruited. Cognitive function is assessed using the symbol digit modalities test (SDMT) which assesses information processing speed and the Stroop test which measures response inhibition. SDMT and Stroop test are administered when the table is supine and during tilt. Results: NOH patients scored significantly worse on SDMT when lying (p=0.018) and standing (p=0.004) compared to the control group. Control subjects performed significantly better when standing for both SDMT (p=0.008) and Stroop (p=0.026), whereas NOH patients had similar scores when lying and standing for SDMT and Stroop. Conclusions: Preliminary results show that information processing speed is slower in NOH patients than controls in both the supine and standing positions. NOH patients have a more difficult time inhibiting unwanted responses compared to controls when standing, which is represented by a greater interference score in NOH patients.
Sheep are seasonally-polyoestrous short-day breeders. Although the domesticated breeds have longer breeding seasons than the feral breeds, their maximum ovulation rates are only achieved over a relatively short period. The effect of these seasonal shifts in ovarian response on the success of ovum recovery for genetic improvement or breed conservation is unknown. The aim of the present study was to assess the efficiency of ovum recovery procedures for genetic conservation outwith the normal breeding season.
We present first results from a coordinated multiwavelength study of the neutron star low-mass X-ray binary EXO 0748 676. Fast UV, X-ray, and optical data were obtained including both spectral and timing information. We discuss how this study allows us to probe the temperature distribution within the binary and hence the geometry and efficiency of X-ray irradiation.
Objectives: This study investigated the relationship between close proximity to detonated blast munitions and cognitive functioning in OEF/OIF/OND Veterans. Methods: A total of 333 participants completed a comprehensive evaluation that included assessment of neuropsychological functions, psychiatric diagnoses and history of military and non-military brain injury. Participants were assigned to a Close-Range Blast Exposure (CBE) or Non-Close-Range Blast Exposure (nonCBE) group based on whether they had reported being exposed to at least one blast within 10 meters. Results: Groups were compared on principal component scores representing the domains of memory, verbal fluency, and complex attention (empirically derived from a battery of standardized cognitive tests), after adjusting for age, education, PTSD diagnosis, sleep quality, substance abuse disorder, and pain. The CBE group showed poorer performance on the memory component. Rates of clinical impairment were significantly higher in the CBE group on select CVLT-II indices. Exploratory analyses examined the effects of concussion and multiple blasts on test performance and revealed that number of lifetime concussions did not contribute to memory performance. However, accumulating blast exposures at distances greater than 10 meters did contribute to poorer performance. Conclusions: Close proximity to detonated blast munitions may impact memory, and Veterans exposed to close-range blast are more likely to demonstrate clinically meaningful deficits. These findings were observed after statistically adjusting for comorbid factors. Results suggest that proximity to blast should be considered when assessing for memory deficits in returning Veterans. Comorbid psychiatric factors may not entirely account for cognitive difficulties. (JINS, 2018, 24, 466–475)
Individuals of lower socioeconomic status (SES) display increased attentiveness to others and greater prosocial behavior compared to individuals of higher SES. We situate these effects within Pepper & Nettle's contextually appropriate response framework of SES. We argue that increased prosocial behavior is a contextually adaptive response for lower-SES individuals that serves to increase control over their more threatening social environments.