We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Higher inflammation has been linked to poor physical and mental health outcomes, and mortality, but few studies have rigorously examined whether changes in perceived stress and depressive symptoms are associated with increased inflammation within family caregivers and non-caregivers in a longitudinal design.
Design:
Longitudinal Study.
Setting:
REasons for Geographic And Racial Differences in Stroke cohort study.
Participants:
Participants included 239 individuals who were not caregivers at baseline but transitioned to providing substantial and sustained caregiving over time. They were initially matched to 241 non-caregiver comparisons on age, sex, race, education, marital status, self-rated health, and history of cardiovascular disease. Blood was drawn at baseline and approximately 9.3 years at follow-up for both groups.
Measurements:
Perceived Stress Scale, Center for Epidemiological Studies-Depression, inflammatory biomarkers, including high-sensitivity C-reactive protein, D dimer, tumor necrosis factor alpha receptor 1, interleukin (IL)-2, IL-6, and IL-10 taken at baseline and follow-up.
Results:
Although at follow-up, caregivers showed significantly greater worsening in perceived stress and depressive symptoms compared to non-caregivers, there were few significant associations between depressive symptoms or perceived stress on inflammation for either group. Inflammation, however, was associated with multiple demographic and health variables, including age, race, obesity, and use of medications for hypertension and diabetes for caregivers and non-caregivers.
Conclusions:
These findings illustrate the complexity of studying the associations between stress, depressive symptoms, and inflammation in older adults, where these associations may depend on demographic, disease, and medication effects. Future studies should examine whether resilience factors may prevent increased inflammation in older caregivers.
End-of-life care (EOLC) communication is beneficial but underutilized, particularly in conditions with a variable course such as chronic obstructive pulmonary disease (COPD) and congestive heart failure (CHF). Physicians’ emotional distress intolerance has been identified as a barrier to EOLC communication. However, studies of emotional distress intolerance in EOLC have largely relied on anecdotal reports, qualitative data, or observational studies of physician–patient communication. A free-standing measure of multiple dimensions of distress tolerance is warranted to enable the identification of individuals experiencing distress intolerance and to facilitate the effective targeting of interventions to improve distress tolerance.
Objectives
This study provides preliminary data on the reliability and validity of the Physician Distress Intolerance (PDI) scale. We examine potential subdimensions of emotional distress intolerance.
Method
Family medicine and internal medicine physicians completed the PDI, read vignettes describing patients with COPD or CHF, and indicated whether they initiated or delayed EOLC communication with their patients with similar conditions.
Results
Exploratory and confirmatory factor analyses were performed on separate samples. Confirmatory factor analysis confirmed that a three-factor solution was superior to a two- or one-factor solution. Three subscales were created: Anticipating Negative Emotions, Intolerance of Uncertainty, and Iatrogenic Harm. The full scale and subscales had adequate internal consistency and demonstrated evidence of validity. Higher scores on the PDI, indicating greater distress intolerance, were negatively associated with initiation and positively associated with delay of EOLC communication. Subscales provided unique information.
Significance of results
The PDI can contribute to research investigating and addressing emotional barriers to EOLC communication.
Diets varying in SFA and MUFA content can impact glycaemic control; however, whether underlying differences in genetic make-up can influence blood glucose responses to these dietary fatty acids is unknown. We examined the impact of dietary oils varying in SFA/MUFA content on changes in blood glucose levels (primary outcome) and whether these changes were modified by variants in the stearoyl-CoA desaturase (SCD) gene (secondary outcome). Obese men and women participating in the randomised, crossover, isoenergetic, controlled-feeding Canola Oil Multicenter Intervention Trial II consumed three dietary oils for 6 weeks, with washout periods of ˜6 weeks between each treatment. Diets studied included a high SFA/low MUFA Control oil (36·6 % SFA/28·2 % MUFA), a conventional canola oil (6·2 % SFA/63·1 % MUFA) and a high-oleic acid canola oil (5·8 % SFA/74·7 % MUFA). No differences in fasting blood glucose were observed following the consumption of the dietary oils. However, when stratified by SCD genotypes, significant SNP-by-treatment interactions on blood glucose response were found with additive models for rs1502593 (P = 0·01), rs3071 (P = 0·02) and rs522951 (P = 0·03). The interaction for rs3071 remained significant (P = 0·005) when analysed with a recessive model, where individuals carrying the CC genotype showed an increase (0·14 (sem 0·09) mmol/l) in blood glucose levels with the Control oil diet, but reductions in blood glucose with both MUFA oil diets. Individuals carrying the AA and AC genotypes experienced reductions in blood glucose in response to all three oils. These findings identify a potential new target for personalised nutrition approaches aimed at improving glycaemic control.
Maltreatment during development is associated with epigenetic changes to the genome. Enhancing caregiving may mitigate these effects. Attachment and Biobehavioral Catch-Up (ABC) is an intervention that has been shown to improve parent–child relationships and a variety of biological and behavioral outcomes among children that are involved in Child Protective Services. This preliminary study, using a small sample size, explored whether children who received ABC exhibit different methylation patterns than those who received a control intervention. The participants included 23 children aged 6–21 months who were randomized to receive ABC (n = 12) or a control intervention (n = 11). While the children displayed similar methylation patterns preintervention, DNA methylation varied between the ABC and control groups at 14,828 sites postintervention. Functional pathway analyses indicated that these differences were associated with gene pathways that are involved in cell signaling, metabolism, and neuronal development. This study is one of the first to explore parenting intervention effects on children's DNA methylation at the whole genome level in infancy. These preliminary findings provide a basis for hypothesis generation in further research with larger-scale studies regarding the malleability of epigenetic states that are associated with maltreatment.
While the Kadi affair has attracted a lot of attention, this Article approaches it from a rarely used contextual theoretical perspective of resolving institutional conflicts through reflexive sincere cooperation. The argument is short and simple: The institutional relationship between the EU judiciary and the UN Security Council should have been conducted not in strategic-pragmatic terms motivated by institutional power-plays, but rather by genuine pluralist institutional cooperation. The argument is preceded by an in-depth analysis of the theoretical and concrete practical shortcomings stemming from the lack of institutional cooperation between the UN and the EU in the Kadi affair. These shortcomings were not inevitable, as the EU and the UN legal and political systems are already connected with a whole set of bridging mechanisms. These should be, however, strengthened and their use should be made more common. In order to achieve that, the Article suggests an amendment to the Statute of the Court of Justice of the EU and further improvement of the safeguards in the UN Security Council sanctioning mechanisms procedures. There is no dilemma: Enhanced institutional cooperation between the institutions of the two systems will work to their mutual advantage as well as, most importantly, maintain the rights and liberties of individuals like Kadi.
We reviewed all patients who were supported with extracorporeal membrane oxygenation and/or ventricular assist device at our institution in order to describe diagnostic characteristics and assess mortality.
Methods
A retrospective cohort study was performed including all patients supported with extracorporeal membrane oxygenation and/or ventricular assist device from our first case (8 October, 1998) through 25 July, 2016. The primary outcome of interest was mortality, which was modelled by the Kaplan–Meier method.
Results
A total of 223 patients underwent 241 extracorporeal membrane oxygenation runs. Median support time was 4.0 days, ranging from 0.04 to 55.8 days, with a mean of 6.4±7.0 days. Mean (±SD) age at initiation was 727.4 days (±146.9 days). Indications for extracorporeal membrane oxygenation were stratified by primary indication: cardiac extracorporeal membrane oxygenation (n=175; 72.6%) or respiratory extracorporeal membrane oxygenation (n=66; 27.4%). The most frequent diagnosis for cardiac extracorporeal membrane oxygenation patients was hypoplastic left heart syndrome or hypoplastic left heart syndrome-related malformation (n=55 patients with HLHS who underwent 64 extracorporeal membrane oxygenation runs). For respiratory extracorporeal membrane oxygenation, the most frequent diagnosis was congenital diaphragmatic hernia (n=22). A total of 24 patients underwent 26 ventricular assist device runs. Median support time was 7 days, ranging from 0 to 75 days, with a mean of 15.3±18.8 days. Mean age at initiation of ventricular assist device was 2530.8±660.2 days (6.93±1.81 years). Cardiomyopathy/myocarditis was the most frequent indication for ventricular assist device placement (n=14; 53.8%). Survival to discharge was 42.2% for extracorporeal membrane oxygenation patients and 54.2% for ventricular assist device patients. Kaplan–Meier 1-year survival was as follows: all patients, 41.0%; extracorporeal membrane oxygenation patients, 41.0%; and ventricular assist device patients, 43.2%. Kaplan–Meier 5-year survival was as follows: all patients, 39.7%; extracorporeal membrane oxygenation patients, 39.7%; and ventricular assist device patients, 43.2%.
Conclusions
This single-institutional 18-year review documents the differential probability of survival for various sub-groups of patients who require support with extracorporeal membrane oxygenation or ventricular assist device. The indication for mechanical circulatory support, underlying diagnosis, age, and setting in which cannulation occurs may affect survival after extracorporeal membrane oxygenation and ventricular assist device. The Kaplan–Meier analyses in this study demonstrate that patients who survive to hospital discharge have an excellent chance of longer-term survival.
OBJECTIVES/SPECIFIC AIMS: Traditional hospice focuses on symptoms and quality of life (QOL) at the very end of life. Clinical symptoms and QOL in the last 1–2 years of life are also important and may be affected by dementia. Our objective was to characterize how symptoms differ between people with and without dementia in the last years before death and whether symptoms impact social dimensions of QOL. METHODS/STUDY POPULATION: We studied 1270 community-dwelling participants who died between 2011 and 2015 in the National Health and Aging Trends Study, a nationally representative cohort of older adults. From the last interview before death, we examined sensory (vision; hearing), physical (pain; problems with breathing, chewing/swallowing, speaking, upper or lower extremity strength/movement, and balance/coordination), and psychiatric (depression; anxiety; insomnia) symptoms by dementia status. We examined associations between symptoms and participation restrictions (visiting family/friends, attending religious services, participating in clubs/activities, going out for enjoyment, and engaging in favorite activity). RESULTS/ANTICIPATED RESULTS: Low energy (69%), pain (59%), and lower extremity strength/movement problems (56%) were most common. People with dementia (37.3% of decedents) had higher prevalence of all symptoms (p≤0.01), except pain, breathing problems, and insomnia. Dementia and greater symptom burden were independently associated with greater odds of participation restrictions (p<0.05). Problems speaking were significantly associated with limitations in all activities except for attending religious services. Balance/coordination, energy, and strength/movement problems were associated with limitations in 3 activities. DISCUSSION/SIGNIFICANCE OF IMPACT: Sensory, physical, and psychiatric symptoms are common in the year before death, with greater symptom prevalence in people with dementia. Both dementia and symptoms are associated with restrictions in participation. Older patients may benefit not only from earlier emphasis on palliative care but also programs and assistive devices that accommodate physical impairments.
In recent years, satellite imagery, previously restricted to the defence and intelligence communities, has been made available to a range of non-state actors as well. Non-governmental organisations, journalists, and celebrities such as George Clooney now use remote sensing data like digital Sherlock Holmeses to investigate and reveal human rights abuses, political violence, environmental destruction, and eco-crimes from a distance. It is often said that the increasing availability and applicability of remote sensing technologies has contributed to the rise of what can be called ‘satellite-based activism’ empowering non-state groups to challenge state practices of seeing and showing. In this article we argue that NGO activism is not challenging the sovereign gaze of the state but, on the contrary, actually reinforcing it. We will bolster our arguments in this regard in two prominent fields of non-governmental remote sensing: human rights and environmental governance.
This article is concerned with the debate about interdisciplinary methods in international law, in particular the turn to International Relations. It finds the historical critique of Martti Koskenniemi grounded in a more methodological issue: the turn toward a redefinition of norm properties impedes on the critical discursive quality of law. Shaping this historical critique into a research question that allows for meaningful engagement, the article discusses Koskenniemi’s charges drawing on recent constructivist scholarship. Giving an account of what it means to be ‘obliged’ to obey the law, this article defends the coherence of Koskenniemi’s position and suggests that we should take the critique of the interdisciplinary project between law and International Relations seriously. While it agrees that a significant part of the discourse fails to appreciate the particularities of the law, it suggests that understanding legal obligations requires taking the institutional autonomy of the law into account. Respecting this autonomy, in turn, points to a multi- instead of an interdisciplinary project. The reflexive formalist conception of the law that this article advocates captures the obligating nature of the law, independent of the normative content of particular rules.
A postemergence (POST) timing study was conducted on established populations of burcucumber (Sicyos angulatus) in corn (Zea mays), and a second study examined the residual activity of several herbicides for burcucumber control under greenhouse conditions. In the field study, flumiclorac, halosulfuron, primisulfuron, CGA 152005, and CGA 152005 + primisulfuron (45, 71, 40, 40, and 20 + 20 g ai/ha, respectively) were applied at two POST timings. CGA 152005, primisulfuron, and the combination provided greater than 85% control of burcucumber 14 wk after planting (WAP). Flumiclorac and halosulfuron provided 60% control or less by 8 WAP. Timing of the POST applications did not influence burcucumber control by 11 WAP with any herbicide. In the greenhouse, germinated burcucumber seeds were placed in soil treated with atrazine, chlorimuron, primisulfuron, or CGA 152005 at normal field use rates. All treatments provided similar residual control early; however by 4 wk after treatment (WAT), control from atrazine was less than 10% compared to 69% for chlorimuron and about 50% for primisulfuron and CGA 152005. This research suggests that CGA 152005 and primisulfuron can both be effective for managing burcucumber in corn, whereas flumiclorac and halosulfuron proved ineffective.
A 2-yr experiment evaluated the effect of spring soil disturbance on the periodicity of weed emergence. At four locations across the northeastern United States, emerged weeds, by species, were monitored every 2 wk in both undisturbed plots and plots tilled in the spring with a rotary cultivator. Eight weed species including large crabgrass, giant and yellow foxtail, common lambsquarters, smooth pigweed, eastern black nightshade, common ragweed, and velvetleaf occurred at three or more site-years. Spring soil disturbance either had no effect or reduced total seedling emergence compared with undisturbed soils. Total seedling emergence for large crabgrass, giant foxtail, smooth pigweed, and common ragweed were on average, 1.4 to 2.6 times less with spring soil disturbance, whereas eastern black nightshade and velvetleaf were mostly unaffected by the soil disturbance. The influence of soil disturbance on yellow foxtail and common lambsquarters emergence varied between seasons and locations. Although the total number of emerged seedlings was often affected by the soil disturbance, with the exception of yellow foxtail and common ragweed, the periodicity of emergence was similar across disturbed and undisturbed treatments.
A 2-yr experiment repeated at five locations across the northeastern United States evaluated the effect of weed density and time of glyphosate application on weed control and corn grain yield using a single postemergence (POST) application. Three weed densities, designed to reduce corn yields by 10, 25, and 50%, were established across the locations, using forage sorghum as a surrogate weed. At each weed density, a single application of glyphosate at 1.12 kg ai/ha was applied to glyphosate-resistant corn at the V2, V4, V6, and V8 growth stages. At low and medium weed densities, the V4 through V8 applications provided nearly complete weed control and yields equivalent to the weed-free treatment. Weed biomass and the potential for weed seed production from subsequent weed emergence made the V2 timing less effective. At high weed densities, the V4 followed by the V6 timing provided the most effective weed control, while maintaining corn yield. Weed competition from subsequent weed emergence in the V2 application and the duration of weed competition in the V8 timing reduced yield on average by 12 and 15%, respectively. This research shows that single POST applications can be successful but weed density and herbicide timing are key elements.
A 2-yr experiment assessed the potential for using soil degree days (DD) to predict cumulative weed emergence. Emerged weeds, by species, were monitored every 2 wk in undisturbed plots. Soil DD were calculated at each location using a base temperature of 9 C. Weed emergence was fit with logistic regression for common ragweed, common lambsquarters, velvetleaf, giant foxtail, yellow foxtail, large crabgrass, smooth pigweed, and eastern black nightshade. Coefficients of determination for the logistic models fit to the field data ranged between 0.90 and 0.95 for the eight weed species. Common ragweed and common lambsquarters were among the earliest species to emerge, reaching 10% emergence before 150 DD. Velvetleaf, giant foxtail, and yellow foxtail were next, completing 10% emergence by 180 DD. The last weeds to emerge were large crabgrass, smooth pigweed, and eastern black nightshade, which emerged after 280 DD. The developed models were verified by predicting cumulative weed emergence in adjacent plots. The coefficients of determination for the model verification plots ranged from 0.66 to 0.99 and averaged 0.90 across all eight weed species. These results suggest that soil DD are good predictors for weed emergence. Forecasting weed emergence will help growers make better crop and weed management decisions.
This essay argues that categories of corruption and reform, so often used by historians to assess the Gilded Age, are themselves the ideological products of the period's struggles for political, economic, and social power. It does so by exploring fierce disputes over how to value sugar, a crucial commodity in the political economy of the late nineteenth-century United States. Confronted with evidence of massive fraud, the Treasury hoped that chemical techniques would rationalize the collection of sugar tariffs. Instead their introduction enabled the rise of the notorious Sugar Trust, by making it more difficult to distinguish corrupt influence from the legitimate exercise of expert judgment.
Sugar exemplifies how Gilded Age battles over corruption should be seen in the broader and longer context of the history of capitalism, in which self-proclaimed reformers have used charges of fraud and adulteration to discredit the knowledge of artisans and workers while mantling themselves in claims to objectivity and reason. Scientific knowledge, far from being the inevitable ally of accountability and good governance, could just as easily be deployed to obfuscate and confuse, and thereby to wrest control of social and economic power.
This article shows how John Maynard Keynes's lifelong commitment to eugenics was deeply embedded in his political, economic, and philosophical work. At the turn of the century, eugenics seemed poised to grant industrial nations unprecedented control over their own future, but that potential depended on contested understandings of the biological mechanisms of inheritance. Early in his career, Keynes helped William Bateson, Britain's chief proponent of Mendelian genetics, analyze problems in human heredity. Simultaneously, Keynes publicly opposed the efforts by Francis Galton and Karl Pearson to study inheritance through statistical biometry. For Keynes, this conflict was morally laden: Mendelism incorporated the only ethical theory of uncertainty, while biometry rested on false and dangerous concepts. This early study of heredity shaped Keynes's visions of industrial democracy after 1918. Liberals looked for a system of societal and economic management to engineer an escape from the postwar Malthusian trap. Britain's economic plight, Keynes argued, was rooted in the hereditary weaknesses of its leadership. Successful technocratic liberalism would depend on control over the quality as well as quantity of human beings. Ultimately, in his essay “Economic Possibilities for Our Grandchildren,” Keynes predicted that effective eugenic management would bring about capitalism's end.
Exertional heat illness is a classification of disease with clinical presentations that are not always diagnosed easily. Exertional heat stroke is a significant cause of death in competitive sports, and the increasing popularity of marathons races and ultra-endurance competitions will make treating many heat illnesses more common for Emergency Medical Services (EMS) providers. Although evidence is available primarily from case series and healthy volunteer studies, the consensus for treating exertional heat illness, coupled with altered mental status, is whole body rapid cooling. Cold or ice water immersion remains the most effective treatment to achieve this goal. External thermometry is unreliable in the context of heat stress and direct internal temperature measurement by rectal or esophageal probes must be used when diagnosing heat illness and during cooling. With rapid recognition and implementation of effective cooling, most patients suffering from exertional heat stroke will recover quickly and can be discharged home with instructions to rest and to avoid heat stress and exercise for a minimum of 48 hours; although, further research pertaining to return to activity is warranted.
PryorRR, RothRN, SuyamaJ, HostlerD. Exertional Heat Illness: Emerging Concepts and Advances in Prehospital Care. Prehosp Disaster Med. 2015;30(3):19.