We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Cognitive and functional impairment after stroke are common, but the relation between cognitive and functional decline after stroke is not well studied.
Methods:
We used the comprehensive cohort in the Canadian Longitudinal Study on Aging to identify those with prior stroke, and we calculated reliable cognitive change scores from baseline to follow-up for the memory and executive domains. Functional decline was defined as an increase in the number of dependent daily activities. Using formal mediation analysis, we tested the presence and degree of mediation of the association between stroke and functional decline by cognitive decline.
Results:
There were 22,648 individuals with memory change scores (325 with stroke) and 17,613 individuals with executive change scores (241 with stroke). History of stroke was significantly associated with memory decline (−0.26 standard deviations, 95% CI −0.33 to −0.19), executive decline (−0.22, 95% CI −0.36 to −0.09), and new functional impairment (adjusted odds ratio 2.31, 95% CI 1.80–2.97) over a median of 3-year follow-up. Cognitive decline was a significant mediator of functional decline. Memory decline mediated only 5% of the relationship, whereas executive and overall cognitive decline mediated 13% and 22%, respectively.
Conclusion:
Cognitive decline is a mediator of the association between prior stroke and functional decline; consequently, strategies to delay, attenuate, or prevent cognitive decline after stroke may be important to preserving long-term functional status.
Variation exists in the timing of surgery for balanced complete atrioventricular septal defect repair. We sought to explore associations between timing of repair and resource utilisation and clinical outcomes in the first year of life.
Methods:
In this retrospective single-centre cohort study, we included patients who underwent complete atrioventricular septal defect repair between 2005 and 2019. Patients with left or right ventricular outflow tract obstruction and major non-cardiac comorbidities (except trisomy 21) were excluded. The primary outcome was days alive and out of the hospital in the first year of life.
Results:
Included were 79 infants, divided into tertiles based on age at surgery (1st = 46 to 137 days, 2nd = 140 – 176 days, 3rd = 178 – 316 days). There were no significant differences among age tertiles for days alive and out of the hospital in the first year of life by univariable analysis (tertile 1, median 351 days; tertile 2, 348 days; tertile 3, 354 days; p = 0.22). No patients died. Fewer post-operative ICU days were used in the oldest tertile relative to the youngest, but days of mechanical ventilation and hospitalisation were similar. Clinical outcomes after repair and resource utilisation in the first year of life were similar for unplanned cardiac reinterventions, outpatient cardiology clinic visits, and weight-for-age z-score at 1 year.
Conclusions:
Age at complete atrioventricular septal defect repair is not associated with important differences in clinical outcomes or resource utilisation in the first year of life.
Cultivation of lowbush blueberry (Vaccinium angustifolium Aiton), an important crop in the eastern part of North America, is unique, as it is carried out over the course of two consecutive growing seasons. Pest management, particularly weed management, is impacted by this biennial cultural practice. The choice of methods to control weeds is narrow, and such a system relies heavily on herbicides for weed management. Availability of unique herbicide active ingredients for weed management is limited, and available herbicides are used repeatedly, so the risk of developing resistance is acute. Hair fescue (Festuca filiformis Pourr.), a perennial grass weed, has evolved resistance to hexazinone, a photosystem II inhibitor frequently used in lowbush blueberry production. We show that substitution of phenylalanine to isoleucine at position 255 is responsible for a decreased sensitivity to hexazinone by a factor of 6.12. Early diagnosis of resistance based on the detection of the mutation will alert growers to use alternative control methods and thus help to increase the sustainability of the cropping system.
Infectious disease modeling plays an important role in the response to infectious disease outbreaks, perhaps most notably during the coronavirus disease 2019 (COVID-19) pandemic. In our experience working with state and local governments during COVID-19 and previous public health crises, we have observed that, while the scientific literature focuses on models’ accuracy and underlying assumptions, an important limitation on the effective application of modeling to public health decision-making is the ability of decision-makers and modelers to work together productively. We therefore propose a set of guiding principles, informed by our experience, for working relationships between decision-makers and modelers. We hypothesize that these guidelines will improve the utility of infectious disease modeling for public health decision-making, irrespective of the particular outbreak in question and of the precise modeling approaches being used.
Nest boxes have been used for many decades as tools for conservation and to study avian population dynamics. Plastic is increasingly used as a material for nest boxes, but no studies have investigated effects of this different material. Two consecutive studies were conducted to investigate effects of nest-box environment on nidicolous parasites, bacteria and fungi, as well as nest success, in blue tits Cyanistes caeruleus and great tits Parus major. The first compared microclimate and parasite and pathogen load in plastic and wooden nest boxes. The second tested the nest protection hypothesis – that birds naturally incorporate aromatic herbs into nests to decrease nest parasites and pathogens – by comparing parasite and pathogen load in plastic nest boxes to which aromatic or non-aromatic plant material was added. No significant difference in nest-box temperature or relative humidity was found between plastic and wooden boxes. Wooden boxes, however, contained 30-fold higher numbers of fleas and a higher total bacterial load on chicks. Fledging success for blue tit broods was significantly higher in wooden boxes. Parasites and bacteria did not decrease by the inclusion of aromatic herbs. The results increase the evidence base for nest-box design in support of plastic, which can provide an appropriate alternative nest-box material to wood, with apparently no difference in microclimate and no increase in the load of measured parasites and pathogens.
Excess sleep is associated with higher risk of stroke, but whether the risk is modified by age and if it remains elevated after accounting for the competing risk of death is not well understood.
Methods:
We used nine years of the Canadian Community Health Survey between 2000 to 2016 to obtain self-reported sleep duration and created a cohort of individuals without prior stroke, heart disease, or cancer. We linked to hospital records to determine subsequent admissions or emergency department visits for acute stroke until December 31, 2017. We used Cox proportional hazard models to determine the association between sleep duration and risk of stroke, assessing for modification by age and sex and adjusting for demographic, vascular, and social factors. We obtained cumulative incidence of stroke accounting for the competing risk of death.
Results:
There were 82,795 individuals in our cohort who met inclusion criteria and had self-reported sleep duration, with 1705 stroke events in follow-up. There was an association between excess sleep (≥10 h/night) and risk of stroke in those <70 years (fully adjusted hazard ratio 2.29, 95% CI 1.04–5.06), but not ≥70 years of age, with a similar association after accounting for the competing risk of death.
Conclusion:
Sleep duration ≥10 h/night is associated with increased risk of stroke in those <70 years of age. The findings support current guidelines for 7–9 h of sleep per night. Further research is needed to elucidate the relationship between sleep and cerebrovascular disease.
Mass vaccination campaigns have been used effectively to limit the impact of communicable disease on public health. However, the scale of the coronavirus disease (COVID-19) vaccination campaign is unprecedented. Mass vaccination sites consolidate resources and experience into a single entity and are essential to achieving community (“herd”) immunity rapidly, efficiently, and equitably. Health care systems, local and regional public health entities, emergency medical services, and private organizations can rapidly come together to solve problems and achieve success. As medical directors at several mass vaccination sites across the United States, we describe key mass vaccination site concepts, including site selection, operational models, patient flow, inventory management, staffing, technology, reporting, medical oversight, communication, and equity. Lessons learned from experience operating a diverse group of mass vaccination sites will help inform not only sites operating during the current pandemic, but also may serve as a blueprint for future outbreaks of highly infectious communicable disease.
Health utility instruments are increasingly being used to measure impairment in health-related quality of life (HRQoL) after stroke. Population-based studies of HRQoL after stroke and assessment of differences by age and functional domain are needed.
Methods:
We used the Canadian Community Health Survey linked with administrative databases to determine HRQoL using the Health Utilities Index Mark 3 (HUI3) among those with prior hospitalization or emergency department visit for stroke and compared to controls without stroke. We used multivariable linear regression to determine the difference in HUI3 between those with stroke and controls for the global index and individual attributes, with assessment for modification by age (<60, 60–74, and 75+ years) and sex, and we combined estimates across survey years using random effects meta-analysis.
Results:
Our cohort contained 1240 stroke survivors and 123,765 controls and was weighted to be representative of the Canadian household population. Mean health utility was 0.63 (95% confidence interval [CI] 0.58, 0.68) for those with stroke and 0.83 (95% CI 0.82, 0.84) for controls. There was significant modification by age, but not sex, with the greatest adjusted reduction in HUI3 among stroke respondents aged 60–74 years. Individual HUI3 attributes with the largest reductions in utility among stroke survivors compared to controls were mobility, cognition, emotion, and pain.
Conclusions:
In this population-based study, the reduction in HUI3 among stroke survivors compared to controls was greatest among respondents aged 60–74, and in attributes of mobility, cognition, emotion, and pain. These results highlight the persistent impairment of HRQoL in the chronic phase of stroke and potential targets for community support.
To understand the long-term climate and glaciological evolution of the ice sheet in the region bordering the Weddell Sea, the British Antarctic Survey has undertaken a series of successful ice core projects drilling to bedrock on Berkner Island, James Ross Island and the Fletcher Promontory. A new project, WACSWAIN, seeks to increase this knowledge by further drilling to bedrock on two further ice rises in this region. In a single-season project, an ice core was recovered to bedrock at 651 m on Skytrain Ice Rise using an ice core drill in a fluid-filled borehole. In a second season, a rapid access drill was used to recover ice chips to 323 m on Sherman Island in a dry borehole, though failing to reach the bedrock which was at an estimated depth of 428 m.
There is substantial evidence that voters’ choices are shaped by assessments of the state of the economy and that these assessments, in turn, are influenced by the news. But how does the economic news track the welfare of different income groups in an era of rising inequality? Whose economy does the news cover? Drawing on a large new dataset of US news content, we demonstrate that the tone of the economic news strongly and disproportionately tracks the fortunes of the richest households, with little sensitivity to income changes among the non-rich. Further, we present evidence that this pro-rich bias emerges not from pro-rich journalistic preferences but, rather, from the interaction of the media’s focus on economic aggregates with structural features of the relationship between economic growth and distribution. The findings yield a novel explanation of distributionally perverse electoral patterns and demonstrate how distributional biases in the economy condition economic accountability.
ABSTRACT IMPACT: This work will help to understand a novel therapeutic approach to a common type of acute myeloid leukemia. OBJECTIVES/GOALS: FMS-like tyrosine kinase 3 (or FLT3) mutations occur in ˜30% of acute myeloid leukemia (AML) cases. FLT3 tyrosine kinase domain (TKD) mutations are particularly important in relapsed/refractory FLT3 mutant AML, which portends poor prognosis. This study describes a therapeutic approach to overcoming resistance conferred by FLT3-TKD mutations. METHODS/STUDY POPULATION: To understand the efficacy of a novel type 1 FLT3 inhibitor (NCGC1481), as a monotherapy and combination therapy, several assays were utilized to interrogate functionality of these therapies. Cell lines and patient samples containing aspartate 835 to tyrosine mutations (D835Y, the most common TKD alteration) and phenylalanine 691 to leucine (F691L) were utilized to examine the effects of NCGC1481 with and without other targeted therapies like MEK inhibitors. Specifically, assays measuring viability, cell death using flow cytometry, in vitro clonogenicity, cellular signaling, and xenograft survival were examined in these FLT3-TKD AML models. Synergy was also measured using well described methods, which also allowed for appropriate dose finding for drug combination experiments. RESULTS/ANTICIPATED RESULTS: Our novel type 1 FLT3 inhibitor (NCGC1481) was particularly effective in the most common FLT3 TKD mutant, D835Y. NCGC1481 reduced viability and cell signaling, while also inducing cell death and prolonging xenograft survival in the FLT3-D835Y model system. In contrast, clinically approved FLT3 inhibitors were less effective at suppressing AML cells expressing FLT3-D835Y. In the case of FLT3-F691L, most of the FLT3 inhibitors tested, including NCGC1481, suppressed canonical FLT3 signaling, but did not significantly reduce viability or leukemic clonogenicity. However, when NCGC1481 was combined with other targeted agents like MEK inhibitors, at synergistic doses, eradication of the FLT3-F691L AML clone was substantially increased. DISCUSSION/SIGNIFICANCE OF FINDINGS: In AML, response to FLT3 inhibitor therapy is often short-lived, with resistance sometimes occurring via FLT3-TKD mutations. Given the dismal prognosis of relapsed FLT3 mutant AML, novel therapies are necessary. This study describes efficacy of a novel FLT3 inhibitor, along with its synergistic activity when combined with other targeted agents.
Nearly three times as many people detained in a jail have a serious mental illness (SMI) when compared to community samples. Once an individual with SMI gets involved in the criminal justice system, they are more likely than the general population to stay in the system, face repeated incarcerations, and return to prison more quickly when compared to their nonmentally ill counterparts.
The Cal-DSH Diversion Guidelines provide 10 general guidelines that jurisdictions should consider when developing diversion programs for individuals with a serious mental illness (SMI) who become involved in the criminal justice system. Screening for SMI in a jail setting is reviewed. In addition, important treatment interventions for SMI and substance use disorders are highlighted with the need to address criminogenic risk factors highlighted.
The World Health Organization (WHO; Geneva, Switzerland) recommends lay first responder (LFR) programs as a first step toward establishing formal Emergency Medical Services (EMS) in low- and middle-income countries (LMICs) to address injury. There is a scarcity of research investigating LFR program development in predominantly rural settings of LMICs.
Study Objective:
A pilot LFR program was launched and assessed over 12 months to investigate the feasibility of leveraging pre-existing transportation providers to scale up prehospital emergency care in rural, low-resource settings of LMICs.
Methods:
An LFR program was established in rural Chad to evaluate curriculum efficacy, using a validated 15-question pre-/post-test to measure participant knowledge improvement. Pre-/post-test score distributions were compared using a Wilcoxon Signed-Rank test. For test evaluation, each pre-test question was mapped to its corresponding post-test analog and compared using McNemar’s Chi-Squared Test to examine knowledge acquisition on a by-question basis. Longitudinal prehospital care was evaluated with incident reports, while program cost was tracked using a one-way sensitivity analysis. Qualitative follow-up surveys and semi-interviews were conducted at 12 months, with initial participants and randomly sampled motorcycle taxi drivers, and used a constructivist grounded theory approach to understand the factors motivating continued voluntary participation to inform future program continuity. The consolidated criteria for reporting qualitative research (COREQ) checklist was used to guide design, analysis, and reporting the qualitative results.
Results:
A total of 108 motorcycle taxi participants demonstrated significant knowledge improvement (P <.001) across three of four curricular categories: scene safety, airway and breathing, and bleeding control. Lay first responders treated 71 patients over six months, encountering five deaths, and provided patient transport in 82% of encounters. Lay first responders reported an average confidence score of 8.53/10 (n = 38). In qualitative follow-up surveys and semi-structured interviews, the ability to care for the injured, new knowledge/skills, and the resultant gain in social status and customer acquisition motivated continued involvement as LFRs. Ninety-six percent of untrained, randomly sampled motorcycle taxi drivers reported they would be willing to pay to participate in future training courses.
Conclusion:
Lay first responder programs appear feasible and cost-effective in rural LMIC settings. Participants demonstrate significant knowledge acquisition, and after 12 months of providing emergency care, report sustained voluntary participation due to social and financial benefits, suggesting sustainability and scalability of LFR programs in low-resource settings.
There is a continual need for invasive plant science to develop approaches for cost-effectively benefiting native over nonnative species in dynamic management and biophysical contexts, including within predominantly nonnative plant landscapes containing only small patches of native plants. Our objective was to test the effectiveness of a minimal-input strategy for enlarging native species patches within a nonnative plant matrix. In Pecos National Historical Park, New Mexico, USA, we identified 40 native perennial grass patches within a matrix of the nonnative annual forb kochia [Bassia scoparia (L.) A.J. Scott]. We mechanically cut B. scoparia in a 2-m-wide ring surrounding the perimeters of half the native grass patches (with the other half as uncut controls) and measured change in native grass patch size (relative to pretreatment) for 3 yr. Native grass patches around which B. scoparia was cut grew quickly the first posttreatment year and by the third year had increased in size four times more than control patches. Treated native grass patches expanded by an average of 25 m2, from 4 m2 in October 2015 before treatment to 29 m2 in October 2018. The experiment occurred during a dry period, conditions that should favor B. scoparia and contraction of the native grasses, suggesting that the observed increase in native grasses occurred despite suboptimal climatic conditions. Strategically treating around native patches to enlarge them over time showed promise as a minimal-input technique for increasing the proportion of the landscape dominated by native plants.
Glyphosate-resistant (GR) kochia has been reported across the western and midwestern United States. From 2011 to 2014, kochia seed was collected from agronomic regions across Colorado to evaluate the frequency and distribution of glyphosate-, dicamba-, and fluroxypyr-resistant kochia, and to assess the frequency of multiple resistance. Here we report resistance frequency as percent resistance within a population, and resistance distribution as the percentage and locations of accessions classified as resistant to a discriminating herbicide dose. In 2011, kochia accessions were screened with glyphosate only, whereas from 2012 to 2014 kochia accessions were screened with glyphosate, dicamba, and fluroxypyr. From 2011 to 2014, the percentages of GR kochia accessions were 60%, 45%, 39%, and 52%, respectively. The percentages of dicamba-resistant kochia accessions from 2012 to 2014 were 33%, 45%, and 28%, respectively. No fluroxypyr-resistant accessions were identified. Multiple-resistant accessions (low resistance or resistant to both glyphosate and dicamba) from 2012 to 2014 were identified in 14%, 15%, and 20% of total sampled accessions, respectively. This confirmation of multiple glyphosate and dicamba resistance in kochia accessions emphasizes the importance of diversity in herbicide site of action as critical to extend the usefulness of remaining effective herbicides such as fluroxypyr for management of this weed.