To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Home care for older people in England is commissioned through local authorities working predominantly with independent providers of care. Commissioners operate in a market model, planning and procuring home care services for local populations. Their role involves ‘managing’ and ‘shaping’ the market to ensure an adequate supply of care providers. Another imperative, emerging from the principles of personalisation, is the drive to achieve user outcomes rather than ‘time and task’ objectives. Little formal research has investigated the way commissioners reconcile these different requirements and organise commissioning. This study investigated commissioning approaches using qualitative telephone interviews with ten commissioners from different local authorities in England. The characteristics of commissioning were analysed thematically. Findings indicated (a) commissioning involved complex systems and processes, uniquely shaped for the local context, but frequently changed, suggesting a constant need for reframing commissioning arrangements; (b) partnerships with providers were mainly transactional, with occasional examples of collaborative models, that were considered to facilitate flexible services more appropriate for commissioning for personalised outcomes; and (c) only a small number of commissioners had attempted to reconcile the competing and incompatible goals of tightly prescribed contracting and working collaboratively with providers. A better understanding of flexible contracting arrangements and the hallmarks of a trusting collaboration is required to move beyond the procedural elements of contracting and commissioning.
Congenital heart disease (CHD) describes the abnormalities of the heart or great vessels that are present at birth and that significantly impair the function of the cardiovascular system. It is the most common birth defect, affecting up to 2% of live-born children: according to the British Heart Foundation (BHF Statistics 2018), CHD is detected in 1 out of 180 babies (excluding bicuspid aortic valve), which translates into at least 4000 affected infants in the UK per year. CHD is diagnosed in over 8% of premature births and is a leading cause of infant mortality (up to 10% of cases). Cardiac abnormalities account for more than 9% of all stillbirths after 20 weeks and up to 4% of spontaneous miscarriages before 20 weeks of pregnancy. It is estimated that in the European Union, 3000 children with heart defects die annually as ‘terminations of pregnancy for fetal anomaly’, late fetal death or early neonatal death. Some malformations, such as aortic valve anomalies, often do not manifest at birth, and as more diagnoses are being made later in life, the number of CHD cases only increases [1–3].
The Minnesota Center for Twin and Family Research (MCTFR) comprises multiple longitudinal, community-representative investigations of twin and adoptive families that focus on psychological adjustment, personality, cognitive ability and brain function, with a special emphasis on substance use and related psychopathology. The MCTFR includes the Minnesota Twin Registry (MTR), a cohort of twins who have completed assessments in middle and older adulthood; the Minnesota Twin Family Study (MTFS) of twins assessed from childhood and adolescence into middle adulthood; the Enrichment Study (ES) of twins oversampled for high risk for substance-use disorders assessed from childhood into young adulthood; the Adolescent Brain (AdBrain) study, a neuroimaging study of adolescent twins; and the Siblings Interaction and Behavior Study (SIBS), a study of adoptive and nonadoptive families assessed from adolescence into young adulthood. Here we provide a brief overview of key features of these established studies and describe new MCTFR investigations that follow up and expand upon existing studies or recruit and assess new samples, including the MTR Study of Relationships, Personality, and Health (MTR-RPH); the Colorado-Minnesota (COMN) Marijuana Study; the Adolescent Brain Cognitive Development (ABCD) study; the Colorado Online Twins (CoTwins) study and the Children of Twins (CoT) study.
Given the evidence of multi-parameter risk factors in shaping cognitive outcomes in aging, including sleep, inflammation, cardiometabolism, and mood disorders, multidimensional investigations of their impact on cognition are warranted. We sought to determine the extent to which self-reported sleep disturbances, metabolic syndrome (MetS) factors, cellular inflammation, depressive symptomatology, and diminished physical mobility were associated with cognitive impairment and poorer cognitive performance.
This is a cross-sectional study.
Participants with elevated, well-controlled blood pressure were recruited from the local community for a Tai Chi and healthy-aging intervention study.
One hundred forty-five older adults (72.7 ± 7.9 years old; 66% female), 54 (37%) with evidence of cognitive impairment (CI) based on Montreal Cognitive Assessment (MoCA) score ≤24, underwent medical, psychological, and mood assessments.
CI and cognitive domain performance were assessed using the MoCA. Univariate correlations were computed to determine relationships between risk factors and cognitive outcomes. Bootstrapped logistic regression was used to determine significant predictors of CI risk and linear regression to explore cognitive domains affected by risk factors.
The CI group were slower on the mobility task, satisfied more MetS criteria, and reported poorer sleep than normocognitive individuals (all p < 0.05). Multivariate logistic regression indicated that sleep disturbances, but no other risk factors, predicted increased risk of evidence of CI (OR = 2.00, 95% CI: 1.26–4.87, 99% CI: 1.08–7.48). Further examination of MoCA cognitive subdomains revealed that sleep disturbances predicted poorer executive function (β = –0.26, 95% CI: –0.51 to –0.06, 99% CI: –0.61 to –0.02), with lesser effects on visuospatial performance (β = –0.20, 95% CI: –0.35 to –0.02, 99% CI: –0.39 to 0.03), and memory (β = –0.29, 95% CI: –0.66 to –0.01, 99% CI: –0.76 to 0.08).
Our results indicate that the deleterious impact of self-reported sleep disturbances on cognitive performance was prominent over other risk factors and illustrate the importance of clinician evaluation of sleep in patients with or at risk of diminished cognitive performance. Future, longitudinal studies implementing a comprehensive neuropsychological battery and objective sleep measurement are warranted to further explore these associations.
Open Strategy has drawn increasing attention in recent years. A growing number of studies have captured greater transparency and heightened inclusion in the strategic practices of contemporary organizations (e.g., Whittington et al., 2011; Hautz et al., 2017). It is often Information Technology (IT) that can facilitate involvement of a wider range of stakeholders in the generation of strategic content and knowledge (Chesbrough & Appleyard, 2007; Wulf & Butel, 2016), and in the practice of strategy (Whittington et al., 2011; Whittington, 2014). However, despite the widely recognized role of such technology as online platforms (Malhotra et al., 2017) and social media (Huang et al., 2013; Baptista et al., 2017) in enabling openness in strategy, literature with an explicit focus on IT has been surprisingly sparse to date (Tavakoli et al., 2015; 2017). Thus far, most papers have been published in Management and Strategic Management outlets (e.g., Whittington et al., 2011; Stieger et al., 2012; Seidl & Werle, 2017), including a special issue on Open Strategy in Long Range Planning (e.g., Hautz et al., 2017).
Kochia is one of the most problematic weeds in the United States. Field studies were conducted in five states (Wyoming, Colorado, Kansas, Nebraska, and South Dakota) over 2 yr (2010 and 2011) to evaluate kochia control with selected herbicides registered in five common crop scenarios: winter wheat, fallow, corn, soybean, and sugar beet to provide insight for diversifying kochia management in crop rotations. Kochia control varied by experimental site such that more variation in kochia control and biomass production was explained by experimental site than herbicide choice within a crop. Kochia control with herbicides currently labeled for use in sugar beet averaged 32% across locations. Kochia control was greatest and most consistent from corn herbicide programs (99%), followed by soybean (96%) and fallow (97%) herbicide programs. Kochia control from wheat herbicide programs was 93%. With respect to the availability of effective herbicide options, glyphosate-resistant kochia control was easiest in corn, soybean, and fallow, followed by wheat; and difficult to manage with herbicides in sugar beet.
Though theory suggests that individual differences in neuroticism (a tendency to experience negative emotions) would be associated with altered functioning of the amygdala (which has been linked with emotionality and emotion dysregulation in childhood, adolescence, and adulthood), results of functional neuroimaging studies have been contradictory and inconclusive. We aimed to clarify the relationship between neuroticism and three hypothesized neural markers derived from functional magnetic resonance imaging during negative emotion face processing: amygdala activation, amygdala habituation, and amygdala-prefrontal connectivity, each of which plays an important role in the experience and regulation of emotions. We used general linear models to examine the relationship between trait neuroticism and the hypothesized neural markers in a large sample of over 500 young adults. Although neuroticism was not significantly associated with magnitude of amygdala activation or amygdala habituation, it was associated with amygdala–ventromedial prefrontal cortex connectivity, which has been implicated in emotion regulation. Results suggest that trait neuroticism may represent a failure in top-down control and regulation of emotional reactions, rather than overactive emotion generation processes, per se. These findings suggest that neuroticism, which has been associated with increased rates of transdiagnostic psychopathology, may represent a failure in the inhibitory neurocircuitry associated with emotion regulation.
Background: The semantic variant of primary progressive aphasia (svPPA) is a form of dementia, mainly featuring language impairment, for which the extent of white matter (WM) damage is less described than its associated grey matter (GM) atrophy. Our study aimed to characterise the extent of this damage using a sensitive and unbiased approach. Methods: We conducted a between-group study comparing 10 patients with a clinical diagnosis of svPPA, recruited between 2011 and 2014 at a tertiary reference centre, with 9 cognitively healthy, age-matched controls. From diffusion tensor imaging (DTI) data, we extracted fractional anisotropy (FA) values using a tract-based spatial statistics approach. We further obtained GM volumetric data using the Freesurfer automated segmentation tool. We compared both groups using non-parametric Wilcoxon rank-sum tests, correcting for multiple comparisons. Results: Demographic data showed that patients and controls were comparable. As expected, clinical data showed lower results in svPPA than controls on cognitive screening tests. Tractography showed impaired diffusion in svPPA patients, with FA mostly decreased in the longitudinal, uncinate, cingulum and external capsule fasciculi. Volumetric data show significant atrophy in svPPA patients, mostly in the left entorhinal, amygdala, inferior temporal, middle temporal, superior temporal and temporal pole cortices, and bilateral fusiform gyri. Conclusions: This syndrome appears to be associated not only with GM but also significant WM degeneration. Thus, DTI could play a role in the differential diagnosis of atypical dementia by specifying WM damage specific to svPPA.
This research explores media reporting of Indigenous students’ Programme for International Student Assessment (PISA) results in two national and 11 metropolitan Australian newspapers from 2001 to 2015. Of almost 300 articles on PISA, only 10 focused on reporting of Indigenous PISA results. While general or non-Indigenous PISA results featured in media reports, especially at the time of the publication of PISA results, there was overwhelming neglect of Indigenous results and the performance gap. A thematic analysis of articles showed mainstream PISA reporting had critical commentary which is not found in the Indigenous PISA articles. The three themes identified include: a lack of teacher quality in remote and rural schools; the debate on Gonski funding recommendations and the PISA achievement gap between Indigenous and non-Indigenous students. This study concluded the overwhelming neglect is linked to media bias, which continues to drive mainstream media coverage of Indigenous Australians.
Many patients with advanced serious illness or at the end of life experience delirium, a potentially reversible form of acute brain dysfunction, which may impair ability to participate in medical decision-making and to engage with their loved ones. Screening for delirium provides an opportunity to address modifiable causes. Unfortunately, delirium remains underrecognized. The main objective of this pilot was to validate the brief Confusion Assessment Method (bCAM), a two-minute delirium-screening tool, in a veteran palliative care sample.
This was a pilot prospective, observational study that included hospitalized patients evaluated by the palliative care service at a single Veterans’ Administration Medical Center. The bCAM was compared against the reference standard, the Diagnostic and Statistical Manual of Mental Disorders, fifth edition. Both assessments were blinded and conducted within 30 minutes of each other.
We enrolled 36 patients who were a median of 67 years (interquartile range 63–73). The primary reasons for admission to the hospital were sepsis or severe infection (33%), severe cardiac disease (including heart failure, cardiogenic shock, and myocardial infarction) (17%), or gastrointestinal/liver disease (17%). The bCAM performed well against the Diagnostic and Statistical Manual of Mental Disorders, fifth edition, for detecting delirium, with a sensitivity (95% confidence interval) of 0.80 (0.4, 0.96) and specificity of 0.87 (0.67, 0.96).
Significance of Results
Delirium was present in 27% of patients enrolled and never recognized by the palliative care service in routine clinical care. The bCAM provided good sensitivity and specificity in a pilot of palliative care patients, providing a method for nonpsychiatrically trained personnel to detect delirium.
OBJECTIVES/SPECIFIC AIMS: Delirium, a form of acute brain dysfunction, characterized by changes in attention and alertness, is a known independent predictor of mortality in the Intensive Care Unit (ICU). We sought to understand whether catatonia, a more recently recognized form of acute brain dysfunction, is associated with increased 30-day mortality in critically ill older adults. METHODS/STUDY POPULATION: We prospectively enrolled critically ill patients at a single institution who were on a ventilator or in shock and evaluated them daily for delirium using the Confusion Assessment for the ICU and for catatonia using the Bush Francis Catatonia Rating Scale. Coma, was defined as a Richmond Agitation Scale score of −4 or −5. We used the Cox Proportional Hazards model predicting 30-day mortality after adjusting for delirium, coma and catatonia status. RESULTS/ANTICIPATED RESULTS: We enrolled 335 medical, surgical or trauma critically ill patients with 1103 matched delirium and catatonia assessments. Median age was 58 years (IQR: 48 - 67). Main indications for admission to the ICU included: airway disease or protection (32%; N=100) or sepsis and/or shock (25%; N=79. In the unadjusted analysis, regardless of the presence of catatonia, non-delirious individuals have the highest median survival times, while delirious patients have the lowest median survival time. Comparing the absence and presence of catatonia, the presence of catatonia worsens survival (Figure 1). In a time-dependent Cox model, comparing non-delirious individuals, holding catatonia status constant, delirious individuals have 1.72 times the hazards of death (IQR: 1.321, 2.231) while those with coma have 5.48 times the hazards of death (IQR: 4.298, 6.984). For DSM-5 catatonia scores, a 1-unit increase in the score is associated with 1.18 times the hazards of in-hospital mortality. Comparing two individuals with the same delirium status, an individual with a DSM-5 catatonia score of 0 (no catatonia) will have 1.178 times the hazard of death (IQR: 1.086, 1.278), while an individual with a score of 3 catatonia items (catatonia) present will have 1.63 times the hazard of death. DISCUSSION/SIGNIFICANCE OF IMPACT: Non-delirious individuals have the highest median survival times, while those who are comatose have the lowest median survival times after a critical illness, holding catatonia status constant. Comparing the absence and presence of catatonia, the presence of catatonia seems to worsen survival. Those individual who are both comatose and catatonic have the lowest median survival time.
Medical procedures and patient care activities may facilitate environmental dissemination of healthcare-associated pathogens such as methicillin-resistant Staphylococcus aureus (MRSA).
Observational cohort study of MRSA-colonized patients to determine the frequency of and risk factors for environmental shedding of MRSA during procedures and care activities in carriers with positive nares and/or wound cultures. Bivariate analyses were performed to identify factors associated with environmental shedding.
A Veterans Affairs hospital.
This study included 75 patients in contact precautions for MRSA colonization or infection.
Of 75 patients in contact precautions for MRSA, 55 (73%) had MRSA in nares and/or wounds and 25 (33%) had positive skin cultures. For the 52 patients with MRSA in nares and/or wounds and at least 1 observed procedure, environmental shedding of MRSA occurred more frequently during procedures and care activities than in the absence of a procedure (59 of 138, 43% vs 8 of 83, 10%; P < .001). During procedures, increased shedding occurred ≤0.9 m versus >0.9 m from the patient (52 of 138, 38% vs 25 of 138, 18%; P = .0004). Contamination occurred frequently on surfaces touched by personnel (12 of 38, 32%) and on portable equipment used for procedures (25 of 101, 25%). By bivariate analysis, the presence of a wound with MRSA was associated with shedding (17 of 29, 59% versus 6 of 23, 26%; P = .04).
Environmental shedding of MRSA occurs frequently during medical procedures and patient care activities. There is a need for effective strategies to disinfect surfaces and equipment after procedures.
OBJECTIVES/SPECIFIC AIMS: Background: Delirium is a well described form of acute brain organ dysfunction characterized by decreased or increased movement, changes in attention and concentration as well as perceptual disturbances (i.e., hallucinations) and delusions. Catatonia, a neuropsychiatric syndrome traditionally described in patients with severe psychiatric illness, can present as phenotypically similar to delirium and is characterized by increased, decreased and/or abnormal movements, staring, rigidity, and mutism. Delirium and catatonia can co-occur in the setting of medical illness, but no studies have explored this relationship by age. Our objective was to assess whether advancing age and the presence of catatonia are associated with delirium. METHODS/STUDY POPULATION: Methods: We prospectively enrolled critically ill patients at a single institution who were on a ventilator or in shock and evaluated them daily for delirium using the Confusion Assessment for the ICU and for catatonia using the Bush Francis Catatonia Rating Scale. Measures of association (OR) were assessed with a simple logistic regression model with catatonia as the independent variable and delirium as the dependent variable. Effect measure modification by age was assessed using a Likelihood ratio test. RESULTS/ANTICIPATED RESULTS: Results: We enrolled 136 medical and surgical critically ill patients with 452 matched (concomitant) delirium and catatonia assessments. Median age was 59 years (IQR: 52–68). In our cohort of 136 patients, 58 patients (43%) had delirium only, 4 (3%) had catatonia only, 42 (31%) had both delirium and catatonia, and 32 (24%) had neither. Age was significantly associated with prevalent delirium (i.e., increasing age associated with decreased risk for delirium) (p=0.04) after adjusting for catatonia severity. Catatonia was significantly associated with prevalent delirium (p<0.0001) after adjusting for age. Peak delirium risk was for patients aged 55 years with 3 or more catatonic signs, who had 53.4 times the odds of delirium (95% CI: 16.06, 176.75) than those with no catatonic signs. Patients 70 years and older with 3 or more catatonia features had half this risk. DISCUSSION/SIGNIFICANCE OF IMPACT: Conclusions: Catatonia is significantly associated with prevalent delirium even after controlling for age. These data support an inverted U-shape risk of delirium after adjusting for catatonia. This relationship and its clinical ramifications need to be examined in a larger sample, including patients with dementia. Additionally, we need to assess which acute brain syndrome (delirium or catatonia) develops first.
In “Toward a Theory of Race, Crime, and Urban Inequality,” Sampson and Wilson (1995) argued that racial disparities in violent crime are attributable in large part to the persistent structural disadvantages that are disproportionately concentrated in African American communities. They also argued that the ultimate causes of crime were similar for both Whites and Blacks, leading to what has been labeled the thesis of “racial invariance.” In light of the large scale social changes of the past two decades and the renewed political salience of race and crime in the United States, this paper reassesses and updates evidence evaluating the theory. In so doing, we clarify key concepts from the original thesis, delineate the proper context of validation, and address new challenges. Overall, we find that the accumulated empirical evidence provides broad but qualified support for the theoretical claims. We conclude by charting a dual path forward: an agenda for future research on the linkages between race and crime, and policy recommendations that align with the theory’s emphasis on neighborhood level structural forces but with causal space for cultural factors.
Antineuronal antibodies are associated with psychosis, although their clinical significance in first episode of psychosis (FEP) is undetermined.
To examine all patients admitted for treatment of FEP for antineuronal antibodies and describe clinical presentations and treatment outcomes in those who were antibody positive.
Individuals admitted for FEP to six mental health units in Queensland, Australia, were prospectively tested for serum antineuronal antibodies. Antibody-positive patients were referred for neurological and immunological assessment and therapy.
Of 113 consenting participants, six had antineuronal antibodies (anti-N-methyl-D-aspartate receptor antibodies [n = 4], voltage-gated potassium channel antibodies [n = 1] and antibodies against uncharacterised antigen [n = 1]). Five received immunotherapy, which prompted resolution of psychosis in four.
A small subgroup of patients admitted to hospital with FEP have antineuronal antibodies detectable in serum and are responsive to immunotherapy. Early diagnosis and treatment is critical to optimise recovery.
Direct ink writing of silicone elastomers enables printing with precise control of porosity and mechanical properties of ordered cellular solids, suitable for shock absorption and stress mitigation applications. With the ability to manipulate structure and feedstock stiffness, the design space becomes challenging to parse to obtain a solution producing a desired mechanical response. Here, we derive an analytical design approach for a specific architecture. Results from finite element simulations and quasi-static mechanical tests of two different parallel strand architectures were analyzed to understand the structure-property relationships under uniaxial compression. Combining effective stiffness-density scaling with least squares optimization of the stress responses yielded general response curves parameterized by resin modulus and strand spacing. An analytical expression of these curves serves as a reduced order model, which, when optimized, provides a rapid design capability for filament-based 3D printed structures. As a demonstration, the optimal design of a face-centered tetragonal architecture is computed that satisfies prescribed minimum and maximum load constraints.
Timing of weed emergence and seed persistence in the soil influence the ability to implement timely and effective control practices. Emergence patterns and seed persistence of kochia populations were monitored in 2010 and 2011 at sites in Kansas, Colorado, Wyoming, Nebraska, and South Dakota. Weekly observations of emergence were initiated in March and continued until no new emergence occurred. Seed was harvested from each site, placed into 100-seed mesh packets, and buried at depths of 0, 2.5, and 10 cm in fall of 2010 and 2011. Packets were exhumed at 6-mo intervals over 2 yr. Viability of exhumed seeds was evaluated. Nonlinear mixed-effects Weibull models were fit to cumulative emergence (%) across growing degree days (GDD) and to viable seed (%) across burial time to describe their fixed and random effects across site-years. Final emergence densities varied among site-years and ranged from as few as 4 to almost 380,000 seedlings m−2. Across 11 site-years in Kansas, cumulative GDD needed for 10% emergence were 168, while across 6 site-years in Wyoming and Nebraska, only 90 GDD were needed; on the calendar, this date shifted from early to late March. The majority (>95%) of kochia seed did not persist for more than 2 yr. Remaining seed viability was generally >80% when seeds were exhumed within 6 mo after burial in March, and declined to <5% by October of the first year after burial. Burial did not appear to increase or decrease seed viability over time but placed seed in a position from which seedling emergence would not be possible. High seedling emergence that occurs very early in the spring emphasizes the need for fall or early spring PRE weed control such as tillage, herbicides, and cover crops, while continued emergence into midsummer emphasizes the need for extended periods of kochia management.