To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Organic grain producers are interested in reducing tillage to conserve soil and decrease labor and fuel costs. We examined agronomic and economic tradeoffs associated with alternative strategies for reducing tillage frequency and intensity in a cover crop–soybean (Glycine max L. Merr.) sequence within a corn (Zea mays L.)–soybean–spelt (Triticum spelta L.) organic cropping system experiment in Pennsylvania. Tillage-based soybean production preceded by a cover crop mixture of annual ryegrass (Lolium perenne L. ssp. multiflorum), orchardgrass (Dactylis glomerata L.) and forage radish (Raphanus sativus L.) interseeded into corn grain (Z. mays L.) was compared with reduced-tillage soybean production preceded by roller-crimped cereal rye (Secale cereale L.) that was sown after corn silage. Total aboveground weed biomass did not differ between soybean production strategies. Each strategy, however, was characterized by high inter-annual variability in weed abundance. Tillage-based soybean production marginally increased grain yield by 0.28 Mg ha−1 compared with reduced-tillage soybean. A path model of soybean yield indicated that soybean stand establishment and weed biomass were primary drivers of yield, but soybean production strategy had a measurable effect on yields due to factors other than within-season weed–crop competition. Cumulative tillage frequency and intensity were quantified for each cover crop—sequence using the Soil Tillage Intensity Rating (STIR) index. The reduced-tillage soybean sequence resulted in 50% less soil disturbance compared to tillage-based soybean sequence across study years. Finally, enterprise budget comparisons showed that the reduced-tillage soybean sequence resulted in lower input costs than the tillage-based soybean sequence but was approximately $114 ha−1 less profitable because of lower average yields.
In preparation for a multisite antibiotic stewardship intervention, we assessed knowledge and attitudes toward management of asymptomatic bacteriuria (ASB) plus teamwork and safety climate among providers, nurses, and clinical nurse assistants (CNAs).
Prospective surveys during January–June 2018.
All acute and long-term care units of 4 Veterans’ Affairs facilities.
The survey instrument included 2 previously tested subcomponents: the Kicking CAUTI survey (ASB knowledge and attitudes) and the Safety Attitudes Questionnaire (SAQ).
A total of 534 surveys were completed, with an overall response rate of 65%. Cognitive biases impacting management of ASB were identified. For example, providers presented with a case scenario of an asymptomatic patient with a positive urine culture were more likely to give antibiotics if the organism was resistant to antibiotics. Additionally, more than 80% of both nurses and CNAs indicated that foul smell is an appropriate indication for a urine culture. We found significant interprofessional differences in teamwork and safety climate (defined as attitudes about issues relevant to patient safety), with CNAs having highest scores and resident physicians having the lowest scores on self-reported perceptions of teamwork and safety climates (P < .001). Among providers, higher safety-climate scores were significantly associated with appropriate risk perceptions related to ASB, whereas social norms concerning ASB management were correlated with higher teamwork climate ratings.
Our survey revealed substantial misunderstanding regarding management of ASB among providers, nurses, and CNAs. Educating and empowering these professionals to discourage unnecessary urine culturing and inappropriate antibiotic use will be key components of antibiotic stewardship efforts.
A 2018 workshop on the White Mountain Apache Tribe lands in Arizona examined ways to enhance investigations into cultural property crime (CPC) through applications of rapidly evolving methods from archaeological science. CPC (also looting, graverobbing) refers to unauthorized damage, removal, or trafficking in materials possessing blends of communal, aesthetic, and scientific values. The Fort Apache workshop integrated four generally partitioned domains of CPC expertise: (1) theories of perpetrators’ motivations and methods; (2) recommended practice in sustaining public and community opposition to CPC; (3) tactics and strategies for documenting, investigating, and prosecuting CPC; and (4) forensic sedimentology—uses of biophysical sciences to link sediments from implicated persons and objects to crime scenes. Forensic sedimentology served as the touchstone for dialogues among experts in criminology, archaeological sciences, law enforcement, and heritage stewardship. Field visits to CPC crime scenes and workshop deliberations identified pathways toward integrating CPC theory and practice with forensic sedimentology’s potent battery of analytic methods.
Alzheimer's disease and vascular dementia are associated with overlapping symptoms of anxiety and depression. More accurate discrimination between emerging neuropsychiatric and cognitive symptoms would better assist illness detection. The potential for protection against cognitive decline and dementia following early identification and intervention of neuropsychiatric symptoms warrants investigation.
The hippocampus plays an important role in psychopathology and treatment outcome. While posterior hippocampus (PH) may be crucial for the learning process that exposure-based treatments require, affect-focused treatments might preferentially engage anterior hippocampus (AH). Previous studies have distinguished the different functions of these hippocampal sub-regions in memory, learning, and emotional processes, but not in treatment outcome. Examining two independent clinical trials, we hypothesized that anterior hippocampal volume would predict outcome of affect-focused treatment outcome [Interpersonal Psychotherapy (IPT); Panic-Focused Psychodynamic Psychotherapy (PFPP)], whereas posterior hippocampal volume would predict exposure-based treatment outcome [Prolonged Exposure (PE); Cognitive Behavioral Therapy (CBT); Applied Relaxation Training (ART)].
Thirty-five patients with posttraumatic stress disorder (PTSD) and 24 with panic disorder (PD) underwent structural magnetic resonance imaging (MRI) before randomization to affect-focused (IPT for PTSD; PFPP for PD) or exposure-based treatments (PE for PTSD; CBT or ART for PD). AH and PH volume were regressed with clinical outcome changes.
Baseline whole hippocampal volume did not predict post-treatment clinical severity scores in any treatment. For affect-focused treatments, but not exposure-based treatments, anterior hippocampal volume predicted clinical improvement. Smaller AH correlated with greater affect-focused treatment improvement. Posterior hippocampal volume did not predict treatment outcome.
This is the first study to explore associations between hippocampal volume sub-regions and treatment outcome in PTSD and PD. Convergent results suggest that affect-focused treatment may influence the clinical outcome through the ‘limbic’ AH, whereas exposure-based treatments do not. These preliminary, theory-congruent, therapeutic findings require replication in a larger clinical trial.
Bipolar disorder is less prevalent in older people but accounts for 8–10% of psychiatric admissions. Treating and managing bipolar disorder in older people is challenging because of medical comorbidity. We review the cognitive problems observed in older people, explore why these are important and consider current treatment options. There are very few studies examining the cognitive profiles of older people with bipolar disorder and symptomatic depression and mania, and these show significant impairments in executive function. Most studies have focused on cognitive impairment in euthymic older people: as in euthymic adults of working age, significant impairments are observed in tests of attention, memory and executive function/processing speeds. Screening tests are not always helpful in euthymic older people as the impairment can be relatively subtle, and more in-depth neuropsychological testing may be needed to show impairments. Cognitive impairment may be more pronounced in older people with ‘late-onset’ bipolar disorder than in those with ‘early-onset’ disorder. Strategies to address symptomatic cognitive impairment in older people include assertive treatment of the mood disorder, minimising drugs that can adversely affect cognition, optimising physical healthcare and reducing relapse rates.
After reading this article you will be able to:
•understand that cognitive impairment in euthymic older people with bipolar disorder is similar to that in working-age adults with the disorder, affecting attention, memory and executive function/processing speeds
•recognise that cognitive impairment in older people is likely to be a major determinant of functional outcomes
•Implement approaches to treat cognitive impairment in bipolar disorder.
DECLARATION OF INTEREST
B.J.S. consults for Cambridge Cognition, PEAK (www.peak.net) and Mundipharma.
The chemical composition of soil from the Glasgow (UK) urban area was used to identify the controls on the availability of potentially harmful elements (PHEs) in soil to humans. Total and bioaccessible concentrations of arsenic (As), chromium (Cr) and lead (Pb) in 27 soil samples, collected from different land uses, were coupled to information on their solid-phase partitioning derived from sequential extraction data. The total element concentrations in the soils were in the range <0.1–135mgkg–1 for As; 65–3680mgkg–1 for Cr and 126–2160mgkg–1 for Pb, with bioaccessible concentrations averaging 27, 5 and 27% of the total values, respectively. Land use does not appear to be a predictor of contamination; however, the history of the contamination is critically important. The Chemometric Identification of Substrates and Element Distribution (CISED) sequential chemical extraction and associated self-modelling mixture resolution analysis identified three sample groupings and 16 geochemically distinct phases (substrates). These were related to iron (n=3), aluminium–silicon (Al–Si; n=2), calcium (n=3), phosphorus (n=1), magnesium (Mg; n=3), manganese (n=1) and easily extractable (n=3), which was predominantly made up of sodium and sulphur. As, Cr and Pb were respectively found in 9, 10 and 12 of the identified phases, with bioaccessible As predominantly associated with easily extractable phases, bioaccessible Cr with the Mg-dominated phases and bioaccessible Pb with both the Mg-dominated and Al–Si phases. Using a combination of the Unified Barge Method to measure the bioaccessibility of PHEs and CISED to identify the geochemical sources has allowed a much better understanding of the complexity of PHE mobility in the Glasgow urban environment. This approach can be applied to other urban environments and cases of soil contamination, and made part of land-use planning.
While our fascination with understanding the past is sufficient to warrant an increased focus on synthesis, solutions to important problems facing modern society require understandings based on data that only archaeology can provide. Yet, even as we use public monies to collect ever-greater amounts of data, modes of research that can stimulate emergent understandings of human behavior have lagged behind. Consequently, a substantial amount of archaeological inference remains at the level of the individual project. We can more effectively leverage these data and advance our understandings of the past in ways that contribute to solutions to contemporary problems if we adapt the model pioneered by the National Center for Ecological Analysis and Synthesis to foster synthetic collaborative research in archaeology. We propose the creation of the Coalition for Archaeological Synthesis coordinated through a U.S.-based National Center for Archaeological Synthesis. The coalition will be composed of established public and private organizations that provide essential scholarly, cultural heritage, computational, educational, and public engagement infrastructure. The center would seek and administer funding to support collaborative analysis and synthesis projects executed through coalition partners. This innovative structure will enable the discipline to address key challenges facing society through evidentially based, collaborative synthetic research.
Fontan survivors have depressed cardiac index that worsens over time. Serum biomarker measurement is minimally invasive, rapid, widely available, and may be useful for serial monitoring. The purpose of this study was to identify biomarkers that correlate with lower cardiac index in Fontan patients.
Methods and results
This study was a multi-centre case series assessing the correlations between biomarkers and cardiac magnetic resonance-derived cardiac index in Fontan patients ⩾6 years of age with biochemical and haematopoietic biomarkers obtained ±12 months from cardiac magnetic resonance. Medical history and biomarker values were obtained by chart review. Spearman’s Rank correlation assessed associations between biomarker z-scores and cardiac index. Biomarkers with significant correlations had receiver operating characteristic curves and area under the curve estimated. In total, 97 cardiac magnetic resonances in 87 patients met inclusion criteria: median age at cardiac magnetic resonance was 15 (6–33) years. Significant correlations were found between cardiac index and total alkaline phosphatase (−0.26, p=0.04), estimated creatinine clearance (0.26, p=0.02), and mean corpuscular volume (−0.32, p<0.01). Area under the curve for the three individual biomarkers was 0.63–0.69. Area under the curve for the three-biomarker panel was 0.75. Comparison of cardiac index above and below the receiver operating characteristic curve-identified cut-off points revealed significant differences for each biomarker (p<0.01) and for the composite panel [median cardiac index for higher-risk group=2.17 L/minute/m2 versus lower-risk group=2.96 L/minute/m2, (p<0.01)].
Higher total alkaline phosphatase and mean corpuscular volume as well as lower estimated creatinine clearance identify Fontan patients with lower cardiac index. Using biomarkers to monitor haemodynamics and organ-specific effects warrants prospective investigation.
The relative contribution of demographic, lifestyle and medication factors to the association between affective disorders and cardiometabolic diseases is poorly understood.
To assess the relationship between cardiometabolic disease and features of depresion and bipolar disorder within a large population sample.
Cross-sectional study of 145 991 UK Biobank participants: multivariate analyses of associations between features of depression or bipolar disorder and five cardiometabolic outcomes, adjusting for confounding factors.
There were significant associations between mood disorder features and ‘any cardiovascular disease’ (depression odds ratio (OR) = 1.15, 95% CI 1.12–1.19; bipolar OR = 1.28, 95% CI 1.14–1.43) and with hypertension (depression OR = 1.15, 95% CI 1.13–1.18; bipolar OR = 1.26, 95% CI 1.12–1.42). Individuals with features of mood disorder taking psychotropic medication were significantly more likely than controls not on psychotropics to report myocardial infarction (depression OR = 1.47, 95% CI 1.24–1.73; bipolar OR = 2.23, 95% CI 1.53–3.57) and stroke (depression OR = 2.46, 95% CI 2.10–2.80; bipolar OR = 2.31, 95% CI 1.39–3.85).
Associations between features of depression or bipolar disorder and cardiovascular disease outcomes were statistically independent of demographic, lifestyle and medication confounders. Psychotropic medication may also be a risk factor for cardiometabolic disease in individuals without a clear history of mood disorder.
Analyzing historical trajectories of social interactions at varying scales can lead to complementary interpretations of relationships among archaeological settlements. We use social network analysis combined with geographic information systems at three spatial scales over time in the western U.S. Southwest to show how the same social processes affected network dynamics at each scale. The period we address, A.D. 1200–1450, was characterized by migration and demographic upheaval. The tumultuous late thirteenth-century interval was followed by population coalescence and the development of widespread religious movements in the fourteenth and fifteenth centuries. In the southern Southwest these processes resulted in a highly connected network that drew in members of different settlements within and between different valleys that had previously been distinct. In the northern Southwest networks were initially highly connected followed by a more fragmented social landscape. We examine how different network textures emerged at each scale through 50-year snapshots. The results demonstrate the usefulness of applying a multiscalar approach to complex historical trajectories and the potential for social network analysis as applied to archaeological data.
Several studies demonstrating that central line–associated bloodstream infections (CLABSIs) are preventable prompted a national initiative to reduce the incidence of these infections.
We conducted a collaborative cohort study to evaluate the impact of the national “On the CUSP: Stop BSI” program on CLABSI rates among participating adult intensive care units (ICUs). The program goal was to achieve a unit-level mean CLABSI rate of less than 1 case per 1,000 catheter-days using standardized definitions from the National Healthcare Safety Network. Multilevel Poisson regression modeling compared infection rates before, during, and up to 18 months after the intervention was implemented.
A total of 1,071 ICUs from 44 states, the District of Columbia, and Puerto Rico, reporting 27,153 ICU-months and 4,454,324 catheter-days of data, were included in the analysis. The overall mean CLABSI rate significantly decreased from 1.96 cases per 1,000 catheter-days at baseline to 1.15 at 16–18 months after implementation. CLABSI rates decreased during all observation periods compared with baseline, with adjusted incidence rate ratios steadily decreasing to 0.57 (95% confidence intervals, 0.50–0.65) at 16–18 months after implementation.
Coincident with the implementation of the national “On the CUSP: Stop BSI” program was a significant and sustained decrease in CLABSIs among a large and diverse cohort of ICUs, demonstrating an overall 43% decrease and suggesting the majority of ICUs in the United States can achieve additional reductions in CLABSI rates.
To describe the frequency of use of all types of urinary catheters, including but not limited to indwelling catheters, as well as positive cultures associated with the various types. We also determined the accuracy of catheter-days reporting at our institution.
Prospective, observational trial based on patient-level review of the electronic medical record. Chart review was compared with standard methods of catheter surveillance and reporting by infection control personnel.
Ten internal medicine and 5 long-term care wards in 2 tertiary care Veterans Affairs hospitals in Texas from July 2010 through June 2011.
The study included 7,866 inpatients.
Measurements included patient bed-days; days of use of indwelling, external, suprapubic, and intermittent urinary catheters; number of urine cultures obtained and culture results; and infection control reports of indwelling catheter-days.
We observed 7,866 inpatients with 128,267 bed-days on acute medicine and extended care wards during the study. A urinary catheter was used on 36.9% of the total bed-days observed. Acute medicine wards collected more urine cultures per 1,000 bed-days than did the extended care wards (75.9 and 10.4 cultures per 1,000 bed-days, respectively; P < .0001 ). Catheter-days were divided among indwelling-catheter-days (47.8%), external-catheter-days (48.4%), and other (intermittent- and suprapubic-catheter-days, 3.8%). External catheters contributed to 376 (37.3%) of the 1,009 catheter-associated positive urine cultures. Urinary-catheter-days reported to the infection control department missed 20.1% of the actual days of indwelling catheter use, whereas 12.0% of their reported catheter-days were false.
Urinary catheter use was extremely common. External catheters accounted for a large portion of catheter-associated bacteriuria, and standard practices for tracking urinary-catheter-days were unreliable.
To examine the use of vitamin D supplements during infancy among the participants in an international infant feeding trial.
Information about vitamin D supplementation was collected through a validated FFQ at the age of 2 weeks and monthly between the ages of 1 month and 6 months.
Infants (n 2159) with a biological family member affected by type 1 diabetes and with increased human leucocyte antigen-conferred susceptibility to type 1 diabetes from twelve European countries, the USA, Canada and Australia.
Daily use of vitamin D supplements was common during the first 6 months of life in Northern and Central Europe (>80 % of the infants), with somewhat lower rates observed in Southern Europe (>60 %). In Canada, vitamin D supplementation was more common among exclusively breast-fed than other infants (e.g. 71 % v. 44 % at 6 months of age). Less than 2 % of infants in the USA and Australia received any vitamin D supplementation. Higher gestational age, older maternal age and longer maternal education were study-wide associated with greater use of vitamin D supplements.
Most of the infants received vitamin D supplements during the first 6 months of life in the European countries, whereas in Canada only half and in the USA and Australia very few were given supplementation.