To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Legislative solutions to pressing problems like balancing the budget, climate change, and poverty usually require compromise. Yet national, state, and local legislators often reject compromise proposals that would move policy in their preferred direction. Why do legislators reject such agreements? This engaging and relevant investigation into how politicians think reveals that legislators refuse compromise - and exacerbate gridlock - because they fear punishment from voters in primary elections. Prioritizing these electoral interests can lead lawmakers to act in ways that hurt their policy interests and also overlook the broader electorate's preferences by representing only a subset of voters with rigid positions. With their solution-oriented approach, Anderson, Butler, and Harbridge-Yong demonstrate that improving the likelihood of legislative compromise may require moving negotiations outside of the public spotlight. Highlighting key electoral motives underlying polarization, this book is an excellent resource for scholars and students studying Congress, American politics, public policy, and political behavior.
The science of studying diamond inclusions for understanding Earth history has developed significantly over the past decades, with new instrumentation and techniques applied to diamond sample archives revealing the stories contained within diamond inclusions. This chapter reviews what diamonds can tell us about the deep carbon cycle over the course of Earth’s history. It reviews how the geochemistry of diamonds and their inclusions inform us about the deep carbon cycle, the origin of the diamonds in Earth’s mantle, and the evolution of diamonds through time.
Early-life environmental and nutritional exposures are considered to contribute to the differences in cardiovascular disease (CVD) burden. Among sub-Saharan African populations, the association between markers of early-life exposures such as leg length and sitting height and CVD risk is yet to be investigated. This study assessed the association between leg length, sitting height, and estimated 10-year atherosclerotic cardiovascular disease (ASCVD) risk among Ghanaian-born populations in Europe and Ghana. We constructed sex-specific quintiles for sitting height and leg length for 3250 participants aged 40–70 years (mean age 52 years; men 39.6%; women 60.4%) in the cross-sectional multicenter Research on Diabetes and Obesity among African Migrants study. Ten-year risk of ASCVD was estimated using the Pooled Cohort Equations; risk ≥7.5% was defined as “elevated” CVD risk. Prevalence ratios (PR) were estimated to determine the associations between sitting height, leg length, and estimated 10-year ASCVD risk. For both men and women, mean sitting height and leg length were highest in Europe and lowest in rural Ghana. Sitting height was inversely associated with 10-year ASCVD risk among all women (PR for 1 standard deviation increase of sitting height: 0.75; 95% confidence interval: 0.67, 0.85). Among men, an inverse association between sitting height and 10-year ASCVD risk was significant on adjustment for study site, adult, and parental education but attenuated when further adjusted for height. No association was found between leg length and estimated 10-year ASCVD risk. Early-life and childhood exposures that influence sitting height could be the important determinants of ASCVD risk in this adult population.
Needlestick and sharps injury (NSSI) is a common occupational hazard of orthopedic surgery training. The purpose of this study was to examine the incidence and surrounding circumstances of intraoperative NSSI in orthopedic surgery residents and fellows and to examine postexposure reporting.
A 35-question cross-sectional survey.
The study was conducted by orthopedic surgery residents and faculty at a nonprofit regional hospital.
The questionnaire was distributed to US allopathic orthopedic surgery residency and fellowship programs; 300 orthopedic surgery trainees participated in the survey.
Of 223 trainees who had completed at least 1 year of residency, 172 (77.1%) sustained an NSSI during residency, and 57 of 63 trainees (90.5%) who had completed at least 4 years sustained an NSSI during residency. The most common causes of NSSI were solid needles, followed by solid pins or wires. The surgical activity most associated with NSSI was wound closure, followed by fracture fixation. The type of surgery most frequently associated with NSSI was orthopedic trauma, followed by hip and knee arthroplasty. Of 177 trainees who had sustained a prior NSSI, 99 (55.9%) failed to report all events to their institution’s occupational health department.
The incidence of NSSI during residency training is high, with >90% of trainees in their fifth year or later of training having received an injury during their training, with a mean of >4 separate events. Most trainees with an NSSI did not report all of their events, which implies that changes are needed in the incident reporting process universally.
Sucralose is an artificial non-nutritive sweetener used in foods aimed to reduce sugar and energy intake. While thought to be inert, the impact of sucralose on metabolic control has shown to be the opposite. The gut microbiome has emerged as a factor shaping metabolic responses after sweetener consumption. We examined the short-term effect of sucralose consumption on glucose homeostasis and gut microbiome of healthy male volunteers. We performed a randomised, double-blind study in thirty-four subjects divided into two groups, one that was administered sucralose capsules (780 mg/d for 7 d; n 17) and a control group receiving placebo (n 17). Before and after the intervention, glycaemic and insulinaemic responses were assessed with a standard oral glucose load (75 g). Insulin resistance was determined using homeostasis model assessment of insulin resistance and Matsuda indexes. The gut microbiome was evaluated before and after the intervention by 16S rRNA sequencing. During the study, body weight remained constant in both groups. Glycaemic control and insulin resistance were not affected during the 7-d period. At the phylum level, gut microbiome was not modified in any group. We classified subjects according to their change in insulinaemia after the intervention, to compare the microbiome of responders and non-responders. Independent of consuming sucralose or placebo, individuals with a higher insulinaemic response after the intervention had lower Bacteroidetes and higher Firmicutes abundances. In conclusion, consumption of high doses of sucralose for 7 d does not alter glycaemic control, insulin resistance, or gut microbiome in healthy individuals. However, it highlights the need to address individual responses to sucralose.
Identifying risk factors of individuals in a clinical-high-risk state for psychosis are vital to prevention and early intervention efforts. Among prodromal abnormalities, cognitive functioning has shown intermediate levels of impairment in CHR relative to first-episode psychosis and healthy controls, highlighting a potential role as a risk factor for transition to psychosis and other negative clinical outcomes. The current study used the AX-CPT, a brief 15-min computerized task, to determine whether cognitive control impairments in CHR at baseline could predict clinical status at 12-month follow-up.
Baseline AX-CPT data were obtained from 117 CHR individuals participating in two studies, the Early Detection, Intervention, and Prevention of Psychosis Program (EDIPPP) and the Understanding Early Psychosis Programs (EP) and used to predict clinical status at 12-month follow-up. At 12 months, 19 individuals converted to a first episode of psychosis (CHR-C), 52 remitted (CHR-R), and 46 had persistent sub-threshold symptoms (CHR-P). Binary logistic regression and multinomial logistic regression were used to test prediction models.
Baseline AX-CPT performance (d-prime context) was less impaired in CHR-R compared to CHR-P and CHR-C patient groups. AX-CPT predictive validity was robust (0.723) for discriminating converters v. non-converters, and even greater (0.771) when predicting CHR three subgroups.
These longitudinal outcome data indicate that cognitive control deficits as measured by AX-CPT d-prime context are a strong predictor of clinical outcome in CHR individuals. The AX-CPT is brief, easily implemented and cost-effective measure that may be valuable for large-scale prediction efforts.
Childhood disruptive behaviors are highly prevalent and associated with adverse long-term social and economic outcomes. Trajectories of welfare receipt in early adulthood and the association of childhood behaviors with high welfare receipt trajectories have not been examined.
Boys (n = 1000) from low socioeconomic backgrounds were assessed by kindergarten teachers for inattention, hyperactivity, aggression, opposition, and prosociality, and prospectively followed up for 30 years. We used group-base trajectory modeling to estimate trajectories of welfare receipt from age 19–36 years using government tax return records, then examined the association between teacher-rated behaviors and trajectory group membership using mixed effects multinomial regression models.
Three trajectories of welfare receipt were identified: low (70.8%), declining (19.9%), and chronic (9.3%). The mean annual personal employment earnings (US$) for the three groups at age 35/36 years was $36 500 (s.d. = $24 000), $15 600 (s.d. = $16 275), and $1700 (s.d. = $4800), respectively. Relative to the low welfare receipt group, a unit increase in inattention (mean = 2.64; s.d. = 2.32, range = 0–8) at age 6 was associated with an increased risk of being in the chronic group (relative risk ratio; RRR = 1.16, 95% CI 1.03–1.31) and in the declining group (RRR = 1.13, 95% CI 1.03–1.23), after adjustment for child IQ and family adversity, and independent of other behaviors. Family adversity was more strongly associated with trajectories of welfare receipt than any behavior.
Boys from disadvantaged backgrounds exhibiting high inattention in kindergarten are at elevated risk of chronic welfare receipt during adulthood. Screening and support for inattentive behaviors beginning in kindergarten could have long-term social and economic benefits for individuals and society.
Medical residents are an important group for antimicrobial stewardship programs (ASPs) to target with interventions aimed at improving antibiotic prescribing. In this study, we compared antimicrobial prescribing practices of 2 academic medical teams receiving different ASP training approaches along with a hospitalist control group.
Retrospective cohort study comparing guideline-concordant antibiotic prescribing for 3 common infections among a family medicine (FM) resident service, an internal medicine (IM) resident service, and hospitalists.
Community teaching hospital.
Adult patients admitted between July 1, 2016, and June 30, 2017, with a discharge diagnosis of pneumonia, cellulitis, and urinary tract infections were reviewed.
All 3 medical teams received identical baseline ASP education and daily antibiotic prescribing audit with feedback via clinical pharmacists. The FM resident service received an additional layer of targeted ASP intervention that included biweekly stewardship-focused rounds with an ASP physician and clinical pharmacist leadership. Guideline-concordant prescribing was assessed based on the institution’s ASP guidelines.
Of 1,572 patients, 295 (18.8%) were eligible for inclusion (FM, 96; IM, 69; hospitalist, 130). The percentage of patients receiving guideline-concordant antibiotic selection empirically was similar between groups for all diagnoses (FM, 87.5%; IM, 87%; hospitalist, 83.8%; P = .702). No differences were observed in appropriate definitive antibiotic selection among groups (FM, 92.4%; IM, 89.1%; hospitalist, 89.9%; P = .746). The FM resident service was more likely to prescribe a guideline-concordant duration of therapy across all diagnoses (FM, 74%; IM, 56.5%; hospitalist, 44.6%; P < .001).
Adding dedicated stewardship-focused rounds into the graduate medical curriculum demonstrated increased guideline adherence specifically to duration of therapy recommendations.
We analyzed antibiotic use data from 29 southeastern US hospitals over a 5-year period to determine changes in antibiotic use after the fluoroquinolone US Food and Drug Administration (FDA) advisory update in 2016. Fluoroquinolone use declined both before and after the FDA announcement, and the use of select, alternative antibiotics increased after the announcement.
Fluoroquinolones are among the 4 most commonly prescribed antibiotic classes.1,2 Postmarketing reports of serious adverse events linked to fluoroquinolones include tendonitis, neuropathy, hypoglycemia, psychiatric side effects, and possible aortic vessel rupture, leading to safety label changes in July 2008 and August 2013.3 In July 2016, the US Food and Drug Administration (FDA) strengthened the “black box” warning following an initial safety announcement in May 2016, recommending avoidance of fluoroquinolones for uncomplicated infections such as acute exacerbation of chronic bronchitis, uncomplicated urinary tract infections, and acute bacterial sinusitis.4 Concerns over safety and the association with Clostridiodes difficile infection have led inpatient antimicrobial stewardship programs (ASPs) to develop initiatives to promote avoidance of quinolones. The objective of this study was to quantify the effect of the 2016 FDA “black box” update on inpatient antibiotic use among a cohort of southeastern US hospitals.
“Uncertain futures” refers to a set of policy problems that possess some combination of the following characteristics: (i) they potentially cause irreversible changes; (ii) they are widespread, so that policy responses may make sense only on a global scale; (iii) network effects are difficult to understand and may amplify (or moderate) consequences; (iv) time horizons are long; and (v) the likelihood of catastrophic outcomes is unknown or even unknowable. These characteristics tend to make uncertain futures intractable to market solutions because property rights are not clearly defined and essential information is unavailable. These same factors also pose challenges for benefit-cost analysis (BCA) and other traditional decision analysis tools. The diverse policy decisions confronting decision-makers today demand “dynamic BCA,” analytic frameworks that incorporate uncertainties and trade-offs across policy areas, recognizing that: perceptions of risks can be uninformed, misinformed, or inaccurate; risk characterization can suffer from ambiguity; and experts’ tendency to focus on one risk at a time may blind policymakers to important trade-offs. Dynamic BCA – which recognizes trade-offs, anticipates the need to learn from experience, and encourages learning – is essential for lowering the likelihoods and mitigating the consequences of uncertain futures while encouraging economic growth, reducing fragility, and increasing resilience.
Alveolar echinococcosis is a neglected parasitic zoonosis caused by the metacestode Echinococcus multilocularis, which grows as a malignant tumour-like infection in the liver of humans. Albendazole (ABZ) is the antiparasitic drug of choice for the treatment of the disease. However, its effectiveness is low, due to its poor absorption from the gastro-intestinal tract. It is also parasitostatic and in some cases produces side-effects. Therefore, an alternative to the treatment of this severe human disease is necessary. In this context, the repositioning of drugs combined with nanotechnology to improve the bioavailability of drugs emerges as a useful, fast and inexpensive tool for the treatment of neglected diseases. The in vitro and in vivo efficacy of dichlorophen (DCP), an antiparasitic agent for intestinal parasites, and silica nanoparticles modified with DCP (NP-DCP) was evaluated against E. multilocularis larval stage. Both formulations showed a time and dose-dependent in vitro effect against protoscoleces. The NP-DCP had a greater in vitro efficacy than the drug alone or ABZ. In vivo studies demonstrated that the NP-DCP (4 mg kg−1) had similar efficacy to ABZ (25 mg kg−1) and greater activity than the free DCP. Therefore, the repurposing of DCP combined with silica nanoparticles could be an alternative for the treatment of echinococcosis.
Floriculture value exceeds $5.8 billion in the United States. Environmental challenges, market trends, and diseases complicate breeding priorities. To inform breeders’ and geneticists’ research efforts, we set out to gather consumers’ preferences in the form of willingness to pay (WTP) for different rose attributes in a discrete choice experiment. The responses are modeled in WTP space, using polynomials to account for heterogeneity. Consumer preferences indicate that heat and disease tolerance were the most important aspects for subjects in the sample, followed by drought resistance. To the best of our knowledge, this is the first study to identify breeding priorities in rosaceous plants from a consumer perspective.
Review findings on the role of dietary patterns in preventing depression are inconsistent, possibly due to variation in assessment of dietary exposure and depression. We studied the association between dietary patterns and depressive symptoms in six population-based cohorts and meta-analysed the findings using a standardised approach that defined dietary exposure, depression assessment and covariates.
Included were cross-sectional data from 23 026 participants in six cohorts: InCHIANTI (Italy), LASA, NESDA, HELIUS (the Netherlands), ALSWH (Australia) and Whitehall II (UK). Analysis of incidence was based on three cohorts with repeated measures of depressive symptoms at 5–6 years of follow-up in 10 721 participants: Whitehall II, InCHIANTI, ALSWH. Three a priori dietary patterns, Mediterranean diet score (MDS), Alternative Healthy Eating Index (AHEI-2010), and the Dietary Approaches to Stop Hypertension (DASH) diet were investigated in relation to depressive symptoms. Analyses at the cohort-level adjusted for a fixed set of confounders, meta-analysis used a random-effects model.
Cross-sectional and prospective analyses showed statistically significant inverse associations of the three dietary patterns with depressive symptoms (continuous and dichotomous). In cross-sectional analysis, the association of diet with depressive symptoms using a cut-off yielded an adjusted OR of 0.87 (95% confidence interval 0.84–0.91) for MDS, 0.93 (0.88–0.98) for AHEI-2010, and 0.94 (0.87–1.01) for DASH. Similar associations were observed prospectively: 0.88 (0.80–0.96) for MDS; 0.95 (0.84–1.06) for AHEI-2010; 0.90 (0.84–0.97) for DASH.
Population-scale observational evidence indicates that adults following a healthy dietary pattern have fewer depressive symptoms and lower risk of developing depressive symptoms.
In recent years, the discovery of massive quasars at
has provided a striking challenge to our understanding of the origin and growth of supermassive black holes in the early Universe. Mounting observational and theoretical evidence indicates the viability of massive seeds, formed by the collapse of supermassive stars, as a progenitor model for such early, massive accreting black holes. Although considerable progress has been made in our theoretical understanding, many questions remain regarding how (and how often) such objects may form, how they live and die, and how next generation observatories may yield new insight into the origin of these primordial titans. This review focusses on our present understanding of this remarkable formation scenario, based on the discussions held at the Monash Prato Centre from November 20 to 24, 2017, during the workshop ‘Titans of the Early Universe: The Origin of the First Supermassive Black Holes’.