To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Antibiotics are widely used by all specialties in the hospital setting. We evaluated previously defined high-risk antibiotic use in relation to Clostridioides difficile infections (CDIs).
We analyzed 2016–2017 data from 171 hospitals. High-risk antibiotics included second-, third-, and fourth-generation cephalosporins, fluoroquinolones, carbapenems, and lincosamides. A CDI case was a positive stool C. difficile toxin or molecular assay result from a patient without a positive result in the previous 8 weeks. Hospital-associated (HA) CDI cases included specimens collected >3 calendar days after admission or ≤3 calendar days from a patient with a prior same-hospital discharge within 28 days. We used the multivariable Poisson regression model to estimate the relative risk (RR) of high-risk antibiotic use on HA CDI, controlling for confounders.
The median days of therapy for high-risk antibiotic use was 241.2 (interquartile range [IQR], 192.6–295.2) per 1,000 days present; the overall HA CDI rate was 33 (IQR, 24–43) per 10,000 admissions. The overall correlation of high-risk antibiotic use and HA CDI was 0.22 (P = .003), and higher correlation was observed in teaching hospitals (0.38; P = .002). For every 100-day (per 1,000 days present) increase in high-risk antibiotic therapy, there was a 12% increase in HA CDI (RR, 1.12; 95% CI, 1.04–1.21; P = .002) after adjusting for confounders.
High-risk antibiotic use is an independent predictor of HA CDI. This assessment of poststewardship implementation in the United States highlights the importance of tracking trends of antimicrobial use over time as it relates to CDI.
Previous research has discovered different subtypes of social withdrawal based on motivations to approach or avoid social interactions. Each of these motivations are uniquely related to indices of maladjustment during emerging adulthood, including aspects of the self. However, research has yet to investigate whether or not relationship quality moderates these associations. The purpose of this study was to examine whether relationship quality with best friends, romantic partners, mothers, and fathers, respectively, serve as protective factors in the negative links between shyness and avoidance and self-worth. The participants included 519 college students (Mage = 19.87, SD = 1.99, 61% female) from four universities across the United States. Results revealed that relationship quality with both best friends and romantic partners moderated the relation between shyness and self-worth. The differences between parent and peer relationships are discussed.
Objectives: Studies of neurocognitively elite older adults, termed SuperAgers, have identified clinical predictors and neurobiological indicators of resilience against age-related neurocognitive decline. Despite rising rates of older persons living with HIV (PLWH), SuperAging (SA) in PLWH remains undefined. We aimed to establish neuropsychological criteria for SA in PLWH and examined clinically relevant correlates of SA. Methods: 734 PLWH and 123 HIV-uninfected participants between 50 and 64 years of age underwent neuropsychological and neuromedical evaluations. SA was defined as demographically corrected (i.e., sex, race/ethnicity, education) global neurocognitive performance within normal range for 25-year-olds. Remaining participants were labeled cognitively normal (CN) or impaired (CI) based on actual age. Chi-square and analysis of variance tests examined HIV group differences on neurocognitive status and demographics. Within PLWH, neurocognitive status differences were tested on HIV disease characteristics, medical comorbidities, and everyday functioning. Multinomial logistic regression explored independent predictors of neurocognitive status. Results: Neurocognitive status rates and demographic characteristics differed between PLWH (SA=17%; CN=38%; CI=45%) and HIV-uninfected participants (SA=35%; CN=55%; CI=11%). In PLWH, neurocognitive groups were comparable on demographic and HIV disease characteristics. Younger age, higher verbal IQ, absence of diabetes, fewer depressive symptoms, and lifetime cannabis use disorder increased likelihood of SA. SA reported increased independence in everyday functioning, employment, and health-related quality of life than non-SA. Conclusions: Despite combined neurological risk of aging and HIV, youthful neurocognitive performance is possible for older PLWH. SA relates to improved real-world functioning and may be better explained by cognitive reserve and maintenance of cardiometabolic and mental health than HIV disease severity. Future research investigating biomarker and lifestyle (e.g., physical activity) correlates of SA may help identify modifiable neuroprotective factors against HIV-related neurobiological aging. (JINS, 2019, 25, 507–519)
Development involves synergistic interplay among genotypes and the physical and cultural environments, and integrating genetics into experimental designs that manipulate the environment can improve understanding of developmental psychopathology and intervention efficacy. Consistent with differential susceptibility theory, individuals can vary in their sensitivity to environmental conditions including intervention for reasons including their genotype. As a consequence, understanding genetic influences on intervention response is critical. Empirically, we tested an interaction between a genetic index representing sensitivity to the environment and the Family Check-Up intervention. Participants were drawn from the Early Steps Multisite randomized prevention trial that included a low-income and racially/ethnically diverse sample of children and their families followed longitudinally (n = 515). As hypothesized, polygenic sensitivity to the environment moderated the effects of the intervention on 10-year-old children's symptoms of internalizing psychopathology, such that children who were genetically sensitive and were randomly assigned to the intervention had fewer symptoms of child psychopathology than genetically sensitive children assigned to the control condition. A significant difference in internalizing symptoms assessed with a clinical interview emerged between the intervention and control groups for those 0.493 SD above the mean on polygenic sensitivity, or 25% of the sample. Similar to personalized medicine, it is time to understand individual and sociocultural differences in treatment response and individualize psychosocial interventions to reduce the burden of child psychopathology and maximize well-being for children growing up in a wide range of physical environments and cultures.
The seventh-century AD switch from gold to silver currencies transformed the socio-economic landscape of North-west Europe. The source of silver, however, has proven elusive. Recent research, integrating ice-core data from the Colle Gnifetti drill site in the Swiss Alps, geoarchaeological records and numismatic and historical data, has provided new evidence for this transformation. Annual ice-core resolution data are combined with lead pollution analysis to demonstrate that significant new silver mining facilitated the change to silver coinage, and dates the introduction of such coinage to c. AD 660. Archaeological evidence and atmospheric modelling of lead pollution locates the probable source of the silver to mines at Melle, in France.
To test the hypothesis that long-term care facility (LTCF) residents with Clostridium difficile infection (CDI) or asymptomatic carriage of toxigenic strains are an important source of transmission in the LTCF and in the hospital during acute-care admissions.
A 6-month cohort study with identification of transmission events was conducted based on tracking of patient movement combined with restriction endonuclease analysis (REA) and whole-genome sequencing (WGS).
Veterans Affairs hospital and affiliated LTCF.
The study included 29 LTCF residents identified as asymptomatic carriers of toxigenic C. difficile based on every other week perirectal screening and 37 healthcare facility-associated CDI cases (ie, diagnosis >3 days after admission or within 4 weeks of discharge to the community), including 26 hospital-associated and 11 LTCF-associated cases.
Of the 37 CDI cases, 7 (18·9%) were linked to LTCF residents with LTCF-associated CDI or asymptomatic carriage, including 3 of 26 hospital-associated CDI cases (11·5%) and 4 of 11 LTCF-associated cases (36·4%). Of the 7 transmissions linked to LTCF residents, 5 (71·4%) were linked to asymptomatic carriers versus 2 (28·6%) to CDI cases, and all involved transmission of epidemic BI/NAP1/027 strains. No incident hospital-associated CDI cases were linked to other hospital-associated CDI cases.
Our findings suggest that LTCF residents with asymptomatic carriage of C. difficile or CDI contribute to transmission both in the LTCF and in the affiliated hospital during acute-care admissions. Greater emphasis on infection control measures and antimicrobial stewardship in LTCFs is needed, and these efforts should focus on LTCF residents during hospital admissions.
One of Andrew Ayton's many contributions to his field is the way he helped both his own and the next generation of historians appreciate that, as Nigel Saul puts it in his foreword to this volume, an Edwardian army ‘was held together by the social authority and resources of its principal commander’, and that ‘in the field, the force's effectiveness would depend largely on the individual commander's authority and on the day-to-day relations between his immediate lieutenants’. As Richard Barber has emphasised, Edward III consciously used his immense social and chivalric authority to foster powerful bonds of affection and mutual understanding with and among his principal military captains, and one of the most important means he employed to that end was the creation of a new chivalric order in 1348.
Britain's ‘Most Noble Order of the Garter’ is the oldest, most famous, and most prestigious of all the monarchical orders of knighthood. The buckled blue band with its gold motto of honi soit qui mal y pense (‘shamed be he who thinks ill of it’) has become part of the royal arms of the United Kingdom, and can be seen in countless places in Britain and the Commonwealth as a result. Knights and Ladies of the Order are entitled to display their own heraldic arms encircled by the Garter, and for this reason also, many garter-emblems can be found in churches, colleges, hospitals and other public structures throughout Britain and across the world.
When they first encounter the garter symbol and motto, students and tourists often react with some puzzlement. To twenty-first-century sensibilities, a garter seems a strange choice for a chivalric emblem, while the motto naturally provokes the question ‘who thinks ill of what?’ Efforts to satisfy the curiosity aroused by the heraldry of the Order have not been lacking. According to an oft-told story, of which there are several variations, King Edward retrieved a garter that slipped from a lady's leg at a ball, and pronounced the motto as a gentlemanly effort to shield her from embarrassment. The earliest version of this tale, however, was first published in the 1460s, and has no foundation whatsoever in any source even close to contemporary with the Order's establishment.
Introduction: In many rural and remote communities in BC, family physicians who are providing excellent primary and emergency care would like to access useful, timely, and collegial support to ensure the highest quality of health services for their patients. We undertook a real-time virtual support project in Robson Valley, located in northern BC, to evaluate the use of digital technologies such as videoconferencing for on demand consultation between family physicians at rural sites and emergency physicians at a regional site. Telehealth consults also occurred between rural sites with nurses at community emergency rooms consulting with local on-call physicians. Our aim was to use telehealth to facilitate timely access to high quality, comprehensive, coordinated team-based care. An evaluation framework, based on the Triple Aim sought to: 1) Identify telehealth use cases and assess impact on patient outcomes, patient and health professional experience, and cost of health care delivery; and 2) Assess the role of relationships among care team members in progressing from uptake to normalization of telehealth into routine usage. Methods: Using a participatory approach, all members of the pilot project were involved in shaping the pilot including the co-development of the evaluation itself. Evaluation was used iteratively throughout implementation for ongoing quality improvement via regular team meetings, sharing and reflecting on findings, and adjusting processes as required. Mixed methods were used including: interviews with family physicians, nurses, and patients at rural sites, and emergency physicians at regional site; review of records such as technology use statistics; and stakeholder focus groups. Results: From November 2016 to July 2017, 26 cases of telehealth use were captured and evaluated. Findings indicate that telehealth has positively impacted care team, patients, and health system. Benefits for care team at the rural sites included confidence in diagnoses through timely access to advice and support, while emergency physicians at the regional site gained deeper understanding of the practice settings of rural colleagues. Nevertheless, telehealth has complicated the emergency department work flow and increased physician workload. Findings demonstrated efficiencies for the health system, including reducing the need for patient transfer. Patients expressed confidence in the physicians and telehealth system; by receiving care closer to home, they experienced personal cost savings. Implementation saw a move away from scheduled telehealth visits to real use of technology for timely access. Conclusion: Evidence of the benefits of telehealth in emergency settings is needed to support stakeholder engagement to address issues of workflow and capacity. This pilot has early indications of significant local impact and will inform the expansion of emergency telehealth in all emergency settings in BC.
States often violate international agreements, both accidentally and intentionally. To process complaints efficiently, states can create formal, pretrial procedures in which governments can negotiate with litigants before a case ever goes to court. If disputes are resolved during pretrial negotiations, it can be very difficult to tell what has happened. Are governments coming into compliance? If so, are they only doing so when they have accidentally committed a violation or even when they are intentionally resisting? Or are challenges simply being dropped? This paper presents a formal model to address these questions. We develop our theory in the context of the European Union (EU). To test our model, we collect a new dataset of over 13,000 Commission infringement cases against EU member states (2003–2013). Our results suggest that accidental and intentional noncompliance both occur, but that intentional noncompliance is more common in the EU. We find that the Commission is an effective, if imperfect, monitor and enforcer of international law. The Commission can correct intentional noncompliance, but not always. It strategically drops cases that it believes it is unlikely to win.
Little is known about the association of cortical Aβ with depression and anxiety among cognitively normal (CN) elderly persons.
We conducted a cross-sectional study derived from the population-based Mayo Clinic Study of Aging in Olmsted County, Minnesota; involving CN persons aged ≥ 60 years that underwent PiB-PET scans and completed Beck Depression Inventory-II (BDI-II) and Beck Anxiety Inventory (BAI). Cognitive diagnosis was made by an expert consensus panel. Participants were classified as having abnormal (≥1.4; PiB+) or normal PiB-PET (<1.4; PiB−) using a global cortical to cerebellar ratio. Multi-variable logistic regression analyses were performed to calculate odds ratios (OR) and 95% confidence intervals (95% CI) after adjusting for age and sex.
Of 1,038 CN participants (53.1% males), 379 were PiB+. Each one point symptom increase in the BDI (OR = 1.03; 1.00–1.06) and BAI (OR = 1.04; 1.01–1.08) was associated with increased odds of PiB-PET+. The number of participants with BDI > 13 (clinical depression) was greater in the PiB-PET+ than PiB-PET- group but the difference was not significant (OR = 1.42; 0.83–2.43). Similarly, the number of participants with BAI > 10 (clinical anxiety) was greater in the PiB-PET+ than PiB-PET− group but the difference was not significant (OR = 1.77; 0.97–3.22).
As expected, depression and anxiety levels were low in this community-dwelling sample, which likely reduced our statistical power. However, we observed an informative albeit weak association between increased BDI and BAI scores and elevated cortical amyloid deposition. This observation needs to be tested in a longitudinal cohort study.
Objectives: Human immunodeficiency virus (HIV) disproportionately affects Hispanics/Latinos in the United States, yet little is known about neurocognitive impairment (NCI) in this group. We compared the rates of NCI in large well-characterized samples of HIV-infected (HIV+) Latinos and (non-Latino) Whites, and examined HIV-associated NCI among subgroups of Latinos. Methods: Participants included English-speaking HIV+ adults assessed at six U.S. medical centers (194 Latinos, 600 Whites). For overall group, age: M=42.65 years, SD=8.93; 86% male; education: M=13.17, SD=2.73; 54% had acquired immunodeficiency syndrome. NCI was assessed with a comprehensive test battery with normative corrections for age, education and gender. Covariates examined included HIV-disease characteristics, comorbidities, and genetic ancestry. Results: Compared with Whites, Latinos had higher rates of global NCI (42% vs. 54%), and domain NCI in executive function, learning, recall, working memory, and processing speed. Latinos also fared worse than Whites on current and historical HIV-disease characteristics, and nadir CD4 partially mediated ethnic differences in NCI. Yet, Latinos continued to have more global NCI [odds ratio (OR)=1.59; 95% confidence interval (CI)=1.13–2.23; p<.01] after adjusting for significant covariates. Higher rates of global NCI were observed with Puerto Rican (n=60; 71%) versus Mexican (n=79, 44%) origin/descent; this disparity persisted in models adjusting for significant covariates (OR=2.40; CI=1.11–5.29; p=.03). Conclusions: HIV+ Latinos, especially of Puerto Rican (vs. Mexican) origin/descent had increased rates of NCI compared with Whites. Differences in rates of NCI were not completely explained by worse HIV-disease characteristics, neurocognitive comorbidities, or genetic ancestry. Future studies should explore culturally relevant psychosocial, biomedical, and genetic factors that might explain these disparities and inform the development of targeted interventions. (JINS, 2018, 24, 163–175)
Beginning in 1999, Curtis Signorino challenged the use of traditional logits and probits analysis for testing discrete-choice, strategic models. Signorino argues that the complex parametric relationships generated by even the simplest strategic models can lead to wildly inaccurate inferences if one applies these traditional approaches. In their stead, Signorino proposes generating stochastic formal models, from which one can directly derive a maximum likelihood estimator. We propose a simpler, alternative methodology for theoretically and empirically accounting for strategic behavior. In particular, we propose carefully and correctly deriving one's comparative statics from one's formal model, whether it is stochastic or deterministic does not particularly matter, and using standard logit or probit estimation techniques to test the predictions. We demonstrate that this approach performs almost identically to Signorino's more complex suggestion.