To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To assess the effectiveness and acceptability of antimicrobial stewardship-focused implementation strategies on inpatient fluoroquinolones.
Stewardship champions at 15 hospitals were surveyed regarding the use and acceptability of strategies to improve fluoroquinolone prescribing. Antibiotic days of therapy (DOT) per 1,000 days present (DP) for sites with and without prospective audit and feedback (PAF) and/or prior approval were compared.
Among all of the sites, 60% had PAF or prior approval implemented for fluoroquinolones. Compared to sites using neither strategy (64.2 ± 34.4 DOT/DP), fluoroquinolone prescribing rates were lower for sites that employed PAF and/or prior approval (35.5 ± 9.8; P = .03) and decreased from 2017 to 2018 (P < .001). This decrease occurred without an increase in advanced-generation cephalosporins. Total antibiotic rates were 13% lower for sites with PAF and/or prior approval, but this difference did not reach statistical significance (P = .20). Sites reporting that PAF and/or prior approval were “completely” accepted had lower fluoroquinolone rates than sites where it was “moderately” accepted (34.2 ± 5.7 vs 48.7 ± 4.5; P < .01). Sites reported that clinical pathways and/or local guidelines (93%), prior approval (93%), and order forms (80%) “would” or “may” be effective in improving fluoroquinolone use. Although most sites (73%) indicated that requiring infectious disease consults would or may be effective in improving fluoroquinolones, 87% perceived implementation to be difficult.
PAF and prior approval implementation strategies focused on fluoroquinolones were associated with significantly lower fluoroquinolone prescribing rates and nonsignificant decreases in total antibiotic use, suggesting limited evidence for class substitution. The association of acceptability of strategies with lower rates highlights the importance of culture. These results may indicate increased acceptability of implementation strategies and/or sensitivity to FDA warnings.
United States dentists prescribe 10% of all outpatient antibiotics. Assessing appropriateness of antibiotic prescribing has been challenging due to a lack of guidelines for oral infections. In 2019, the American Dental Association (ADA) published clinical practice guidelines (CPG) on the management of acute oral infections. Our objective was to describe baseline national antibiotic prescribing for acute oral infections prior to the release of the ADA CPG and to identify patient-level variables associated with an antibiotic prescription.
We performed an analysis of national VA data from January 1, 2017, to December 31, 2017. We identified cases of acute oral infections using International Classification of Disease, Tenth Revision, Clinical Modification (ICD-10-CM) codes. Antibiotics prescribed by a dentist within ±7 days of a visit were included. Multivariable logistic regression identified patient-level variables associated with an antibiotic prescription.
Of the 470,039 VA dental visits with oral infections coded, 12% of patient visits with irreversible pulpitis, 17% with apical periodontitis, and 28% with acute apical abscess received antibiotics. Although the median days’ supply was 7, prolonged use of antibiotics was frequent (≥8 days, 42%–49%). Patients with high-risk cardiac conditions, prosthetic joints, and endodontic, implant, and oral and maxillofacial surgery dental procedures were more likely to receive antibiotics.
Most treatments of irreversible pulpitis and apical periodontitis cases were concordant with new ADA guidelines. However, in cases where antibiotics were prescribed, prolonged antibiotic courses >7 days were frequent. These findings demonstrate opportunities for the new ADA guidelines to standardize and improve dental prescribing practices.
To characterize postextraction antibiotic prescribing patterns, predictors for antibiotic prescribing and the incidence of and risk factors for postextraction oral infection.
Retrospective analysis of a random sample of veterans who received tooth extractions from January 1, 2017 through December 31, 2017.
VA dental clinics.
Overall, 69,610 patients met inclusion criteria, of whom 404 were randomly selected for inclusion. Adjunctive antibiotics were prescribed to 154 patients (38.1%).
Patients who received or did not receive an antibiotic were compared for the occurrence of postextraction infection as documented in the electronic health record. Multivariable logistic regression was performed to identify factors associated with antibiotic receipt.
There was no difference in the frequency of postextraction oral infection identified among patients who did and did not receive antibiotics (4.5% vs 3.2%; P = .59). Risk factors for postextraction infection could not be identified due to the low frequency of this outcome. Patients who received antibiotics were more likely to have a greater number of teeth extracted (aOR, 1.10; 95% CI, 1.03–1.18), documentation of acute infection at time of extraction (aOR, 3.02; 95% CI, 1.57–5.82), molar extraction (aOR, 1.78; 95% CI, 1.10–2.86) and extraction performed by an oral maxillofacial surgeon (aOR, 2.29; 95% CI, 1.44–3.58) or specialty dentist (aOR, 5.77; 95% CI, 2.05–16.19).
Infectious complications occurred at a low incidence among veterans undergoing tooth extraction who did and did not receive postextraction antibiotics. These results suggest that antibiotics have a limited role in preventing postprocedural infection; however, future studies are necessary to more clearly define the role of antibiotics for this indication.
In response to the 2013–2016 Ebola virus disease outbreak, the US government designated certain healthcare institutions as Ebola treatment centers (ETCs) to better prepare for future emerging infectious disease outbreaks. This study investigated ETC experiences and critical care policies for patients with viral hemorrhagic fever (VHF).
A 58-item questionnaire elicited information on policies for 9 critical care interventions, factors that limited care provision, and innovations developed to deliver care.
Setting and participants:
The questionnaire was sent to 82 ETCs.
We analyzed ordinal and categorical data pertaining to the ETC characteristics and descriptive data about their policies and perceived challenges. Statistical analyses assessed whether ETCs with experience caring for VHF patients were more likely to have critical care policies than those that did not.
Of the 27 ETCs who responded, 17 (63%) were included. Among them, 8 (47%) reported experience caring for persons under investigation or confirmed cases of VHF. Most felt ready to provide intubation, chest compressions, and renal replacement therapy to these patients. The factors most cited for limiting care were staff safety and clinical futility. Innovations developed to better provide care included increased simulation training and alternative technologies for procedures and communication.
There were broad similarities in critical care policies and limitations among institutions. There were several interventions, namely ECMO and cricothyrotomy, which few institutions felt ready to provide. Future studies could identify obstacles to providing these interventions and explore policy changes after increased experience with novel infectious diseases, such as COVID-19.
This research communication reports the results from questionnaires used to identify the impact of recent research into the disinfection of cattle foot-trimming equipment to prevent bovine digital dermatitis (BDD) transmission on (a) biosecurity knowledge and (b) hygiene practice of foot health professionals. An initial questionnaire found that more than half of participating farmers, veterinary surgeons and commercial foot-trimmers were not considering hand or hoof-knife hygiene in their working practices. The following year, after the release of a foot-trimming hygiene protocol and a comprehensive knowledge exchange programme by the University of Liverpool, a second survey showed 35/80 (43.8%) farmers, veterinary surgeons and commercial foot-trimmers sampled considered they were now more aware of the risk of spreading BDD during foot- trimming. Furthermore, 36/80 (45.0%) had enhanced their hygiene practice in the last year, impacting an estimated 1383 farms and 5130 cows trimmed each week. Participants who reported having seen both the foot-trimming hygiene protocol we developed with AHDB Dairy and other articles about foot-trimming hygiene in the farming and veterinary press, were significantly more likely to have changed their working practices. Difficulties accessing water and cleaning facilities on farms were identified as the greatest barrier to improving biosecurity practices. Participants' preferred priority for future research was continued collection of evidence for the importance and efficacy of good foot-trimming hygiene practices.
This study compared the level of education and tests from multiple cognitive domains as proxies for cognitive reserve.
The participants were educationally, ethnically, and cognitively diverse older adults enrolled in a longitudinal aging study. We examined independent and interactive effects of education, baseline cognitive scores, and MRI measures of cortical gray matter change on longitudinal cognitive change.
Baseline episodic memory was related to cognitive decline independent of brain and demographic variables and moderated (weakened) the impact of gray matter change. Education moderated (strengthened) the gray matter change effect. Non-memory cognitive measures did not incrementally explain cognitive decline or moderate gray matter change effects.
Episodic memory showed strong construct validity as a measure of cognitive reserve. Education effects on cognitive decline were dependent upon the rate of atrophy, indicating education effectively measures cognitive reserve only when atrophy rate is low. Results indicate that episodic memory has clinical utility as a predictor of future cognitive decline and better represents the neural basis of cognitive reserve than other cognitive abilities or static proxies like education.
Accumulating evidence suggests that wakeful rest (a period of minimal cognitive stimulation) enhances memory in clinical populations with memory impairment. However, no study has previously examined the efficacy of this technique in stroke survivors, despite the high prevalence of post-stroke memory difficulties. We aimed to investigate whether wakeful rest enhances verbal memory in stroke survivors and healthy controls.
Twenty-four stroke survivors and 24 healthy controls were presented with two short stories; one story was followed by a 10-minute period of wakeful rest and the other was followed by a 10-minute visual interference task. A mixed factorial analysis of variance (ANOVA) with pairwise comparisons was used to compare participants’ story retention at two time points.
After 15–30 minutes, stroke survivors (p = .002, d = .73), and healthy controls (p = .001, d = .76) retained more information from the story followed by wakeful rest, compared with the story followed by an interference task. While wakeful rest remained the superior condition in healthy controls after 7 days (p = .01, d = .58), the beneficial effect was not maintained in stroke survivors (p = .35, d = .19).
Wakeful rest is a promising technique, which significantly enhanced verbal memory after 15–30 minutes in both groups; however, no significant benefit of wakeful rest was observed after 7 days in stroke survivors. Preliminary findings suggest that wakeful rest enhances early memory consolidation processes by protecting against the effects of interference after learning in stroke survivors.
This is an epidemiological study of carbapenem-resistant Enterobacteriaceae (CRE) in Veterans’ Affairs medical centers (VAMCs). In 2017, almost 75% of VAMCs had at least 1 CRE case. We observed substantial geographic variability, with more cases in urban, complex facilities. This supports the benefit of tailoring infection control strategies to facility characteristics.
Language and cognitive impairments are common consequences of stroke. These difficulties persist with 60% of stroke survivors continuing to experience memory problems, 50% attention deficits and 61% communication problems long after the onset of the stroke-related impairments. Such deficits are ‘invisible’ – evident only through patient report, behavioural observation or formal assessment. The impacts of such deficits are considerable and can include prolonged hospital stays, poorer functional recovery and reduced quality of life. Effective and timely rehabilitation of language (auditory comprehension, expressive language, reading and writing) and cognitive abilities (memory, attention, spatial awareness, perception and executive function) are crucial to optimise recovery after stroke. In this chapter we review the current evidence base, relevant clinical guidelines relating to language and cognitive impairments and consider the implications for stroke rehabilitation practice and future research. Speech and language therapy offers benefit to people with aphasia after stroke; intensive intervention, if tolerated, likely augments the benefits. Interventions for deficits in all non-language cognitive domains exist, but need refining and evaluating more thoroughly with a wider range of methodologies.
Understanding risk factors for death from Covid-19 is key to providing good quality clinical care. We assessed the presenting characteristics of the ‘first wave’ of patients with Covid-19 at Royal Oldham Hospital, UK and undertook logistic regression modelling to investigate factors associated with death. Of 470 patients admitted, 169 (36%) died. The median age was 71 years (interquartile range 57–82), and 255 (54.3%) were men. The most common comorbidities were hypertension (n = 218, 46.4%), diabetes (n = 143, 30.4%) and chronic neurological disease (n = 123, 26.1%). The most frequent complications were acute kidney injury (AKI) (n = 157, 33.4%) and myocardial injury (n = 21, 4.5%). Forty-three (9.1%) patients required intubation and ventilation, and 39 (8.3%) received non-invasive ventilation. Independent risk factors for death were increasing age (odds ratio (OR) per 10 year increase above 40 years 1.87, 95% confidence interval (CI) 1.57–2.27), hypertension (OR 1.72, 95% CI 1.10–2.70), cancer (OR 2.20, 95% CI 1.27–3.81), platelets <150 × 103/μl (OR 1.93, 95% CI 1.13–3.30), C-reactive protein ≥100 μg/ml (OR 1.68, 95% CI 1.05–2.68), >50% chest radiograph infiltrates (OR 2.09, 95% CI 1.16–3.77) and AKI (OR 2.60, 95% CI 1.64–4.13). There was no independent association between death and gender, ethnicity, deprivation level, fever, SpO2/FiO2, lymphopoenia or other comorbidities. These findings will inform clinical and shared decision making, including use of respiratory support and therapeutic agents.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
To scale-out an experiential teaching kitchen in Parks and Recreation centres’ after-school programming in a large urban setting among predominantly low-income, minority children.
We evaluated the implementation of a skills-based, experiential teaching kitchen to gauge programme success. Effectiveness outcomes included pre–post measures of child-reported cooking self-efficacy, attitudes towards cooking, fruit and vegetable preference, intention to eat fruits and vegetables and willingness to try new fruits and vegetables. Process outcomes included attendance (i.e., intervention dose delivered), cost, fidelity and adaptations to the intervention.
After-school programming in Parks and Recreation Community centres in Nashville, TN.
Predominantly low-income minority children aged 6–14 years.
Of the twenty-five city community centres, twenty-one successfully implemented the programme, and nineteen of twenty-five implemented seven or more of the eight planned sessions. Among children with pre–post data (n 369), mean age was 8·8 (sd 1·9) years, and 53·7 % were female. All five effectiveness measures significantly improved (P < 0·001). Attendance at sessions ranged from 36·3 % of children not attending any sessions to 36·6 % of children attending at least four sessions. Across all centres, fidelity was 97·5 %. The average food cost per serving was $1·37.
This type of nutritional education and skills building experiential teaching kitchen can be successfully implemented in a community setting with high fidelity, effectiveness and organisational alignment, while also expanding reach to low-income, underserved children.
Although infections caused by Acinetobacter baumannii are often healthcare-acquired, difficult to treat, and associated with high mortality, epidemiologic data for this organism are limited. We describe the epidemiology, clinical characteristics, and outcomes for patients with extensively drug-resistant Acinetobacter baumannii (XDRAB).
Retrospective cohort study
Department of Veterans’ Affairs Medical Centers (VAMCs)
Patients with XDRAB cultures (defined as nonsusceptible to at least 1 agent in all but 2 or fewer classes) at VAMCs between 2012 and 2018.
Microbiology and clinical data was extracted from national VA datasets. We used descriptive statistics to summarize patient characteristics and outcomes and bivariate analyses to compare outcomes by culture source.
Among 11,546 patients with 15,364 A. baumannii cultures, 408 (3.5%) patients had 667 (4.3%) XDRAB cultures. Patients with XDRAB were older (mean age, 68 years; SD, 12.2) with median Charlson index 3 (interquartile range, 1–5). Respiratory specimens (n = 244, 36.6%) and urine samples (n = 187, 28%) were the most frequent sources; the greatest proportion of patients were from the South (n = 162, 39.7%). Most patients had had antibiotic exposures (n = 362, 88.7%) and hospital or long-term care admissions (n = 331, 81%) in the prior 90 days. Polymyxins, tigecycline, and minocycline demonstrated the highest susceptibility. Also, 30-day mortality (n = 96, 23.5%) and 1-year mortality (n = 199, 48.8%) were high, with significantly higher mortality in patients with blood cultures.
The proportion of Acinetobacter baumannii in the VA that was XDR was low, but treatment options are extremely limited and clinical outcomes were poor. Prevention of healthcare-associated XDRAB infection should remain a priority, and novel antibiotics for XDRAB treatment are urgently needed.
A survey of Veterans’ Affairs Medical Centers on control of carbapenem-resistant Enterobacteriaceae (CRE) and carbapenem-producing CRE (CP-CRE) demonstrated that most facilities use VA guidelines but few screen for CRE/CP-CRE colonization regularly or regularly communicate CRE/CP-CRE status at patient transfer. Most respondents were knowledgeable about CRE guidelines but cited lack of adequate resources.
Liquid phase (or liquid cell) transmission electron microscopy (LP-TEM) has been established as a powerful tool for observing dynamic processes in liquids at nanometer to atomic length scales. However, the simple act of observation using electrons irreversibly alters the nature of the sample. A clear understanding of electron-beam-driven processes during LP-TEM is required to interpret in situ observations and utilize the electron beam as a stimulus to drive nanoscale dynamic processes. In this article, we discuss recent advances toward understanding, quantifying, mitigating, and harnessing electron-beam-driven chemical processes occurring during LP-TEM. We highlight progress in several research areas, including modeling electron-beam-induced radiolysis near interfaces, electron-beam-induced nanocrystal formation, and radiation damage of soft materials and biomolecules.
In the absence of pyuria, positive urine cultures are unlikely to represent infection. Conditional urine reflex culture policies have the potential to limit unnecessary urine culturing. We evaluated the impact of this diagnostic stewardship intervention.
We conducted a retrospective, quasi-experimental (nonrandomized) study, with interrupted time series, from August 2013 to January 2018 to examine rates of urine cultures before versus after the policy intervention. We compared 3 intervention sites to 3 control sites in an aggregated series using segmented negative binomial regression.
The study included 6 acute-care hospitals within the Veterans’ Health Administration across the United States.
Adult patients with at least 1 urinalysis ordered during acute-care admission, excluding pregnant patients or those undergoing urological procedures, were included.
At the intervention sites, urine cultures were performed if a preceding urinalysis met prespecified criteria. No such restrictions occurred at the control sites. The primary outcome was the rate of urine cultures performed per 1,000 patient days. The safety outcome was the rate of gram-negative bloodstream infection per 1,000 patient days.
The study included 224,573 urine cultures from 50,901 admissions in 24,759 unique patients. Among the intervention sites, the overall average number of urine cultures performed did not significantly decrease relative to the preintervention period (5.9% decrease; P = 0.8) but did decrease by 21% relative to control sites (P < .01). We detected no significant difference in the rates of gram-negative bloodstream infection among intervention or control sites (P = .49).
Conditional urine reflex culture policies were associated with a decrease in urine culturing without a change in the incidence of gram-negative bloodstream infection.