To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To assess the potential for contamination of personnel, patients and the environment during use of contaminated N95 respirators and to compare the effectiveness of interventions to reduce contamination.
Simulation study of patient care interactions using N95 respirators contaminated with a higher and lower inoculum of the benign virus bacteriophage MS2.
Twelve healthcare personnel performed 3 standardized examinations of mannequins including: 1) Control with suboptimal respirator handling technique; 2) Improved technique with glove change after each N95 contact; and 3) Control with 1-minute ultraviolet-C light (UV-C) treatment prior to donning. The order of the examinations was randomized within subject. The frequencies of contamination were compared among groups. Observations and simulations with fluorescent lotion were used to assess routes of transfer leading to contamination.
With suboptimal respirator handling technique, bacteriophage MS2 was frequently transferred to the participants, mannequin, and environmental surfaces and fomites. Improved technique resulted in significantly reduced transfer of MS2 in the higher inoculum simulations (P<0.01), whereas UV-C treatment reduced transfer in both the higher and lower inoculum simulations (P<0.01). Observations and simulations with fluorescent lotion demonstrated multiple potential routes of transfer to participants, mannequin, and surfaces, including both direct contact with the contaminated respirator and indirect contact via contaminated gloves.
Reuse of contaminated N95 respirators can result in contamination of personnel and the environment even when correct technique is used. Decontamination technologies such as UV-C could reduce the risk for transmission.
The ongoing pandemic disaster of coronavirus erupted with the first confirmed cases in Wuhan, China in December 2019, caused by the SARS-CoV-2 novel coronavirus, the disease referred to as “COVID-19.” The World Health Organization (WHO) confirmed the outbreak and determined it a global pandemic. The current pandemic has infected nearly 100 million people and killed over 2 million. The current COVID-19 pandemic is smashing every public health barrier, guardrail and safety measure in underdeveloped and the most developed countries alike with peaks and troughs across time. Greatly impacted are those regions experiencing conflict and war. Morbidity and mortality increase logarithmically for those communities at risk and that lack the ability to promote basic preventative measures. As states around the globe struggle to unify responses, make gains on preparedness levels, identify and symptomatically treat positive cases and labs across the globe frantically rollout various vaccines and effective surveillance and therapeutic mechanisms. The incidence and prevalence of COVID-19 may continue to increase globally as no unified disaster response is manifested and disinformation spreads. During this failure in response, virus variants are erupting at a dizzying pace. Ungoverned spaces where non-state actors predominate and active war zones may become the next epicenter for COVID-19 fatality rates.
As the incidence rates continue to rise, hospitals in North America and Europe exceed surge capacity and immunity post infection struggles to be adequately described. The global threat in previously high-quality, robust infrastructure healthcare systems in the most developed economies are failing the challenge posed by COVID-19; how will less developed economies and those healthcare infrastructures that are destroyed by war and conflict until adequate vaccines penetrance in these communities or adequate treatment are established? Ukraine and other states in the Black Sea Region are under threat and are exposed to armed Russian aggression against territorial sovereignty daily. Ukraine, where Russia has been waging war since 2014, faces this specific dual threat: disaster response to violence and a deadly infectious disease. In order to best serve biosurveillance, aid in pandemic disaster response and bolster health security in Europe, across the North Atlantic Treaty Alliance (NATO) and Black Sea regions, increased NATO integration, across Ukraine’s disaster response structures within the Ministries of Health, Defense and Interior must be reenforced and expanded in order to mitigate the COVID-19 disaster.
It remains unclear whether pragmatic language skills and core language skills (grammar and vocabulary) are distinct language domains. The present work aimed to tease apart these domains using a novel online assessment battery administered to almost 400 children aged 7 to 13 years. Confirmatory factor analysis indicated that pragmatic and core language domains could be measured separately, but that both domains were highly related (r = .79). However, zero-order correlations between pragmatic tests were quite small, indicating that task-specific skills played an important role in performance, and follow-up exploratory factor analysis suggested that pragmatics might be best understood as a family of skills rather than a domain. This means that these different pragmatic skills may have different cognitive underpinnings and also need to be assessed separately. However, our overall results supported the idea that pragmatic and core aspects of language are closely related during development, with one area scaffolding development in the other.
Vitamin D deficiency is associated with an increased risk of falls and fractures. Assuming this association is causal, we aimed to identify the number and proportion of hospitalisations for falls and hip fractures attributable to vitamin D deficiency (25 hydroxy D (25(OH)D) <50 nmol/l) in Australians aged ≥65 years. We used 25(OH)D data from the 2011/12 Australian Health Survey and relative risks from published meta-analyses to calculate population-attributable fractions for falls and hip fracture. We applied these to data published by the Australian Institute of Health and Welfare to calculate the number of events each year attributable to vitamin D deficiency. In men and women combined, 8·3 % of hospitalisations for falls (7991 events) and almost 8 % of hospitalisations for hip fractures (1315 events) were attributable to vitamin D deficiency. These findings suggest that, even in a sunny country such as Australia, vitamin D deficiency contributes to a considerable number of hospitalisations as a consequence of falls and for treatment of hip fracture in older Australians; in countries where the prevalence of vitamin D deficiency is higher, the impact will be even greater. It is important to mitigate vitamin D deficiency, but whether this should occur through supplementation or increased sun exposure needs consideration of the benefits, harms, practicalities and costs of both approaches.
To evaluate the impact of changes in import tariffs on sweetened beverages.
Interrupted time series analysis was used to examine sweetened beverage tariff increases of 40–60 % in 2008 and to 75 % in 2012, and an approximately 11 % decrease in 2014 when an excise tax replaced the tariff. Post-tax trends were compared with a counterfactual modelled on the pre-tax trend for: quarterly price of an indicator beverage, monthly beverage import volumes (both 2001–2017) and quarterly sales volumes (2012–2017). In a controlled analysis, taxed beverage imports were compared with a sugary snacks control.
In the first year, after the 2008 tariff increase the price of the selected indicator soft drink increased by 7·3 % (95 % CI 6·3 %, 8·3 %) but after the 2012 tariff increase it decreased by 13·9 % (95 % CI –14·9 %, –12·8 %). At the same time, the import volumes of taxed beverages decreased by 13·2 % (95 % CI –38·1 %, 17·8 %) and 2·9 % (95 % CI –41·6 %, 72·5 %), respectively, and decreased by 24·8 % (95 % CI –36·9, –9·8) and 10·2 % (95 % CI –37·1, 37·5) in the controlled analysis. After the 2014 tax decrease, the price of the indicator soft drink decreased by 23·6 % (95 % CI –26·0 %, –21·1 %), sweetened beverage imports increased by 4·5 % (95 % CI –39·5 %, 156·0 %) and sales of full-sugar soft drinks increased by 31 % (95 % CI –21 %, 243 %).
The increased import tariffs on sweetened beverages appeared to be effective for reducing import volumes, but this was partly reversed by the reduced tax/tariff in 2014.
To investigate the timing and routes of contamination of the rooms of patients newly admitted to the hospital.
Observational cohort study and simulations of pathogen transfer.
A Veterans’ Affairs hospital.
Patients newly admitted to the hospital with no known carriage of healthcare-associated pathogens.
Interactions between the participants and personnel or portable equipment were observed, and cultures of high-touch surfaces, floors, bedding, and patients’ socks and skin were collected for up to 4 days. Cultures were processed for Clostridioides difﬁcile, methicillin-resistant Staphylococcus aureus (MRSA), and vancomycin-resistant enterococci (VRE). Simulations were conducted with bacteriophage MS2 to assess plausibility of transfer from contaminated floors to high-touch surfaces and to assess the effectiveness of wearing slippers in reducing transfer.
Environmental cultures became positive for at least 1 pathogen in 10 (59%) of the 17 rooms, with cultures positive for MRSA, C. difficile, and VRE in the rooms of 10 (59%), 2 (12%), and 2 (12%) participants, respectively. For all 14 instances of pathogen detection, the initial site of recovery was the floor followed in a subset of patients by detection on sock bottoms, bedding, and high-touch surfaces. In simulations, wearing slippers over hospital socks dramatically reduced transfer of bacteriophage MS2 from the floor to hands and to high-touch surfaces.
Floors may be an underappreciated source of pathogen dissemination in healthcare facilities. Simple interventions such as having patients wear slippers could potentially reduce the risk for transfer of pathogens from floors to hands and high-touch surfaces.
Gloves and gowns are used during patient care to reduce contamination of personnel and prevent pathogen transmission.
To determine whether the use of gowns adds a substantial benefit over gloves alone in preventing patient-to-patient transfer of a viral DNA surrogate marker.
In total, 30 source patients had 1 cauliflower mosaic virus surrogate marker applied to their skin and clothing and a second to their bed rail and bedside table. Personnel caring for the source patients were randomized to wear gloves, gloves plus cover gowns, or no barrier. Interactions with up to 7 subsequent patients were observed, and the percentages of transfer of the DNA markers were compared among the 3 groups.
In comparison to the no-barrier group (57.8% transfer of 1 or both markers), there were significant reductions in transfer of the DNA markers in the gloves group (31.1% transfer; odds ratio [OR], 0.16; 95% confidence interval [CI], 0.02-0.73) and the gloves-plus-gown group (25.9% transfer; OR, 0.11; 95% CI, 0.01–0.51). The addition of a cover gown to gloves during the interaction with the source patient did not significantly reduce the transfer of the DNA marker (P = .53). During subsequent patient interactions, transfer of the DNA markers was significantly reduced if gloves plus gowns were worn and if hand hygiene was performed (P < .05).
Wearing gloves or gloves plus gowns reduced the frequency of patient-to-patient transfer of a viral DNA surrogate marker. The use of gloves plus gowns during interactions with the source patient did not reduce transfer in comparison to gloves alone.
New Zealand has a long-running campylobacter infection (campylobacteriosis) epidemic with contaminated fresh chicken meat as the major source. This is both the highest impact zoonosis and the largest food safety problem in the country. Adding to this burden is the recent rapid emergence of antibiotic resistance in these campylobacter infections acquired from locally-produced chicken. Campylobacteriosis rates halved in 2008, as compared with the previous 5 years, following the introduction of regulatory limits on allowable contamination levels in fresh chicken meat, with large health and economic benefits resulting. In the following decade, disease rates do not appear to have declined further. The cumulative impact would equate to an estimated 539 000 cases, 5480 hospitalisations, 284 deaths and economic costs of approximately US$380 million during the last 10 years (2009–2018). Additional regulatory interventions, that build on previously successful regulations in this country, are urgently needed to control the source of this epidemic.
This is an epidemiological study of carbapenem-resistant Enterobacteriaceae (CRE) in Veterans’ Affairs medical centers (VAMCs). In 2017, almost 75% of VAMCs had at least 1 CRE case. We observed substantial geographic variability, with more cases in urban, complex facilities. This supports the benefit of tailoring infection control strategies to facility characteristics.
Conflicts between humans and bears have occurred since prehistory. Through time, the catalogue of human–bear conflicts (HBC) has been changing depending on the values and needs of human societies and their interactions with bears. Even today, conflict situations vary among the eight species of bears and geographically across these species’ ranges. This results in a broad range of interactions between bears and humans that may be considered as conflicts, including: (1) predation of domestic or semiwild animals, including bees, hunting dogs, and pet animals; (2) damage due to foraging on cultivated berries, fruits, agricultural products, and the tree bark in forest plantations; (3) economic loss due to destruction of beehives, fences, silos, houses, and other human property; (4) bear attacks on humans causing mild or fatal trauma; (5) bluff charges, bear intrusions into residential areas; and (6) vehicle collisions with bears and traffic accidents. In this chapter we aim to outline the principal types of HBC and geographical differences in the occurrence of conflicts and the coexistence between people and bears.
Population analyses of functional connectivity have provided a rich understanding of how brain function differs across time, individual, and cognitive task. An important but challenging task in such population analyses is the identification of reliable features that describe the function of the brain, while accounting for individual heterogeneity. Our work is motivated by two particularly important challenges in this area: first, how can one analyze functional connectivity data over populations of individuals, and second, how can one use these analyses to infer group similarities and differences. Motivated by these challenges, we model population connectivity data as a multilayer network and develop the multi-node2vec algorithm, an efficient and scalable embedding method that automatically learns continuous node feature representations from multilayer networks. We use multi-node2vec to analyze resting state fMRI scans over a group of 74 healthy individuals and 60 patients with schizophrenia. We demonstrate how multilayer network embeddings can be used to visualize, cluster, and classify functional regions of the brain for these individuals. We furthermore compare the multilayer network embeddings of the two groups. We identify significant differences between the groups in the default mode network and salience network—findings that are supported by the triple network model theory of cognitive organization. Our findings reveal that multi-node2vec is a powerful and reliable method for analyzing multilayer networks. Data and publicly available code are available at https://github.com/jdwilson4/multi-node2vec.
There is controversy regarding whether the addition of cover gowns offers a substantial benefit over gloves alone in reducing personnel contamination and preventing pathogen transmission.
Simulated patient care interactions.
To evaluate the efficacy of different types of barrier precautions and to identify routes of transmission.
In randomly ordered sequence, 30 personnel each performed 3 standardized examinations of mannequins contaminated with pathogen surrogate markers (cauliflower mosaic virus DNA, bacteriophage MS2, nontoxigenic Clostridioides difficile spores, and fluorescent tracer) while wearing no barriers, gloves, or gloves plus gowns followed by examination of a noncontaminated mannequin. We compared the frequency and routes of transfer of the surrogate markers to the second mannequin or the environment.
For a composite of all surrogate markers, transfer by hands occurred at significantly lower rates in the gloves-alone group (OR, 0.02; P < .001) and the gloves-plus-gown group (OR, 0.06; P = .002). Transfer by stethoscope diaphragms was common in all groups and was reduced by wiping the stethoscope between simulations (OR, 0.06; P < .001). Compared to the no-barriers group, wearing a cover gown and gloves resulted in reduced contamination of clothing (OR, 0.15; P < .001), but wearing gloves alone did not.
Wearing gloves alone or gloves plus gowns reduces hand transfer of pathogens but may not address transfer by devices such as stethoscopes. Cover gowns reduce the risk of contaminating the clothing of personnel.
The Cognitive Battery of the National Institutes of Health Toolbox (NIH-TB) is a collection of assessments that have been adapted and normed for administration across the lifespan and is increasingly used in large-scale population-level research. However, despite increasing adoption in longitudinal investigations of neurocognitive development, and growing recommendations that the Toolbox be used in clinical applications, little is known about the long-term temporal stability of the NIH-TB, particularly in youth.
The present study examined the long-term temporal reliability of the NIH-TB in a large cohort of youth (9–15 years-old) recruited across two data collection sites. Participants were invited to complete testing annually for 3 years.
Reliability was generally low-to-moderate, with intraclass correlation coefficients ranging between 0.31 and 0.76 for the full sample. There were multiple significant differences between sites, with one site generally exhibiting stronger temporal stability than the other.
Reliability of the NIH-TB Cognitive Battery was lower than expected given early work examining shorter test-retest intervals. Moreover, there were very few instances of tests meeting stability requirements for use in research; none of the tests exhibited adequate reliability for use in clinical applications. Reliability is paramount to establishing the validity of the tool, thus the constructs assessed by the NIH-TB may vary over time in youth. We recommend further refinement of the NIH-TB Cognitive Battery and its norming procedures for children before further adoption as a neuropsychological assessment. We also urge researchers who have already employed the NIH-TB in their studies to interpret their results with caution.
Background: Antibiotics are the most prescribed medicines worldwide, accounting for 20%–30% of total drug expenditures in most settings. Antimicrobial stewardship activities can provide guidance for the most appropriate antibiotic use. Objective: In an effort to generate baseline data to guide antimicrobial stewardship recommendations, we conducted point-prevalence surveys at 3 hospitals in Kenya. Methods: Sites included referral hospitals located in Nairobi (2,000 beds), Eldoret (900 beds) and Mombasa (700 beds). [Results are presented in this order.] Hospital administrators, heads of infection prevention and control units, and laboratory department heads were interviewed about ongoing antimicrobial stewardship activities, existing infection prevention and control programs, and microbiology diagnostic capacities. Patient-level data were collected by a clinical or medical officer and a pharmacist. A subset of randomly selected, consenting hospital patients was enrolled, and data were abstracted from their medical records, treatment sheets, and nursing notes using a modified WHO point-prevalence survey form. Results: Overall, 1,071 consenting patients were surveyed from the 3 hospitals (n = 579, n = 263, and n = 229, respectively) of whom >60% were aged >18 years and 53% were female. Overall, 489 of 1,071 of patients (46%) received ≥1 antibiotic, of whom 254 of 489 (52%) received 1 antibiotic, 201 of 489 (41%) received 2 antibiotics, 31 of 489 (6%) received 3 antibiotics, and 3 of 489 (1%) received 4 antibiotics. Antibiotic use was higher among those aged <5 years: 150 of 244 (62%) compared with older individuals (337 of 822, 41%). Amoxicillin/clavulanate was the most commonly used antibiotic (66 of 387, 17%) at the largest hospital (in Nairobi) whereas ceftriaxone was the most common at the other 2 facilities: 57 of 184 (31%) in Eldoret and 55 of 190 (29%) in Mombasa. Metronidazole was the next most commonly prescribed antibiotic (15%–19%). Meropenem was the only carbapenem reported: 22 of 387 patients (6%) in Nairobi, 2 of 190 patients (1%) in Eldoret, and 8 of 184 patients (4%) in Mombasa. Stop dates or review dates were not indicated for 106 of 390 patients (27%) in Nairobi, 75 of 190 patients (40%) in Eldoret, and 113 of 184 patients (72%) in Mombasa receiving antibiotics. Of 761 antibiotic prescriptions, 45% had a least 1 missed dose. Culture and antibiotic susceptibility tests were limited to 50 of 246 patients (20%) in Nairobi, 17 of 124 patients (14%) in Eldoret, and 23 of 119 patients (19%) in Mombasa who received antibiotics. The largest hospital had an administratively recognized antimicrobial stewardship committee. Conclusions: The prevalence of antibiotic use found by our study was 46%, generally lower than the rates reported in 3 similar studies from other African countries, which ranged from 56% to 65%. However, these survey findings indicate that ample opportunities exist for improving antimicrobial stewardship efforts in Kenya considering the high usage of empiric therapy and low microbiologic diagnostic utilization.
Background: With the emergence of antibiotic resistant threats and the need for appropriate antibiotic use, laboratory microbiology information is important to guide clinical decision making in nursing homes, where access to such data can be limited. Susceptibility data are necessary to inform antibiotic selection and to monitor changes in resistance patterns over time. To contribute to existing data that describe antibiotic resistance among nursing home residents, we summarized antibiotic susceptibility data from organisms commonly isolated from urine cultures collected as part of the CDC multistate, Emerging Infections Program (EIP) nursing home prevalence survey. Methods: In 2017, urine culture and antibiotic susceptibility data for selected organisms were retrospectively collected from nursing home residents’ medical records by trained EIP staff. Urine culture results reported as negative (no growth) or contaminated were excluded. Susceptibility results were recorded as susceptible, non-susceptible (resistant or intermediate), or not tested. The pooled mean percentage tested and percentage non-susceptible were calculated for selected antibiotic agents and classes using available data. Susceptibility data were analyzed for organisms with ≥20 isolates. The definition for multidrug-resistance (MDR) was based on the CDC and European Centre for Disease Prevention and Control’s interim standard definitions. Data were analyzed using SAS v 9.4 software. Results: Among 161 participating nursing homes and 15,276 residents, 300 residents (2.0%) had documentation of a urine culture at the time of the survey, and 229 (76.3%) were positive. Escherichia coli, Proteus mirabilis, Klebsiella spp, and Enterococcus spp represented 73.0% of all urine isolates (N = 278). There were 215 (77.3%) isolates with reported susceptibility data (Fig. 1). Of these, data were analyzed for 187 (87.0%) (Fig. 2). All isolates tested for carbapenems were susceptible. Fluoroquinolone non-susceptibility was most prevalent among E. coli (42.9%) and P. mirabilis (55.9%). Among Klebsiella spp, the highest percentages of non-susceptibility were observed for extended-spectrum cephalosporins and folate pathway inhibitors (25.0% each). Glycopeptide non-susceptibility was 10.0% for Enterococcus spp. The percentage of isolates classified as MDR ranged from 10.1% for E. coli to 14.7% for P. mirabilis. Conclusions: Substantial levels of non-susceptibility were observed for nursing home residents’ urine isolates, with 10% to 56% reported as non-susceptible to the antibiotics assessed. Non-susceptibility was highest for fluoroquinolones, an antibiotic class commonly used in nursing homes, and ≥ 10% of selected isolates were MDR. Our findings reinforce the importance of nursing homes using susceptibility data from laboratory service providers to guide antibiotic prescribing and to monitor levels of resistance.
Background: Antibiotics are among the most commonly prescribed drugs in nursing homes; urinary tract infections (UTIs) are a frequent indication. Although there is no gold standard for the diagnosis of UTIs, various criteria have been developed to inform and standardize nursing home prescribing decisions, with the goal of reducing unnecessary antibiotic prescribing. Using different published criteria designed to guide decisions on initiating treatment of UTIs (ie, symptomatic, catheter-associated, and uncomplicated cystitis), our objective was to assess the appropriateness of antibiotic prescribing among NH residents. Methods: In 2017, the CDC Emerging Infections Program (EIP) performed a prevalence survey of healthcare-associated infections and antibiotic use in 161 nursing homes from 10 states: California, Colorado, Connecticut, Georgia, Maryland, Minnesota, New Mexico, New York, Oregon, and Tennessee. EIP staff reviewed resident medical records to collect demographic and clinical information, infection signs, symptoms, and diagnostic testing documented on the day an antibiotic was initiated and 6 days prior. We applied 4 criteria to determine whether initiation of treatment for UTI was supported: (1) the Loeb minimum clinical criteria (Loeb); (2) the Suspected UTI Situation, Background, Assessment, and Recommendation tool (UTI SBAR tool); (3) adaptation of Infectious Diseases Society of America UTI treatment guidelines for nursing home residents (Crnich & Drinka); and (4) diagnostic criteria for uncomplicated cystitis (cystitis consensus) (Fig. 1). We calculated the percentage of residents for whom initiating UTI treatment was appropriate by these criteria. Results: Of 248 residents for whom UTI treatment was initiated in the nursing home, the median age was 79 years [IQR, 19], 63% were female, and 35% were admitted for postacute care. There was substantial variability in the percentage of residents with antibiotic initiation classified as appropriate by each of the criteria, ranging from 8% for the cystitis consensus, to 27% for Loeb, to 33% for the UTI SBAR tool, to 51% for Crnich and Drinka (Fig. 2). Conclusions: Appropriate initiation of UTI treatment among nursing home residents remained low regardless of criteria used. At best only half of antibiotic treatment met published prescribing criteria. Although insufficient documentation of infection signs, symptoms and testing may have contributed to the low percentages observed, adequate documentation in the medical record to support prescribing should be standard practice, as outlined in the CDC Core Elements of Antibiotic Stewardship for nursing homes. Standardized UTI prescribing criteria should be incorporated into nursing home stewardship activities to improve the assessment and documentation of symptomatic UTI and to reduce inappropriate antibiotic use.
The hands of healthcare personnel are the most important source for transmission of healthcare-associated pathogens. The role of contaminated fomites such as portable equipment, stethoscopes, and clothing of personnel in pathogen transmission is unclear.
To study routes of transmission of cauliflower mosaic virus DNA markers from 31 source patients and from environmental surfaces in their rooms.
A 3-month observational cohort study.
A Veterans’ Affairs hospital.
After providing care for source patients, healthcare personnel were observed during interactions with subsequent patients. Putative routes of transmission were identified based on recovery of DNA markers from sites of contact with the patient or environment. To assess plausibility of fomite-mediated transmission, we assessed the frequency of transfer of methicillin-resistant Staphylococcus aureus (MRSA) from the skin of 25 colonized patients via gloved hands versus fomites.
Of 145 interactions involving contact with patients and/or the environment, 41 (28.3%) resulted in transfer of 1 or both DNA markers to the patient and/or the environment. The DNA marker applied to patients’ skin and clothing was transferred most frequently by stethoscopes, hands, and portable equipment, whereas the marker applied to environmental surfaces was transferred only by hands and clothing. The percentages of MRSA transfer from the skin of colonized patients via gloved hands, stethoscope diaphragms, and clothing were 52%, 40%, and 48%, respectively.
Fomites such as stethoscopes, clothing, and portable equipment may be underappreciated sources of pathogen transmission. Simple interventions such as decontamination of fomites between patients could reduce the risk for transmission.
The review aimed to identify factors influencing opioid prescribing as regular pain-management medication for older people.
Chronic pain occurs in 45%–85% of older people, but appears to be under-recognised and under-treated. However, strong opiate prescribing is more prevalent in older people, increasing at the fastest rate in this age group.
This review included all study types, published 1990–2017, which focused on opioid prescribing for pain management among older adults. Arksey and O’Malley’s framework was used to scope the literature. PubMed, EBSCO Host, the UK Drug Database, and Google Scholar were searched. Data extraction, carried out by two researchers, included factors explaining opioid prescribing patterns and prescribing trends.
A total of 613 papers were identified and 53 were included in the final review consisting of 35 research papers, 10 opinion pieces and 8 grey literature sources. Factors associated with prescribing patterns were categorised according to whether they were patient-related, prescriber-driven, or system-driven. Patient factors included age, gender, race, and cognition; prescriber factors included attitudes towards opioids and judgements about ‘normal’ pain; and policy/system factors related to the changing policy landscape over the last three decades, particularly in the USA.
A large number of context-dependent factors appeared to influence opioid prescribing for chronic pain management in older adults, but the findings were inconsistent. There is a gap in the literature relating to the UK healthcare system; the prescriber and the patient perspective; and within the context of multi-morbidity and treatment burden.
Reduction in the use of fluoroquinolone antibiotics has been associated with reductions in Clostridioides difficile infections (CDIs) due to fluoroquinolone-resistant strains.
To determine whether facility-level fluoroquinolone use predicts healthcare facility-associated (HCFA) CDI due to fluoroquinolone-resistant 027 strains.
Using a nationwide cohort of hospitalized patients in the Veterans’ Affairs Healthcare System, we identified hospitals that categorized >80% of CDI cases as positive or negative for the 027 strain for at least one-quarter of fiscal years 2011–2018. Within these facilities, we used visual summaries and multilevel logistic regression models to assess the association between facility-level fluoroquinolone use and rates of HCFA-CDI due to 027 strains, controlling for time and facility complexity level, and adjusting for correlated outcomes within facilities.
Between 2011 and 2018, 55 hospitals met criteria for reporting 027 results, including a total of 5,091 HCFA-CDI cases, with 1,017 infections (20.0%) due to 027 strains. Across these facilities, the use of fluoroquinolones decreased by 52% from 2011 to 2018, with concurrent reductions in the overall HCFA-CDI rate and the proportion of HCFA-CDI cases due to the 027 strain of 13% and 55%, respectively. A multilevel logistic model demonstrated a significant effect of facility-level fluoroquinolone use on the proportion of infections in the facility due to the 027 strain, most noticeably in low-complexity facilities.
Our findings provide support for interventions to reduce use of fluroquinolones as a control measure for CDI, particularly in settings where fluoroquinolone use is high and fluoroquinolone-resistant strains are common causes of infection.