We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To develop, implement, and evaluate the effectiveness of a unique centralized surveillance infection prevention (CSIP) program.
Design:
Observational quality improvement project.
Setting:
An integrated academic healthcare system.
Intervention:
The CSIP program comprises senior infection preventionists who are responsible for healthcare-associated infection (HAI) surveillance and reporting, allowing local infection preventionists (LIPs) a greater portion of their time to non-surveillance patient safety activities. Four CSIP team members accrued HAI responsibilities at 8 facilities.
Methods:
We evaluated the effectiveness of the CSIP program using 4 measures: recovery of LIP time, efficiency of surveillance activities by LIPs and CSIP staff, surveys characterizing LIP perception of their effectiveness in HAI reduction, and nursing leaders’ perception of LIP effectiveness.
Results:
The amount of time spent by LIP teams on HAI surveillance was highly variable, while CSIP time commitment and efficiency was steady. Post-CSIP implementation, 76.9% of LIPs agreed that they spend adequate time on inpatient units, compared to 15.4% pre-CSIP; LIPs also reported more time to allot to non-surveillance activities. Nursing leaders reported greater satisfaction with LIP involvement with HAI reduction practices.
Conclusion:
CSIP programs are a little-reported strategy to ease burden on LIPs with reallocation of HAI surveillance. The analyses presented here will aid health systems in anticipating the benefit of CSIP programs.
Food fortification improves vitamin D intakes but is not yet mandated in many countries. Combining vitamin D with different dietary lipids altered vitamin D absorption in in-vitro and postprandial studies. This randomised, placebo-controlled trial examined the effect of the lipid composition of a vitamin D fortified dairy drink on change in 25-hydroxyvitamin D (25(OH)D) concentrations. Sixty-three healthy adults aged 50+ years were randomised to one of the following for 4 weeks: vitamin D fortified olive oil dairy drink, vitamin D fortified coconut oil dairy drink, vitamin D supplement or placebo control dairy drink. All vitamin D groups received 20µg of vitamin D3 daily. Serum was collected at baseline and post-intervention to measure 25(OH)D concentrations and biomarkers of metabolic health. Repeated measures general linear model analysis of covariance (RM GLM ANCOVA) compared changes over time. There was a significant time*treatment interaction effect on 25(OH)D concentrations for those classified as vitamin D insufficient (p<0.001) and sufficient at baseline (p=0.004). 25(OH)D concentrations increased significantly for all insufficient participants receiving vitamin D3 in any form. However, for vitamin D sufficient participants at baseline, 25(OH)D concentrations only increased significantly with the coconut oil dairy drink and supplement. There was no effect of vitamin D on biomarkers of metabolic health. Vitamin D fortification of lipid-containing foods may be used in lieu of supplementation when supplement adherence is low or for individuals with dysphagia. These results are important given the recent recommendation to increase vitamin D intakes to 15-20µg for older adults in Ireland.
Innovative shoe insoles, designed to enhance sensory information on the plantar surface of the feet, could help to improve walking in people with Multiple Sclerosis.
Objective:
To compare the effects of wearing textured versus smooth insoles, on measures of gait, foot sensation and patient-reported outcomes, in people with Multiple Sclerosis.
Methods:
A prospective, randomised controlled trial was conducted with concealed allocation, assessor blinding and intention-to-treat analysis. Thirty ambulant men and women with multiple sclerosis (MS) (Disease Steps rating 1–4) were randomly allocated to wear textured or smooth insoles for 12 weeks. Self-reported insole wear and falls diaries were completed over the intervention period. Laboratory assessments of spatiotemporal gait patterns, foot sensation and proprioception, and patient-reported outcomes, were performed at Weeks 0 (Baseline 1), 4 (Baseline 2) and 16 (Post-Intervention). The primary outcome was the size of the mediolateral base of support (stride/step width) when walking over even and uneven surfaces. Independent t-tests were performed on change from baseline (average of baseline measures) to post-intervention.
Results:
There were no differences in stride width between groups, when walking over the even or uneven surfaces (P ≥ 0.20) at post-intervention. There were no between-group differences for any secondary outcomes including gait (all P values > 0.23), foot sensory function (all P values ≥ 0.08) and patient-reported outcomes (all P values ≥ 0.23).
Conclusions:
In our small trial, prolonged wear of textured insoles did not appear to alter walking or foot sensation in people with MS who have limited foot sensory loss. Further investigation is needed to explore optimal insole design.
Clinical Trial Registration:
Australian and New Zealand Clinical Trials Registry (ACTRN12615000421538).
Hospitals are increasingly consolidating into health systems. Some systems have appointed healthcare epidemiologists to lead system-level infection prevention programs. Ideal program infrastructure and support resources have not been described. We informally surveyed 7 healthcare epidemiologists with recent experience building and leading system-level infection prevention programs. Key facilitators and barriers for program structure and implementation are described.
To evaluate the effectiveness of ultraviolet-C (UV-C) disinfection as an adjunct to standard chlorine-based disinfectant terminal room cleaning in reducing transmission of hospital-acquired multidrug-resistant organisms (MDROs) from a prior room occupant.
Design:
A retrospective cohort study was conducted to compare rates of MDRO transmission by UV-C status from January 1, 2016, through December 31, 2018.
Setting:
Acute-care, single-patient hospital rooms at 6 hospitals within an academic healthcare system in Pennsylvania.
Methods:
Transmission of hospital-acquired MDRO infection was assessed in patients subsequently assigned to a single-patient room of a source occupant with carriage of 1 or more MDROs on or during admission. Acquisition of 5 pathogens was compared between exposed patients in rooms with standard-of-care chlorine-based disinfectant terminal cleaning with or without adjunct UV-C disinfection. Logistic regression analysis was used to estimate the adjusted risk of pathogen transfer with adjunctive use of UV-C disinfection.
Results:
In total, 33,771 exposed patient admissions were evaluated; the source occupants carried 46,688 unique pathogens. Prior to the 33,771 patient admissions, 5,802 rooms (17.2%) were treated with adjunct UV-C disinfection. After adjustment for covariates, exposed patients in rooms treated with adjunct UV-C were at comparable risk of transfer of any pathogen (odds ratio, 1.06; 95% CI, 0.84–1.32; P = .64).
Conclusion:
Our analysis does not support the use of UV-C in addition to post-discharge cleaning with chlorine-based disinfectant to lower the risk of prior room occupant pathogen transfer.
Background: While mechanical thrombectomy (MT) has become broadly used, many nuances around its performance are still contentious. In particular, the optimal sedation strategy for MT is not clear in the literature. Methods: This study was a single-center retrospective cohort study of a prospectively collected database. Age, gender, pre-treatment NIH stroke score (NIHSS), Alberta stroke program early score CT (ASPECTS), quality of collateralization, whether the patient underwent thrombectomy, tandem carotid occlusion, and thrombolysis in cerebral infarction (TICI) score were recorded in the database. Results: We identified 228 patients having anterior circulation mechanical thrombectomy (MT). 91 were right-sided, 108 were left-sided. Collaterals were graded as good in 135 (71.4), moderate in 44 (23.2%), and poor in 10 (5.3%). The average pre-MT ASPECTS was 8.1 (range). We found significant differences between all patients, patients with good outcome (mRS 0-2) and death in age, baseline NIHSS, collateralization, and TICI revascularization score. Multivariate analysis was performed with showed significant associations of sidedness, collateralization, TICI score and hemorrhage with neurological outcome. Right-sided stroke, better collaterals, higher TICI score and absence of hemorrhage were associated with better outcomes. Conclusions: We found comparable outcomes to those reported in the literature with use of general anesthetic. We identify several factors that influence outcomes.
Background: Visual impairment can impact 70% of individuals who have experienced a stroke. Identification and remediation of visual impairments can improve overall function and perceived quality of life. Our project aimed to improve visual assessment and timely intervention for patients with post-stroke visual impairment (PSVI). Methods: We conducted a quality improvement initiative to create a standardized screening and referral process for patients with PSVI to access an orthoptist. Post-stroke visual impairment was identified using the Visual Screen Assessment (VISA) tool. Patients filled out a VFQ-25 questionnaire before and after orthoptic assessment, and differences between scores were evaluated. Results: Eighteen patients completed the VFQ-25 both before and after orthoptic assessment. Of the vision related constructs, there was a significant improvement in reported outcomes for general vision (M=56.9, SD=30.7; M=48.6, SD=16.0), p=0.002, peripheral vision (M=88.3, SD=16; M=75, SD=23.1), p= 0.027, ocular pain (M=97.2, SD=6.9; M=87.5, SD=21.4), p=0.022, near activities (M=82.4, SD=24.1; M=67.8, SD=25.6), p<0.001, social functioning (M=90.2, SD=19; M=78.5, SD=29.3), p=0.019, mental health (M=84.0, SD=25.9; M=70.5, SD=31.2), p=0.017, and role difficulties (M=84.7, SD=26.3; M=67.4, SD=37.9), p=0.005. Conclusions: Orthoptic assessments for those with PSVI significantly improved perceived quality of life in a numerous vision related constructs, suggesting it is a valuable part of a patient’s post-stroke recovery.
Background: Visual impairment exists for an estimated 70% of individuals who have experienced a stroke. Identification and remediation of visual impairments can improve overall function and perceived quality of life. Our project aims to improve visual assessment and timely intervention for patients with post-stroke visual impairment (PSVI). Methods: We conducted a quality improvement initiative to create a standardized screening and referral process for patients with PSVI to access an orthoptist. Post-stroke visual impairment was assessed by way of the Visual Screen Assessment (VISA) tool, administered by an occupational therapist. Patients filled out a VFQ-25 questionnaire before and after orthoptic assessment and intervention. The VFQ-25 is a validated post-stroke survey assessing a patient’s perceived quality of life. Differences between pre- and post-orthoptic assessment scores will be evaluated. Results: Data collection currently ongoing.The benefits of a standardized screen for PSVI, standardized referral to, and experience with an orthoptist assessment will be determined. Learnings gained will also inform how we can expand the program to benefit a wider demographic of patients. Conclusions: The data gathered and the subsequent analysis will be instrumental in guiding ongoing improvement initiatives for patients with PSVI.
Large truncated spherical near-field systems with conductive or absorbing floors are typically used in the measurement of the performances of vehicle-installed antennas. The main advantage of conductive floor systems is the ease of accommodation of the vehicle under test, but their performances are affected by the interaction with the reflecting ground floor. Instead, absorbing-based systems emulating free-space conditions minimize the effect of the interaction with the floor, but generally require longer setup times, especially at lower frequencies (70–400 MHz), where bulky absorbers are typically used to improve reflectivity levels. Considering scaled measurements of a vehicle model, the performances of these two typical implementations are analyzed in the 84–1500 MHz range and compared to free-space measurements. Absorbers with different dimensions and reflectivity have been installed in the scaled measurement setup, and measured data have been investigated with proper post-processing to verify the applicability to realistic systems. Figures of merit of interest for automotive applications, like gain and partial radiated powers, have been compared to free-space to evaluate the impact of different scenarios.
High levels of early emotionality (of either negative or positive valence) are hypothesized to be important precursors to early psychopathology, with attention-deficit/hyperactivity disorder (ADHD) a prime early target. The positive and negative affect domains are prime examples of Research Domain Criteria (RDoC) concepts that may enrich a multilevel mechanistic map of psychopathology risk. Utilizing both variable-centered and person-centered approaches, the current study examined whether levels and trajectories of infant negative and positive emotionality, considered either in isolation or together, predicted children's ADHD symptoms at 4 to 8 years of age. In variable-centered analyses, higher levels of infant negative affect (at as early as 3 months of age) were associated with childhood ADHD symptoms. Findings for positive affect failed to reach statistical threshold. Results from person-centered trajectory analyses suggest that additional information is gained by simultaneously considering the trajectories of positive and negative emotionality. Specifically, only when exhibiting moderate, stable or low levels of positive affect did negative affect and its trajectory relate to child ADHD symptoms. These findings add to a growing literature that suggests that infant negative emotionality is a promising early life marker of future ADHD risk and suggest secondarily that moderation by positive affectivity warrants more consideration.
The primary aim of this study was to assess the epidemiology of carbapenem-resistant Acinetobacter baumannii (CRAB) for 9 months following a regional outbreak with this organism. We also aimed to determine the differential positivity rate from different body sites and characterize the longitudinal changes of surveillance test results among CRAB patients.
Design:
Observational study.
Setting:
A 607-bed tertiary-care teaching hospital in Milwaukee, Wisconsin.
Patients:
Any patient admitted from postacute care facilities and any patient housed in the same inpatient unit as a positive CRAB patient.
Methods:
Participants underwent CRAB surveillance cultures from tracheostomy secretions, skin, and stool from December 5, 2018, to September 6, 2019. Cultures were performed using a validated, qualitative culture method, and final bacterial identification was performed using mass spectrometry.
Results:
In total, 682 patients were tested for CRAB, of whom 16 (2.3%) were positive. Of the 16 CRAB-positive patients, 14 (87.5%) were residents from postacute care facilities and 11 (68.8%) were African American. Among positive patients, the positivity rates by body site were 38% (6 of 16) for tracheal aspirations, 56% (9 of 16) for skin, and 82% (13 of 16) for stool.
Conclusions:
Residents from postacute care facilities were more frequently colonized by CRAB than patients admitted from home. Stool had the highest yield for identification of CRAB.
To describe interfacility transfer communication (IFTC) methods for notification of multidrug-resistant organism (MDRO) status in a diverse sample of acute-care hospitals.
Design:
Cross-sectional survey.
Participants:
Hospitals within the Society for Healthcare Epidemiology of America (SHEA) Research Network (SRN).
Methods:
SRN members completed an electronic survey on protocols and methods for IFTC. We assessed differences in IFTC frequency, barriers, and perceived benefit by presence of an IFTC protocol.
Results:
Among 136 hospital representatives who were sent the survey, 54 (40%) responded, of whom 72% reported having an IFTC protocol in place. The presence of a protocol did not differ significantly by hospital size, academic affiliation, or international status. Of those with IFTC protocols, 44% reported consistent notification of MDRO status (>75% of the time) to receiving facilities, as opposed to 13% from those with no IFTC protocol (P = .04). Respondents from hospitals with IFTC protocols reported significantly fewer barriers to communication compared to those without (2.8 vs 4.3; P = .03). Overall, however, most respondents (56%) reported a lack of standardization in communication. Presence of an IFTC protocol did not affect whether respondents perceived IFTC protocols as having a significant impact on infection prevention or antimicrobial stewardship.
Conclusions:
Most respondents reported having an IFTC protocol, which was associated with reduced communication barriers at transfer. Standardization of protocols and clarity about expectations for sending and receipt of information related to MDRO status may facilitate IFTC and promote appropriate and timely infection prevention practices.
To examine associations between diet and risk of developing gastro-oesophageal reflux disease (GERD).
Design:
Prospective cohort with a median follow-up of 15·8 years. Baseline diet was measured using a FFQ. GERD was defined as self-reported current or history of daily heartburn or acid regurgitation beginning at least 2 years after baseline. Sex-specific logistic regressions were performed to estimate OR for GERD associated with diet quality scores and intakes of nutrients, food groups and individual foods and beverages. The effect of substituting saturated fat for monounsaturated or polyunsaturated fat on GERD risk was examined.
Setting:
Melbourne, Australia.
Participants:
A cohort of 20 926 participants (62 % women) aged 40–59 years at recruitment between 1990 and 1994.
Results:
For men, total fat intake was associated with increased risk of GERD (OR 1·05 per 5 g/d; 95 % CI 1·01, 1·09; P = 0·016), whereas total carbohydrate (OR 0·89 per 30 g/d; 95 % CI 0·82, 0·98; P = 0·010) and starch intakes (OR 0·84 per 30 g/d; 95 % CI 0·75, 0·94; P = 0·005) were associated with reduced risk. Nutrients were not associated with risk for women. For both sexes, substituting saturated fat for polyunsaturated or monounsaturated fat did not change risk. For both sexes, fish, chicken, cruciferous vegetables and carbonated beverages were associated with increased risk, whereas total fruit and citrus were associated with reduced risk. No association was observed with diet quality scores.
Conclusions:
Diet is a possible risk factor for GERD, but food considered as triggers of GERD symptoms might not necessarily contribute to disease development. Potential differential associations for men and women warrant further investigation.
Clarifying the relationship between depression symptoms and cardiometabolic and related health could clarify risk factors and treatment targets. The objective of this study was to assess whether depression symptoms in midlife are associated with the subsequent onset of cardiometabolic health problems.
Methods
The study sample comprised 787 male twin veterans with polygenic risk score data who participated in the Harvard Twin Study of Substance Abuse (‘baseline’) and the longitudinal Vietnam Era Twin Study of Aging (‘follow-up’). Depression symptoms were assessed at baseline [mean age 41.42 years (s.d. = 2.34)] using the Diagnostic Interview Schedule, Version III, Revised. The onset of eight cardiometabolic conditions (atrial fibrillation, diabetes, erectile dysfunction, hypercholesterolemia, hypertension, myocardial infarction, sleep apnea, and stroke) was assessed via self-reported doctor diagnosis at follow-up [mean age 67.59 years (s.d. = 2.41)].
Results
Total depression symptoms were longitudinally associated with incident diabetes (OR 1.29, 95% CI 1.07–1.57), erectile dysfunction (OR 1.32, 95% CI 1.10–1.59), hypercholesterolemia (OR 1.26, 95% CI 1.04–1.53), and sleep apnea (OR 1.40, 95% CI 1.13–1.74) over 27 years after controlling for age, alcohol consumption, smoking, body mass index, C-reactive protein, and polygenic risk for specific health conditions. In sensitivity analyses that excluded somatic depression symptoms, only the association with sleep apnea remained significant (OR 1.32, 95% CI 1.09–1.60).
Conclusions
A history of depression symptoms by early midlife is associated with an elevated risk for subsequent development of several self-reported health conditions. When isolated, non-somatic depression symptoms are associated with incident self-reported sleep apnea. Depression symptom history may be a predictor or marker of cardiometabolic risk over decades.
To conduct a pilot study implementing combined genomic and epidemiologic surveillance for hospital-acquired multidrug-resistant organisms (MDROs) to predict transmission between patients and to estimate the local burden of MDRO transmission.
Design:
Pilot prospective multicenter surveillance study.
Setting:
The study was conducted in 8 university hospitals (2,800 beds total) in Melbourne, Australia (population 4.8 million), including 4 acute-care, 1 specialist cancer care, and 3 subacute-care hospitals.
Methods:
All clinical and screening isolates from hospital inpatients (April 24 to June 18, 2017) were collected for 6 MDROs: vanA VRE, MRSA, ESBL Escherichia coli (ESBL-Ec) and Klebsiella pneumoniae (ESBL-Kp), and carbapenem-resistant Pseudomonas aeruginosa (CRPa) and Acinetobacter baumannii (CRAb). Isolates were analyzed and reported as routine by hospital laboratories, underwent whole-genome sequencing at the central laboratory, and were analyzed using open-source bioinformatic tools. MDRO burden and transmission were assessed using combined genomic and epidemiologic data.
Results:
In total, 408 isolates were collected from 358 patients; 47.5% were screening isolates. ESBL-Ec was most common (52.5%), then MRSA (21.6%), vanA VRE (15.7%), and ESBL-Kp (7.6%). Most MDROs (88.3%) were isolated from patients with recent healthcare exposure.
Combining genomics and epidemiology identified that at least 27.1% of MDROs were likely acquired in a hospital; most of these transmission events would not have been detected without genomics. The highest proportion of transmission occurred with vanA VRE (88.4% of patients).
Conclusions:
Genomic and epidemiologic data from multiple institutions can feasibly be combined prospectively, providing substantial insights into the burden and distribution of MDROs, including in-hospital transmission. This analysis enables infection control teams to target interventions more effectively.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.