To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Clostridioides difficile infection (CDI) is the most frequently reported hospital-acquired infection in the United States. Bioaerosols generated during toilet flushing are a possible mechanism for the spread of this pathogen in clinical settings.
To measure the bioaerosol concentration from toilets of patients with CDI before and after flushing.
In this pilot study, bioaerosols were collected 0.15 m, 0.5 m, and 1.0 m from the rims of the toilets in the bathrooms of hospitalized patients with CDI. Inhibitory, selective media were used to detect C. difficile and other facultative anaerobes. Room air was collected continuously for 20 minutes with a bioaerosol sampler before and after toilet flushing. Wilcoxon rank-sum tests were used to assess the difference in bioaerosol production before and after flushing.
Rooms of patients with CDI at University of Iowa Hospitals and Clinics.
Bacteria were positively cultured from 8 of 24 rooms (33%). In total, 72 preflush and 72 postflush samples were collected; 9 of the preflush samples (13%) and 19 of the postflush samples (26%) were culture positive for healthcare-associated bacteria. The predominant species cultured were Enterococcus faecalis, E. faecium, and C. difficile. Compared to the preflush samples, the postflush samples showed significant increases in the concentrations of the 2 large particle-size categories: 5.0 µm (P = .0095) and 10.0 µm (P = .0082).
Bioaerosols produced by toilet flushing potentially contribute to hospital environmental contamination. Prevention measures (eg, toilet lids) should be evaluated as interventions to prevent toilet-associated environmental contamination in clinical settings.
This study examined the long-term effects of a randomized controlled trial of the Family Check-Up (FCU) intervention initiated at age 2 on inhibitory control in middle childhood and adolescent internalizing and externalizing problems. We hypothesized that the FCU would promote higher inhibitory control in middle childhood relative to the control group, which in turn would be associated with lower internalizing and externalizing symptomology at age 14. Participants were 731 families, with half (n = 367) of the families assigned to the FCU intervention. Using an intent-to-treat design, results indicate that the FCU intervention was indirectly associated with both lower internalizing and externalizing symptoms at age 14 via its effect on increased inhibitory control in middle childhood (i.e., ages 8.5–10.5). Findings highlight the potential for interventions initiated in toddlerhood to have long-term impacts on self-regulation processes, which can further reduce the risk for behavioral and emotional difficulties in adolescence.
To assess the impact of a newly developed Central-Line Insertion Site Assessment (CLISA) score on the incidence of local inflammation or infection for CLABSI prevention.
A pre- and postintervention, quasi-experimental quality improvement study.
Setting and participants:
Adult inpatients with central venous catheters (CVCs) hospitalized in an intensive care unit or oncology ward at a large academic medical center.
We evaluated CLISA score impact on insertion site inflammation and infection (CLISA score of 2 or 3) incidence in the baseline period (June 2014–January 2015) and the intervention period (April 2015–October 2017) using interrupted times series and generalized linear mixed-effects multivariable analyses. These were run separately for days-to-line removal from identification of a CLISA score of 2 or 3. CLISA score interrater reliability and photo quiz results were evaluated.
Among 6,957 CVCs assessed 40,846 times, percentage of lines with CLISA score of 2 or 3 in the baseline and intervention periods decreased by 78.2% (from 22.0% to 4.7%), with a significant immediate decrease in the time-series analysis (P < .001). According to the multivariable regression, the intervention was associated with lower percentage of lines with a CLISA score of 2 or 3, after adjusting for age, gender, CVC body location, and hospital unit (odds ratio, 0.15; 95% confidence interval, 0.06–0.34; P < .001). According to the multivariate regression, days to removal of lines with CLISA score of 2 or 3 was 3.19 days faster after the intervention (P < .001). Also, line dwell time decreased 37.1% from a mean of 14 days (standard deviation [SD], 10.6) to 8.8 days (SD, 9.0) (P < .001). Device utilization ratios decreased 9% from 0.64 (SD, 0.08) to 0.58 (SD, 0.06) (P = .039).
The CLISA score creates a common language for assessing line infection risk and successfully promotes high compliance with best practices in timely line removal.
Given the evidence of multi-parameter risk factors in shaping cognitive outcomes in aging, including sleep, inflammation, cardiometabolism, and mood disorders, multidimensional investigations of their impact on cognition are warranted. We sought to determine the extent to which self-reported sleep disturbances, metabolic syndrome (MetS) factors, cellular inflammation, depressive symptomatology, and diminished physical mobility were associated with cognitive impairment and poorer cognitive performance.
This is a cross-sectional study.
Participants with elevated, well-controlled blood pressure were recruited from the local community for a Tai Chi and healthy-aging intervention study.
One hundred forty-five older adults (72.7 ± 7.9 years old; 66% female), 54 (37%) with evidence of cognitive impairment (CI) based on Montreal Cognitive Assessment (MoCA) score ≤24, underwent medical, psychological, and mood assessments.
CI and cognitive domain performance were assessed using the MoCA. Univariate correlations were computed to determine relationships between risk factors and cognitive outcomes. Bootstrapped logistic regression was used to determine significant predictors of CI risk and linear regression to explore cognitive domains affected by risk factors.
The CI group were slower on the mobility task, satisfied more MetS criteria, and reported poorer sleep than normocognitive individuals (all p < 0.05). Multivariate logistic regression indicated that sleep disturbances, but no other risk factors, predicted increased risk of evidence of CI (OR = 2.00, 95% CI: 1.26–4.87, 99% CI: 1.08–7.48). Further examination of MoCA cognitive subdomains revealed that sleep disturbances predicted poorer executive function (β = –0.26, 95% CI: –0.51 to –0.06, 99% CI: –0.61 to –0.02), with lesser effects on visuospatial performance (β = –0.20, 95% CI: –0.35 to –0.02, 99% CI: –0.39 to 0.03), and memory (β = –0.29, 95% CI: –0.66 to –0.01, 99% CI: –0.76 to 0.08).
Our results indicate that the deleterious impact of self-reported sleep disturbances on cognitive performance was prominent over other risk factors and illustrate the importance of clinician evaluation of sleep in patients with or at risk of diminished cognitive performance. Future, longitudinal studies implementing a comprehensive neuropsychological battery and objective sleep measurement are warranted to further explore these associations.
Building on prior work using Tom Dishion's Family Check-Up, the current article examined intervention effects on dysregulated irritability in early childhood. Dysregulated irritability, defined as reactive and intense response to frustration, and prolonged angry mood, is an ideal marker of neurodevelopmental vulnerability to later psychopathology because it is a transdiagnostic indicator of decrements in self-regulation that are measurable in the first years of life that have lifelong implications for health and disease. This study is perhaps the first randomized trial to examine the direct effects of an evidence- and family-based intervention, the Family Check-Up (FCU), on irritability in early childhood and the effects of reductions in irritability on later risk of child internalizing and externalizing symptomatology. Data from the geographically and sociodemographically diverse multisite Early Steps randomized prevention trial were used. Path modeling revealed intervention effects on irritability at age 4, which predicted lower externalizing and internalizing symptoms at age 10.5. Results indicate that family-based programs initiated in early childhood can reduce early childhood irritability and later risk for psychopathology. This holds promise for earlier identification and prevention approaches that target transdiagnostic pathways. Implications for future basic and prevention research are discussed.
Several research teams have previously traced patterns of emerging conduct problems (CP) from early or middle childhood. The current study expands on this previous literature by using a genetically-informed, experimental, and long-term longitudinal design to examine trajectories of early-emerging conduct problems and early childhood discriminators of such patterns from the toddler period to adolescence. The sample represents a cohort of 731 toddlers and diverse families recruited based on socioeconomic, child, and family risk, varying in urbanicity and assessed on nine occasions between ages 2 and 14. In addition to examining child, family, and community level discriminators of patterns of emerging conduct problems, we were able to account for genetic susceptibility using polygenic scores and the study's experimental design to determine whether random assignment to the Family Check-Up (FCU) discriminated trajectory groups. In addition, in accord with differential susceptibility theory, we tested whether the effects of the FCU were stronger for those children with higher genetic susceptibility. Results augmented previous findings documenting the influence of child (inhibitory control [IC], gender) and family (harsh parenting, parental depression, and educational attainment) risk. In addition, children in the FCU were overrepresented in the persistent low versus persistent high CP group, but such direct effects were qualified by an interaction between the intervention and genetic susceptibility that was consistent with differential susceptibility. Implications are discussed for early identification and specifically, prevention efforts addressing early child and family risk.
Background: Cervical sponylotic myelopathy (CSM) may present with neck and arm pain. This study investiagtes the change in neck/arm pain post-operatively in CSM. Methods: This ambispective study llocated 402 patients through the Canadian Spine Outcomes and Research Network. Outcome measures were the visual analogue scales for neck and arm pain (VAS-NP and VAS-AP) and the neck disability index (NDI). The thresholds for minimum clinically important differences (MCIDs) for VAS-NP and VAS-AP were determined to be 2.6 and 4.1. Results: VAS-NP improved from mean of 5.6±2.9 to 3.8±2.7 at 12 months (P<0.001). VAS-AP improved from 5.8±2.9 to 3.5±3.0 at 12 months (P<0.001). The MCIDs for VAS-NP and VAS-AP were also reached at 12 months. Based on the NDI, patients were grouped into those with mild pain/no pain (33%) versus moderate/severe pain (67%). At 3 months, a significantly high proportion of patients with moderate/severe pain (45.8%) demonstrated an improvement into mild/no pain, whereas 27.2% with mild/no pain demonstrated worsening into moderate/severe pain (P <0.001). At 12 months, 17.4% with mild/no pain experienced worsening of their NDI (P<0.001). Conclusions: This study suggests that neck and arm pain responds to surgical decompression in patients with CSM and reaches the MCIDs for VAS-AP and VAS-NP at 12 months.
Introduction: Recently there have been many studies performed on the effectiveness of implementing LEAN principals to improve wait times for emergency departments (EDs), but there have been relatively few studies on implementing these concepts on length of stay (LOS) in the ED. This research aims to explore the initial feasibility of applying the LEAN model to length-of-stay metrics in an ED by identifying areas of non-value added time for patients staying in the ED. Methods: In this project we used a sample of 10,000 ED visits at the Health Science Centre in St. John's over a 1-year period and compared patients’ LOS in the ED on four criteria: day of the week, hour of presentation, whether laboratory tests were ordered, and whether diagnostic imaging was ordered. Two sets of analyses were then performed. First a two-sided Wilcoxon rank-sum test was used to evaluate whether ordering either lab tests or diagnostic imaging affected LOS. Second a generalized linear model (GLM) was created using a 10-fold cross-validation with a LASSO operator to analyze the effect size and significance of each of the four criteria on LOS. Additionally, a post-test analysis of the GLM was performed on a second sample of 10,000 ED visits in the same 1-year period to assess its predictive power and infer the degree to which a patient's LOS is determined by the four criteria. Results: For the Wilcoxon rank-sum test there was no significant difference in LOS for patients who were ordered diagnostic imaging compared to those who were not (p = 0.6998) but there was a statistically significant decrease in LOS for patients who were ordered lab tests compared to those who were not (p = 2.696 x 10-10). When assessing the GLM there were two significant takeaways: ordering lab tests reduced LOS (95% CI = 42.953 - 68.173min reduction), and arriving at the ED on Thursday increased LOS significantly (95% CI = 6.846 – 52.002min increase). Conclusion: This preliminary analysis identified several factors that increased patients’ LOS in the ED, which would be suitable for potential LEAN interventions. The increase in LOS for both patients who are not ordered lab tests and who visit the ED on Thursday warrant further investigation to identify causal factors. Finally, while this analysis revealed several actionable criteria for improving ED LOS the relatively low predictive power of the final GLM in the post-test analysis (R2 = 0.00363) indicates there are more criteria that influence LOS for exploration in future analyses.
Introduction: Acute aortic syndrome (AAS) is a time sensitive aortic catastrophe that is often misdiagnosed. There are currently no Canadian guidelines to aid in diagnosis. Our goal was to adapt the existing American Heart Association (AHA) and European Society of Cardiology (ESC) diagnostic algorithms for AAS into a Canadian evidence based best practices algorithm targeted for emergency medicine physicians. Methods: We chose to adapt existing high-quality clinical practice guidelines (CPG) previously developed by the AHA/ESC using the GRADE ADOLOPMENT approach. We created a National Advisory Committee consisting of 21 members from across Canada including academic, community and remote/rural emergency physicians/nurses, cardiothoracic and cardiovascular surgeons, cardiac anesthesiologists, critical care physicians, cardiologist, radiologists and patient representatives. The Advisory Committee communicated through multiple teleconference meetings, emails and a one-day in person meeting. The panel prioritized questions and outcomes, using the Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach to assess evidence and make recommendations. The algorithm was prepared and revised through feedback and discussions and through an iterative process until consensus was achieved. Results: The diagnostic algorithm is comprised of an updated pre test probability assessment tool with further testing recommendations based on risk level. The updated tool incorporates likelihood of an alternative diagnosis and point of care ultrasound. The final best practice diagnostic algorithm defined risk levels as Low (0.5% no further testing), Moderate (0.6-5% further testing required) and High ( >5% computed tomography, magnetic resonance imaging, trans esophageal echocardiography). During the consensus and feedback processes, we addressed a number of issues and concerns. D-dimer can be used to reduce probability of AAS in an intermediate risk group, but should not be used in a low or high-risk group. Ultrasound was incorporated as a bedside clinical examination option in pre test probability assessment for aortic insufficiency, abdominal/thoracic aortic aneurysms. Conclusion: We have created the first Canadian best practice diagnostic algorithm for AAS. We hope this diagnostic algorithm will standardize and improve diagnosis of AAS in all emergency departments across Canada.
Filamentary structures can form within the beam of protons accelerated during the interaction of an intense laser pulse with an ultrathin foil target. Such behaviour is shown to be dependent upon the formation time of quasi-static magnetic field structures throughout the target volume and the extent of the rear surface proton expansion over the same period. This is observed via both numerical and experimental investigations. By controlling the intensity profile of the laser drive, via the use of two temporally separated pulses, both the initial rear surface proton expansion and magnetic field formation time can be varied, resulting in modification to the degree of filamentary structure present within the laser-driven proton beam.