To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This study examined the long-term effects of a randomized controlled trial of the Family Check-Up (FCU) intervention initiated at age 2 on inhibitory control in middle childhood and adolescent internalizing and externalizing problems. We hypothesized that the FCU would promote higher inhibitory control in middle childhood relative to the control group, which in turn would be associated with lower internalizing and externalizing symptomology at age 14. Participants were 731 families, with half (n = 367) of the families assigned to the FCU intervention. Using an intent-to-treat design, results indicate that the FCU intervention was indirectly associated with both lower internalizing and externalizing symptoms at age 14 via its effect on increased inhibitory control in middle childhood (i.e., ages 8.5–10.5). Findings highlight the potential for interventions initiated in toddlerhood to have long-term impacts on self-regulation processes, which can further reduce the risk for behavioral and emotional difficulties in adolescence.
In a tertiary-care hospital and affiliated long-term care facility, a stewardship intervention focused on patients with Clostridioides difficile infection (CDI) was associated with a significant reduction in unnecessary non-CDI antibiotic therapy. However, there was no significant reduction in total non-CDI therapy or in the frequency of CDI recurrence.
Sink drainage systems are not amenable to standard methods of cleaning and disinfection. Disinfectants applied as a foam might enhance efficacy of drain decontamination due to greater persistence and increased penetration into sites harboring microorganisms.
To examine the efficacy and persistence of foam-based products in reducing sink drain colonization with gram-negative bacilli.
During a 5-month period, different methods for sink drain disinfection in patient rooms were evaluated in a hospital and its affiliated long-term care facility. We compared the efficacy of a single treatment with 4 different foam products in reducing the burden of gram-negative bacilli in the sink drain to a depth of 2.4 cm (1 inch) below the strainer. For the most effective product, the effectiveness of foam versus liquid-pouring applications, and the effectiveness of repeated foam treatments were evaluated.
A foam product containing 3.13% hydrogen peroxide and 0.05% peracetic acid was significantly more effective than the other 3 foam products. In comparison to pouring the hydrogen peroxide and peracetic acid disinfectant, the foam application resulted in significantly reduced recovery of gram-negative bacilli on days 1, 2, and 3 after treatment with a return to baseline by day 7. With repeated treatments every 3 days, a progressive decrease in the bacterial load recovered from sink drains was achieved.
An easy-to-use foaming application of a hydrogen peroxide- and peracetic acid-based disinfectant suppressed sink-drain colonization for at least 3 days. Intermittent application of the foaming disinfectant could potentially reduce the risk for dissemination of pathogens from sink drains.
The Minnesota Center for Twin and Family Research (MCTFR) comprises multiple longitudinal, community-representative investigations of twin and adoptive families that focus on psychological adjustment, personality, cognitive ability and brain function, with a special emphasis on substance use and related psychopathology. The MCTFR includes the Minnesota Twin Registry (MTR), a cohort of twins who have completed assessments in middle and older adulthood; the Minnesota Twin Family Study (MTFS) of twins assessed from childhood and adolescence into middle adulthood; the Enrichment Study (ES) of twins oversampled for high risk for substance-use disorders assessed from childhood into young adulthood; the Adolescent Brain (AdBrain) study, a neuroimaging study of adolescent twins; and the Siblings Interaction and Behavior Study (SIBS), a study of adoptive and nonadoptive families assessed from adolescence into young adulthood. Here we provide a brief overview of key features of these established studies and describe new MCTFR investigations that follow up and expand upon existing studies or recruit and assess new samples, including the MTR Study of Relationships, Personality, and Health (MTR-RPH); the Colorado-Minnesota (COMN) Marijuana Study; the Adolescent Brain Cognitive Development (ABCD) study; the Colorado Online Twins (CoTwins) study and the Children of Twins (CoT) study.
To assess the impact of a newly developed Central-Line Insertion Site Assessment (CLISA) score on the incidence of local inflammation or infection for CLABSI prevention.
A pre- and postintervention, quasi-experimental quality improvement study.
Setting and participants:
Adult inpatients with central venous catheters (CVCs) hospitalized in an intensive care unit or oncology ward at a large academic medical center.
We evaluated CLISA score impact on insertion site inflammation and infection (CLISA score of 2 or 3) incidence in the baseline period (June 2014–January 2015) and the intervention period (April 2015–October 2017) using interrupted times series and generalized linear mixed-effects multivariable analyses. These were run separately for days-to-line removal from identification of a CLISA score of 2 or 3. CLISA score interrater reliability and photo quiz results were evaluated.
Among 6,957 CVCs assessed 40,846 times, percentage of lines with CLISA score of 2 or 3 in the baseline and intervention periods decreased by 78.2% (from 22.0% to 4.7%), with a significant immediate decrease in the time-series analysis (P < .001). According to the multivariable regression, the intervention was associated with lower percentage of lines with a CLISA score of 2 or 3, after adjusting for age, gender, CVC body location, and hospital unit (odds ratio, 0.15; 95% confidence interval, 0.06–0.34; P < .001). According to the multivariate regression, days to removal of lines with CLISA score of 2 or 3 was 3.19 days faster after the intervention (P < .001). Also, line dwell time decreased 37.1% from a mean of 14 days (standard deviation [SD], 10.6) to 8.8 days (SD, 9.0) (P < .001). Device utilization ratios decreased 9% from 0.64 (SD, 0.08) to 0.58 (SD, 0.06) (P = .039).
The CLISA score creates a common language for assessing line infection risk and successfully promotes high compliance with best practices in timely line removal.
Quaternary processes and environmental changes are often difficult to assess in remote subantarctic islands due to high surface erosion rates and overprinting of sedimentary products in locations that can be a challenge to access. We present a set of high-resolution, multichannel seismic lines and complementary multibeam bathymetry collected off the eastern (leeward) side of the subantarctic Auckland Islands, about 465 km south of New Zealand's South Island. These data constrain the erosive and depositional history of the island group, and they reveal an extensive system of sediment-filled valleys that extend offshore to depths that exceed glacial low-stand sea level. Although shallow, marine, U-shaped valleys and moraines are imaged, the rugged offshore geomorphology of the paleovalley floors and the stratigraphy of infill sediments suggests that the valley floors were shaped by submarine fluvial erosion, and subsequently filled by lacustrine, fjord, and fluvial sedimentary processes.
Meal timing may influence food choices, neurobiology and psychological states. Our exploratory study examined if time-of-day eating patterns were associated with mood disorders among adults.
During 2004–2006 (age 26–36 years) and 2009–2011 (follow-up, age 31–41 years), N = 1304 participants reported 24-h food and beverage intake. Time-of-day eating patterns were derived by principal components analysis. At follow-up, the Composite International Diagnostic Interview measured lifetime mood disorder. Log binomial and adjacent categories log-link regression were used to examine bidirectional associations between eating patterns and mood disorder. Covariates included sex, age, marital status, social support, education, work schedule, body mass index and smoking.
Three patterns were derived at each time-point: Grazing (intake spread across the day), Traditional (highest intakes reflected breakfast, lunch and dinner), and Late (skipped/delayed breakfast with higher evening intakes). Compared to those in the lowest third of the respective pattern at baseline and follow-up, during the 5-year follow-up, those in the highest third of the Late pattern at both time-points had a higher prevalence of mood disorder [prevalence ratio (PR) = 2.04; 95% confidence interval (CI) 1.20–3.48], and those in the highest third of the Traditional pattern at both time-points had a lower prevalence of first onset mood disorder (PR = 0.31; 95% CI 0.11–0.87). Participants who experienced a mood disorder during follow-up had a 1.07 higher relative risk of being in a higher Late pattern score category at follow-up than those without mood disorder (95% CI 1.00–1.14).
Non-traditional eating patterns, particularly skipped or delayed breakfast, may be associated with mood disorders.
In a crossover trial, a gown designed to increase skin coverage at the hands and wrists significantly reduced contamination of personnel during personal protective equipment (PPE) removal, and education on donning and doffing technique further reduced contamination. Simple modifications of PPE and education can reduce contamination during PPE removal.
This study investigates suicide risk in late childhood and early adolescence in relation to a family-centered intervention, the Family Check-Up, for problem behavior delivered in early childhood. At age 2, 731 low-income families receiving nutritional services from Women, Infants, and Children programs were randomized to the Family Check-Up intervention or to a control group. Trend-level main effects were observed on endorsement of suicide risk by parents or teachers from ages 7.5 to 14, with higher rates of suicide risk endorsement in youth in the control versus intervention condition. A significant indirect effect of intervention was also observed, with treatment-related improvements in inhibitory control across childhood predicting reductions in suicide-related risk both at age 10.5, assessed via diagnostic interviews with parents and youth, and at age 14, assessed via parent and teacher reports. Results add to the emerging body of work demonstrating long-term reductions in suicide risk related to family-focused preventive interventions, and highlight improvements in youth self-regulatory skills as an important mechanism of such reductions in risk.
Building on prior work using Tom Dishion's Family Check-Up, the current article examined intervention effects on dysregulated irritability in early childhood. Dysregulated irritability, defined as reactive and intense response to frustration, and prolonged angry mood, is an ideal marker of neurodevelopmental vulnerability to later psychopathology because it is a transdiagnostic indicator of decrements in self-regulation that are measurable in the first years of life that have lifelong implications for health and disease. This study is perhaps the first randomized trial to examine the direct effects of an evidence- and family-based intervention, the Family Check-Up (FCU), on irritability in early childhood and the effects of reductions in irritability on later risk of child internalizing and externalizing symptomatology. Data from the geographically and sociodemographically diverse multisite Early Steps randomized prevention trial were used. Path modeling revealed intervention effects on irritability at age 4, which predicted lower externalizing and internalizing symptoms at age 10.5. Results indicate that family-based programs initiated in early childhood can reduce early childhood irritability and later risk for psychopathology. This holds promise for earlier identification and prevention approaches that target transdiagnostic pathways. Implications for future basic and prevention research are discussed.
Several research teams have previously traced patterns of emerging conduct problems (CP) from early or middle childhood. The current study expands on this previous literature by using a genetically-informed, experimental, and long-term longitudinal design to examine trajectories of early-emerging conduct problems and early childhood discriminators of such patterns from the toddler period to adolescence. The sample represents a cohort of 731 toddlers and diverse families recruited based on socioeconomic, child, and family risk, varying in urbanicity and assessed on nine occasions between ages 2 and 14. In addition to examining child, family, and community level discriminators of patterns of emerging conduct problems, we were able to account for genetic susceptibility using polygenic scores and the study's experimental design to determine whether random assignment to the Family Check-Up (FCU) discriminated trajectory groups. In addition, in accord with differential susceptibility theory, we tested whether the effects of the FCU were stronger for those children with higher genetic susceptibility. Results augmented previous findings documenting the influence of child (inhibitory control [IC], gender) and family (harsh parenting, parental depression, and educational attainment) risk. In addition, children in the FCU were overrepresented in the persistent low versus persistent high CP group, but such direct effects were qualified by an interaction between the intervention and genetic susceptibility that was consistent with differential susceptibility. Implications are discussed for early identification and specifically, prevention efforts addressing early child and family risk.
People with cerebral palsy (CP) are less physically active than the general population and, consequently, are at increased risk of preventable disease. Evidence indicates that low-moderate doses of physical activity can reduce disease risk and improve fitness and function in people with CP. Para athletes with CP typically engage in ‘performance-focused’ sports training, which is undertaken for the sole purpose of enhancing sports performance. Anecdotally, many Para athletes report that participation in performance-focused sports training confers meaningful clinical benefits which exceed those reported in the literature; however, supporting scientific evidence is lacking. The aim of this paper is to describe the protocol for an 18-month study evaluating the clinical effects of a performance-focused swimming training programme for people with CP who have high support needs.
This study will use a concurrent multiple-baseline, single-case experimental design across three participants with CP who have high support needs. Each participant will complete a five-phase trial comprising: baseline (A1); training phase 1 (B1); maintenance phase 1 (A2); training phase 2 (B2); and maintenance phase 2 (A3). For each participant, measurement of swim velocity, health-related quality of life and gross motor functioning will be carried out a minimum of five times in each of the five phases.
The study described will produce Level II evidence regarding the effects of performance-focused swimming training on clinical outcomes in people with CP who have high support needs. Findings are expected to provide an indication of the potential for sport to augment outcomes in neurological rehabilitation.
Starting in 2016, we initiated a pilot tele-antibiotic stewardship program at 2 rural Veterans Affairs medical centers (VAMCs). Antibiotic days of therapy decreased significantly (P < .05) in the acute and long-term care units at both intervention sites, suggesting that tele-stewardship can effectively support antibiotic stewardship practices in rural VAMCs.
Background: Cervical sponylotic myelopathy (CSM) may present with neck and arm pain. This study investiagtes the change in neck/arm pain post-operatively in CSM. Methods: This ambispective study llocated 402 patients through the Canadian Spine Outcomes and Research Network. Outcome measures were the visual analogue scales for neck and arm pain (VAS-NP and VAS-AP) and the neck disability index (NDI). The thresholds for minimum clinically important differences (MCIDs) for VAS-NP and VAS-AP were determined to be 2.6 and 4.1. Results: VAS-NP improved from mean of 5.6±2.9 to 3.8±2.7 at 12 months (P<0.001). VAS-AP improved from 5.8±2.9 to 3.5±3.0 at 12 months (P<0.001). The MCIDs for VAS-NP and VAS-AP were also reached at 12 months. Based on the NDI, patients were grouped into those with mild pain/no pain (33%) versus moderate/severe pain (67%). At 3 months, a significantly high proportion of patients with moderate/severe pain (45.8%) demonstrated an improvement into mild/no pain, whereas 27.2% with mild/no pain demonstrated worsening into moderate/severe pain (P <0.001). At 12 months, 17.4% with mild/no pain experienced worsening of their NDI (P<0.001). Conclusions: This study suggests that neck and arm pain responds to surgical decompression in patients with CSM and reaches the MCIDs for VAS-AP and VAS-NP at 12 months.
Consumption of certain berries appears to slow postprandial glucose absorption, attributable to polyphenols, which may benefit exercise and cognition, reduce appetite and/or oxidative stress. This randomised, crossover, placebo-controlled study determined whether polyphenol-rich fruits added to carbohydrate-based foods produce a dose-dependent moderation of postprandial glycaemic, glucoregulatory hormone, appetite and ex vivo oxidative stress responses. Twenty participants (eighteen males/two females; 24 (sd 5) years; BMI: 27 (sd 3) kg/m2) consumed one of five cereal bars (approximately 88 % carbohydrate) containing no fruit ingredients (reference), freeze-dried black raspberries (10 or 20 % total weight; LOW-Rasp and HIGH-Rasp, respectively) and cranberry extract (0·5 or 1 % total weight; LOW-Cran and HIGH-Cran), on trials separated by ≥5 d. Postprandial peak/nadir from baseline (Δmax) and incremental postprandial AUC over 60 and 180 min for glucose and other biochemistries were measured to examine the dose-dependent effects. Glucose AUC0–180 min trended towards being higher (43 %) after HIGH-Rasp v. LOW-Rasp (P=0·06), with no glucose differences between the raspberry and reference bars. Relative to reference, HIGH-Rasp resulted in a 17 % lower Δmax insulin, 3 % lower C-peptide (AUC0–60 min and 3 % lower glucose-dependent insulinotropic polypeptide (AUC0–180 min) P<0·05. No treatment effects were observed for the cranberry bars regarding glucose and glucoregulatory hormones, nor were there any treatment effects for either berry type regarding ex vivo oxidation, appetite-mediating hormones or appetite. Fortification with freeze-dried black raspberries (approximately 25 g, containing 1·2 g of polyphenols) seems to slightly improve the glucoregulatory hormone and glycaemic responses to a high-carbohydrate food item in young adults but did not affect appetite or oxidative stress responses at doses or with methods studied herein.
Introduction: Recently there have been many studies performed on the effectiveness of implementing LEAN principals to improve wait times for emergency departments (EDs), but there have been relatively few studies on implementing these concepts on length of stay (LOS) in the ED. This research aims to explore the initial feasibility of applying the LEAN model to length-of-stay metrics in an ED by identifying areas of non-value added time for patients staying in the ED. Methods: In this project we used a sample of 10,000 ED visits at the Health Science Centre in St. John's over a 1-year period and compared patients’ LOS in the ED on four criteria: day of the week, hour of presentation, whether laboratory tests were ordered, and whether diagnostic imaging was ordered. Two sets of analyses were then performed. First a two-sided Wilcoxon rank-sum test was used to evaluate whether ordering either lab tests or diagnostic imaging affected LOS. Second a generalized linear model (GLM) was created using a 10-fold cross-validation with a LASSO operator to analyze the effect size and significance of each of the four criteria on LOS. Additionally, a post-test analysis of the GLM was performed on a second sample of 10,000 ED visits in the same 1-year period to assess its predictive power and infer the degree to which a patient's LOS is determined by the four criteria. Results: For the Wilcoxon rank-sum test there was no significant difference in LOS for patients who were ordered diagnostic imaging compared to those who were not (p = 0.6998) but there was a statistically significant decrease in LOS for patients who were ordered lab tests compared to those who were not (p = 2.696 x 10-10). When assessing the GLM there were two significant takeaways: ordering lab tests reduced LOS (95% CI = 42.953 - 68.173min reduction), and arriving at the ED on Thursday increased LOS significantly (95% CI = 6.846 – 52.002min increase). Conclusion: This preliminary analysis identified several factors that increased patients’ LOS in the ED, which would be suitable for potential LEAN interventions. The increase in LOS for both patients who are not ordered lab tests and who visit the ED on Thursday warrant further investigation to identify causal factors. Finally, while this analysis revealed several actionable criteria for improving ED LOS the relatively low predictive power of the final GLM in the post-test analysis (R2 = 0.00363) indicates there are more criteria that influence LOS for exploration in future analyses.
The prehospital disaster and emergency medical services community stands on the front-line in the response to events such as novel influenza, multi-drug resistant tuberculosis, and other high consequence diseases such as the Ebola Virus Disease.
To address provider and community safety, we developed an online educational program utilizing a Multi-Pathogen Approach to infectious disease personal protective equipment (PPE) deployment by prehospital providers. Such vigilance starts with syndromic recognition and quickly transcends to include operational issues, clinical interventions, and public health integration.
The University of Maryland, Baltimore County (Maryland, USA), Department of Emergency Health Services partnered with the Maryland State Department of Health (USA), to develop an online educational curriculum. The curriculum was developed through an expert panel consensus group including prehospital providers and is hybrid in design and includes awareness level training and procedural guidance.
Currently deployed online, this educational content demonstrating the use of the Multi-Pathogen Approach is accessible open-access via YouTube worldwide on computers, tablets, and smartphones. This curriculum is also accessible for continuing medical education to over 50,000 prehospital, hospital, and clinic personnel throughout Maryland and the National Capital Region of the United States. The curriculum consists of twelve modules of didactic and live videotaped demonstrations.
The development of the Multi-Pathogen Approach for the deployment of PPE and the use of online education modules has given prehospital providers an easily accessible open-access tool for high consequence disease management. The development of educational efforts such as these can help ensure better patient care and prehospital EMS system readiness.
Introduction: Acute aortic syndrome (AAS) is a time sensitive aortic catastrophe that is often misdiagnosed. There are currently no Canadian guidelines to aid in diagnosis. Our goal was to adapt the existing American Heart Association (AHA) and European Society of Cardiology (ESC) diagnostic algorithms for AAS into a Canadian evidence based best practices algorithm targeted for emergency medicine physicians. Methods: We chose to adapt existing high-quality clinical practice guidelines (CPG) previously developed by the AHA/ESC using the GRADE ADOLOPMENT approach. We created a National Advisory Committee consisting of 21 members from across Canada including academic, community and remote/rural emergency physicians/nurses, cardiothoracic and cardiovascular surgeons, cardiac anesthesiologists, critical care physicians, cardiologist, radiologists and patient representatives. The Advisory Committee communicated through multiple teleconference meetings, emails and a one-day in person meeting. The panel prioritized questions and outcomes, using the Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach to assess evidence and make recommendations. The algorithm was prepared and revised through feedback and discussions and through an iterative process until consensus was achieved. Results: The diagnostic algorithm is comprised of an updated pre test probability assessment tool with further testing recommendations based on risk level. The updated tool incorporates likelihood of an alternative diagnosis and point of care ultrasound. The final best practice diagnostic algorithm defined risk levels as Low (0.5% no further testing), Moderate (0.6-5% further testing required) and High ( >5% computed tomography, magnetic resonance imaging, trans esophageal echocardiography). During the consensus and feedback processes, we addressed a number of issues and concerns. D-dimer can be used to reduce probability of AAS in an intermediate risk group, but should not be used in a low or high-risk group. Ultrasound was incorporated as a bedside clinical examination option in pre test probability assessment for aortic insufficiency, abdominal/thoracic aortic aneurysms. Conclusion: We have created the first Canadian best practice diagnostic algorithm for AAS. We hope this diagnostic algorithm will standardize and improve diagnosis of AAS in all emergency departments across Canada.
Geophysical survey and excavations from 2010–2016 at Lawrenz Gun Club (11CS4), a late pre-Columbian village located in the central Illinois River valley in Illinois, identified 10 mounds, a central plaza, and dozens of structures enclosed within a stout 10 hectare bastioned palisade. Nineteen radiocarbon (14C) measurements were taken from single entities of wood charcoal, short-lived plants, and animal bones. A site chronology has been constructed using a Bayesian approach that considers the stratigraphic contexts and feature formation processes. The village was host to hundreds of years of continuous human activity during the Mississippi Period. Mississippian activity at the site is estimated to have begun in cal AD 990–1165 (95% probability), ended in cal AD 1295–1450 (95% probability), and lasted 150–420 yr (95% probability) in the primary Bayesian model with similar results obtained in two alternative models. The palisade is estimated to have been constructed in cal AD 1150–1230 (95% probability) and was continuously repaired and rebuilt for 15–125 yr (95% probability), probably for 40–85 yr (68% probability). Comparison to other studies demonstrates that the bastioned palisade at Lawrenz was one of the earliest constructed in the midcontinental United States.