To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This study evaluated a simple environmental intervention called ‘Water Schools’ in Lower Austria providing free refillable water bottles and educational material.
Non-randomised controlled cluster trial with three measurements: at baseline (T0), after the intervention at 9 months (T1) and after 1-year follow-up (T2).
Half-day elementary schools in Lower Austria (Austria).
Third-grade pupils from twenty-two schools in the intervention group (IG) and thirty-two schools in the control group (CG) participated in the study. Data were analysed for 569 to 598 pupils in the IG and for 545 to 613 in the CG, depending on the time of measurement.
The consumption of tap water increased in the IG from baseline to T1 and then decreased again at T2, but this was similar in the CG (no statistically significant difference in the time trend between the IG and CG). Similar results were seen for tap water consumption in the mornings. The proportion of children who only drank tap water on school mornings increased significantly from baseline to T1 in the IG compared to the CG (P = 0·020). No difference in the changes over time occurred between the groups for the proportion of pupils drinking approximately one bottle of tap water during school mornings.
Not only the children in the IG but also those in the CG drank more tap water after 1 school year than at the beginning. The measurement of drinking habits in the CG may have been intervention enough to bring about changes or to initiate projects.
Many of today’s products and services fall into the category of products and services that are, as such, legal but still unhealthy. Think of smoking as an example. This paper deals with the question of which mix of legal instruments should ideally be used in the light of such products and services. It distinguishes between products and services that predominantly lead to “harm to oneself” or “harm to others”. In the end, most law and economics arguments point in the direction of ex ante safety regulation. However, when there are concerns about the quality of regulation (eg due to lobbying efforts), liability has important added value. In the “harm to others” scenario, safety regulation is even more warranted because the larger number of victims and the increased difficulty to prove the causal link render litigation burdensome. Strict liability has several advantages over negligence and is, therefore, generally preferred. A strong point of negligence, however, is that it enables the court to conduct its own weighing of costs and benefits, which in specific circumstances may be better than that of the regulator or the producer. Furthermore, it enables the judge to consider not only the harm to the user, but also the harm to others in setting the due care standard.
Nosocomial transmission of influenza is a major concern for infection control. We aimed to dissect transmission dynamics of influenza, including asymptomatic transmission events, in acute care.
Prospective surveillance study during 2 influenza seasons.
Volunteer sample of inpatients on medical wards and healthcare workers (HCWs).
Participants provided daily illness diaries and nasal swabs for influenza A and B detection and whole-genome sequencing for phylogenetic analyses. Contacts between study participants were tracked. Secondary influenza attack rates were calculated based on spatial and temporal proximity and phylogenetic evidence for transmission.
In total, 152 HCWs and 542 inpatients were included; 16 HCWs (10.5%) and 19 inpatients (3.5%) tested positive for influenza on 109 study days. Study participants had symptoms of disease on most of the days they tested positive for influenza (83.1% and 91.9% for HCWs and inpatients, respectively). Also, 11(15.5%) of 71 influenza-positive swabs among HCWs and 3 (7.9%) of 38 influenza-positive swabs among inpatients were collected on days without symptoms; 2 (12.5%) of 16 HCWs and 2 (10.5%) of 19 inpatients remained fully asymptomatic. The secondary attack rate was low: we recorded 1 transmission event over 159 contact days (0.6%) that originated from a symptomatic case. No transmission event occurred in 61 monitored days of contacts with asymptomatic influenza-positive individuals.
Influenza in acute care is common, and individuals regularly shed influenza virus without harboring symptoms. Nevertheless, both symptomatic and asymptomatic transmission events proved rare. We suggest that healthcare-associated influenza prevention strategies that are based on preseason vaccination and barrier precautions for symptomatic individuals seem to be effective.
To assess influenza symptoms, adherence to mask use recommendations, absenteesm and presenteeism in acute care healthcare workers (HCWs) during influenza epidemics.
The TransFLUas influenza transmission study in acute healthcare prospectively followed HCWs prospectively over 2 consecutive influenza seasons. Symptom diaries asking for respiratory symptoms and adherence with mask use recommendations were recorded on a daily basis, and study participants provided midturbinate nasal swabs for influenza testing.
In total, 152 HCWs (65.8% nurses and 13.2% physicians) were included: 89.1% of study participants reported at least 1 influenza symptom during their study season and 77.8% suffered from respiratory symptoms. Also, 28.3% of HCW missed at least 1 working day during the study period: 82.6% of these days were missed because of symptoms of influenza illness. Of all participating HCWs, 67.9% worked with symptoms of influenza infection on 8.8% of study days. On 0.3% of study days, symptomatic HCWs were shedding influenza virus while at work. Among HCWs with respiratory symptoms, 74.1% adhered to the policy to wear a mask at work on 59.1% of days with respiratory symptoms.
Respiratory disease is frequent among HCWs and imposes a significant economic burden on hospitals due to the number of working days lost. Presenteesm with respiratory illness, including influenza, is also frequent and poses a risk for patients and staff.
Background: Antibiotic time outs (ABTOs), formal reassessments of all new antimicrobial regimens by the care team, can optimize antimicrobial regimens, reducing antimicrobial overuse and potentially improving outcomes. Implementation of ABTOs is a substantial challenge. We used quality improvement methods to implement robust, meaningful, team-driven ABTOs in general medicine ward services. Methods: We identified and engaged stakeholders to serve as champions for the quality improvement initiative. On October 1, 2018, 2 internal medicine teaching services (services A and B), began conducting ABTOs on all patients admitted to their services receiving systemic antimicrobials for at least 36 hours. Eligible patients were usually identified by the team pharmacist. ABTOs were completed within 72 hours of antibiotic initiation and were documented in the electronic medical record (EMR) by providers using a template. The process was modified as necessary in response to feedback from frontline clinicians using plan-do-study-act (PDSA) methods. We subsequently spread the project to 2 additional internal medicine services (services C and D); 2 family medicine teams (services E and F); and 1 general pediatric service (service G). The project is ongoing. We collected data for the following metrics: (1) proportion of ABTO-eligible patients with an ABTO; (2) proportion of ABTOs conducted within the recommended time frame; (3) documented plan changes as a result of ABTO (eg, change IV antibiotics to PO); (4) proportion of documented plan changes actually completed within 24 hours. Results: Within 12 weeks, services A and B were successfully completing time outs in >80% of their patients. This target was consistently reached by services C, D, E, F, and G almost immediately following launch on those services. As of June 29, 2019, >80% of eligible patients across all participating services have had a time out conducted for 16 consecutive weeks. ABTOs have resulted in a change in management in 35% of cases, including IV-to-PO change in 19% of cases and discontinuation in 5%. Overall, 77% of time outs occurred during the 36–72-hour window. Ultimately, 95% of documented plan changes were completed within 24 hours. Conclusions: ABTOs are effective but implementation is challenging. We achieved high compliance with ABTOs without using electronic reminders. Our results suggest that ABTOs were impactful in the non–critical-care general medicine setting. Next steps include (1) development of EMR-based tools to facilitate identifying eligible patients and ABTO documentation; (2) continued spread through our health care system; and (3) analysis of ABTO impact using ABTO-unexposed patients as a control group.
The majority of running geothermal plants worldwide are located in geological settings with convection- or advection-dominant heat transport. In Germany as in most regions in Europe, conduction is the dominating heat transport mechanism, with a resulting average geothermal gradient. The geothermal play type concept is a modern methodology to group geothermal resources according to their geological setting, and characteristic heat transport mechanisms. In particular, the quantity of heat transport is related to fluid flow in natural or engineered geothermal reservoirs. Hence, the permeability structure is a key element for geothermal play typing. Following the existing geothermal play type catalogue, four major geothermal play types can be identified for Germany: intracratonic basins, foreland basins and basement/crystalline rock provinces as conduction-dominated play types, and extensional terrains as the convection-dominated play type. The installed capacity of geothermal facilities sums up to 397.1 MWth by the end of 2018. District heating plants accounted for the largest portion, with about 337.0 MWth. The majority of these installations are located in the play type ‘foreland basin’, namely the Molasse Basin in southern Germany. The stratigraphic unit for geothermal use is the Upper Jurassic, also known as ‘Malm’ formation, a carbonate reservoir with high variability in porosity and permeability. Recently drilled wells in the southernmost Molasse Basin indicate the Upper Jurassic as a tight, fracture-controlled reservoir, not usable for conventional hydrothermal well doublets. Our new data compilation including the recently drilled deep geothermal well Geretsried reveals the relation of porosity and permeability to depth. The results suggest that obviously diagenetic processes control permeability with depth in carbonate rock, diminishing the predictability of reservoir porosity and permeability. The play type concept helps to delineate these property variations in play type levels because it is based on geological constraints, common for exploration geology. Following the general idea of play typing, the results from this play analysis can be transferred to geological analogues as carbonate rock play levels in varying depth.
Approximately 100 years ago, Bleuler famously declared that “Sensory response to external stimulus is quite normal” in schizophrenia, followed however by the cryptic statement: “Busch and Kraepelin have found in perception experiment (using the shutter and revolving drum apparatus) that schizophrenics show many more errors and particularly omissions than do the healthy … Using accurate apparatus, we were unable to substantiate these findings”.
To update current estimates of non–device-associated pneumonia (ND pneumonia) rates and their frequency relative to ventilator associated pneumonia (VAP), and identify risk factors for ND pneumonia.
Academic teaching hospital.
All adult hospitalizations between 2013 and 2017 were included. Pneumonia (device associated and non–device associated) were captured through comprehensive, hospital-wide active surveillance using CDC definitions and methodology.
From 2013 to 2017, there were 163,386 hospitalizations (97,485 unique patients) and 771 pneumonia cases (520 ND pneumonia and 191 VAP). The rate of ND pneumonia remained stable, with 4.15 and 4.54 ND pneumonia cases per 10,000 hospitalization days in 2013 and 2017 respectively (P = .65). In 2017, 74% of pneumonia cases were ND pneumonia. Male sex and increasing age we both associated with increased risk of ND pneumonia. Additionally, patients with chronic bronchitis or emphysema (hazard ratio [HR], 2.07; 95% confidence interval [CI], 1.40–3.06), congestive heart failure (HR, 1.48; 95% CI, 1.07–2.05), or paralysis (HR, 1.72; 95% CI, 1.09–2.73) were also at increased risk, as were those who were immunosuppressed (HR, 1.54; 95% CI, 1.18–2.00) or in the ICU (HR, 1.49; 95% CI, 1.06–2.09). We did not detect a change in ND pneumonia risk with use of chlorhexidine mouthwash, total parenteral nutrition, all medications of interest, and prior ventilation.
The incidence rate of ND pneumonia did not change from 2013 to 2017, and 3 of 4 nosocomial pneumonia cases were non–device associated. Hospital infection prevention programs should consider expanding the scope of surveillance to include non-ventilated patients. Future research should continue to look for modifiable risk factors and should assess potential prevention strategies.
Surgical site infections (SSIs) are common surgical complications that lead to increased costs. Depending on payer type, however, they do not necessarily translate into deficits for every hospital.
We investigated how surgical site infections (SSIs) influence the contribution margin in 2 reimbursement systems based on diagnosis-related groups (DRGs).
This preplanned observational health cost analysis was nested within a Swiss multicenter randomized controlled trial on the timing of preoperative antibiotic prophylaxis in general surgery between February 2013 and August 2015. A simulation of cost and income in the National Health Service (NHS) England reimbursement system was conducted.
Of 5,175 patients initially enrolled, 4,556 had complete cost and income data as well as SSI status available for analysis. SSI occurred in 228 of 4,556 of patients (5%). Patients with SSIs were older, more often male, had higher BMIs, compulsory insurance, longer operations, and more frequent ICU admissions. SSIs led to higher hospital cost and income. The median contribution margin was negative in cases of SSI. In SSI cases, median contribution margin was Swiss francs (CHF) −2045 (IQR, −12,800 to 4,848) versus CHF 895 (IQR, −2,190 to 4,158) in non-SSI cases. Higher ASA class and private insurance were associated with higher contribution margins in SSI cases, and ICU admission led to greater deficits. Private insurance had a strong increasing effect on contribution margin at the 10th, 50th (median), and 90th percentiles of its distribution, leading to overall positive contribution margins for SSIs in Switzerland. The NHS England simulation with 3,893 patients revealed similar but less pronounced effects of SSI on contribution margin.
Depending on payer type, reimbursement systems with DRGs offer only minor financial incentives to the prevention of SSI.
To update current estimates of non–device-associated urinary tract infection (ND-UTI) rates and their frequency relative to catheter-associated UTIs (CA-UTIs) and to identify risk factors for ND-UTIs.
Academic teaching hospital.
All adult hospitalizations between 2013 and 2017 were included. UTIs (device and non-device associated) were captured through comprehensive, hospital-wide active surveillance using Centers for Disease Control and Prevention case definitions and methodology.
From 2013 to 2017 there were 163,386 hospitalizations (97,485 unique patients) and 1,273 UTIs (715 ND-UTIs and 558 CA-UTIs). The rate of ND-UTIs remained stable, decreasing slightly from 6.14 to 5.57 ND-UTIs per 10,000 hospitalization days during the study period (P = .15). However, the proportion of UTIs that were non–device related increased from 52% to 72% (P < .0001). Female sex (hazard ratio [HR], 1.94; 95% confidence interval [CI], 1.50–2.50) and increasing age were associated with increased ND-UTI risk. Additionally, the following conditions were associated with increased risk: peptic ulcer disease (HR, 2.25; 95% CI, 1.04–4.86), immunosuppression (HR, 1.48; 95% CI, 1.15–1.91), trauma admissions (HR, 1.36; 95% CI, 1.02–1.81), total parenteral nutrition (HR, 1.99; 95% CI, 1.35–2.94) and opioid use (HR, 1.62; 95% CI, 1.10–2.32). Urinary retention (HR, 1.41; 95% CI, 0.96–2.07), suprapubic catheterization (HR, 2.28; 95% CI, 0.88–5.91), and nephrostomy tubes (HR, 2.02; 95% CI, 0.83–4.93) may also increase risk, but estimates were imprecise.
Greater than 70% of UTIs are now non–device associated. Current targeted surveillance practices should be reconsidered in light of this changing landscape. We identified several modifiable risk factors for ND-UTIs, and future research should explore the impact of prevention strategies that target these factors.
A lasting legacy of the International Polar Year (IPY) 2007–2008 was the promotion of the Permafrost Young Researchers Network (PYRN), initially an IPY outreach and education activity by the International Permafrost Association (IPA). With the momentum of IPY, PYRN developed into a thriving network that still connects young permafrost scientists, engineers, and researchers from other disciplines. This research note summarises (1) PYRN’s development since 2005 and the IPY’s role, (2) the first 2015 PYRN census and survey results, and (3) PYRN’s future plans to improve international and interdisciplinary exchange between young researchers. The review concludes that PYRN is an established network within the polar research community that has continually developed since 2005. PYRN’s successful activities were largely fostered by IPY. With >200 of the 1200 registered members active and engaged, PYRN is capitalising on the availability of social media tools and rising to meet environmental challenges while maintaining its role as a successful network honouring the legacy of IPY.
Zoledronate (Zol) is a bone-preserving/ anti-tumoral drug that is widely used for the treatment of many cancers including spinal bone metastases. High systemic Zol doses required to elicit an adequate effect in the spine often lead to significant side effects, limiting its prolonged use and effectiveness. Here, we aim to provide an alternative strategy to locally deliver Zol at the tumor site. We show that nanoporous 3D-printed scaffolds can be loaded with Zol and possess the ability to release Zol (10-28%) over a sustained period. Additionally, we demonstrate that Zol-impregnated scaffolds, mostly Gel Lay, impair the proliferation of the prostate cancer cell line LAPC4 and the prostate-induced bone metastasis cells in vitro. 3D-printed nanoporous polymers offer a novel and versatile opportunity for potential local delivery of drugs in future clinical settings. These polymers can decrease systemic exposure and related side effects of Zol while at the same time concentrating the drug effect at the tumor site thereby inhibiting tumor proliferation. Also, these scaffolds could be co-printed or coupled with other materials to produce custom implants that offer better structural support for bone growth at the tumor site following resection.
OBJECTIVES/SPECIFIC AIMS: Triple-negative breast cancer (TNBC) accounts for one-fifth of the breast cancer patient population. The heterogeneous nature of TNBC and lack of options for targeted therapy make its treatment a constant adventure. The deficiency of tumor suppressors p53 and ARF is one of the known genetic signatures enriched in TNBC. Crucial questions remain about how TNBC is regulated by these genetic alterations. METHODS/STUDY POPULATION: In order to address this issue, we established p53/ARF-defective murine embryonic fibroblast and mammary epithelial cell to study the molecular and phenotypic consequences. Moreover, transgenic mice were generated to investigate the effect of p53/ARF deficiency on mammary tumor development in vivo. RESULTS/ANTICIPATED RESULTS: Increased proliferation and transformation capability were observed in p53/ARF-defective cells, and an aggressive form of mammary tumor was also seen in p53−/−ARF−/− mice. Gene expression profiling and knock-down experiments using shRNAs were conducted to identify inflammatory marker ISG15 and RNA-editing enzyme ADAR1 as potential culprits for the elevated oncogenic potential. Interestingly, we found that the overexpression of ISG15 and ADAR1 is also prevalent in human TNBC cell lines. Reducing ADAR1 expression abrogated the oncogenic potential of human TNBC cell lines, while non-TNBC cells are less susceptible. DISCUSSION/SIGNIFICANCE OF IMPACT: These results indicate critical roles played by the tumor suppressors p53 and ARF in the pathogenesis of TNBC, likely through regulating ADAR1-mediated RNA modifications. Further understanding of this pathway promises to shed light on genetics-driven vulnerabilities of TNBC and inform development of more effective therapeutic strategies.
Some centres favour early intervention for ureteral colic while others prefer trial of spontaneous passage, and relative outcomes are poorly described. Calgary and Vancouver have similar populations and physician expertise, but differing approaches to ureteral colic. We studied 60-day hospitalization and intervention rates for patients having a first emergency department (ED) visit for ureteral colic in these diverse systems.
We used administrative data and structured chart review to study all Vancouver and Calgary patients with an index visit for ureteral colic during 2014. Patient demographics, arrival characteristics and triage category were captured from ED information systems, while ED visits and admissions were captured from linked regional hospital databases. Laboratory results were obtained from electronic health records and stone characteristics were abstracted from diagnostic imaging reports. Our primary outcome was hospitalization or urological intervention from 0 to 60 days. Secondary outcomes included ED revisits, readmissions and rescue interventions. Time to event analysis was conducted and Cox Proportional Hazards modelling was performed to adjust for covariate imbalance.
We studied 3283 patients with CT-defined stones. Patient and stone characteristics were similar for the cities. Hospitalization or intervention occurred in 60.9% of Calgary patients and 31.3% of Vancouver patients (p<0.001). Calgary patients had higher index intervention rates (52.1% v. 7.5%), and experienced more ED revisits and hospital readmissions during follow-up. The data suggest that outcome events were associated with overtreatment of small stones in one city and undertreatment of large stones in the other.
An early interventional approach was associated with higher ED revisit, hospitalization and intervention rates. If these events are markers of patient disability, then a less interventional approach to small stones and earlier definitive management of large stones may reduce system utilization and improve outcomes for patients with acute ureteral colic.
To summarize and discuss logistic and administrative challenges we encountered during the Benefits of Enhanced Terminal Room (BETR) Disinfection Study and lessons learned that are pertinent to future utilization of ultraviolet (UV) disinfection devices in other hospitals
Multicenter cluster randomized trial
SETTING AND PARTICIPANTS
Nine hospitals in the southeastern United States
All participating hospitals developed systems to implement 4 different strategies for terminal room disinfection. We measured compliance with disinfection strategy, barriers to implementation, and perceptions from nurse managers and environmental services (EVS) supervisors throughout the 28-month trial.
Implementation of enhanced terminal disinfection with UV disinfection devices provides unique challenges, including time pressures from bed control personnel, efficient room identification, negative perceptions from nurse managers, and discharge volume. In the course of the BETR Disinfection Study, we utilized several strategies to overcome these barriers: (1) establishing safety as the priority; (2) improving communication between EVS, bed control, and hospital administration; (3) ensuring availability of necessary resources; and (4) tracking and providing feedback on compliance. Using these strategies, we deployed ultraviolet (UV) disinfection devices in 16,220 (88%) of 18,411 eligible rooms during our trial (median per hospital, 89%; IQR, 86%–92%).
Implementation of enhanced terminal room disinfection strategies using UV devices requires recognition and mitigation of 2 key barriers: (1) timely and accurate identification of rooms that would benefit from enhanced terminal disinfection and (2) overcoming time constraints to allow EVS cleaning staff sufficient time to properly employ enhanced terminal disinfection methods.