We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The direct carbonate procedure for accelerator mass spectrometry radiocarbon (AMS 14C) dating of submilligram samples of biogenic carbonate without graphitization is becoming widely used in a variety of studies. We compare the results of 153 paired direct carbonate and standard graphite 14C determinations on single specimens of an assortment of biogenic carbonates. A reduced major axis regression shows a strong relationship between direct carbonate and graphite percent Modern Carbon (pMC) values (m = 0.996; 95% CI [0.991–1.001]). An analysis of differences and a 95% confidence interval on pMC values reveals that there is no significant difference between direct carbonate and graphite pMC values for 76% of analyzed specimens, although variation in direct carbonate pMC is underestimated. The difference between the two methods is typically within 2 pMC, with 61% of direct carbonate pMC measurements being higher than their paired graphite counterpart. Of the 36 specimens that did yield significant differences, all but three missed the 95% significance threshold by 1.2 pMC or less. These results show that direct carbonate 14C dating of biogenic carbonates is a cost-effective and efficient complement to standard graphite 14C dating.
Identify risk factors that could increase progression to severe disease and mortality in hospitalized SARS-CoV-2 patients in the Southeast US.
Design, Setting, and Participants
Multicenter, retrospective cohort including 502 adults hospitalized with laboratory-confirmed COVID-19 between March 1, 2020 and May 8, 2020 within one of 15 participating hospitals in 5 health systems across 5 states in the Southeast US.
Methods
The study objectives were to identify risk factors that could increase progression to hospital mortality and severe disease (defined as a composite of intensive care unit admission or requirement of mechanical ventilation) in hospitalized SARS-CoV-2 patients in the Southeast US.
Results
A total of 502 patients were included, and the majority (476/502, 95%) had clinically evaluable outcomes. Hospital mortality was 16% (76/476), while 35% (177/502) required ICU admission, and 18% (91/502) required mechanical ventilation. By both univariate and adjusted multivariate analysis, hospital mortality was independently associated with age (adjusted odds ratio [aOR] 2.03 for each decade increase, 95% CI 1.56-2.69), male sex (aOR 2.44, 95% CI: 1.34-4.59), and cardiovascular disease (aOR 2.16, 95% CI: 1.15-4.09). As with mortality, risk of severe disease was independently associated with age (aOR 1.17 for each decade increase, 95% CI: 1.00-1.37), male sex (aOR 2.34, 95% CI 1.54-3.60), and cardiovascular disease (aOR 1.77, 95% CI 1.09-2.85).
Conclusions
In an adjusted multivariate analysis, advanced age, male sex, and cardiovascular disease increased risk of severe disease and mortality in patients with COVID-19 in the Southeast US. In-hospital mortality risk doubled with each subsequent decade of life.
Intimate partner violence (IPV) and unhealthy alcohol use are common yet often unaddressed public health problems in low- and middle-income countries. In a randomized trial, we found that the common elements treatment approach (CETA), a multi-problem, flexible, transdiagnostic intervention, was effective in reducing IPV and unhealthy alcohol use among couples in Zambia at a 12-month post-baseline assessment. In this follow-up study, we investigated whether treatment effects were sustained among CETA participants at 24-months post-baseline.
Methods
Participants were heterosexual couples in Zambia in which the woman reported IPV perpetrated by the male partner and in which the male had hazardous alcohol use. Couples were randomized to CETA or treatment as usual plus safety checks. Measures were the Severity of Violence Against Women Scale (SVAWS) and the Alcohol Use Disorders Identification Test (AUDIT). The trial was stopped early upon recommendation by the trial's DSMB due to CETA's effectiveness following the 12-month assessment. Control participants exited the study and were offered CETA. This brief report presents data from an additional follow-up assessment conducted among original CETA participants at a 24-month visit.
Results
There were no meaningful changes in SVAWS or AUDIT scores between 12- and 24-months. The within-group treatment effect for SVAWS from baseline to 24-months was d = 1.37 (p < 0.0001) and AUDIT was d = 0.85 (p < 0.0001).
Conclusions
The lack of change in levels of IPV and unhealthy alcohol use between the 12- and 24-month post-baseline timepoints suggests that treatment gains were sustained among participants who received CETA for at least two years from intervention commencement.
To consider the principal effect of an interaction between year (pre- and post-Universal Infant Free School Meals (UIFSM)) and school on pupil’s dietary intakes.
Design:
A repeated cross-sectional survey using dietary data from 2008 to 2009 (pre-) and 2017 to 2018 (post-UIFSM)
Setting:
Two primary schools, NE England.
Participants:
Pupils aged 4–7 years (2008–2009 n 121; 2017–2018 n 87).
Results:
At lunchtime, there was a statistically significant decrease in pupils non-milk extrinsic sugars intake (%E NMEs) pre- to post-UIFSM (mean change –4·6 %; 95 % CI –6·3, –2·9); this was reflected in total diet (–3·8 %; –5·2, –2·7 %). A year and school interaction was found for mean Ca intakes: post-UIFSM pupils in School 2 had a similar mean intake as pre; in School 1 intakes had increased (difference of difference: –120 mg; 95 % CI –179, –62); no reflection in total diet. Post-UIFSM mean portions of yogurt decreased in School 2 and remained similar in School 1 (–0·25; –0·46, –0·04); this was similar for ‘cake/pudding’ and fruit.
Conclusions:
Within the limitations, these findings highlight positives and limitations following UIFSM implementation and demonstrate the role of school-level food practices on pupil’s choices. To facilitate maximum potential of UIFSM, national levers, such as discussions on updating school food standards, including sugars, could consider removing the daily ‘pudding’ option and advocate ‘fruit only’ options 1 d/week, as some schools do currently. Small school-level changes could maximise positive health impacts by decreasing NMEs intake. A more robust evaluation is imperative to consider dietary impacts, equitability and wider effects on schools and families.
Mortality risk is known to be associated with many physiological or biochemical risk factors, and polygenic risk scores (PRSs) may offer an additional or alternative approach to risk stratification. We have compared the predictive value of common biochemical tests, PRSs and information on parental survival in a cohort of twins and their families. Common biochemical test results were available for up to 13,365 apparently healthy men and women, aged 17−93 years (mean 49.0, standard deviation [SD] 13.7) at blood collection. PRSs for longevity were available for 14,169 study participants and reported parental survival for 25,784 participants. A search for information on date and cause of death was conducted through the Australian National Death Index, with median follow-up of 11.3 years. Cox regression was used to evaluate associations with mortality from all causes, cancers, cardiovascular diseases and other causes. Linear relationships with all-cause mortality were strongest for C-reactive protein, gamma-glutamyl transferase, glucose and alkaline phosphatase, with hazard ratios (HRs) of 1.16 (95% CI [1.07, 1.24]), 1.15 (95% CI 1.04–1.21), 1.13 (95% CI [1.08, 1.19]) and 1.11 (95% CI [1.05, 1.88]) per SD difference, respectively. Significant nonlinear effects were found for urea, uric acid and butyrylcholinesterase. Lipid risk factors were not statistically significant for mortality in our cohort. Family history and PRS showed weaker but significant associations with survival, with HR in the range 1.05 to 1.09 per SD difference. In conclusion, biochemical tests currently predict long-term mortality more strongly than genetic scores based on genotyping or on reported parental survival.
The Rapid ASKAP Continuum Survey (RACS) is the first large-area survey to be conducted with the full 36-antenna Australian Square Kilometre Array Pathfinder (ASKAP) telescope. RACS will provide a shallow model of the ASKAP sky that will aid the calibration of future deep ASKAP surveys. RACS will cover the whole sky visible from the ASKAP site in Western Australia and will cover the full ASKAP band of 700–1800 MHz. The RACS images are generally deeper than the existing NRAO VLA Sky Survey and Sydney University Molonglo Sky Survey radio surveys and have better spatial resolution. All RACS survey products will be public, including radio images (with
$\sim$
15 arcsec resolution) and catalogues of about three million source components with spectral index and polarisation information. In this paper, we present a description of the RACS survey and the first data release of 903 images covering the sky south of declination
$+41^\circ$
made over a 288-MHz band centred at 887.5 MHz.
We aimed to evaluate the prevalence, clinical determinants, and consequences (falls and hospitalization) of frailty in older adults with mental illness.
Design:
Retrospective clinical cohort study.
Setting:
We collected the data in a specialized psychogeriatric ward, in Boston, USA, between July 2018 and June 2019.
Participants:
Two hundred and fourty-four inpatients aged 65 years old and over.
Measurements:
Psychiatric diagnosis was based on a multi-professional consensus meeting according to DSM-5 criteria. Frailty was assessed according to two common instruments, that is, the FRAIL questionnaire and the deficit accumulation model (aka Frailty Index [FI]). Multiple linear regression analyses were conducted to evaluate the association between frailty and sample demographics (age, female sex, and non-Caucasian ethnicity) and clinical characteristics (dementia, number of clinical diseases, current infection, number of psychotropic, and non-psychotropic medications in use). Multiple regression between frailty assessments and either falls or number of hospital admissions in the last 6 and 12 months, respectively, were analyzed and adjusted for covariates.
Results:
Prevalence of frailty was high, that is, 83.6% according to the FI and 55.3% according to the FRAIL questionnaire. Age, the number of clinical (somatic) diseases, and the number of non-psychotropic medications were independently associated with frailty identified by the FRAIL. Dementia, current infection, the number of clinical (somatic) diseases, and the number of non-psychotropic medications were independently associated with frailty according to the FI. Falls were significantly associated with both frailty instruments. However, we found only a significant association for the number of hospital admissions with the FI.
Conclusion:
Frailty is highly prevalent among geriatric psychiatry inpatients. The FRAIL questionnaire and the FI may capture different forms of frailty dimensions, being the former probably more associated with the phenotype model and the latter more associated with multimorbidity.
A cylindrical and inclined jet in crossflow is studied under two distinct velocity ratios, $r=1$ and $r=2$, using highly resolved large eddy simulations. First, an investigation of turbulent scalar mixing sheds light onto the previously observed but unexplained phenomenon of negative turbulent diffusivity. We identify two distinct types of counter gradient transport, prevalent in different regions: the first, throughout the windward shear layer, is caused by cross-gradient transport; the second, close to the wall right after injection, is caused by non-local effects. Then, we propose a deep learning approach for modelling the turbulent scalar flux by adapting the tensor basis neural network previously developed to model Reynolds stresses (Ling et al., J. Fluid Mech., vol. 807, 2016a, pp. 155–166). This approach uses a deep neural network with embedded coordinate frame invariance to predict a tensorial turbulent diffusivity that is not explicitly available in the high-fidelity data used for training. After ensuring analytically that the matrix diffusivity leads to a stable solution for the advection diffusion equation, we apply this approach in the inclined jets in crossflow under study. The results show significant improvement compared to a simple model, particularly where cross-gradient effects play an important role in turbulent mixing. The model proposed herein is not limited to jets in crossflow; it can be used in any turbulent flow where the Reynolds averaged transport of a scalar is considered.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
This is the first report on the association between trauma exposure and depression from the Advancing Understanding of RecOvery afteR traumA(AURORA) multisite longitudinal study of adverse post-traumatic neuropsychiatric sequelae (APNS) among participants seeking emergency department (ED) treatment in the aftermath of a traumatic life experience.
Methods
We focus on participants presenting at EDs after a motor vehicle collision (MVC), which characterizes most AURORA participants, and examine associations of participant socio-demographics and MVC characteristics with 8-week depression as mediated through peritraumatic symptoms and 2-week depression.
Results
Eight-week depression prevalence was relatively high (27.8%) and associated with several MVC characteristics (being passenger v. driver; injuries to other people). Peritraumatic distress was associated with 2-week but not 8-week depression. Most of these associations held when controlling for peritraumatic symptoms and, to a lesser degree, depressive symptoms at 2-weeks post-trauma.
Conclusions
These observations, coupled with substantial variation in the relative strength of the mediating pathways across predictors, raises the possibility of diverse and potentially complex underlying biological and psychological processes that remain to be elucidated in more in-depth analyses of the rich and evolving AURORA database to find new targets for intervention and new tools for risk-based stratification following trauma exposure.
Presenteeism is an expensive and challenging problem in the healthcare industry. In anticipation of the staffing challenges expected with the COVID-19 pandemic, we examined a decade of payroll data for a healthcare workforce. We aimed to determine the effect of seasonal influenza-like illness (ILI) on absences to support COVID-19 staffing plans.
Design:
Retrospective cohort study.
Setting:
Large academic medical center in the United States.
Participants:
Employees of the academic medical center who were on payroll between the years of 2009 and 2019.
Methods:
Biweekly institutional payroll data was evaluated for unscheduled absences as a marker for acute illness-related work absences. Linear regression models, stratified by payroll status (salaried vs hourly employees) were developed for unscheduled absences as a function of local ILI.
Results:
Both hours worked and unscheduled absences were significantly related to the community prevalence of influenza-like illness in our cohort. These effects were stronger in hourly employees.
Conclusions:
Organizations should target their messaging at encouraging salaried staff to stay home when ill.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
Critical shortages of personal protective equipment, especially N95 respirators, during the coronavirus disease 2019 (COVID-19) pandemic continues to be a source of concern. Novel methods of N95 filtering face-piece respirator decontamination that can be scaled-up for in-hospital use can help address this concern and keep healthcare workers (HCWs) safe.
Methods:
A multidisciplinary pragmatic study was conducted to evaluate the use of an ultrasonic room high-level disinfection system (HLDS) that generates aerosolized peracetic acid (PAA) and hydrogen peroxide for decontamination of large numbers of N95 respirators. A cycle duration that consistently achieved disinfection of N95 respirators (defined as ≥6 log10 reductions in bacteriophage MS2 and Geobacillus stearothermophilus spores inoculated onto respirators) was identified. The treated masks were assessed for changes to their hydrophobicity, material structure, strap elasticity, and filtration efficiency. PAA and hydrogen peroxide off-gassing from treated masks were also assessed.
Results:
The PAA room HLDS was effective for disinfection of bacteriophage MS2 and G. stearothermophilus spores on respirators in a 2,447 cubic-foot (69.6 cubic-meter) room with an aerosol deployment time of 16 minutes and a dwell time of 32 minutes. The total cycle time was 1 hour and 16 minutes. After 5 treatment cycles, no adverse effects were detected on filtration efficiency, structural integrity, or strap elasticity. There was no detectable off-gassing of PAA and hydrogen peroxide from the treated masks at 20 and 60 minutes after the disinfection cycle, respectively.
Conclusion:
The PAA room disinfection system provides a rapidly scalable solution for in-hospital decontamination of large numbers of N95 respirators during the COVID-19 pandemic.
Background: Carbapenem-resistant Enterobacteriaceae (CRE) are endemic in the Chicago region. We assessed the regional impact of a CRE control intervention targeting high-prevalence facilities; that is, long-term acute-care hospitals (LTACHs) and ventilator-capable skilled nursing facilities (vSNFs). Methods: In July 2017, an academic–public health partnership launched a regional CRE prevention bundle: (1) identifying patient CRE status by querying Illinois’ XDRO registry and periodic point-prevalence surveys reported to public health, (2) cohorting or private rooms with contact precautions for CRE patients, (3) combining hand hygiene adherence, monitoring with general infection control education, and guidance by project coordinators and public health, and (4) daily chlorhexidine gluconate (CHG) bathing. Informed by epidemiology and modeling, we targeted LTACHs and vSNFs in a 13-mile radius from the coordinating center. Illinois mandates CRE reporting to the XDRO registry, which can also be manually queried or generate automated alerts to facilitate interfacility communication. The regional intervention promoted increased automation of alerts to hospitals. The prespecified primary outcome was incident clinical CRE culture reported to the XDRO registry in Cook County by month, analyzed by segmented regression modeling. A secondary outcome was colonization prevalence measured by serial point-prevalence surveys for carbapenemase-producing organism colonization in LTACHs and vSNFs. Results: All eligible LTACHs (n = 6) and vSNFs (n = 9) participated in the intervention. One vSNF declined CHG bathing. vSNFs that implemented CHG bathing typically bathed residents 2–3 times per week instead of daily. Overall, there were significant gaps in infection control practices, especially in vSNFs. Also, 75 Illinois hospitals adopted automated alerts (56 during the intervention period). Mean CRE incidence in Cook County decreased from 59.0 cases per month during baseline to 40.6 cases per month during intervention (P < .001). In a segmented regression model, there was an average reduction of 10.56 cases per month during the 24-month intervention period (P = .02) (Fig. 1), and an estimated 253 incident CRE cases were averted. Mean CRE incidence also decreased among the stratum of vSNF/LTACH intervention facilities (P = .03). However, evidence of ongoing CRE transmission, particularly in vSNFs, persisted, and CRE colonization prevalence remained high at intervention facilities (Table 1). Conclusions: A resource-intensive public health regional CRE intervention was implemented that included enhanced interfacility communication and targeted infection prevention. There was a significant decline in incident CRE clinical cases in Cook County, despite high persistent CRE colonization prevalence in intervention facilities. vSNFs, where understaffing or underresourcing were common and lengths of stay range from months to years, had a major prevalence challenge, underscoring the need for aggressive infection control improvements in these facilities.
Funding: The Centers for Disease Control and Prevention (SHEPheRD Contract No. 200-2011-42037)
Disclosures: M.Y.L. has received research support in the form of contributed product from OpGen and Sage Products (now part of Stryker Corporation), and has received an investigator-initiated grant from CareFusion Foundation (now part of BD).
Background: Shared Healthcare Intervention to Eliminate Life-threatening Dissemination of MDROs in Orange County, California (SHIELD OC) was a CDC-funded regional decolonization intervention from April 2017 through July 2019 involving 38 hospitals, nursing homes (NHs), and long-term acute-care hospitals (LTACHs) to reduce MDROs. Decolonization in NH and LTACHs consisted of universal antiseptic bathing with chlorhexidine (CHG) for routine bathing and showering plus nasal iodophor decolonization (Monday through Friday, twice daily every other week). Hospitals used universal CHG in ICUs and provided daily CHG and nasal iodophor to patients in contact precautions. We sought to evaluate whether decolonization reduced hospitalization and associated healthcare costs due to infections among residents of NHs participating in SHIELD compared to nonparticipating NHs. Methods: Medicaid insurer data covering NH residents in Orange County were used to calculate hospitalization rates due to a primary diagnosis of infection (counts per member quarter), hospital bed days/member-quarter, and expenditures/member quarter from the fourth quarter of 2015 to the second quarter of 2019. We used a time-series design and a segmented regression analysis to evaluate changes attributable to the SHIELD OC intervention among participating and nonparticipating NHs. Results: Across the SHIELD OC intervention period, intervention NHs experienced a 44% decrease in hospitalization rates, a 43% decrease in hospital bed days, and a 53% decrease in Medicaid expenditures when comparing the last quarter of the intervention to the baseline period (Fig. 1). These data translated to a significant downward slope, with a reduction of 4% per quarter in hospital admissions due to infection (P < .001), a reduction of 7% per quarter in hospitalization days due to infection (P < .001), and a reduction of 9% per quarter in Medicaid expenditures (P = .019) per NH resident. Conclusions: The universal CHG bathing and nasal decolonization intervention adopted by NHs in the SHIELD OC collaborative resulted in large, meaningful reductions in hospitalization events, hospitalization days, and healthcare expenditures among Medicaid-insured NH residents. The findings led CalOptima, the Medicaid provider in Orange County, California, to launch an NH incentive program that provides dedicated training and covers the cost of CHG and nasal iodophor for OC NHs that enroll.
Funding: None
Disclosures: Gabrielle M. Gussin, University of California, Irvine, Stryker (Sage Products): Conducting studies in which contributed antiseptic product is provided to participating hospitals and nursing homes. Clorox: Conducting studies in which contributed antiseptic product is provided to participating hospitals and nursing homes. Medline: Conducting studies in which contributed antiseptic product is provided to participating hospitals and nursing homes. Xttrium: Conducting studies in which contributed antiseptic product is provided to participating hospitals and nursing homes.
Background: When control mechanisms such as water temperature and biocide level are insufficient, Legionella, the causative bacteria of Legionnaires’ disease, can proliferate in water distribution systems in buildings. Guidance and oversight bodies are increasingly prioritizing water safety programs in healthcare facilities to limit Legionella growth. However, ensuring optimal implementation in large buildings is challenging. Much is unknown, and sometimes assumed, about whether building and campus characteristics influence Legionella growth. We used an extensive real-world environmental Legionella data set in the Veterans Health Administration (VHA) healthcare system to examine infrastructure characteristics and Legionella positivity. Methods: VHA medical facilities across the country perform quarterly potable water sampling of healthcare buildings for Legionella detection as part of a comprehensive water safety program. Results are reported to a standardized national database. We did an exploratory univariate analysis of facility-reported Legionella data from routine potable water samples taken in 2015 to 2018, in conjunction with infrastructure characteristics available in a separate national data set. This review examined the following characteristics: building height (number of floors), building age (reported construction year), and campus acreage. Results: The final data set included 201,936 water samples from 819 buildings. Buildings with 1–5 floors (n = 634) had a Legionella positivity rate of 5.3%, 6–10 floors (n = 104) had a rate of 6.4%, 11–15 floors (n = 36) had a rate of 8.1%, and 16–22 floors (n = 9) had a rate of 8.8%. All rates were significantly different from each other except 11–15 floors and 16–22 floors (P < .05, χ2). The oldest buildings (1800s) had significantly less (P < .05, χ2) Legionella positivity than those built between 1900 and 1939 and between 1940 and 1979, but they were no different than the newest buildings (Fig. 1). In newer buildings (1980–2019), all decades had buildings with Legionella positivity (Fig. 1 inset). Campus acreage varied from ~3 acres to almost 500 acres. Although significant differences were found in Legionella positivity for different campus sizes, there was no clear trend and campus acreage may not be a suitable proxy for the extent or complexity of water systems feeding buildings. Conclusions: The analysis of this large, real-world data set supports an assumption that taller buildings are more likely to be associated with Legionella detection, perhaps a result of more extensive piping. In contrast, the assumption that newer buildings are less associated with Legionella was not fully supported. These results demonstrate the variability in Legionella positivity in buildings, and they also provide evidence that can inform implementation of water safety programs.
Funding: None
Disclosures: Chetan Jinadatha, principal Investigator/Co-I: Research: NIH/NINR, AHRQ, NSF principal investigator: Research: Xenex Healthcare Services. Funds provided to institution. Inventor: Methods for organizing the disinfection of one or more items contaminated with biological agents. Owner: Department of Veterans Affairs. Licensed to Xenex Disinfection System, San Antonio, TX.