To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
There is controversy regarding whether the addition of cover gowns offers a substantial benefit over gloves alone in reducing personnel contamination and preventing pathogen transmission.
Simulated patient care interactions.
To evaluate the efficacy of different types of barrier precautions and to identify routes of transmission.
In randomly ordered sequence, 30 personnel each performed 3 standardized examinations of mannequins contaminated with pathogen surrogate markers (cauliflower mosaic virus DNA, bacteriophage MS2, nontoxigenic Clostridioides difficile spores, and fluorescent tracer) while wearing no barriers, gloves, or gloves plus gowns followed by examination of a noncontaminated mannequin. We compared the frequency and routes of transfer of the surrogate markers to the second mannequin or the environment.
For a composite of all surrogate markers, transfer by hands occurred at significantly lower rates in the gloves-alone group (OR, 0.02; P < .001) and the gloves-plus-gown group (OR, 0.06; P = .002). Transfer by stethoscope diaphragms was common in all groups and was reduced by wiping the stethoscope between simulations (OR, 0.06; P < .001). Compared to the no-barriers group, wearing a cover gown and gloves resulted in reduced contamination of clothing (OR, 0.15; P < .001), but wearing gloves alone did not.
Wearing gloves alone or gloves plus gowns reduces hand transfer of pathogens but may not address transfer by devices such as stethoscopes. Cover gowns reduce the risk of contaminating the clothing of personnel.
Background: Antibiotics are among the most commonly prescribed drugs in nursing homes; urinary tract infections (UTIs) are a frequent indication. Although there is no gold standard for the diagnosis of UTIs, various criteria have been developed to inform and standardize nursing home prescribing decisions, with the goal of reducing unnecessary antibiotic prescribing. Using different published criteria designed to guide decisions on initiating treatment of UTIs (ie, symptomatic, catheter-associated, and uncomplicated cystitis), our objective was to assess the appropriateness of antibiotic prescribing among NH residents. Methods: In 2017, the CDC Emerging Infections Program (EIP) performed a prevalence survey of healthcare-associated infections and antibiotic use in 161 nursing homes from 10 states: California, Colorado, Connecticut, Georgia, Maryland, Minnesota, New Mexico, New York, Oregon, and Tennessee. EIP staff reviewed resident medical records to collect demographic and clinical information, infection signs, symptoms, and diagnostic testing documented on the day an antibiotic was initiated and 6 days prior. We applied 4 criteria to determine whether initiation of treatment for UTI was supported: (1) the Loeb minimum clinical criteria (Loeb); (2) the Suspected UTI Situation, Background, Assessment, and Recommendation tool (UTI SBAR tool); (3) adaptation of Infectious Diseases Society of America UTI treatment guidelines for nursing home residents (Crnich & Drinka); and (4) diagnostic criteria for uncomplicated cystitis (cystitis consensus) (Fig. 1). We calculated the percentage of residents for whom initiating UTI treatment was appropriate by these criteria. Results: Of 248 residents for whom UTI treatment was initiated in the nursing home, the median age was 79 years [IQR, 19], 63% were female, and 35% were admitted for postacute care. There was substantial variability in the percentage of residents with antibiotic initiation classified as appropriate by each of the criteria, ranging from 8% for the cystitis consensus, to 27% for Loeb, to 33% for the UTI SBAR tool, to 51% for Crnich and Drinka (Fig. 2). Conclusions: Appropriate initiation of UTI treatment among nursing home residents remained low regardless of criteria used. At best only half of antibiotic treatment met published prescribing criteria. Although insufficient documentation of infection signs, symptoms and testing may have contributed to the low percentages observed, adequate documentation in the medical record to support prescribing should be standard practice, as outlined in the CDC Core Elements of Antibiotic Stewardship for nursing homes. Standardized UTI prescribing criteria should be incorporated into nursing home stewardship activities to improve the assessment and documentation of symptomatic UTI and to reduce inappropriate antibiotic use.
The hands of healthcare personnel are the most important source for transmission of healthcare-associated pathogens. The role of contaminated fomites such as portable equipment, stethoscopes, and clothing of personnel in pathogen transmission is unclear.
To study routes of transmission of cauliflower mosaic virus DNA markers from 31 source patients and from environmental surfaces in their rooms.
A 3-month observational cohort study.
A Veterans’ Affairs hospital.
After providing care for source patients, healthcare personnel were observed during interactions with subsequent patients. Putative routes of transmission were identified based on recovery of DNA markers from sites of contact with the patient or environment. To assess plausibility of fomite-mediated transmission, we assessed the frequency of transfer of methicillin-resistant Staphylococcus aureus (MRSA) from the skin of 25 colonized patients via gloved hands versus fomites.
Of 145 interactions involving contact with patients and/or the environment, 41 (28.3%) resulted in transfer of 1 or both DNA markers to the patient and/or the environment. The DNA marker applied to patients’ skin and clothing was transferred most frequently by stethoscopes, hands, and portable equipment, whereas the marker applied to environmental surfaces was transferred only by hands and clothing. The percentages of MRSA transfer from the skin of colonized patients via gloved hands, stethoscope diaphragms, and clothing were 52%, 40%, and 48%, respectively.
Fomites such as stethoscopes, clothing, and portable equipment may be underappreciated sources of pathogen transmission. Simple interventions such as decontamination of fomites between patients could reduce the risk for transmission.
This chapter examines the networks and connections within the early modern legal community of Aberdeen, Scotland. It reconstructs a particular master-apprentice network of the early-seventeenth to the mid-eighteenth centuries, showing the importance of this educational mechanism both for entrance into the local legal profession and for establishing professional contacts. This chapter also reconstructs the networks which were focused on two of Aberdeen’s most important courts of the period—the sheriff and commissary courts. It shows the extent to which the men who held offices in these courts were interconnected, both personally and professionally, and reflects on what this discovery reveals about contemporaneous local court practice. Finally, this chapter concludes by reflecting on how men of law may have regarded their own networks, through an examination of their children’s god-parentage records.
Dialysis patients may not have access to conventional renal replacement therapy (RRT) following disasters. We hypothesized that improvised renal replacement therapy (ImpRRT) would be comparable to continuous renal replacement therapy (CRRT) in a porcine acute kidney injury model.
Following bilateral nephrectomies and 2 hours of caudal aortic occlusion, 12 pigs were randomized to 4 hours of ImpRRT or CRRT. In the ImpRRT group, blood was circulated through a dialysis filter using a rapid infuser to collect the ultrafiltrate. Improvised replacement fluid, made with stock solutions, was infused pre-pump. In the CRRT group, commercial replacement fluid was used. During RRT, animals received isotonic crystalloids and norepinephrine.
There were no differences in serum creatinine, calcium, magnesium, or phosphorus concentrations. While there was a difference between groups in serum potassium concentration over time (P < 0.001), significance was lost in pairwise comparison at specific time points. Replacement fluids or ultrafiltrate flows did not differ between groups. There were no differences in lactate concentration, isotonic crystalloid requirement, or norepinephrine doses. No difference was found in electrolyte concentrations between the commercial and improvised replacement solutions.
The ImpRRT system achieved similar performance to CRRT and may represent a potential option for temporary RRT following disasters.
OBJECTIVES/GOALS: African-Americans have a 3-fold higher risk of end-stage kidney disease (ESKD) compared to Whites due in part to APOL1 risk alleles. Whether resistant hypertension (RH) magnifies the risk of ESKD among African Americans beyond APOL1 is not known. We examined the interaction between RH and race on ESKD risk and the independent effect of RH beyond APOL1. METHODS/STUDY POPULATION: We designed a retrospective cohort of 240,038 veterans with HTN, enrolled in the Million Veteran Program with an estimated glomerular filtration rate (eGFR) >30 ml/min/1.73m2. The primary exposure was incident RH (time-varying). The primary outcome was incident ESKD during a 13.5 year follow up: 2004-2017. Secondary outcomes were myocardial infarction (MI), stroke, and death. Incident RH was defined as failure to achieve outpatient blood pressure (BP) <140/90 mmHg with 3 antihypertensive drugs, including a thiazide, or use of 4 or more drugs. Poisson models were used to estimate incidence rates and test additive interaction with race and APOL1 genotype. Multivariable Cox models (with Fine-Gray competing-risks models as sensitivity analyses) were used to examine independent effects. RESULTS/ANTICIPATED RESULTS: The cohort comprised 235,046 veterans; median age was 60 years; 21% were African-American and 6% were women, with 23,010 incident RH cases observed over a median follow-up time of 10.2 years [interquartile range, 5.6-12.6]. Patients with RH had higher incidence rates [per 1000 person-years] of ESKD (4.5 vs. 1.3), myocardial infarction (6.5 vs. 3.0), stroke (16.4 vs. 7.6) and death (12.0 vs. 6.9) than non-resistant hypertension (NRH). African-Americans with RH had a 2.6-fold higher risk of ESKD compared to African-Americans with NRH; 3-fold the risk of Whites with RH, and 9.6-fold the risk of Whites with NRH [p-interaction<.001]. Among African-Americans, RH was associated with a 2.2-fold (95%CI, 1.86-2.58) higher risk of incident ESKD in models adjusted for APOL1 genotype and in the subset of African-Americans with no APOL1 risk alleles, RH was associated with an adjusted 2.75-fold (95% CI: 2.00-3.50) higher risk of incident ESKD. DISCUSSION/SIGNIFICANCE OF IMPACT: RH was independently associated with a higher risk of ESKD and cardiovascular outcomes, especially among African-Americans. This elevated risk is independent of APOL1 genotype. Interventions that achieve BP targets among patients with RH could curtail the incidence of ESKD and cardiovascular outcomes in this high-risk population. CONFLICT OF INTEREST DESCRIPTION: None.
There is a requirement in some beef markets to slaughter bulls at under 16 months of age. This requires high levels of concentrate feeding. Increasing the slaughter age of bulls to 19 months facilitates the inclusion of a grazing period, thereby decreasing the cost of production. Recent data indicate few quality differences in longissimus thoracis (LT) muscle from conventionally reared 16-month bulls and 19-month-old bulls that had a grazing period prior to finishing on concentrates. The aim of the present study was to expand this observation to additional commercially important muscles/cuts. The production systems selected were concentrates offered ad libitum and slaughter at under 16 months of age (16-C) or at 19 months of age (19-CC) to examine the effect of age per se, and the cheaper alternative for 19-month bulls described above (19-GC). The results indicate that muscles from 19-CC were more red, had more intramuscular fat and higher cook loss than those from 16-C. No differences in muscle objective texture or sensory texture and acceptability were found between treatments. The expected differences in composition and quality between the muscles were generally consistent across the production systems examined. Therefore, for the type of animal and range of ages investigated, the effect of the production system on LT quality was generally representative of the effect on the other muscles analysed. In addition, the data do not support the under 16- month age restriction, based on meat acceptability, in commercial suckler bull production.
Acute change in mental status (ACMS), defined by the Confusion Assessment Method, is used to identify infections in nursing home residents. A medical record review revealed that none of 15,276 residents had an ACMS documented. Using the revised McGeer criteria with a possible ACMS definition, we identified 296 residents and 21 additional infections. The use of a possible ACMS definition should be considered for retrospective nursing home infection surveillance.
In recent years an enhanced catabolism of serine, with or without the existence of porphyria, has been demonstrated in relation to a specific subtype of psychosis, according to ICD-10 criteria, the acute polymorphic psychosis with or without symptoms of schizophrenia. Since sensory perceptual distortions play a key role in the symptomatology, patients with this disorder are referred to as Acute Polymorphic Psychosis plus psychosensory phenomena (APP+). In a retrospective study, including a total of 140 chronic psychiatric patients, we investigated the prevalence of Acute Intermittent Porphyria (AIP) and APP+. No subjects with AIP were found. In two patients APP+ could be demonstrated, based on both clinical characteristics and positive biochemical markers, ie lowered plasma serine concentration and increased TSM-ratio (100 × Taurine (μmol/l)/Serine concentration * Methionine concentration). In three patients the psychotic disorder was suspected to be present. It is concluded that careful psychiatric diagnosing may reveal specific psychotic disorders with a distinct biological pathogenetic factor, ie a disturbed serine metabolism.
Pregabalin is indicated for the treatment of GAD in adults in Europe. The efficacy and safety of pregabalin for the treatment of adults and elderly patients with GAD has been demonstrated in 6 of 7 short-term clinical trials of 4 to 8 weeks.
To characterise the long-term efficacy and safety of pregabalin in subjects with GAD.
Subjects were randomised to double-blind treatment with either high-dose pregabalin (450-600 mg/d), low-dose pregabalin (150-300 mg/d), or lorazepam (3-4 mg/d) for 3 months. Treatment was extended with drug or blinded placebo for a further 3 months.
At 3 months, mean change from baseline Hamilton Anxiety Rating Scale (HAM-A) for pregabalin high- and low-dose, and for lorazepam ranged from -16.0 to -17.4. Mean change from baseline Clinical Global Impression-Severity (CGI-S) scores ranged from -2.1 to -2.3 and mean CGI-Improvement (CGI-I) scores were 1.9 for each active treatment group. At 6 months, improvement was retained for all 3 active drug groups, even when switched to placebo. HAM-A and CGI-S change from baseline scores ranged from -14.9 to -19.0 and -2.0 to -2.5, respectively. Mean CGI-I scores ranged from 1.5 to 2.3. The most frequently reported adverse events were insomnia, fatigue, dizziness, headache, and somnolence.
Efficacy was observed at 3 months, with maintained improvement in anxiety symptoms over 6 months of treatment. These results are consistent with previously reported efficacy and safety trials of shorter duration with pregabalin and lorazepam in subjects with GAD.
Pregabalin is indicated for the treatment of generalised anxiety disorder (GAD) in adults in Europe. When pregabalin is discontinued, a 1-week (minimum) taper is recommended to prevent potential discontinuation symptoms.
To evaluate whether a 1-week pregabalin taper, after 3 or 6 months of treatment, is associated with the development of discontinuation symptoms (including rebound anxiety) in subjects with GAD.
Subjects were randomised to double-blind treatment with low- (150-300 mg/d) or high-dose pregabalin (450-600 mg/d) or lorazepam (3-4 mg/d) for 3 months. After 3 months ~25% of subjects in each group (per the original randomisation) underwent a double-blind, 1-week taper, with substitution of placebo. The remaining subjects continued on active treatment for another 3 months and underwent the 1-week taper at 6 months.
Discontinuation after 3 months was associated with low mean changes in Physician Withdrawal Checklist (PWC) scores (range: +1.4 to +2.3) and Hamilton Anxiety Rating Scale (HAM A) scores (range: +0.9 to +2.3) for each pregabalin dose and lorazepam. Discontinuation after 6 months was associated with low mean changes in PWC scores (range: -1.0 to +3.0) and HAM A scores (range: -0.8 to +3.0) for all active drugs and placebo. Incidence of rebound anxiety during pregabalin taper was low and did not appear related to treatment dose or duration.
A 1-week taper following 3 or 6 months of pregabalin treatment was not associated with clinically meaningful discontinuation symptoms as evaluated by changes in the PWC and HAM A rating scales.
This article describes the development, implementation, and evaluation of a complex methotrexate ethics case used in teaching a Pharmacy Law and Ethics course. Qualitative analysis of student reflective writings provided useful insight into the students’ experience and comfort level with the final ethics case in the course. These data demonstrate a greater student appreciation of different perspectives, the potential for conflict in communicating about such cases, and the importance of patient autonomy. Faculty lessons learned are also described, facilitating adoption of this methotrexate ethics case by other healthcare profession educators.
Vitamin D deficiency has been commonly reported in elite athletes, but the vitamin D status of UK university athletes in different training environments remains unknown. The present study aimed to determine any seasonal changes in vitamin D status among indoor and outdoor athletes, and whether there was any relationship between vitamin D status and indices of physical performance and bone health. A group of forty-seven university athletes (indoor n 22, outdoor n 25) were tested during autumn and spring for serum vitamin D status, bone health and physical performance parameters. Blood samples were analysed for serum 25-hydroxyvitamin D (s-25(OH)D) status. Peak isometric knee extensor torque using an isokinetic dynamometer and jump height was assessed using an Optojump. Aerobic capacity was estimated using the Yo-Yo intermittent recovery test. Peripheral quantitative computed tomography scans measured radial bone mineral density. Statistical analyses were performed using appropriate parametric/non-parametric testing depending on the normality of the data. s-25(OH)D significantly fell between autumn (52·8 (sd 22·0) nmol/l) and spring (31·0 (sd 16·5) nmol/l; P < 0·001). In spring, 34 % of participants were considered to be vitamin D deficient (<25 nmol/l) according to the revised 2016 UK guidelines. These data suggest that UK university athletes are at risk of vitamin D deficiency. Thus, further research is warranted to investigate the concomitant effects of low vitamin D status on health and performance outcomes in university athletes residing at northern latitudes.
Clostridioides difficile infection (CDI) is the most frequently reported hospital-acquired infection in the United States. Bioaerosols generated during toilet flushing are a possible mechanism for the spread of this pathogen in clinical settings.
To measure the bioaerosol concentration from toilets of patients with CDI before and after flushing.
In this pilot study, bioaerosols were collected 0.15 m, 0.5 m, and 1.0 m from the rims of the toilets in the bathrooms of hospitalized patients with CDI. Inhibitory, selective media were used to detect C. difficile and other facultative anaerobes. Room air was collected continuously for 20 minutes with a bioaerosol sampler before and after toilet flushing. Wilcoxon rank-sum tests were used to assess the difference in bioaerosol production before and after flushing.
Rooms of patients with CDI at University of Iowa Hospitals and Clinics.
Bacteria were positively cultured from 8 of 24 rooms (33%). In total, 72 preflush and 72 postflush samples were collected; 9 of the preflush samples (13%) and 19 of the postflush samples (26%) were culture positive for healthcare-associated bacteria. The predominant species cultured were Enterococcus faecalis, E. faecium, and C. difficile. Compared to the preflush samples, the postflush samples showed significant increases in the concentrations of the 2 large particle-size categories: 5.0 µm (P = .0095) and 10.0 µm (P = .0082).
Bioaerosols produced by toilet flushing potentially contribute to hospital environmental contamination. Prevention measures (eg, toilet lids) should be evaluated as interventions to prevent toilet-associated environmental contamination in clinical settings.
Our understanding about the genetic influences on human disease has increased dramatically with the technological developments in genome and DNA analysis and the discovery of the human genome sequence. Whilst much remains unexplained, it is obvious that normal cardiac development is controlled by the genome and there is significant evidence that a proportion of cardiac malformations are caused by genetic factors. This is important for clinicians as an understanding of confirmed genetic factors is essential to estimate recurrence risks of congenital heart disease (CHD) within families and also screen for predicted associated anomalies. An accurate genetic diagnosis can provide important prognostic information for both the initial patient (proband) and other family members, for whom further genetic investigations may be indicated. There is likely to be a continued increase in demand for such investigations as improvement in surgical and medical management allows more individuals with CHD to survive to reproductive age and have families of their own. For some, the recurrence risk for a cardiac malformation may be as high as 50%; the actual figure varies with different genetic diagnoses. Accurate risk stratification is likely to become increasingly important and the rapidly developing technologies to detect genetic variation mean that genome-wide investigation is becoming more widely available in the clinical setting. An aim of this chapter is to introduce clinicians to principles that will help them embrace and understand the results from these investigations and appreciate the implications they have for their patients.
We systematically reviewed implementation research targeting depression interventions in low- and middle-income countries (LMICs) to assess gaps in methodological coverage.
PubMed, CINAHL, PsycINFO, and EMBASE were searched for evaluations of depression interventions in LMICs reporting at least one implementation outcome published through March 2019.
A total of 8714 studies were screened, 759 were assessed for eligibility, and 79 studies met inclusion criteria. Common implementation outcomes reported were acceptability (n = 50; 63.3%), feasibility (n = 28; 35.4%), and fidelity (n = 18; 22.8%). Only four studies (5.1%) reported adoption or penetration, and three (3.8%) reported sustainability. The Sub-Saharan Africa region (n = 29; 36.7%) had the most studies. The majority of studies (n = 59; 74.7%) reported outcomes for a depression intervention implemented in pilot researcher-controlled settings. Studies commonly focused on Hybrid Type-1 effectiveness-implementation designs (n = 53; 67.1), followed by Hybrid Type-3 (n = 16; 20.3%). Only 21 studies (26.6%) tested an implementation strategy, with the most common being revising professional roles (n = 10; 47.6%). The most common intervention modality was individual psychotherapy (n = 30; 38.0%). Common study designs were mixed methods (n = 27; 34.2%), quasi-experimental uncontrolled pre-post (n = 17; 21.5%), and individual randomized trials (n = 16; 20.3).
Existing research has focused on early-stage implementation outcomes. Most studies have utilized Hybrid Type-1 designs, with the primary aim to test intervention effectiveness delivered in researcher-controlled settings. Future research should focus on testing and optimizing implementation strategies to promote scale-up of evidence-based depression interventions in routine care. These studies should use high-quality pragmatic designs and focus on later-stage implementation outcomes such as cost, penetration, and sustainability.
Psychotropic prescription rates continue to increase in the United States (USA). Few studies have investigated whether social-structural factors may play a role in psychotropic medication use independent of mental illness. Food insecurity is prevalent among people living with HIV in the USA and has been associated with poor mental health. We investigated whether food insecurity was associated with psychotropic medication use independent of the symptoms of depression and anxiety among women living with HIV in the USA.
We used cross-sectional data from the Women's Interagency HIV Study (WIHS), a nationwide cohort study. Food security (FS) was the primary explanatory variable, measured using the Household Food Security Survey Module. First, we used multivariable linear regressions to test whether FS was associated with symptoms of depression (Center for Epidemiologic Studies Depression [CESD] score), generalised anxiety disorder (GAD-7 score) and mental health-related quality of life (MOS-HIV Mental Health Summary score; MHS). Next, we examined associations of FS with the use of any psychotropic medications, including antidepressants, sedatives and antipsychotics, using multivariable logistic regressions adjusting for age, race/ethnicity, income, education and alcohol and substance use. In separate models, we additionally adjusted for symptoms of depression (CESD score) and anxiety (GAD-7 score).
Of the 905 women in the sample, two-thirds were African-American. Lower FS (i.e. worse food insecurity) was associated with greater symptoms of depression and anxiety in a dose–response relationship. For the psychotropic medication outcomes, marginal and low FS were associated with 2.06 (p < 0.001; 95% confidence interval [CI] = 1.36–3.13) and 1.99 (p < 0.01; 95% CI = 1.26–3.15) times higher odds of any psychotropic medication use, respectively, before adjusting for depression and anxiety. The association of very low FS with any psychotropic medication use was not statistically significant. A similar pattern was found for antidepressant and sedative use. After additionally adjusting for CESD and GAD-7 scores, marginal FS remained associated with 1.93 (p < 0.05; 95% CI = 1.16–3.19) times higher odds of any psychotropic medication use. Very low FS, conversely, was significantly associated with lower odds of antidepressant use (adjusted odds ratio = 0.42; p < 0.05; 95% CI = 0.19–0.96).
Marginal FS was associated with higher odds of using psychotropic medications independent of depression and anxiety, while very low FS was associated with lower odds. These complex findings may indicate that people experiencing very low FS face barriers to accessing mental health services, while those experiencing marginal FS who do access services are more likely to be prescribed psychotropic medications for distress arising from social and structural factors.
Sink drainage systems are not amenable to standard methods of cleaning and disinfection. Disinfectants applied as a foam might enhance efficacy of drain decontamination due to greater persistence and increased penetration into sites harboring microorganisms.
To examine the efficacy and persistence of foam-based products in reducing sink drain colonization with gram-negative bacilli.
During a 5-month period, different methods for sink drain disinfection in patient rooms were evaluated in a hospital and its affiliated long-term care facility. We compared the efficacy of a single treatment with 4 different foam products in reducing the burden of gram-negative bacilli in the sink drain to a depth of 2.4 cm (1 inch) below the strainer. For the most effective product, the effectiveness of foam versus liquid-pouring applications, and the effectiveness of repeated foam treatments were evaluated.
A foam product containing 3.13% hydrogen peroxide and 0.05% peracetic acid was significantly more effective than the other 3 foam products. In comparison to pouring the hydrogen peroxide and peracetic acid disinfectant, the foam application resulted in significantly reduced recovery of gram-negative bacilli on days 1, 2, and 3 after treatment with a return to baseline by day 7. With repeated treatments every 3 days, a progressive decrease in the bacterial load recovered from sink drains was achieved.
An easy-to-use foaming application of a hydrogen peroxide- and peracetic acid-based disinfectant suppressed sink-drain colonization for at least 3 days. Intermittent application of the foaming disinfectant could potentially reduce the risk for dissemination of pathogens from sink drains.
Meal timing may influence food choices, neurobiology and psychological states. Our exploratory study examined if time-of-day eating patterns were associated with mood disorders among adults.
During 2004–2006 (age 26–36 years) and 2009–2011 (follow-up, age 31–41 years), N = 1304 participants reported 24-h food and beverage intake. Time-of-day eating patterns were derived by principal components analysis. At follow-up, the Composite International Diagnostic Interview measured lifetime mood disorder. Log binomial and adjacent categories log-link regression were used to examine bidirectional associations between eating patterns and mood disorder. Covariates included sex, age, marital status, social support, education, work schedule, body mass index and smoking.
Three patterns were derived at each time-point: Grazing (intake spread across the day), Traditional (highest intakes reflected breakfast, lunch and dinner), and Late (skipped/delayed breakfast with higher evening intakes). Compared to those in the lowest third of the respective pattern at baseline and follow-up, during the 5-year follow-up, those in the highest third of the Late pattern at both time-points had a higher prevalence of mood disorder [prevalence ratio (PR) = 2.04; 95% confidence interval (CI) 1.20–3.48], and those in the highest third of the Traditional pattern at both time-points had a lower prevalence of first onset mood disorder (PR = 0.31; 95% CI 0.11–0.87). Participants who experienced a mood disorder during follow-up had a 1.07 higher relative risk of being in a higher Late pattern score category at follow-up than those without mood disorder (95% CI 1.00–1.14).
Non-traditional eating patterns, particularly skipped or delayed breakfast, may be associated with mood disorders.
We aimed to estimate the cost-effectiveness of brief weight-loss counselling by dietitian-trained practice nurses, in a high-income-country case study.
A literature search of the impact of dietary counselling on BMI was performed to source the ‘best’ effect size for use in modelling. This was combined with multiple other input parameters (e.g. epidemiological and cost parameters for obesity-related diseases, likely uptake of counselling) in an established multistate life-table model with fourteen parallel BMI-related disease life tables using a 3 % discount rate.
New Zealand (NZ).
We calculated quality-adjusted life-years (QALY) gained and health-system costs over the remainder of the lifespan of the NZ population alive in 2011 (n 4·4 million).
Counselling was estimated to result in an increase of 250 QALY (95 % uncertainty interval −70, 560 QALY) over the population’s lifetime. The incremental cost-effectiveness ratio was 2011 $NZ 138 200 per QALY gained (2018 $US 102 700). Per capita QALY gains were higher for Māori (Indigenous population) than for non-Māori, but were still not cost-effective. If willingness-to-pay was set to the level of gross domestic product per capita per QALY gained (i.e. 2011 $NZ 45 000 or 2018 $US 33 400), the probability that the intervention would be cost-effective was 2 %.
The study provides modelling-level evidence that brief dietary counselling for weight loss in primary care generates relatively small health gains at the population level and is unlikely to be cost-effective.