To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Dialysis patients may not have access to conventional renal replacement therapy (RRT) following disasters. We hypothesized that improvised renal replacement therapy (ImpRRT) would be comparable to continuous renal replacement therapy (CRRT) in a porcine acute kidney injury model.
Following bilateral nephrectomies and 2 hours of caudal aortic occlusion, 12 pigs were randomized to 4 hours of ImpRRT or CRRT. In the ImpRRT group, blood was circulated through a dialysis filter using a rapid infuser to collect the ultrafiltrate. Improvised replacement fluid, made with stock solutions, was infused pre-pump. In the CRRT group, commercial replacement fluid was used. During RRT, animals received isotonic crystalloids and norepinephrine.
There were no differences in serum creatinine, calcium, magnesium, or phosphorus concentrations. While there was a difference between groups in serum potassium concentration over time (P < 0.001), significance was lost in pairwise comparison at specific time points. Replacement fluids or ultrafiltrate flows did not differ between groups. There were no differences in lactate concentration, isotonic crystalloid requirement, or norepinephrine doses. No difference was found in electrolyte concentrations between the commercial and improvised replacement solutions.
The ImpRRT system achieved similar performance to CRRT and may represent a potential option for temporary RRT following disasters.
There is a requirement in some beef markets to slaughter bulls at under 16 months of age. This requires high levels of concentrate feeding. Increasing the slaughter age of bulls to 19 months facilitates the inclusion of a grazing period, thereby decreasing the cost of production. Recent data indicate few quality differences in longissimus thoracis (LT) muscle from conventionally reared 16-month bulls and 19-month-old bulls that had a grazing period prior to finishing on concentrates. The aim of the present study was to expand this observation to additional commercially important muscles/cuts. The production systems selected were concentrates offered ad libitum and slaughter at under 16 months of age (16-C) or at 19 months of age (19-CC) to examine the effect of age per se, and the cheaper alternative for 19-month bulls described above (19-GC). The results indicate that muscles from 19-CC were more red, had more intramuscular fat and higher cook loss than those from 16-C. No differences in muscle objective texture or sensory texture and acceptability were found between treatments. The expected differences in composition and quality between the muscles were generally consistent across the production systems examined. Therefore, for the type of animal and range of ages investigated, the effect of the production system on LT quality was generally representative of the effect on the other muscles analysed. In addition, the data do not support the under 16- month age restriction, based on meat acceptability, in commercial suckler bull production.
Acute change in mental status (ACMS), defined by the Confusion Assessment Method, is used to identify infections in nursing home residents. A medical record review revealed that none of 15,276 residents had an ACMS documented. Using the revised McGeer criteria with a possible ACMS definition, we identified 296 residents and 21 additional infections. The use of a possible ACMS definition should be considered for retrospective nursing home infection surveillance.
In recent years an enhanced catabolism of serine, with or without the existence of porphyria, has been demonstrated in relation to a specific subtype of psychosis, according to ICD-10 criteria, the acute polymorphic psychosis with or without symptoms of schizophrenia. Since sensory perceptual distortions play a key role in the symptomatology, patients with this disorder are referred to as Acute Polymorphic Psychosis plus psychosensory phenomena (APP+). In a retrospective study, including a total of 140 chronic psychiatric patients, we investigated the prevalence of Acute Intermittent Porphyria (AIP) and APP+. No subjects with AIP were found. In two patients APP+ could be demonstrated, based on both clinical characteristics and positive biochemical markers, ie lowered plasma serine concentration and increased TSM-ratio (100 × Taurine (μmol/l)/Serine concentration * Methionine concentration). In three patients the psychotic disorder was suspected to be present. It is concluded that careful psychiatric diagnosing may reveal specific psychotic disorders with a distinct biological pathogenetic factor, ie a disturbed serine metabolism.
Pregabalin is indicated for the treatment of GAD in adults in Europe. The efficacy and safety of pregabalin for the treatment of adults and elderly patients with GAD has been demonstrated in 6 of 7 short-term clinical trials of 4 to 8 weeks.
To characterise the long-term efficacy and safety of pregabalin in subjects with GAD.
Subjects were randomised to double-blind treatment with either high-dose pregabalin (450-600 mg/d), low-dose pregabalin (150-300 mg/d), or lorazepam (3-4 mg/d) for 3 months. Treatment was extended with drug or blinded placebo for a further 3 months.
At 3 months, mean change from baseline Hamilton Anxiety Rating Scale (HAM-A) for pregabalin high- and low-dose, and for lorazepam ranged from -16.0 to -17.4. Mean change from baseline Clinical Global Impression-Severity (CGI-S) scores ranged from -2.1 to -2.3 and mean CGI-Improvement (CGI-I) scores were 1.9 for each active treatment group. At 6 months, improvement was retained for all 3 active drug groups, even when switched to placebo. HAM-A and CGI-S change from baseline scores ranged from -14.9 to -19.0 and -2.0 to -2.5, respectively. Mean CGI-I scores ranged from 1.5 to 2.3. The most frequently reported adverse events were insomnia, fatigue, dizziness, headache, and somnolence.
Efficacy was observed at 3 months, with maintained improvement in anxiety symptoms over 6 months of treatment. These results are consistent with previously reported efficacy and safety trials of shorter duration with pregabalin and lorazepam in subjects with GAD.
Pregabalin is indicated for the treatment of generalised anxiety disorder (GAD) in adults in Europe. When pregabalin is discontinued, a 1-week (minimum) taper is recommended to prevent potential discontinuation symptoms.
To evaluate whether a 1-week pregabalin taper, after 3 or 6 months of treatment, is associated with the development of discontinuation symptoms (including rebound anxiety) in subjects with GAD.
Subjects were randomised to double-blind treatment with low- (150-300 mg/d) or high-dose pregabalin (450-600 mg/d) or lorazepam (3-4 mg/d) for 3 months. After 3 months ~25% of subjects in each group (per the original randomisation) underwent a double-blind, 1-week taper, with substitution of placebo. The remaining subjects continued on active treatment for another 3 months and underwent the 1-week taper at 6 months.
Discontinuation after 3 months was associated with low mean changes in Physician Withdrawal Checklist (PWC) scores (range: +1.4 to +2.3) and Hamilton Anxiety Rating Scale (HAM A) scores (range: +0.9 to +2.3) for each pregabalin dose and lorazepam. Discontinuation after 6 months was associated with low mean changes in PWC scores (range: -1.0 to +3.0) and HAM A scores (range: -0.8 to +3.0) for all active drugs and placebo. Incidence of rebound anxiety during pregabalin taper was low and did not appear related to treatment dose or duration.
A 1-week taper following 3 or 6 months of pregabalin treatment was not associated with clinically meaningful discontinuation symptoms as evaluated by changes in the PWC and HAM A rating scales.
This article describes the development, implementation, and evaluation of a complex methotrexate ethics case used in teaching a Pharmacy Law and Ethics course. Qualitative analysis of student reflective writings provided useful insight into the students’ experience and comfort level with the final ethics case in the course. These data demonstrate a greater student appreciation of different perspectives, the potential for conflict in communicating about such cases, and the importance of patient autonomy. Faculty lessons learned are also described, facilitating adoption of this methotrexate ethics case by other healthcare profession educators.
Vitamin D deficiency has been commonly reported in elite athletes, but the vitamin D status of UK university athletes in different training environments remains unknown. The present study aimed to determine any seasonal changes in vitamin D status among indoor and outdoor athletes, and whether there was any relationship between vitamin D status and indices of physical performance and bone health. A group of forty-seven university athletes (indoor n 22, outdoor n 25) were tested during autumn and spring for serum vitamin D status, bone health and physical performance parameters. Blood samples were analysed for serum 25-hydroxyvitamin D (s-25(OH)D) status. Peak isometric knee extensor torque using an isokinetic dynamometer and jump height was assessed using an Optojump. Aerobic capacity was estimated using the Yo-Yo intermittent recovery test. Peripheral quantitative computed tomography scans measured radial bone mineral density. Statistical analyses were performed using appropriate parametric/non-parametric testing depending on the normality of the data. s-25(OH)D significantly fell between autumn (52·8 (sd 22·0) nmol/l) and spring (31·0 (sd 16·5) nmol/l; P < 0·001). In spring, 34 % of participants were considered to be vitamin D deficient (<25 nmol/l) according to the revised 2016 UK guidelines. These data suggest that UK university athletes are at risk of vitamin D deficiency. Thus, further research is warranted to investigate the concomitant effects of low vitamin D status on health and performance outcomes in university athletes residing at northern latitudes.
Clostridioides difficile infection (CDI) is the most frequently reported hospital-acquired infection in the United States. Bioaerosols generated during toilet flushing are a possible mechanism for the spread of this pathogen in clinical settings.
To measure the bioaerosol concentration from toilets of patients with CDI before and after flushing.
In this pilot study, bioaerosols were collected 0.15 m, 0.5 m, and 1.0 m from the rims of the toilets in the bathrooms of hospitalized patients with CDI. Inhibitory, selective media were used to detect C. difficile and other facultative anaerobes. Room air was collected continuously for 20 minutes with a bioaerosol sampler before and after toilet flushing. Wilcoxon rank-sum tests were used to assess the difference in bioaerosol production before and after flushing.
Rooms of patients with CDI at University of Iowa Hospitals and Clinics.
Bacteria were positively cultured from 8 of 24 rooms (33%). In total, 72 preflush and 72 postflush samples were collected; 9 of the preflush samples (13%) and 19 of the postflush samples (26%) were culture positive for healthcare-associated bacteria. The predominant species cultured were Enterococcus faecalis, E. faecium, and C. difficile. Compared to the preflush samples, the postflush samples showed significant increases in the concentrations of the 2 large particle-size categories: 5.0 µm (P = .0095) and 10.0 µm (P = .0082).
Bioaerosols produced by toilet flushing potentially contribute to hospital environmental contamination. Prevention measures (eg, toilet lids) should be evaluated as interventions to prevent toilet-associated environmental contamination in clinical settings.
Our understanding about the genetic influences on human disease has increased dramatically with the technological developments in genome and DNA analysis and the discovery of the human genome sequence. Whilst much remains unexplained, it is obvious that normal cardiac development is controlled by the genome and there is significant evidence that a proportion of cardiac malformations are caused by genetic factors. This is important for clinicians as an understanding of confirmed genetic factors is essential to estimate recurrence risks of congenital heart disease (CHD) within families and also screen for predicted associated anomalies. An accurate genetic diagnosis can provide important prognostic information for both the initial patient (proband) and other family members, for whom further genetic investigations may be indicated. There is likely to be a continued increase in demand for such investigations as improvement in surgical and medical management allows more individuals with CHD to survive to reproductive age and have families of their own. For some, the recurrence risk for a cardiac malformation may be as high as 50%; the actual figure varies with different genetic diagnoses. Accurate risk stratification is likely to become increasingly important and the rapidly developing technologies to detect genetic variation mean that genome-wide investigation is becoming more widely available in the clinical setting. An aim of this chapter is to introduce clinicians to principles that will help them embrace and understand the results from these investigations and appreciate the implications they have for their patients.
We systematically reviewed implementation research targeting depression interventions in low- and middle-income countries (LMICs) to assess gaps in methodological coverage.
PubMed, CINAHL, PsycINFO, and EMBASE were searched for evaluations of depression interventions in LMICs reporting at least one implementation outcome published through March 2019.
A total of 8714 studies were screened, 759 were assessed for eligibility, and 79 studies met inclusion criteria. Common implementation outcomes reported were acceptability (n = 50; 63.3%), feasibility (n = 28; 35.4%), and fidelity (n = 18; 22.8%). Only four studies (5.1%) reported adoption or penetration, and three (3.8%) reported sustainability. The Sub-Saharan Africa region (n = 29; 36.7%) had the most studies. The majority of studies (n = 59; 74.7%) reported outcomes for a depression intervention implemented in pilot researcher-controlled settings. Studies commonly focused on Hybrid Type-1 effectiveness-implementation designs (n = 53; 67.1), followed by Hybrid Type-3 (n = 16; 20.3%). Only 21 studies (26.6%) tested an implementation strategy, with the most common being revising professional roles (n = 10; 47.6%). The most common intervention modality was individual psychotherapy (n = 30; 38.0%). Common study designs were mixed methods (n = 27; 34.2%), quasi-experimental uncontrolled pre-post (n = 17; 21.5%), and individual randomized trials (n = 16; 20.3).
Existing research has focused on early-stage implementation outcomes. Most studies have utilized Hybrid Type-1 designs, with the primary aim to test intervention effectiveness delivered in researcher-controlled settings. Future research should focus on testing and optimizing implementation strategies to promote scale-up of evidence-based depression interventions in routine care. These studies should use high-quality pragmatic designs and focus on later-stage implementation outcomes such as cost, penetration, and sustainability.
Psychotropic prescription rates continue to increase in the United States (USA). Few studies have investigated whether social-structural factors may play a role in psychotropic medication use independent of mental illness. Food insecurity is prevalent among people living with HIV in the USA and has been associated with poor mental health. We investigated whether food insecurity was associated with psychotropic medication use independent of the symptoms of depression and anxiety among women living with HIV in the USA.
We used cross-sectional data from the Women's Interagency HIV Study (WIHS), a nationwide cohort study. Food security (FS) was the primary explanatory variable, measured using the Household Food Security Survey Module. First, we used multivariable linear regressions to test whether FS was associated with symptoms of depression (Center for Epidemiologic Studies Depression [CESD] score), generalised anxiety disorder (GAD-7 score) and mental health-related quality of life (MOS-HIV Mental Health Summary score; MHS). Next, we examined associations of FS with the use of any psychotropic medications, including antidepressants, sedatives and antipsychotics, using multivariable logistic regressions adjusting for age, race/ethnicity, income, education and alcohol and substance use. In separate models, we additionally adjusted for symptoms of depression (CESD score) and anxiety (GAD-7 score).
Of the 905 women in the sample, two-thirds were African-American. Lower FS (i.e. worse food insecurity) was associated with greater symptoms of depression and anxiety in a dose–response relationship. For the psychotropic medication outcomes, marginal and low FS were associated with 2.06 (p < 0.001; 95% confidence interval [CI] = 1.36–3.13) and 1.99 (p < 0.01; 95% CI = 1.26–3.15) times higher odds of any psychotropic medication use, respectively, before adjusting for depression and anxiety. The association of very low FS with any psychotropic medication use was not statistically significant. A similar pattern was found for antidepressant and sedative use. After additionally adjusting for CESD and GAD-7 scores, marginal FS remained associated with 1.93 (p < 0.05; 95% CI = 1.16–3.19) times higher odds of any psychotropic medication use. Very low FS, conversely, was significantly associated with lower odds of antidepressant use (adjusted odds ratio = 0.42; p < 0.05; 95% CI = 0.19–0.96).
Marginal FS was associated with higher odds of using psychotropic medications independent of depression and anxiety, while very low FS was associated with lower odds. These complex findings may indicate that people experiencing very low FS face barriers to accessing mental health services, while those experiencing marginal FS who do access services are more likely to be prescribed psychotropic medications for distress arising from social and structural factors.
Sink drainage systems are not amenable to standard methods of cleaning and disinfection. Disinfectants applied as a foam might enhance efficacy of drain decontamination due to greater persistence and increased penetration into sites harboring microorganisms.
To examine the efficacy and persistence of foam-based products in reducing sink drain colonization with gram-negative bacilli.
During a 5-month period, different methods for sink drain disinfection in patient rooms were evaluated in a hospital and its affiliated long-term care facility. We compared the efficacy of a single treatment with 4 different foam products in reducing the burden of gram-negative bacilli in the sink drain to a depth of 2.4 cm (1 inch) below the strainer. For the most effective product, the effectiveness of foam versus liquid-pouring applications, and the effectiveness of repeated foam treatments were evaluated.
A foam product containing 3.13% hydrogen peroxide and 0.05% peracetic acid was significantly more effective than the other 3 foam products. In comparison to pouring the hydrogen peroxide and peracetic acid disinfectant, the foam application resulted in significantly reduced recovery of gram-negative bacilli on days 1, 2, and 3 after treatment with a return to baseline by day 7. With repeated treatments every 3 days, a progressive decrease in the bacterial load recovered from sink drains was achieved.
An easy-to-use foaming application of a hydrogen peroxide- and peracetic acid-based disinfectant suppressed sink-drain colonization for at least 3 days. Intermittent application of the foaming disinfectant could potentially reduce the risk for dissemination of pathogens from sink drains.
Meal timing may influence food choices, neurobiology and psychological states. Our exploratory study examined if time-of-day eating patterns were associated with mood disorders among adults.
During 2004–2006 (age 26–36 years) and 2009–2011 (follow-up, age 31–41 years), N = 1304 participants reported 24-h food and beverage intake. Time-of-day eating patterns were derived by principal components analysis. At follow-up, the Composite International Diagnostic Interview measured lifetime mood disorder. Log binomial and adjacent categories log-link regression were used to examine bidirectional associations between eating patterns and mood disorder. Covariates included sex, age, marital status, social support, education, work schedule, body mass index and smoking.
Three patterns were derived at each time-point: Grazing (intake spread across the day), Traditional (highest intakes reflected breakfast, lunch and dinner), and Late (skipped/delayed breakfast with higher evening intakes). Compared to those in the lowest third of the respective pattern at baseline and follow-up, during the 5-year follow-up, those in the highest third of the Late pattern at both time-points had a higher prevalence of mood disorder [prevalence ratio (PR) = 2.04; 95% confidence interval (CI) 1.20–3.48], and those in the highest third of the Traditional pattern at both time-points had a lower prevalence of first onset mood disorder (PR = 0.31; 95% CI 0.11–0.87). Participants who experienced a mood disorder during follow-up had a 1.07 higher relative risk of being in a higher Late pattern score category at follow-up than those without mood disorder (95% CI 1.00–1.14).
Non-traditional eating patterns, particularly skipped or delayed breakfast, may be associated with mood disorders.
We aimed to estimate the cost-effectiveness of brief weight-loss counselling by dietitian-trained practice nurses, in a high-income-country case study.
A literature search of the impact of dietary counselling on BMI was performed to source the ‘best’ effect size for use in modelling. This was combined with multiple other input parameters (e.g. epidemiological and cost parameters for obesity-related diseases, likely uptake of counselling) in an established multistate life-table model with fourteen parallel BMI-related disease life tables using a 3 % discount rate.
New Zealand (NZ).
We calculated quality-adjusted life-years (QALY) gained and health-system costs over the remainder of the lifespan of the NZ population alive in 2011 (n 4·4 million).
Counselling was estimated to result in an increase of 250 QALY (95 % uncertainty interval −70, 560 QALY) over the population’s lifetime. The incremental cost-effectiveness ratio was 2011 $NZ 138 200 per QALY gained (2018 $US 102 700). Per capita QALY gains were higher for Māori (Indigenous population) than for non-Māori, but were still not cost-effective. If willingness-to-pay was set to the level of gross domestic product per capita per QALY gained (i.e. 2011 $NZ 45 000 or 2018 $US 33 400), the probability that the intervention would be cost-effective was 2 %.
The study provides modelling-level evidence that brief dietary counselling for weight loss in primary care generates relatively small health gains at the population level and is unlikely to be cost-effective.
Carbonate glasses can be formed routinely in the system K2CO3–MgCO3. The enthalpy of formation for one such 0.55K2CO3–0.45MgCO3 glass was determined at 298 K to be 115.00 ± 1.21 kJ/mol by drop solution calorimetry in molten sodium molybdate (3Na2O·MoO3) at 975 K. The corresponding heat of formation from oxides at 298 K was −261.12 ± 3.02 kJ/mol. This ternary glass is shown to be slightly metastable with respect to binary crystalline components (K2CO3 and MgCO3) and may be further stabilized by entropy terms arising from cation disorder and carbonate group distortions. This high degree of disorder is confirmed by 13C MAS NMR measurement of the average chemical shift tensor values, which show asymmetry of the carbonate anion to be significantly larger than previously reported values. Molecular dynamics simulations show that the structure of this carbonate glass reflects the strong interaction between the oxygen atoms in distorted carbonate anions and potassium cations.
Schmidt-hammer exposure-age dating (SHD) of boulders on cryoplanation terrace treads and associated bedrock cliff faces revealed Holocene ages ranging from 0 ± 825 to 8890 ± 1185 yr. The cliffs were significantly younger than the inner treads, which tended to be younger than the outer treads. Radiocarbon dates from the regolith of 3854 to 4821 cal yr BP (2σ range) indicated maximum rates of cliff recession of ~0.1 mm/yr, which suggests the onset of terrace formation before the last glacial maximum. Age, angularity, and size of clasts, together with planation across bedrock structures and the seepage of groundwater from the cliff foot, all support a process-based conceptual model of cryoplanation terrace development in which frost weathering leads to parallel cliff recession and, hence, terrace extension. The availability of groundwater during autumn freezeback is viewed as critical for frost wedging and/or the growth of segregation ice during prolonged winter frost penetration. Permafrost promotes cryoplanation by providing an impermeable frost table beneath the active layer, focusing groundwater flow, and supplying water for sediment transport by solifluction across the tread. Snow beds are considered an effect rather than a cause of cryoplanation terraces, and cryoplanation is seen as distinct from nivation.
People with cerebral palsy (CP) are less physically active than the general population and, consequently, are at increased risk of preventable disease. Evidence indicates that low-moderate doses of physical activity can reduce disease risk and improve fitness and function in people with CP. Para athletes with CP typically engage in ‘performance-focused’ sports training, which is undertaken for the sole purpose of enhancing sports performance. Anecdotally, many Para athletes report that participation in performance-focused sports training confers meaningful clinical benefits which exceed those reported in the literature; however, supporting scientific evidence is lacking. The aim of this paper is to describe the protocol for an 18-month study evaluating the clinical effects of a performance-focused swimming training programme for people with CP who have high support needs.
This study will use a concurrent multiple-baseline, single-case experimental design across three participants with CP who have high support needs. Each participant will complete a five-phase trial comprising: baseline (A1); training phase 1 (B1); maintenance phase 1 (A2); training phase 2 (B2); and maintenance phase 2 (A3). For each participant, measurement of swim velocity, health-related quality of life and gross motor functioning will be carried out a minimum of five times in each of the five phases.
The study described will produce Level II evidence regarding the effects of performance-focused swimming training on clinical outcomes in people with CP who have high support needs. Findings are expected to provide an indication of the potential for sport to augment outcomes in neurological rehabilitation.
Starting in 2016, we initiated a pilot tele-antibiotic stewardship program at 2 rural Veterans Affairs medical centers (VAMCs). Antibiotic days of therapy decreased significantly (P < .05) in the acute and long-term care units at both intervention sites, suggesting that tele-stewardship can effectively support antibiotic stewardship practices in rural VAMCs.