To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The rapid spread of SARS-CoV-2 throughout key regions of the United States (U.S.) in early 2020 placed a premium on timely, national surveillance of hospital patient censuses. To meet that need, the Centers for Disease Control and Prevention’s National Healthcare Safety Network (NHSN), the nation’s largest hospital surveillance system, launched a module for collecting hospital COVID-19 data. This paper presents time series estimates of the critical hospital capacity indicators during April 1–July 14, 2020.
From March 27–July 14, 2020, NHSN collected daily data on hospital bed occupancy, number of hospitalized patients with COVID-19, and availability/use of mechanical ventilators. Time series were constructed using multiple imputation and survey weighting to allow near real-time daily national and state estimates to be computed.
During the pandemic’s April peak in the United States, among an estimated 431,000 total inpatients, 84,000 (19%) had COVID-19. Although the number of inpatients with COVID-19 decreased during April to July, the proportion of occupied inpatient beds increased steadily. COVID-19 hospitalizations increased from mid-June in the South and Southwest after stay-at-home restrictions were eased. The proportion of inpatients with COVID-19 on ventilators decreased from April to July.
The NHSN hospital capacity estimates served as important, near-real time indicators of the pandemic’s magnitude, spread, and impact, providing quantitative guidance for the public health response. Use of the estimates detected the rise of hospitalizations in specific geographic regions in June after declining from a peak in April. Patient outcomes appeared to improve from early April to mid-July.
The aim of the present study was to investigate the effects of milk composition changes on the in vitro growth of bovine mastitis pathogens. Nutritional requirements of three major bovine mastitis pathogens Escherichia coli (E. coli), Staphylococcus aureus (S. aureus), and Streptococcus uberis (S. uberis) were investigated in vitro. We used ultra-high temperature (UHT) treated milk with different contents of fat, protein, and carbohydrates to test the influence of the availability of various milk constituents on pathogen growth characteristics. Additionally, the bacterial growth was investigated under experimentally modified nutrient availability by dilution and subsequent supplementation with individual nutrients (carbohydrates, different nitrogen sources, minerals, and different types of B vitamins) either to milk or to a conventional medium (thioglycolate broth, TB). Varying contents of fat, protein or lactose did not affect bacterial growth with the exception of growth of S. uberis being promoted in protein-enriched milk. The addition of nutrients to diluted whole milk and TB partly revealed different effects, indicating that there are media-specific growth limiting factors after dilution. Supplementation of minerals to diluted milk did not affect growth rates of all studied bacteria. Bacterial growth in diluted whole milk was decreased by the addition of high concentrations of amino acids in S. aureus, and by urea and additional B vitamins in E. coli and S. aureus. The growth rate of S. uberis was increased by the addition of B vitamins to diluted whole milk. The present results demonstrate that growth-limiting nutrients differ among pathogen types. Because reduced bacterial growth was only shown in diluted milk or TB, it is unlikely that alterations in nutrient availability occurring as a consequence of physiological changes of milk composition in the cow's udder would directly affect the susceptibility or course of bovine mastitis.
Annual grass weeds reduce profits of wheat farmers in the Pacific Northwest. The very-long-chain fatty acid elongase (VLCFA)-inhibiting herbicides S-metolachlor and dimethenamid-P could expand options for control of annual grasses but are not registered in wheat, because of crop injury. We evaluated a safener, fluxofenim, applied to wheat seed for protection of 19 soft white winter wheat varieties from S-metolachlor, dimethenamid-P, and pyroxasulfone herbicides; investigated the response of six varieties (UI Sparrow, LWW 15-72223, UI Magic CL+, Brundage 96, UI Castle CL+, and UI Palouse CL+) to incremental doses of fluxofenim; established the fluxofenim dose required to optimally protect the varieties from VLCFA-inhibiting herbicides; and assessed the impact of fluxofenim dose on glutathione S-transferase (GST) activity in three wheat varieties (UI Sparrow, Brundage 96, and UI Castle CL+). Fluxofenim increased the biomass of four varieties treated with S-metolachlor or dimethenamid-P herbicides and one variety treated with pyroxasulfone. Three varieties showed tolerance to the herbicides regardless of the fluxofenim treatment. Estimated fluxofenim doses resulting in 10% biomass reduction of wheat ranged from 0.55 to 1.23 g ai kg−1 seed. Fluxofenim doses resulting in 90% increased biomass after treatment with S-metolachlor, dimethenamid-P, and pyroxasulfone ranged from 0.07 to 0.55, 0.09 to 0.73, and 0.30 to 1.03 g ai kg−1 seed, respectively. Fluxofenim at 0.36 g ai kg−1 seed increased GST activity in UI Castle CL+, UI Sparrow, and Brundage 96 by 58%, 30%, and 38%, respectively. These results suggest fluxofenim would not damage wheat seedlings up to three times the rate labeled for sorghum, and fluxofenim protects soft white winter wheat varieties from S-metolachlor, dimethenamid-P, or pyroxasulfone injury at the herbicide rates evaluated.
Infection prevention and control (IPC) workflows are often retrospective and manual. New tools, however, have entered the field to facilitate rapid prospective monitoring of infections in hospitals. Although artificial intelligence (AI)–enabled platforms facilitate timely, on-demand integration of clinical data feeds with pathogen whole-genome sequencing (WGS), a standardized workflow to fully harness the power of such tools is lacking. We report a novel, evidence-based workflow that promotes quicker infection surveillance via AI-assisted clinical and WGS data analysis. The algorithm suggests clusters based on a combination of similar minimum inhibitory concentration (MIC) data, timing of sample collection, and shared location stays between patients. It helps to proactively guide IPC professionals during investigation of infectious outbreaks and surveillance of multidrug-resistant organisms and healthcare-acquired infections. Methods: Our team established a 1-year workgroup comprised of IPC practitioners, clinical experts, and scientists in the field. We held weekly roundtables to study lessons learned in an ongoing surveillance effort at a tertiary care hospital—utilizing Philips IntelliSpace Epidemiology (ISEpi), an AI-powered system—to understand how such a tool can enhance practice. Based on real-time case discussions and evidence from the literature, a workflow guidance tool and checklist were codified. Results: In our workflow, data-informed clusters posed by ISEpi underwent triage and expert follow-up analysis to assess: (1) likelihood of transmission(s); (2) potential vector(s) identity; (3) need to request WGS; and (4) intervention(s) to be pursued, if warranted. In a representative sample (spanning October 17, 2019, to November 7, 2019) of 67 total isolates suggested for inclusion in 19 unique cluster investigations, we determined that 9 investigations merited follow-up. Collectively, these 9 investigations involved 21 patients and required 115 minutes to review in ISEpi and an additional 70 minutes of review outside of ISEpi. After review, 6 investigations were deemed unlikely to represent a transmission; the other 3 had potential to represent transmission for which interventions would be performed. Conclusions: This study offers an important framework for adaptation of existing infection control workflow strategies to leverage the utility of rapidly integrated clinical and WGS data. This workflow can also facilitate time-sensitive decisions regarding sequencing of specific pathogens given the preponderance of available clinical data supporting investigations. In this regard, our work sets a new standard of practice: precision infection prevention (PIP). Ongoing effort is aimed at development of AI-powered capabilities for enterprise-level quality and safety improvement initiatives.
Funding: Philips Healthcare provided support for this study.
Disclosures: Alan Doty and Juan Jose Carmona report salary from Philips Healthcare.
Background: Infection prevention surveillance for cross transmission is often performed by manual review of microbiologic culture results to identify geotemporally related clusters. However, the sensitivity and specificity of this approach remains uncertain. Whole-genome sequencing (WGS) analysis can help provide a gold-standard for identifying cross-transmission events. Objective: We employed a published WGS program, the Philips IntelliSpace Epidemiology platform, to compare accuracy of two surveillance methods: (i.) a virtual infection practitioner (VIP) with perfect recall and automated analysis of antibiotic susceptibility testing (AST), sample collection timing, and patient location data and (ii) a novel clinical matching (CM) algorithm that provides cluster suggestions based on a nuanced weighted analysis of AST data, timing of sample collection, and shared location stays between patients. Methods: WGS was performed routinely on inpatient and emergency department isolates of Enterobacter cloacae, Enterococcus faecium, Klebsiella pneumoniae, and Pseudomonas aeruginosa at an academic medical center. Single-nucleotide variants (SNVs) were compared within core genome regions on a per-species basis to determine cross-transmission clusters. Moreover, one unique strain per patient was included within each analysis, and duplicates were excluded from the final results. Results: Between May 2018 and April 2019, clinical data from 121 patients were paired with WGS data from 28 E. cloacae, 21 E. faecium, 61 K. pneumoniae, and 46 P. aeruginosa isolates. Previously published SNV relatedness thresholds were applied to define genomically related isolates. Mapping of genomic relatedness defined clusters as follows: 4 patients in 2 E. faecium clusters and 2 patients in 1 P. aeruginosa cluster. The VIP method identified 12 potential clusters involving 28 patients, all of which were “pseudoclusters.” Importantly, the CM method identified 7 clusters consisting of 27 patients, which included 1 true E. faecium cluster of 2 patients with genomically related isolates. Conclusions: In light of the WGS data, all of the potential clusters identified by the VIP were pseudoclusters, lacking sufficient genomic relatedness. In contrast, the CM method showed increased sensitivity and specificity: it decreased the percentage of pseudoclusters by 14% and it identified a related genomic cluster of E. faecium. These findings suggest that integrating clinical data analytics and WGS is likely to benefit institutions in limiting expenditure of resources on pseudoclusters. Therefore, WGS combined with more sophisticated surveillance approaches, over standard methods as modeled by the VIP, are needed to better identify and address true cross-transmission events.
Funding: This study was supported by Philips Healthcare.
Subclinical (SCK) and clinical (CK) ketosis are metabolic disorders responsible for big losses in dairy production. Although Fourier-transform mid-infrared spectrometry (FTIR) to predict ketosis in cows exposed to great metabolic stress was studied extensively, little is known about its suitability in predicting hyperketonemia using individual samples, e.g. in small dairy herds or when only few animals are at risk of ketosis. The objective of the present research was to determine the applicability of milk metabolites predicted by FTIR spectrometry in the individual screening for ketosis. In experiment 1, blood and milk samples were taken every two weeks after calving from Holstein (n = 80), Brown Swiss (n = 72) and Swiss Fleckvieh (n = 58) cows. In experiment 2, cows diagnosed with CK (n = 474) and 420 samples with blood β-hydroxybutyrate [BHB] <1.0 mmol/l were used to investigate if CK could be detected by FTIR-predicted BHB and acetone from a preceding milk control. In experiment 3, correlations between data from an in farm automatic milk analyser and FTIR-predicted BHB and acetone from the monthly milk controls were evaluated. Hyperketonemia occurred in majority during the first eight weeks of lactation. Correlations between blood BHB and FTIR-predicted BHB and acetone were low (r = 0.37 and 0.12, respectively, P < 0.0001), as well as the percentage of true positive values (11.9 and 16.6%, respectively). No association of FTIR predicted ketone bodies with the interval of milk sampling relative to CK diagnosis was found. Data obtained from the automatic milk analyser were moderately correlated with the same day FTIR-predicted BHB analysis (r = 0.61). In conclusion, the low correlations with blood BHB and the small number of true positive samples discourage the use of milk mid-infrared spectrometry analyses as the only method to predict hyperketonemia at the individual cow level.
This study aimed to examine the association between the availability of firearms at home, and the proportion of firearm suicides in Switzerland in an ecological analysis. The data series were analysed by canton and yielded a fairly high correlation (Spearman's rho = 0.60). Thus, the association holds also at a sub-national level.
We examined the change in Swiss suicide rates since 1969, breaking down the rates according to the method used. The descriptive analyses of the main suicide methods are presented. The suicide rates reached a peak in the late 1970s/early 1980s and declined in more recent years. Firearm suicides and suicides by falls were the exception and sustained their upwards trend until the 1990s. Suicide by vehicle exhaust asphyxiation showed a rapid decline following the introduction of catalytic converters in motor vehicles. No substantial method substitution was observed. Suicide by poisoning declined in the 1990s but rose again following an increase in assisted suicide in somatically incurable patients. Suicide is too often regarded as a homogeneous phenomenon. With regard to the method they choose, suicide victims are a heterogeneous population and it is evident that different suicide methods are chosen by different people. A better understanding of the varying patterns of change over time in the different suicide methods used may lead to differentiated preventive strategies.
To address and better understand the problem of high suicide rates in widows and widowers.
Sex and age specific suicide data collated by marital group were extracted from Swiss mortality statistics for the period 1991-2003. The mortality in the first week / month / year of widowhood was calculated based on person-year calculations.
Cross-sectional analysis by sex and age-group confirms the existence of different suicide rate patterns according to marital status. Moreover, the profiles of suicide methods differ. In particular, suicide methods which may be associated with impulsive suicides, such as firearms or poisoning, are relatively frequent in the widowed. The suicide risk of widowed persons is extremely high in the days and weeks immediately after bereavement.
Suicide risk and suicide behavior varies systematically according to marital status. In particular, widows and widowers emerge as a group suitable for preventive methods because of the existence of a time window when there is increased risk. Moreover, widowed persons are a clear-cut risk group under the aegis of undertakers, priests and perhaps general practitioners.
The human endocannabinoid system interacts with various neurotransmitter systems and the endocannabinoid anandamide was found significantly elevated in CSF and inversely correlated to psychopathology (Giuffrida et al. 2004) providing a link to the neurobiology of schizophrenia. While delta-9-tetrahydrocannabinol, the psychoactive compound of Cannabis sativa, shows psychedelic properties, the major herbal cannabinoid compound cannabidiol was suggested recently a re-uptake inhibitor of anandamide. In addition potential antipsychotic properties have been hypothezised.
We performed an explorative, 4-week, double-blind, controlled clinical trial on the effects of purified cannabidiol in acute schizophrenia compared to the antipsychotic amisulpride. The antipsychotic properties of both drugs were the primary target of the study. Furthermore, side-effects and anxiolytic capabilities of both treatments were investigated.
42 patients fulfilling DSM-IV criteria of acute paranoid schizophrenia or schizophreniform psychosis participated in the study. Both treatments were associated with a significant decrease of psychotic symptoms after 2 and 4 weeks as assessed by BPRS and PANSS. However, there was no statistical difference between both treatment groups. In contrast, cannabidiol induced significantly less side effects (EPS, increase in prolactin, weight gain) when compared to amisulpride.
Cannabidiol proved substantial antipsychotic properties in acute schizophrenia. This is in line with our suggestion of an adaptive role of the endocannabinoid system in paranoid schizophrenia, and raises further evidence that this adaptive mechanism may represent a valuable target for antipsychotic treatment strategies.
The Stanley Medical Research Institute (00-093 to FML) and the Koeln Fortune Program (107/2000 + 101/2001 to FML) funded this study.
The unprecedented attacks of 9/11, 2001 resulted in high rates of PTSD in the months following the attacks. Little information exists on the long-term effects of 9/11 in high-risk immigrant urban populations.
We will present findings from an NIMH funded longitudinal study aimed to estimate the prevalence, comorbidity, disability, mental health treatment and service utilization associated with posttraumatic stress disorder (PTSD) in a systematic sample of economically disadvantaged adult, mostly Latino immigrant, primary care patients (n=720) in New York City interviewed approximately 1 and 5 years after attacks of September 11, 2001.
The presentation will focus on: 1) trajectories of 9/11 PTSD; 2) risk and protective factors for the development and persistence of 9/11 PTSD; 2) the role of ethnicity and acculturation in the expression of physical and mental symptoms; and 3) the role of post- disaster social support, and secondary stressors, in mediating the disaster effects.
Our findings will highlight the specific needs for mental health care associated with long term post-disaster psychopathology among high risk populations and will underscore the importance of developing evidence based post-disaster care, including screening and treatment capacities for individuals exposed to trauma in general medical practices.
Various studies have reported a positive relationship between child maltreatment and personality disorders (PDs). However, few studies included all DSM-IV PDs and even fewer adjusted for other forms of childhood adversity, e.g. bullying or family problems.
We analyzed questionnaires completed by 512 participants of the ZInEP epidemiology survey, a comprehensive psychiatric survey of the general population in Zurich, Switzerland. Associations between childhood adversity and PDs were analyzed bivariately via simple regression analyses and multivariately via multiple path analysis.
The bivariate analyses revealed that all PD dimensions were significantly related to various forms of family and school problems as well as child abuse. In contrast, according to the multivariate analysis only school problems and emotional abuse were associated with various PDs. Poverty was uniquely associated with schizotypal PD, conflicts with parents with obsessive-compulsive PD, physical abuse with antisocial PD, and physical neglect with narcissistic PD. Sexual abuse was statistically significantly associated with schizotypal and borderline PD, but corresponding effect sizes were small.
Childhood adversity has a serious impact on PDs. Bullying and violence in schools and emotional abuse appear to be more salient markers of general personality pathology than other forms of childhood adversity. Associations with sexual abuse were negligible when adjusted for other forms of adversity.
Patients’expectancies have long been considered to contribute to treatment outcome. Whereas research has concentrated on different types of expectancies in predicting outcome, it has not examined their interactive contribution, therapist factors, nor the development of expectancies over time. Therefore, the present study aims to investigate the independent as well as the interactive contributions of outcome expectancies (OE) and negative mood regulation expectancies (NMRE) to outcome. One hundred and fourty depressed outpatients in cognitive-behavior psychotherapy completed measures of OE and NMRE at pretreatment and midtreatment, as well as outcome measures at midtreatment and posttreatment. Patients’ OE were assessed using the Patients’ Therapy Expectation and Evaluation Questionnaire (PATHEV; Schulte, 2005), and the short form of the Negative Mood Regulation Scale (NMR; Backenstrass et al., 2010). Outcome was measured using the German version of the Beck Depression Inventory – II (BDI-II; Hautzinger, Keller, & Kühner, 2006), and the Inventory of Depressive Symptomatology - Clinician Rated 30-item version (IDS-C; Rush, Carmody, & Reimitz, 2000). We will perform three-level multiple longitudinal hierarchical analysis, with different assessment time points as the first level, nested in patients (second level), which are nested in therapists (third level), controlling for comorbidities. We expect OE and NMRE to change significantly during therapy, and these changes to be related to outcome, both at midtreatment and posttreatment. We also expect to find a significant interaction between OE and NMRE in predicting outcome, as well as a significant influence of therapists on patients's expectancies. Theoretical and clinical implications of the results will be discussed.
The aim of the study was to analyse the influence on tail-biting in undocked pigs during the rearing period of crude fibre in piglets' rations. All pigs were fed the same pre-starter until weaning. The study comprised two trials with four experimental groups each. The first trial contained: a control group (CG1) with conventional feed (up to 40 g/kg crude fibre), two groups with an increased crude fibre content of up to 50 g/kg (G5) and 60 g/kg (G6), respectively, and one group with conventional feed and crude fibre provision ad libitum (AL). The second trial consisted of a control group (CG2) which received the same conventional feed as CG1 and three treatment groups with either soya hulls (SS), dried sugar beet pulp (DP) or oat fibre (OF) admixed to their ration, to achieve a crude fibre content of 60 g/kg in all three groups. The rearing week, the batch, the treatment group (only in trial one) and the interaction between batch and treatment group had a significant influence on tail-lesions (P < 0.05). The tail-biting process started in rearing week 3 (trial one) and 5 (trial two), respectively. Due to the low frequency of tail-biting during the present study, crude fibre seems to have no major influence on tail-biting during the rearing period. This unexpected result may be caused by the optimized conditions in which the piglets were kept and the intensive animal observation carried out by the employees. However, the batch effect was most influential.
Deficits of mismatch negativity (MMN) in schizophrenia and individuals at risk for psychosis have been replicated many times. Several studies have also demonstrated the occurrence of subclinical psychotic symptoms within the general population. However, none has yet investigated MMN in individuals from the general population who report subclinical psychotic symptoms.
The MMN to duration-, frequency-, and intensity deviants was recorded in 217 nonclinical individuals classified into a control group (n = 72) and three subclinical groups: paranoid (n = 44), psychotic (n = 51), and mixed paranoid-psychotic (n = 50). Amplitudes of MMN at frontocentral electrodes were referenced to average. Based on a three-source model of MMN generation, we conducted an MMN source analysis and compared the amplitudes of surface electrodes and sources among groups.
We found no significant differences in MMN amplitudes of surface electrodes. However, significant differences in MMN generation among the four groups were revealed at the frontal source for duration-deviant stimuli (P = 0.01). We also detected a trend-level difference (P = 0.05) in MMN activity among those groups for frequency deviants at the frontal source.
Individuals from the general population who report psychotic symptoms are a heterogeneous group. However, alterations exist in their frontal MMN activity. This increased activity might be an indicator of more sensitive perception regarding changes in the environment for individuals with subclinical psychotic symptoms.
The increasing lactational performance of dairy cows over the last few decades is closely related to higher nutritional requirements. The decrease in dry matter intake during the peripartal period results in a considerable mobilisation of body tissues (mainly fat reserves and muscle mass) to compensate for the prevailing lack of energy and nutrients. Despite the activation of adaptive mechanisms to mobilise nutrients from body tissues for maintenance and milk production, the increased metabolic load is still a risk factor for animal health. The prevalence of production diseases, particularly subclinical ketosis is high in the early lactation period. Increased β-hydroxybutyrate (BHB) concentrations further depress gluconeogenesis, feed intake and the immune system. Despite a variety of adaptation responses to nutrient and energy deficit that exists among dairy cows, an early and non-invasive detection of developing metabolic disorders in milk samples would be useful. The frequent and regular milking process of dairy cows creates the ability to obtain samples at any stage of lactation. Routine identification of biomarkers accurately characterising the physiological status of an animal is crucial for decisive strategies. The present overview recapitulates established markers measured in milk that are associated with metabolic health of dairy cows. Specifically, measurements of milk fat, protein, lactose and urea concentrations are evaluated. Changes in the ratio of milk fat to protein may indicate an increased risk for rumen acidosis and ketosis. The costly determination of individual fatty acids in milk creates barriers for grouping of fatty acids into saturated, mono- and polyunsaturated fatty acids. Novel approaches include the potential of mid-IR (MIR) based predictions of BHB and acetone in milk, although the latter are not directly measured, but only estimated via indirect associations of concomitantly altered milk composition during (sub)clinical ketosis. Although MIR-based ketone body concentrations in milk are not suitable to monitor the metabolic status of the individual cow, they provide an estimate of the overall herd or specific groups of animals earlier in a particular stage of lactation. Management decisions can be made earlier and animal health status improved by adjusting diet composition.
Determining infectious cross-transmission events in healthcare settings involves manual surveillance of case clusters by infection control personnel, followed by strain typing of clinical/environmental isolates suspected in said clusters. Recent advances in genomic sequencing and cloud computing now allow for the rapid molecular typing of infecting isolates.
To facilitate rapid recognition of transmission clusters, we aimed to assess infection control surveillance using whole-genome sequencing (WGS) of microbial pathogens to identify cross-transmission events for epidemiologic review.
Clinical isolates of Staphylococcus aureus, Enterococcus faecium, Pseudomonas aeruginosa, and Klebsiella pneumoniae were obtained prospectively at an academic medical center, from September 1, 2016, to September 30, 2017. Isolate genomes were sequenced, followed by single-nucleotide variant analysis; a cloud-computing platform was used for whole-genome sequence analysis and cluster identification.
Most strains of the 4 studied pathogens were unrelated, and 34 potential transmission clusters were present. The characteristics of the potential clusters were complex and likely not identifiable by traditional surveillance alone. Notably, only 1 cluster had been suspected by routine manual surveillance.
Our work supports the assertion that integration of genomic and clinical epidemiologic data can augment infection control surveillance for both the identification of cross-transmission events and the inclusion of missed and exclusion of misidentified outbreaks (ie, false alarms). The integration of clinical data is essential to prioritize suspect clusters for investigation, and for existing infections, a timely review of both the clinical and WGS results can hold promise to reduce HAIs. A richer understanding of cross-transmission events within healthcare settings will require the expansion of current surveillance approaches.