To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Healthcare personnel who perform invasive procedures and are living with HIV or hepatitis B have been required to self-notify the NC state health department since 1992. State coordinated review of HCP utilizes a panel of experts to evaluate transmission risk and recommend infection prevention measures. We describe how this practice balances HCP privacy and patient safety and health.
Prevention of central-line–associated bloodstream infection (CLABSI) represents a complex challenge for the teams involved in device insertion and maintenance. First-tier practices for CLABSI prevention are well established.
We describe second-tier prevention practices in Israeli medical-surgical ICUs and assess their association with CLABSI rates.
In June 2017, an online survey assessing infection prevention practices in general ICUs was sent to all Israeli acute-care hospitals. The survey comprised 14 prevention measures supplementary to the established measures that are standard of care for CLABSI prevention. These measures fall into 2 domains: technology and implementation. The association between the number of prevention measures and CLABSI rate during the first 6 months of 2017 was assessed using Spearman’s correlation. We used negative binomial regression to calculate the incidence rate ratio (IRR) associated with the overall number of prevention measures and with each measure individually.
The CLABSI rates in 24 general ICUs varied between 0.0 and 17.0 per 1,000 central-line days. Greater use of preventive measures was associated with lower CLABSI rates (ρ, –0.70; P < .001). For each additional measure, the incidence of CLABSI decreased by 19% (IRR, 0.81; 95% CI, 0.73–0.89). Specific measures associated with lower rates were involvement of ward champions (IRR, 0.47; 95% CI, 0.31–0.71), auditing of insertions by infection control staff (IRR, 0.35; 95% CI, 0.19–0.64), and simulation-based training (IRR, 0.38; 95% CI, 0.22–0.64).
Implementation of second-tier preventive practices was protective against CLABSI. Use of more practices was correlated with lower rates.
We apply two methods to estimate the 21-cm bispectrum from data taken within the Epoch of Reionisation (EoR) project of the Murchison Widefield Array (MWA). Using data acquired with the Phase II compact array allows a direct bispectrum estimate to be undertaken on the multiple redundantly spaced triangles of antenna tiles, as well as an estimate based on data gridded to the uv-plane. The direct and gridded bispectrum estimators are applied to 21 h of high-band (167–197 MHz; z = 6.2–7.5) data from the 2016 and 2017 observing seasons. Analytic predictions for the bispectrum bias and variance for point-source foregrounds are derived. We compare the output of these approaches, the foreground contribution to the signal, and future prospects for measuring the bispectra with redundant and non-redundant arrays. We find that some triangle configurations yield bispectrum estimates that are consistent with the expected noise level after 10 h, while equilateral configurations are strongly foreground-dominated. Careful choice of triangle configurations may be made to reduce foreground bias that hinders power spectrum estimators, and the 21-cm bispectrum may be accessible in less time than the 21-cm power spectrum for some wave modes, with detections in hundreds of hours.
Ice scallops are a small-scale (5–20 cm) quasi-periodic ripple pattern that occurs at the ice–water interface. Previous work has suggested that scallops form due to a self-reinforcing interaction between an evolving ice-surface geometry, an adjacent turbulent flow field and the resulting differential melt rates that occur along the interface. In this study, we perform a series of laboratory experiments in a refrigerated flume to quantitatively investigate the mechanisms of scallop formation and evolution in high resolution. Using particle image velocimetry, we probe an evolving ice–water boundary layer at sub-millimetre scales and 15 Hz frequency. Our data reveal three distinct regimes of ice–water interface evolution: a transition from flat to scalloped ice; an equilibrium scallop geometry; and an adjusting scallop interface. We find that scalloped-ice geometry produces a clear modification to the ice–water boundary layer, characterized by a time-mean recirculating eddy feature that forms in the scallop trough. Our primary finding is that scallops form due to a self-reinforcing feedback between the ice-interface geometry and shear production of turbulent kinetic energy in the flow interior. The length of this shear production zone is therefore hypothesized to set the scallop wavelength.
After five positive randomized controlled trials showed benefit of mechanical thrombectomy in the management of acute ischemic stroke with emergent large-vessel occlusion, a multi-society meeting was organized during the 17th Congress of the World Federation of Interventional and Therapeutic Neuroradiology in October 2017 in Budapest, Hungary. This multi-society meeting was dedicated to establish standards of practice in acute ischemic stroke intervention aiming for a consensus on the minimum requirements for centers providing such treatment. In an ideal situation, all patients would be treated at a center offering a full spectrum of neuroendovascular care (a level 1 center). However, for geographical reasons, some patients are unable to reach such a center in a reasonable period of time. With this in mind, the group paid special attention to define recommendations on the prerequisites of organizing stroke centers providing medical thrombectomy for acute ischemic stroke, but not for other neurovascular diseases (level 2 centers). Finally, some centers will have a stroke unit and offer intravenous thrombolysis, but not any endovascular stroke therapy (level 3 centers). Together, these level 1, 2, and 3 centers form a complete stroke system of care. The multi-society group provides recommendations and a framework for the development of medical thrombectomy services worldwide.
OBJECTIVES/SPECIFIC AIMS: Clostridium difficile infection (CDI) is the most common cause of antibiotic-associated diarrhea and an increasingly common infection in children in both hospital and community settings. Between 20% and 30% of pediatric patients will have a recurrence of symptoms in the days to weeks following an initial infection. Multiple recurrences have been successfully treated with fecal microbiota transplantation (FMT), though the body of evidence in pediatric patients is limited primarily to case reports and case series. The goal of our study was to better understand practices, success, and safety of FMT in children as well as identify risk factors associated with a failed FMT in our pediatric patients. METHODS/STUDY POPULATION: This multicenter retrospective analysis included 373 patients who underwent FMT for CDI between January 1, 2006 and January 1, 2017 from 18 pediatric centers. Demographics, baseline characteristics, FMT practices, C. difficile outcomes, and post-FMT complications were collected through chart abstraction. Successful FMT was defined as no recurrence of CDI within 60 days after FMT. Of the 373 patients in the cohort, 342 had known outcome data at two months post-FMT and were included in the primary analysis evaluating risk factors for recurrence post-FMT. An additional six patients who underwent FMT for refractory CDI were excluded from the primary analysis. Unadjusted analysis was performed using Wilcoxon rank-sum test, Pearson χ2 test, or Fisher exact test where appropriate. Stepwise logistic regression was utilized to determine independent predictors of success. RESULTS/ANTICIPATED RESULTS: The median age of included patients was 10 years (IQR; 3.0, 15.0) and 50% of patients were female. The majority of the cohort was White (89.0%). Comorbidities included 120 patients with inflammatory bowel disease (IBD) and 14 patients who had undergone a solid organ or stem cell transplantation. Of the 336 patients with known outcomes at two months, 272 (81%) had a successful outcome. In the 64 (19%) patients that did have a recurrence, 35 underwent repeat FMT which was successful in 20 of the 35 (57%). The overall success rate of FMT in preventing further episodes of CDI in the cohort with known outcome data was 87%. Unadjusted predictors of a primary FMT response are summarized. Based on stepwise logistic regression modeling, the use of fresh stool, FMT delivery via colonoscopy, the lack of a feeding tube, and a lower number of CDI episodes before undergoing FMT were independently associated with a successful outcome. There were 20 adverse events in the cohort assessed to be related to FMT, 6 of which were felt to be severe. There were no deaths assessed to be related to FMT in the cohort. DISCUSSION/SIGNIFICANCE OF IMPACT: The overall success of FMT in pediatric patients with recurrent or severe CDI is 81% after a single FMT. Children without a feeding tube, who receive an early FMT, FMT with fresh stool, or FMT via colonoscopy are less likely to have a recurrence of CDI in the 2 months following FMT. This is the first large study of FMT for CDI in a pediatric cohort. These findings, if confirmed by additional prospective studies, will support alterations in the practice of FMT in children.
To compare the epidemiology, clinical characteristics, and mortality of patients with bloodstream infections (BSI) caused by extended-spectrum β-lactamase (ESBL)-producing Escherichia coli (ESBL-EC) versus ESBL-producing Klebsiella pneumoniae (ESBL-KP) and to examine the differences in clinical characteristics and outcome between BSIs caused by isolates with CTX-M versus other ESBL genotypes
As part of the INCREMENT project, 33 tertiary hospitals in 12 countries retrospectively collected data on adult patients diagnosed with ESBL-EC BSI or ESBL-KP BSI between 2004 and 2013. Risk factors for ESBL-EC versus ESBL-KP BSI and for 30-day mortality were examined by bivariate analysis followed by multivariable logistic regression.
The study included 909 patients: 687 with ESBL-EC BSI and 222 with ESBL-KP BSI. ESBL genotype by polymerase chain reaction amplification of 286 isolates was available. ESBL-KP BSI was associated with intensive care unit admission, cardiovascular and neurological comorbidities, length of stay to bacteremia >14 days from admission, and a nonurinary source. Overall, 30-day mortality was significantly higher in patients with ESBL-KP BSI than ESBL-EC BSI (33.7% vs 17.4%; odds ratio, 1.64; P=.016). CTX-M was the most prevalent ESBL subtype identified (218 of 286 polymerase chain reaction-tested isolates, 76%). No differences in clinical characteristics or in mortality between CTX-M and non–CTX-M ESBLs were detected.
Clinical characteristics and risk of mortality differ significantly between ESBL-EC and ESBL-KP BSI. Therefore, all ESBL-producing Enterobacteriaceae should not be considered a homogeneous group. No differences in outcomes between genotypes were detected.
Since 2006, Israel has been confronting an outbreak of carbapenem-resistant Enterobacteriaceae (CRE), and in 2007 Israel implemented a national strategy to contain spread. The intervention was initially directed toward acute-care hospitals and later expanded to include an established reservoir of carriage in long-term-care hospitals. It included regular reporting of CRE cases to a central registry and daily oversight of management of the outbreak at the institutional level. Microbiological methodologies were standardized in clinical laboratories nationwide. Uniform requirements for carrier screening and isolation were established, and a protocol for discontinuation of carrier status was formulated. In response to the evolving epidemiology of CRE in Israel and the continued need for uniform guidelines for carrier detection and isolation, the Ministry of Health in 2016 issued a regulatory circular updating the requirements for CRE screening, laboratory diagnosis, molecular characterization, and carrier isolation, as well as reporting and discontinuation of isolation in healthcare institutions nationwide. The principal elements of the circular are contained herein.
Behavioral and psychological symptoms of dementia (BPSD) are a common problem in long-term care facilities (LTC). Clinical guidelines dictate that first-line treatments for BPSD are psychosocial and behavioral interventions; if these are unsuccessful, psychotropic medications may be trialed at low doses and their effects can be monitored.
There have previously been no studies with nationally representative samples to investigate psychotropic administration in LTCs in Australia. This study determines the prevalence of psychotropic administration in a representative stratified random sample of 446 residents living with dementia from 53 Australian LTCs. Questionnaire and medical chart data in this study is drawn from a larger cross-sectional, mixed methods study on quality of life in Australian LTCs.
It was found that 257 (58%) residents were prescribed psychotropic medications including: antipsychotics (n = 160, 36%), benzodiazepines (n = 136, 31%), antidepressants (n = 117, 26%), and anti-dementia medications (n = 9, 2%). BPSD were found to be very common in the sample, with 82% (n = 364) of participants experiencing at least one BPSD. The most prevalent BPSD were depression (n = 286, 70%) and agitation (n = 299, 67%).
Although detailed background information was not collected on individual cases, the prevalence found is indicative of systematic industry-wide, over-prescription of psychotropic medications as a first-line treatment for BPSD. This study highlights a clear need for further research and interventions in this area.
Interest in planting mixtures of cover crop species has grown in recent years as farmers seek to increase the breadth of ecosystem services cover crops provide. As part of a multidisciplinary project, we quantified the degree to which monocultures and mixtures of cover crops suppress weeds during the fall-to-spring cover crop growing period. Weed-suppressive cover crop stands can limit weed seed rain from summer- and winter-annual species, reducing weed population growth and ultimately weed pressure in future cash crop stands. We established monocultures and mixtures of two legumes (medium red clover and Austrian winter pea), two grasses (cereal rye and oats), and two brassicas (forage radish and canola) in a long fall growing window following winter wheat harvest and in a shorter window following silage corn harvest. In fall of the long window, grass cover crops and mixtures were the most weed suppressive, whereas legume cover crops were the least weed suppressive. All mixtures also effectively suppressed weeds. This was likely primarily due to the presence of fast-growing grass species, which were effective even when they were seeded at only 20% of their monoculture rate. In spring, weed biomass was low in all treatments due to winter kill of summer-annual weeds and low germination of winter annuals. In the short window following silage corn, biomass accumulation by cover crops and weeds in the fall was more than an order of magnitude lower than in the longer window. However, there was substantial weed seed production in the spring in all treatments not containing cereal rye (monoculture or mixture). Our results suggest that cover crop mixtures require only low seeding rates of aggressive grass species to provide weed suppression. This creates an opportunity for other species to deliver additional ecosystem services, though careful species selection may be required to maintain mixture diversity and avoid dominance of winter-hardy cover crop grasses in the spring.
The long-term stability of mechanically exfoliated MoS2 flakes was compared for storage in the air and storage under vacuum. Significant changes in MoS2 flakes were observed for samples stored in the air, whereas similar flakes on samples stored in vacuum underwent no change. Small speckles were observed to appear on the surface of flakes stored in the air, followed by thinning and eventual decomposition of MoS2 flakes. The speckles are suspected to be formed by oxidation of MoS2 in the presence of atmospheric oxygen and water molecules, resulting in the formation of hydrated MoO3.
The STEPWISE trial (STructured lifestyle Education for People WIth SchizophrEnia, schizoaffective disorder and first episode psychosis) is currently evaluating a lifestyle education programme in addition to usual care. However, it is difficult to define what constitutes ‘usual care’. We aimed to define ‘usual care’ for lifestyle management in people with schizophrenia, schizoaffective disorder and first-episode psychosis in STEPWISE study sites. Ten National Health Service (NHS) mental health trusts participated in a bespoke survey based on the National Institute for Health and Care Excellence (NICE) guidance.
Eight trusts reported offering lifestyle education programmes and nine offered smoking cessation support. Reported recording of biomedical measures varied.
Although recommended by NICE, lifestyle education programmes are not consistently offered across UK NHS mental health trusts. This highlights missed opportunities to improve the physical health of people with psychotic illness. Our survey benchmarks ‘usual care’ for the STEPWISE study, against which changes can be measured. Furthermore, future studies will be able to identify whether any progress in clinical practice has been made towards achieving the NICE recommendations.
Parity of esteem means valuing mental health as much as physical health in order to close inequalities in mortality, morbidity or delivery of care. There is clear evidence that patients with mental illness receive inferior medical, surgical and preventive care. This further exacerbated by low help-seeking, high stigma, medication side-effects and relatively low resources in mental healthcare. As a result, patients with severe mental illness die 10–20 years prematurely and have a high rate of cardiometabolic complications and other physical illnesses. Many physical healthcare guidelines and policy recommendations address parity of esteem, but their implementation to date has been poor. All clinicians should be aware that inequalities in care are adversely influencing mental health outcomes, and managers, healthcare organisations and politicians should provide resources and education to address this gap.
• Understand the concept of parity of esteem
• Be aware of the current inequalities in mental healthcare
There is a need for clinical tools to identify cultural issues in diagnostic assessment.
To assess the feasibility, acceptability and clinical utility of the DSM-5 Cultural Formulation Interview (CFI) in routine clinical practice.
Mixed-methods evaluation of field trial data from six countries. The CFI was administered to diagnostically diverse psychiatric out-patients during a diagnostic interview. In post-evaluation sessions, patients and clinicians completed debriefing qualitative interviews and Likert-scale questionnaires. The duration of CFI administration and the full diagnostic session were monitored.
Mixed-methods data from 318 patients and 75 clinicians found the CFI feasible, acceptable and useful. Clinician feasibility ratings were significantly lower than patient ratings and other clinician-assessed outcomes. After administering one CFI, however, clinician feasibility ratings improved significantly and subsequent interviews required less time.
The CFI was included in DSM-5 as a feasible, acceptable and useful cultural assessment tool.
Rock art worldwide has proved extremely difficult to date directly. Here, the first radiocarbon dates for rock paintings in Botswana and Lesotho are presented, along with additional dates for Later Stone Age rock art in South Africa. The samples selected for dating were identified as carbon-blacks from short-lived organic materials, meaning that the sampled pigments and the paintings that they were used to produce must be of similar age. The results reveal that southern African hunter-gatherers were creating paintings on rockshelter walls as long ago as 5723–4420 cal BP in south-eastern Botswana: the oldest such evidence yet found in southern Africa.
Contrary to previous “sociolinguistic folklore” that African American (Vernacular) English has a uniform structure across different parts of the US, recent studies have shown that it varies regionally, especially phonologically (Wolfram, 2007; Thomas & Wassink, 2010). However, there is little research on how Americans perceive AAE variation. Based on a map-labeling task, we investigate the folk perception of AAE variation by 55 participants, primarily African Americans in Columbus, Ohio. The analysis focuses on the dialect regions recognized by the participants, the linguistic features associated with different regions, and the attitudes associated with these beliefs. While the perceived regional boundaries mostly align with those identified by speakers in previous perceptual dialectology studies on American English, the participants consistently identified linguistic features that were specific to AAE. The participants recognized substantial phonological and lexical variation and identified “proper” dialects that do not necessarily sound “white”. This study demonstrates the value of considering African Americans’ perspectives in describing African American varieties of English.
We describe a language-based, dynamic information flow control (IFC) system called LIO. Our system presents a new design point for IFC, influenced by the challenge of implementing IFC as a Haskell library, as opposed to the more typical approach of modifying the language runtime system. In particular, we take a coarse-grained, floating-label approach, previously used by IFC Operating Systems, and associate a single, mutable label—the current label—with all the data in a computation's context. This label is always raised to reflect the reading of sensitive information and it is used to restrict the underlying computation's effects. To preserve the flexibility of fine-grained systems, LIO also provides programmers with a means for associating an explicit label with a piece of data. Interestingly, these labeled values can be used to encapsulate the results of sensitive computations which would otherwise lead to the creeping of the current label. Unlike other language-based systems, LIO also bounds the current label with a current clearance, providing a form of discretionary access control that LIO programs can use to deal with covert channels. Moreover, LIO provides programmers with mutable references and exceptions. The latter, exceptions, are used in LIO to encode and recover from monitor failures, all while preserving data confidentiality and integrity—this addresses a longstanding concern that dynamic IFC is inherently prone to information leakage due to monitor failure.