To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Introduction: CAEP recently developed the acute atrial fibrillation (AF) and flutter (AFL) [AAFF] Best Practices Checklist to promote optimal care and guidance on cardioversion and rapid discharge of patients with AAFF. We sought to assess the impact of implementing the Checklist into large Canadian EDs. Methods: We conducted a pragmatic stepped-wedge cluster randomized trial in 11 large Canadian ED sites in five provinces, over 14 months. All hospitals started in the control period (usual care), and then crossed over to the intervention period in random sequence, one hospital per month. We enrolled consecutive, stable patients presenting with AAFF, where symptoms required ED management. Our intervention was informed by qualitative stakeholder interviews to identify perceived barriers and enablers for rapid discharge of AAFF patients. The many interventions included local champions, presentation of the Checklist to physicians in group sessions, an online training module, a smartphone app, and targeted audit and feedback. The primary outcome was length of stay in ED in minutes from time of arrival to time of disposition, and this was analyzed at the individual patient-level using linear mixed effects regression accounting for the stepped-wedge design. We estimated a sample size of 800 patients. Results: We enrolled 844 patients with none lost to follow-up. Those in the control (N = 316) and intervention periods (N = 528) were similar for all characteristics including mean age (61.2 vs 64.2 yrs), duration of AAFF (8.1 vs 7.7 hrs), AF (88.6% vs 82.9%), AFL (11.4% vs 17.1%), and mean initial heart rate (119.6 vs 119.9 bpm). Median lengths of stay for the control and intervention periods respectively were 413.0 vs. 354.0 minutes (P < 0.001). Comparing control to intervention, there was an increase in: use of antiarrhythmic drugs (37.4% vs 47.4%; P < 0.01), electrical cardioversion (45.1% vs 56.8%; P < 0.01), and discharge in sinus rhythm (75.3% vs. 86.7%; P < 0.001). There was a decrease in ED consultations to cardiology and medicine (49.7% vs 41.1%; P < 0.01), but a small but insignificant increase in anticoagulant prescriptions (39.6% vs 46.5%; P = 0.21). Conclusion: This multicenter implementation of the CAEP Best Practices Checklist led to a significant decrease in ED length of stay along with more ED cardioversions, fewer ED consultations, and more discharges in sinus rhythm. Widespread and rigorous adoption of the CAEP Checklist should lead to improved care of AAFF patients in all Canadian EDs.
Introduction: Acute heart failure (AHF) is a common emergency department (ED) presentation and may be associated with poor outcomes. Conversely, many patients rapidly improve with ED treatment and may not need hospital admission. Because there is little evidence to guide disposition decisions by ED and admitting physicians, we sought to create a risk score for predicting short-term serious outcomes (SSO) in patients with AHF. Methods: We conducted prospective cohort studies at 9 tertiary care hospital EDs from 2007 to 2019, and enrolled adult patients who required treatment for AHF. Each patient was assessed for standardized real-time clinical and laboratory variables, as well as for SSO (defined as death within 30 days or intubation, non-invasive ventilation (NIV), myocardial infarction, coronary bypass surgery, or new hemodialysis after admission). The fully pre-specified, logistic regression model with 13 predictors (age, pCO2, and SaO2 were modeled using spline functions with 3 knots and heart rate and creatinine with 5 knots) was fitted to the 10 multiple imputation datasets. Harrell's fast stepdown procedure reduced the number of variables. We calculated the potential impact on sensitivity (95% CI) for SSO and hospital admissions and estimated a sample size of 170 SSOs. Results: The 2,246 patients had mean age 77.4 years, male sex 54.5%, EMS arrival 41.1%, IV NTG 3.1%, ED NIV 5.2%, admission on initial visit 48.6%. Overall there were 174 (7.8%) SSOs including 70 deaths (3.1%). The final risk scale is comprised of five variables (points) and had c-statistic of 0.76 (95% CI: 0.73-0.80): 1.Valvular heart disease (1) 2.ED non-invasive ventilation (2) 3.Creatinine 150-300 (1) ≥300 (2) 4.Troponin 2x-4x URL (1) ≥5x URL (2) 5.Walk test failed (2) The probability of SSO ranged from 2.0% for a total score of 0 to 90.2% for a score of 10, showing good calibration. The model was stable over 1,000 bootstrap samples. Choosing a risk model total point admission threshold of >2 would yield a sensitivity of 80.5% (95% CI 73.9-86.1) for SSO with no change in admissions from current practice (48.6% vs 48.7%). Conclusion: Using a large prospectively collected dataset, we created a concise and sensitive risk scale to assist with admission decisions for patients with AHF in the ED. Implementation of this risk scoring scale should lead to safer and more efficient disposition decisions, with more high-risk patients being admitted and more low-risk patients being discharged.
Participation in European surveillance for bloodstream infection (BSI) commenced in Ireland in 1999 with all laboratories (n = 39) participating by 2014. Observational hand hygiene auditing (OHHA) was implemented in 2011. The aim of this study was to evaluate the impact of OHHA on hand hygiene compliance, alcohol hand rub (AHR) procurement and the incidence of sensitive and resistant Staphylococcus aureus and Enterococcus faecium and faecalis BSI. A prospective segmented regression analysis was performed to determine the temporal association between OHHA and outcomes. Observed hand hygiene improved from 74.7% (73.7–75.6) in 2011 to 90.8% (90.1–91.3) in 2016. AHR procurement increased from 20.1 l/1000 bed days used (BDU) in 2009 to 33.2 l/1000 BDU in 2016. A pre-intervention reduction of 2% per quarter in the ratio of methicillin sensitive Staphylococcus aureus BSI/BDU stabilized in the time period after the intervention (P < 0.01). The ratio of Methicillin resistant Staphylococcus aureus (MRSA) BSI/BDU was decreasing by 5% per quarter pre-intervention, this slowed to 2% per quarter post intervention, (P < 0.01). There was no significant change in the ratio of vancomycin sensitive (P = 0.49) or vancomycin resistant (P = 0.90) Enterococcus sp. BSI/BDU post intervention. This study shows national OHHA increased observed hand hygiene compliance and AHR procurement, however there was no associated reduction in BSI.
A primary barrier to translation of clinical research discoveries into care delivery and population health is the lack of sustainable infrastructure bringing researchers, policymakers, practitioners, and communities together to reduce silos in knowledge and action. As National Institutes of Healthʼs (NIH) mechanism to advance translational research, Clinical and Translational Science Award (CTSA) awardees are uniquely positioned to bridge this gap. Delivering on this promise requires sustained collaboration and alignment between research institutions and public health and healthcare programs and services. We describe the collaboration of seven CTSA hubs with city, county, and state healthcare and public health organizations striving to realize this vision together. Partnership representatives convened monthly to identify key components, common and unique themes, and barriers in academic–public collaborations. All partnerships aligned the activities of the CTSA programs with the needs of the city/county/state partners, by sharing resources, responding to real-time policy questions and training needs, promoting best practices, and advancing community-engaged research, and dissemination and implementation science to narrow the knowledge-to-practice gap. Barriers included competing priorities, differing timelines, bureaucratic hurdles, and unstable funding. Academic–public health/health system partnerships represent a unique and underutilized model with potential to enhance community and population health.
There is demand for new, effective and scalable treatments for depression, and development of new forms of cognitive bias modification (CBM) of negative emotional processing biases has been suggested as possible interventions to meet this need.
We report two double blind RCTs, in which volunteers with high levels of depressive symptoms (Beck Depression Inventory ii (BDI-ii) > 14) completed a brief course of emotion recognition training (a novel form of CBM using faces) or sham training. In Study 1 (N = 36), participants completed a post-training emotion recognition task whilst undergoing functional magnetic resonance imaging to investigate neural correlates of CBM. In Study 2 (N = 190), measures of mood were assessed post-training, and at 2-week and 6-week follow-up.
In both studies, CBM resulted in an initial change in emotion recognition bias, which (in Study 2) persisted for 6 weeks after the end of training. In Study 1, CBM resulted in increases neural activation to happy faces, with this effect driven by an increase in neural activity in the medial prefrontal cortex and bilateral amygdala. In Study 2, CBM did not lead to a reduction in depressive symptoms on the BDI-ii, or on related measures of mood, motivation and persistence, or depressive interpretation bias at either 2 or 6-week follow-ups.
CBM of emotion recognition has effects on neural activity that are similar in some respects to those induced by Selective Serotonin Reuptake Inhibitors (SSRI) administration (Study 1), but we find no evidence that this had any later effect on self-reported mood in an analogue sample of non-clinical volunteers with low mood (Study 2).
To further understand the contribution of feedstuff ingredients to gut health in swine, gut histology and intestinal bacterial profiles associated with the use of two high-quality protein sources, microbially enhanced soybean meal (MSBM) and Menhaden fishmeal (FM) were assessed. Weaned pigs were fed one of three experimental diets: (1) basic diet containing corn and soybean meal (Negative Control (NEG)), (2) basic diet + fishmeal (FM; Positive Control (POS)) and (3) basic diet + MSBM (MSBM). Phase I POS and MSBM diets (d 0 to d 7 post-wean) included FM or MSBM at 7.5%, while Phase II POS and MSBM diets (d 8 to d 21) included FM or MSBM at 5.0%. Gastrointestinal tissue and ileal digesta were collected from euthanised pigs at d 21 (eight pigs/diet) to assess gut histology and intestinal bacterial profiles, respectively. Data were analysed using Proc Mixed in SAS, with pig as the experimental unit and pig (treatment) as the random effect. Histological and immunohistochemical analyses of stomach and small intestinal tissue using haematoxylin–eosin, Periodic Acid Schiff/Alcian blue and inflammatory cell staining did not reveal detectable differences in host response to dietary treatment. Ileal bacterial composition profiles were obtained from next-generation sequencing of PCR generated amplicons targeting the V1 to V3 regions of the 16S rRNA gene. Lactobacillus-affiliated sequences were found to be the most highly represented across treatments, with an average relative abundance of 64.0%, 59.9% and 41.80% in samples from pigs fed the NEG, POS and MSBM diets, respectively. Accordingly, the three most abundant Operational Taxonomic Units (OTUs) were affiliated to Lactobacillus, showing a distinct abundance pattern relative to dietary treatment. One OTU (SD_Ssd_00001), most closely related to Lactobacillus amylovorus, was found to be more abundant in NEG and POS samples compared to MSBM (23.5% and 35.0% v. 9.2%). Another OTU (SD_Ssd_00002), closely related to Lactobacillus johnsonii, was more highly represented in POS and MSBM samples compared to NEG (14.0% and 15.8% v. 0.1%). Finally, OTU Sd_Ssd-00011, highest sequence identity to Lactobacillus delbrueckii, was found in highest abundance in ileal samples from MSBM-fed pigs (1.9% and 3.3% v. 11.3, in POS, NEG and MSBM, respectively). There was no effect of protein source on bacterial taxa to the genus level or diversity based on principal component analysis. Dietary protein source may provide opportunity to enhance presence of specific members of Lactobacillus genus that are associated with immune-modulating properties without altering overall intestinal bacterial diversity.
We have observed the G23 field of the Galaxy AndMass Assembly (GAMA) survey using the Australian Square Kilometre Array Pathfinder (ASKAP) in its commissioning phase to validate the performance of the telescope and to characterise the detected galaxy populations. This observation covers ~48 deg2 with synthesised beam of 32.7 arcsec by 17.8 arcsec at 936MHz, and ~39 deg2 with synthesised beam of 15.8 arcsec by 12.0 arcsec at 1320MHz. At both frequencies, the root-mean-square (r.m.s.) noise is ~0.1 mJy/beam. We combine these radio observations with the GAMA galaxy data, which includes spectroscopy of galaxies that are i-band selected with a magnitude limit of 19.2. Wide-field Infrared Survey Explorer (WISE) infrared (IR) photometry is used to determine which galaxies host an active galactic nucleus (AGN). In properties including source counts, mass distributions, and IR versus radio luminosity relation, the ASKAP-detected radio sources behave as expected. Radio galaxies have higher stellar mass and luminosity in IR, optical, and UV than other galaxies. We apply optical and IR AGN diagnostics and find that they disagree for ~30% of the galaxies in our sample. We suggest possible causes for the disagreement. Some cases can be explained by optical extinction of the AGN, but for more than half of the cases we do not find a clear explanation. Radio sources aremore likely (~6%) to have an AGN than radio quiet galaxies (~1%), but the majority of AGN are not detected in radio at this sensitivity.
Limited research considers the ethnic and cultural diversity among the US Black population, and how this diversity influences diet. The purpose of the present qualitative study is to (1) explore the influence of culture, nativity and ethnicity on the diet of US-born, African-born and Caribbean/Latin American-born Blacks and (2) explore a model of dietary acculturation among the African-born and Caribbean/Latin American-born Blacks. The purposive sample included twenty-two US-born, fifteen Caribbean/Latin American-born and ten African-born Blacks (n 47) living in Boston, who participated in either an in-depth interview (n 12) or a focus group (five groups, size 5–9). Satia-Abouta's model of dietary acculturation informed the interview and focus group questions, which explored the influence of psychosocial factors, taste preferences and environmental factors on dietary changes. NVivo 10 software was utilised for the coding and analysis. Topics based on a priori and posteriori analyses included differences in psychosocial factors and taste preferences and environmental factors by nativity. Caribbean/Latin American-born and African-born Blacks expressed the importance of cultural identity in their dietary preferences and found adaptive strategies to maintain cultural diet, while US-born Blacks demonstrated a variety of preferences for traditionally African American foods. Environmental factors varied by place of birth and residence, with US-born Blacks citing poorer quality and limited affordability of foods. These findings suggest the importance of psychosocial and environmental factors in shaping the diet of the ethnically diverse US Black population and underscore the dietary diversity within and across the different ethnic groups of Blacks.
Waterhemp [Amaranthus tuberculatus (Moq.) J. D. Sauer] and Palmer amaranth (Amaranthus palmeri S. Watson) are troublesome weeds of row-crop production in the United States. Their dioecious reproductive systems ensure outcrossing, facilitating rapid evolution and distribution of resistances to multiple herbicides. Little is known, however, about the genetic basis of dioecy in Amaranthus species. In this work, we use restriction site–associated DNA sequencing (RAD-Seq) to investigate the genetic basis of sex determination in A. tuberculatus and A. palmeri. For each species, approximately 200 plants of each sex were sampled and used to create RAD-Seq libraries. The resulting libraries were separately bar-coded and then pooled for sequencing with the Illumina platform, yielding millions of 64-bp reads. These reads were analyzed to identify sex-specific and sex-biased sequences. We identified 345 male-specific sequences from the A. palmeri data set and 2,754 male-specific sequences in A. tuberculatus. An unexpected 723 female-specific sequences were identified in a subset of the A. tuberculatus females; subsequent research, however, indicated female specificity of these markers was limited to the population from which they were identified. Primer sets designed to specifically amplify male-specific sequences were tested for accuracy on multiple, geographically distinct populations of A. tuberculatus and A. palmeri, as well as other Amaranthus species. Two primer sets for A. palmeri and four primer sets for A. tuberculatus were each able to distinguish between male and female plants with at least 95% accuracy. In the near term, sex-specific markers will be useful to the A. tuberculatus and A. palmeri research communities (e.g., to predict sex for crossing experiments). In the long-term, this research will provide the foundational tools for detailed investigations into the molecular biology and evolution of dioecy in weedy Amaranthus species.
The late Pleistocene–early Holocene archaeological record of the interior Pacific Northwest is dominated by what has been regionally referred to as the Western Stemmed Tradition (WST). While various efforts have attempted to clarify the chronology of this tradition, these have largely focused on data from the Great Basin and have been disproportionately preoccupied with establishing the beginning of the tradition due to its temporal overlap with Clovis materials. Specifically focusing on the Columbia Plateau, we apply a series of Bayesian chronological models to create concise estimates of the most likely beginning, end, and span of the WST. We then further explore its chronology by modeling its temporal span under various parameters and criteria so as to better identify places in the chronology that need further work and those that are robust regardless of data iteration. Our analysis revealed four major findings: (1) WST conservatively dates between 13,000 and 11,000 cal BP, likely extending to ~13,500 cal BP; (2) the most problematic period for WST is its termination; (3) the WST is incredibly long-lived compared to roughly contemporary Paleoindian traditions; and (4) the WST was seemingly unaffected by the onset of the Younger Dryas.
In 2013, the national surveillance case definition for West Nile virus (WNV) disease was revised to remove fever as a criterion for neuroinvasive disease and require at most subjective fever for non-neuroinvasive disease. The aims of this project were to determine how often afebrile WNV disease occurs and assess differences among patients with and without fever. We included cases with laboratory evidence of WNV disease reported from four states in 2014. We compared demographics, clinical symptoms and laboratory evidence for patients with and without fever and stratified the analysis by neuroinvasive and non-neuroinvasive presentations. Among 956 included patients, 39 (4%) had no fever; this proportion was similar among patients with and without neuroinvasive disease symptoms. For neuroinvasive and non-neuroinvasive patients, there were no differences in age, sex, or laboratory evidence between febrile and afebrile patients, but hospitalisations were more common among patients with fever (P < 0.01). The only significant difference in symptoms was for ataxia, which was more common in neuroinvasive patients without fever (P = 0.04). Only 5% of non-neuroinvasive patients did not meet the WNV case definition due to lack of fever. The evidence presented here supports the changes made to the national case definition in 2013.
Health and social care face growing and conflicting pressures: mounting complex needs of an ageing population, restricted funding and a workforce recruitment and retention crisis. In response, in the UK the NHS Long Term Plan promises increased investment and an emphasis on better ‘integrated’ care. We describe key aspects of integration that need addressing.
Declaration of interest
D.K.T. and S.S.S. are on the editorial board of the British Journal of Psychiatry and executives of the Academic Faculty at the Royal College of Psychiatrists. A.J.B.J., H.P. and Z.M. have roles at the Royal College of Psychiatrists that include evaluation of integrated care systems. A.J.B.J. is married to Dr Sarah Wollaston, Member of Parliament for Totnes and Chair of the Health Select Committee.
We identified a pseudo-outbreak of Mycobacterium avium in an outpatient bronchoscopy clinic following an increase in clinic procedure volume. We terminated the pseudo-outbreak by increasing the frequency of automated endoscope reprocessors (AER) filter changes from quarterly to monthly. Filter changing schedules should depend on use rather than fixed time intervals.
Few studies have investigated the patterns of posttraumatic stress disorder (PTSD) symptom change in prolonged exposure (PE) therapy. In this study, we aimed to understand the patterns of PTSD symptom change in both PE and present-centered therapy (PCT).
Participants were active duty military personnel (N = 326, 89.3% male, 61.2% white, 32.5 years old) randomized to spaced-PE (S-PE; 10 sessions over 8 weeks), PCT (10 sessions over 8 weeks), or massed-PE (M-PE; 10 sessions over 2 weeks). Using latent profile analysis, we determined the optimal number of PTSD symptom change classes over time and analyzed whether baseline and follow-up variables were associated with class membership.
Five classes, namely rapid responder (7–17%), steep linear responder (14–22%), gradual responder (30–34%), non-responder (27–33%), and symptom exacerbation (7–13%) classes, characterized each treatment. No baseline clinical characteristics predicted class membership for S-PE and M-PE; in PCT, more negative baseline trauma cognitions predicted membership in the non-responder v. gradual responder class. Class membership was robustly associated with PTSD, trauma cognitions, and depression up to 6 months after treatment for both S-PE and M-PE but not for PCT.
Distinct profiles of treatment response emerged that were similar across interventions. By and large, no baseline variables predicted responder class. Responder status was a strong predictor of future symptom severity for PE, whereas response to PCT was not as strongly associated with future symptoms.
The National Institute of Standards and Technology (NIST) certifies a suite of Standard Reference Materials (SRMs) to address specific aspects of the performance of X-ray powder diffraction instruments. This report describes SRM 1879b, the third generation of this powder diffraction SRM. SRM 1879b is intended for use in the preparation of calibration standards for the quantitative analyses of cristobalite by X-ray powder diffraction in accordance with National Institute for Occupational Safety and Health (NIOSH) Analytical Method 7500, or equivalent. A unit of SRM 1879b consists of approximately 5 g of cristobalite powder bottled in an argon atmosphere. It is certified with respect to crystalline phase purity, or amorphous phase content, and lattice parameter. Neutron powder diffraction, both time-of-flight and constant wavelength, was used to certify the phase purity using SRM 676a as an internal standard. A NIST-built diffractometer, incorporating many advanced design features was used for certification measurements for lattice parameters.
Impaired β-cell development and insulin secretion are characteristic of intrauterine growth-restricted (IUGR) fetuses. In normally grown late gestation fetal sheep pancreatic β-cell numbers and insulin secretion are increased by 7–10 days of pulsatile hyperglycemia (PHG). Our objective was to determine if IUGR fetal sheep β-cell numbers and insulin secretion could also be increased by PHG or if IUGR fetal β-cells do not have the capacity to respond to PHG. Following chronic placental insufficiency producing IUGR in twin gestation pregnancies (n=7), fetuses were administered a PHG infusion, consisting of 60 min, high rate, pulsed infusions of dextrose three times a day with an additional continuous, low-rate infusion of dextrose to prevent a decrease in glucose concentrations between the pulses or a control saline infusion. PHG fetuses were compared with their twin IUGR fetus, which received a saline infusion for 7 days. The pulsed glucose infusion increased fetal arterial glucose concentrations an average of 83% during the infusion. Following the 7-day infusion, a square-wave fetal hyperglycemic clamp was performed in both groups to measure insulin secretion. The rate of increase in fetal insulin concentrations during the first 20 min of a square-wave hyperglycemic clamp was 44% faster in the PHG fetuses compared with saline fetuses (P<0.05). There were no differences in islet size, the insulin+ area of the pancreas and of the islets, and β-cell mass between groups (P>0.23). Chronic PHG increases early phase insulin secretion in response to acute hyperglycemia, indicating that IUGR fetal β-cells are functionally responsive to chronic PHG.