To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Mindfulness-based therapies have been demonstrated to be effective in reducing anxiety, stress and depressive symptoms in adults. Depression is a chronic relapsing condition. Major depressive disorder is one of the most common causes of ill health and functional impairment.
Our goal was to assess the real world clinical effectiveness of Mindfulness Based Cognitive Therapy (MBCT) for Recurrent Depressive Disorder in three domains:
-Depression, anxiety and stress levels
Patients with a diagnosis of Recurrent Depressive Disorder (primary or secondary diagnosis) were referred by their community mental health team to participate in an 8-week educational MBCT programme. Participants completed the Depression, Anxiety and Stress (DASS), 5-Facet Mindfulness and Self Compassion self-rated scales prior to commencing and at the end of the course. They were also invited to give qualitative feedback at the end of the course.
Data were collected from four groups who completed the course over a period of twelve months. A paired samples test was used to compare pre and post intervention scores to determine effect size.
We had complete data for 19 participants out of a cohort of 34. Pre intervention scores were similar for both groups.
The mean age of the cohort was 47 years (SD of 10 years), 3 male, 16 female.
Patients showed a clinically significant reduction of symptoms in depression, anxiety and stress, with respective reductions of 48%, 26% and 43% post intervention. Results were statistically significant for depressaion and stress p <0.001 but not for anxiety p = 0.130.
Positive trends were seen in all domains of the 5-Fact Mindfulness and Self Compassions scales, with mean improvements of 28.2% and 35.3% respectively. All results were statistically significant.
We also collected anonymized qualitative feedback which highlighted themes of empowerment, skill acquisition and improved coping.
Numerous studies have demonstrated poor compliance with antidepressant treatments commonly prescribed in Recurrent Depressive Disorder. This small scale study demonstrates a statistical and clinical benefit of MBCT for these patients, supporting greater use of such novel non-pharmacological therapeutic options as treatment strategies..
The majority of people with dementia will develop one or more behavioural or psychological symptoms of dementia (BPSD) as the illness progresses. Treating these symptoms in diverse residential environments is a challenge, with frequent prescribing of antipsychotic medications. The risks and limited benefits of antipsychotic use in this context are well recognised, prompting national guidelines in Ireland to improve prescribing patterns.
1) Assess the frequency and appropriateness of prescribing of antipsychotic medication in older adults with BPSD referred to Psychiatry of Old Age service in the West of Ireland (Sligo) by comparing with best practice guidelines.
2) Address identified deficits via quality improvement initiatives within department.
Audit standards were set using draft National Clinical Guidelines and NICE guidelines for prescribing in dementia to develop a study specific audit tool.
Items assessed included: the frequency of review of antipsychotic use, whether or not non-pharmacological methods were trialled, if there was an assessment of benefit of the antipsychotic and discussion or risks, if a reduction/discontinuation of antipsychotic was considered, if metabolic monitoring was achieved.
Clinical records for all patients actively under the care of the clinical team with a diagnosis of BPSD were assessed using this tool at the time of the study.
49 patients with BPSD were attending the service in this time period. 58% (n = 29) of the entire cohort were prescribed an antipsychotic, most commonly quetiapine. Patients cared for at home showed the lowest levels of antipsychotic use at 50% (n = 18), while those who were in nursing home (80%, n = 8) and hospital care (100%, n = 3) showed higher rates, though this sample size was too small to demonstrate statistical significance, χ2 = 5.12 p = 0.077.
Exploration of non pharmacological management of BPSD, documentation of discussion of risks of AP medication (metabolic, cardiovascular, falls, sedation, extrapyramidal), attempt at dose reduction or antipsychotic withdrawal were all achieved in less than 45% of cases (range 33–45%).
This audit revealed higher than expected rates of antipsychotic prescribing in our BPSD cohort. It also revealed suboptimal documentation around the use of antipsychotics in this population during clinical interactions.
A subsequent intervention to the proforma assessment tool to prompt these discussions improved these behaviours, there was no impact on the rates of antipsychotic prescribing.
Despite increased attention regarding the limited benefits of antipsychotic medication in BPSD their use remains widespread. Due attention must be given to changing this practice in order to protect this vulnerable patient group.
Background: In recent years, the historic declines in the incidence of methicillin-resistant Staphylococcus aureus (MRSA) bloodstream infections (BSIs) in the United States have slowed. We examined trends in the incidence of community-onset (CO) MRSA BSIs among hospitalized persons with and without substance-use diagnoses. Methods: Using data from >200 US hospitals reporting to the Premier Healthcare Database (PHD) during 2012–2017, we conducted a retrospective study among hospitalized persons aged ≥18 years. MRSA BSIs with substance use were defined as hospitalizations having both a blood culture positive for MRSA and at least 1 International Classification of Disease, Ninth Revision, Clinical Modification (ICD-9-CM) or ICD-10-CM diagnostic code for substance use including opioids, cocaine, amphetamines, or other substances (excluding cannabis, alcohol, and nicotine). MRSA BSIs were considered community onset when a positive blood culture was collected within 3 days of admission. We assessed annual trends and described characteristics of CO MRSA BSI hospitalizations, stratified by substance use. Results: Of 20,049 MRSA BSIs from 2012 to 2017, 17,634 (88%) were CO. Overall, MRSA BSI incidence decreased 7%, from 178.5 to 166.2 per 100,000 hospitalizations during the study period; However, CO MRSA BSI rates remained stable (152.7 to 149.9 per 100,000 hospitalizations). Among CO MRSA BSIs, 1,838 (10%) were BSIs with substance-use diagnoses; the incidence of CO MRSA BSIs with substance use increased 236% (from 8.2 to 27.6 per 100,000 hospitalizations) and represented a greater proportion of the CO MRSA rate over the study period (Fig. 1). The incidence of CO MRSA BSIs without substance use decreased 15% (from 144.5 to 122.4 per 100,000 hospitalizations). Patients with CO MRSA BSIs with substance use were younger (median, 40 vs 65 years), more likely to be female (50% vs 40%), white (79% vs 69%), and to leave against medical advice (15% vs 1%). Among patients not leaving against medical advice, CO BSI patients with substance-use diagnoses had longer lengths of stay (median, 11 vs 9 days), lower in-hospital mortality (9% vs 14%), and higher hospitalization costs (median, $22,912 vs $17,468) compared to patients without substance-use diagnoses. Conclusions: Although the overall CO MRSA BSI rate remained unchanged from 2012 to 2017, infections with substance use diagnoses increased >3-fold, and infections without substance use diagnoses decreased. These data suggest that the emergence of MRSA associated with substance-use diagnoses threatens potential progress in reducing the incidence of CO MRSA infections. Additional strategies may be needed to prevent MRSA BSI in patients with substance-use diagnoses, and to maintain national progress in the reduction of MRSA infections overall.
Background: Microbiology data are utilized to quantify epidemiology and trends in pathogens, antimicrobial resistance, and bloodstream infections. Understanding variability and trends in rates of hospital-level blood culture utilization may be important for interpreting these findings. Methods: We used clinical microbiology results and discharge data to identify monthly blood culture rates from US hospitals participating in the Premier Healthcare Database during 2012–2017. We included all discharges from months where a hospital reported at least 1 blood culture with microbiology and antimicrobial susceptibility results. Blood cultures drawn on or before day 3 were defined as admission cultures (ACs); blood cultures collected after day 3 were defined as a postadmission cultures (PACs). The AC rate was defined as the proportion of all hospitalizations with an AC. The PAC rate was defined as the number of days with a PAC among all patient days. Generalized estimating equation regression models that accounted for hospital-level clustering with an exchangeable correlation matrix were used to measure associations of monthly rates with hospital bed size, teaching status, urban–rural designation, region, month, and year. The AC rates were modeled using logistic regression, and the PAC rates were modeled using a Poisson distribution. Results: We included 11.7 million hospitalizations from 259 hospitals, accounting for nearly 52 million patient days. The median annual hospital-level AC rate was 27.1%, with interhospital variation ranging from 21.1% (quartile 1) to 35.2% (quartile 3) (Fig. 1). Multivariable models revealed no significant trends over time (P = .74), but statistically significant associations between AC rates with month (P < .001) and region (P = .003), associations with teaching status (P = .063), and urban-rural designation (P = .083) approached statistical significance. There was no association with bed size (P = .38). The median annual hospital-level PAC rate was 11.1 per 1,000 patient days, and interhospital variability ranged from 7.6 (quartile 1) to 15.2 (quartile 3) (Fig. 2). Multivariable models of PAC rates showed no significant trends over time (P = .12). We found associations between PAC rates with month (P = .016), bed size (P = .030), and teaching status (P = .040). PAC rates were not associated with urban–rural designation (P = .52) or region (P = .29). Conclusions: Blood culture utilization rates in this large cohort of hospitals were unchanged between 2012 and 2017, though substantial interhospital variability was detected. Although both AC and PAC rates vary by time of year and potentially by teaching status, AC rates vary by geographic characteristics whereas PAC rates vary by bed size. These factors are important to consider when comparing rates of bloodstream infections by hospital.
The U.S. Food and Drug Administration (FDA) traditionally has kept confidential significant amounts of information relevant to the approval or non-approval of specific drugs, devices, and biologics and about the regulatory status of such medical products in FDA’s pipeline.
To develop practical recommendations for FDA to improve its transparency to the public that FDA could implement by rulemaking or other regulatory processes without further congressional authorization. These recommendations would build on the work of FDA’s Transparency Task Force in 2010.
In 2016-2017, we convened a team of academic faculty from Harvard Medical School, Brigham and Women’s Hospital, Yale Medical School, Yale Law School, and Johns Hopkins Bloomberg School of Public Health to develop recommendations through an iterative process of reviewing FDA’s practices, considering the legal and policy constraints on FDA in expanding transparency, and obtaining insights from independent observers of FDA.
The team developed 18 specific recommendations for improving FDA’s transparency to the public. FDA could adopt all these recommendations without further congressional action.
The development of the Blueprint for Transparency at the U.S. Food and Drug Administration was funded by the Laura and John Arnold Foundation.
This study investigated comparatively the pathogenicity of experimental infection of mice and guinea pigs, with Angiostrongylus mackerrasae and the closely related species A. cantonensis. Time course analyses showed that A. mackerrasae causes eosinophilic meningitis in these hosts, which suggests that the species has the potential to cause meningitis in humans and domestic animals. Both A. mackerrasae and the genetically similar A. cantonensis caused eosinophilic meningitis in mice at two time points of 14 and 21 days post infection (dpi). The brain lesions in mice infected with A. mackerrasae were more granulomatous in nature and the parasites were more likely to appear degenerate compared with lesions caused by A. cantonensis. This may indicate that the mouse immune system eliminates A. mackerrasae infection more effectively. The immunologic responses of mice infected with the two Angiostrongylus species was compared by assessing ex vivo stimulated spleen derived T cells and cytokines including interferon-gamma, interleukin 4 and interleukin 17 on 14 and 21 dpi. The results were similar for mice infected with A. cantonensis and A. mackerrasae. Serum from the infected animals with either A. cantonensis or A. mackerrasae recognized total soluble antigen of A. cantonensis female worms on Western blot.
Disaster response requires rapid, complex action by multiple agencies that may rarely interact during nondisaster periods. Failures in communication and coordination between agencies have been pitfalls in the advancement of disaster preparedness. Recommendations of the Federal Emergency Management Agency address these needs and demonstrate commitment to successful disaster management, but they are challenging for communities to ensure. In this article we describe the application of Federal Emergency Management Agency guidelines to the 2008 and 2009 Chicago Marathon and discuss the details of our implementation strategy with a focus on optimizing communication. We believe that it is possible to enhance community disaster preparedness through practical application during mass sporting events.
(Disaster Med Public Health Preparedness. 2011;5:310–315)
Community-based natural resource management (CBNRM) has been on the ascendancy for several decades and plays a leading role in conservation strategies worldwide. Arriving out of a desire to rectify the human costs associated with coercive conservation, CBNRM sought to return the stewardship of biodiversity and natural resources to local communities through participation, empowerment and decentralization. Today, however, scholars and practitioners suggest that CBNRM is experiencing a crisis of identity and purpose, with even the most positive examples experiencing only fleeting success due to major deficiencies. Six case studies from around the world offer a history of how and why the global CBNRM narrative has unfolded over time and space. While CBNRM emerged with promise and hope, it often ended in less than ideal outcomes when institutionalized and reconfigured in design and practice. Nevertheless, despite the current crisis, there is scope for refocusing on the original ideals of CBNRM: ensuring social justice, material well-being and environmental integrity.
Objectives: US hospitals are expected to function without external aid for up to 96 hours during a disaster; however, concern exists that there is insufficient capacity in hospitals to absorb large numbers of acute casualties. The aim of the study was to determine the potential for creation of inpatient bed surge capacity from the early discharge (reverse triage) of hospital inpatients at low risk of untoward events for up to 96 hours.
Methods: In a health system with 3 capacity-constrained hospitals that are representative of US facilities (academic, teaching affiliate, community), a variety (N = 50) of inpatient units were prospectively canvassed in rotation using a blocked randomized design for 19 weeks ending in February 2006. Intensive care units (ICUs), nurseries, and pediatric units were excluded. Assuming a disaster occurred on the day of enrollment, patients who did not require any (previously defined) critical intervention for 4 days were deemed suitable for early discharge.
Results: Of 3491 patients, 44% did not require any critical intervention and were suitable for early discharge. Accounting for additional routine patient discharges, full use of staffed and unstaffed licensed beds, gross surge capacity was estimated at 77%, 95%, and 103% for the 3 hospitals. Factoring likely continuance of nonvictim emergency admissions, net surge capacity available for disaster victims was estimated at 66%, 71%, and 81%, respectively. Reverse triage made up the majority (50%, 55%, 59%) of surge beds. Most realized capacity was available within 24 to 48 hours.
Conclusions: Hospital surge capacity for standard inpatient beds may be greater than previously believed. Reverse triage, if appropriately harnessed, can be a major contributor to surge capacity. (Disaster Med Public Health Preparedness. 2009;3(Suppl 1):S10–S16)
Mastitis is one of the most costly diseases to the dairy farming industry. Conventional antibiotic therapy is often unsatisfactory for successful treatment of mastitis and alternative treatments are continually under investigation. We have previously demonstrated, in two separate field trials, that a probiotic culture, Lactococcus lactis DPC 3147, was comparable to antibiotic therapy to treat bovine mastitis. To understand the mode of action of this therapeutic, we looked at the detailed immune response of the host to delivery of this live strain directly into the mammary gland of six healthy dairy cows. All animals elicited signs of udder inflammation 7 h post infusion. At this time, clots were visible in the milk of all animals in the investigation. The most pronounced increase in immune gene expression was observed in Interleukin (IL)-1β and IL-8, with highest expression corresponding to peaks in somatic cell count. Infusion with a live culture of a Lc. lactis leads to a rapid and considerable innate immune response.
Reported here is the development of a novel evolved gas analysis technique; Sub-Ambient Thermal Volatilization Analysis (SATVA) and its application in characterizing key analyte species from conservation artifacts. In this work SATVA has been applied to the study of volatiles evolution processes occurring in number of model conservation artifacts. The evolution of volatile species from cured formaldehyde resin, leather and metallic artifacts has been studied by SATVA. The specific analytes making up the total quantity of evolved material in each case have been separated and identified using sub-ambient differential distillation and a combination of online mass spectrometry, gas phase IR spectroscopy and GC-MS. The data gathered has been used to provide information on both the degradation processes occurring within the artifacts and the environmental history of the artifacts themselves.
The Hurrian city of Nuzi, in modern Iraq, was an important site during the Mesopotamian Bronze Age. Excavations in the late 20s and early 30s yielded a large and important assemblage of glass and other vitreous materials and smaller but significant assemblages of metals and ceramics. Although the vitreous materials have been widely studied in the past, the other assemblages have received little attention. However a recent study of some metal artifacts indicated the presence of brass and dirty copper rather than the expected bronze. This study was, however, limited to a few objects and the proportions of the different alloys was not investigated. Recent analytical studies on the glass beads have highlighted compositional differences between Egyptian and Mesopotamian glass and attempted to link these to the raw materials used. The lack of significant tin or zinc in glasses colored with copper is interesting given the presence of brass and the apparent scarcity of bronze in the copper alloys. The current study involves reassessment of the entire assemblage, concentrating initially on the vitreous materials, glazes and metals. Variations in preservation across the site and within individual buildings are currently being examined. Full characterization of the assemblages will allow relationships between different manufacturing technologies and the raw materials needed to be investigated.
Objectives: The aim of this study was to assess the cost-effectiveness of a class-based exercise program supplementing a home-based program when compared with a home-based program alone. In addition, we estimated the probability that the supplementary class program is cost-effective over a range of values of a decision maker's willingness to pay for an additional quality-adjusted life-year (QALY).
Methods: The resource use and effectiveness data were collected as part of the clinical trial detailed elsewhere. Unit costs were estimated from published sources. The net benefit approach to cost-effectiveness analysis is used to estimate the probability of the intervention being cost-effective.
Results: The addition of a supplementary class-based group results in an increase in QALYs and lower costs. For all plausible values of a decision maker's willingness to pay for a QALY, the supplementary class group is likely to be cost-effective.
Conclusions: The addition of a class-based exercise program is likely to be cost-effective and, on current evidence, should be implemented.
the feasibility of percutaneous intramuscular functional electrical stimulation (p-fes) in children with cerebral palsy (cp) for immediate improvement of ankle kinematics during gait has not previously been reported. eight children with cp (six with diplegia, two with hemiplegia; mean age 9 years 1 month [sd 1y 4mo; range 7y 11mo to 11y 10mo]) had percutaneous intramuscular electrodes implanted into the gastrocnemius (ga) and tibialis anterior (ta) muscles of their involved limbs. stimulation was provided during appropriate phases of the gait cycle in three conditions (ga only, ta only, and ga/ta). immediately after a week of practice for each stimulation condition, a gait analysis was performed with and without stimulation. a significant improvement in peak dorsiflexion in swing for the more affected extremity and dorsiflexion at initial contact for the less affected extremity were found in the ga/ta condition. clinically meaningful trends were evident for improvements in dorsiflexion kinematics for the more and less affected extremities in the ta only and ga/ta conditions. the results suggest that p-fes might immediately improve ankle kinematics in children with cp.