To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To investigate associations between healthcare-associated Clostridioides difficile infection and patient demographics at an urban safety-net hospital and compare findings with national surveillance statistics.
Study participants were selected using a case-control design using medical records collected between August 2014 and May 2018 at Hahnemann University Hospital in Philadelphia. Controls were frequency matched to cases by age and length of stay. Final sample included 170 cases and 324 controls. Neighborhood-level factors were measured using American Community Survey data. Multilevel models were used to examine infection by census tract, deprivation index, race/ethnicity, insurance type, referral location, antibiotic use, and proton-pump inhibitor use.
Patients on Medicare compared to private insurance had 2.04 times (95% CI, 1.31–3.20) the odds of infection after adjusting for all covariables. Prior antibiotic use (2.70; 95% CI, 1.64–4.46) was also associated with infection, but race or ethnicity and referral location were not. A smaller proportion of hospital cases occurred among white patients (25% vs 44%) and patients over the age of 65 (39% vs 56%) than expected based on national surveillance statistics.
Medicare and antibiotics were associated with Clostridioides difficile infection, but evidence did not indicate association with race or ethnicity. This finding diverges from national data in that infection is higher among white people compared to nonwhite people. Furthermore, a greater proportion of hospital cases were aged <65 years than expected based on national data. National surveillance statistics on CDI may not be transportable to safety-net hospitals, which often disproportionately serve low-income, nonwhite patients.
Introduction: For rhythm control of acute atrial flutter (AAFL) in the emergency department (ED), choices include initial drug therapy or initial electrical cardioversion (ECV). We compared the strategies of pharmacological cardioversion followed by ECV if necessary (Drug-Shock), and ECV alone (Shock Only). Methods: We conducted a randomized, blinded, placebo-controlled trial (1:1 allocation) comparing two rhythm control strategies at 11 academic EDs. We included stable adult patients with AAFL, where onset of symptoms was <48 hours. Patients underwent central web-based randomization stratified by site. The Drug-Shock group received an infusion of procainamide (15mg/kg over 30 minutes) followed 30 minutes later, if necessary, by ECV at 200 joules x 3 shocks. The Shock Only group received an infusion of saline followed, if necessary, by ECV x 3 shocks. The primary outcome was conversion to sinus rhythm for ≥30 minutes at any time following onset of infusion. Patients were followed for 14 days. The primary outcome was evaluated on an intention-to-treat basis. Statistical significance was assessed using chi-squared tests and multivariable logistic regression. Results: We randomized 76 patients, and none was lost to follow-up. The Drug-Shock (N = 33) and Shock Only (N = 43) groups were similar for all characteristics including mean age (66.3 vs 63.4 yrs), duration of AAFL (30.1 vs 24.5 hrs), previous AAFL (72.7% vs 69.8%), median CHADS2 score (1 vs 1), and mean initial heart rate (128.9 vs 126.0 bpm). The Drug-Shock and Shock only groups were similar for the primary outcome of conversion (100% vs 93%; absolute difference 7.0%, 95% CI -0.6;14.6; P = 0.25). The multivariable analyses confirmed the similarity of the two strategies (P = 0.19). In the Drug-Shock group 21.2% of patients converted with the infusion. There were no statistically significant differences for time to conversion (84.2 vs 97.6 minutes), total ED length of stay (9.4 vs 7.5 hours), disposition home (100% vs 95.3%), and stroke within 14 days (0 vs 0). Premature discontinuation of infusion (usually for transient hypotension) was more common in the Drug-Shock group (9.1% vs 0.0%) but there were no serious adverse events. Conclusion: Both the Drug-Shock and Shock Only strategies were highly effective and safe in allowing AAFL patients to go home in sinus rhythm. IV procainamide alone was effective in only one fifth of patients, much less than for acute AF.
Introduction: Patients with major bleeding (e.g. gastrointestinal bleeding, and intracranial hemorrhage [ICH]) are commonly encountered in the Emergency Department (ED). A growing number of patients are on either oral or parenteral anticoagulation (AC), but the impact of AC on outcomes of patients with major bleeding is unknown. With regards to oral anticoagulation (OAC), we particularly sought to analyze differences between patients on Warfarin or Direct Oral Anticoagulants (DOACs). Methods: We analyzed a prospectively collected registry (2011-2016) of patients who presented to the ED with major bleeding at two academic hospitals. “Major bleeding” was defined by the International Society on Thrombosis and Haemostasis criteria. The primary outcome, in-hospital mortality, was analyzed using a multivariable logistic regression model. Secondary outcomes included discharge to long-term care among survivors, total hospital length of stay (LOS) among survivors, and total hospital costs. Results: 1,477 patients with major bleeding were included. AC use was found among 215 total patients (14.6%). Among OAC patients (n = 181), 141 (77.9%) had used Warfarin, and 40 (22.1%) had used a DOAC. 484 patients (32.8%) died in-hospital. AC use was associated with higher in-hospital mortality (adjusted odds ratio [OR]: 1.50 [1.17-1.93]). Among survivors to discharge, AC use was associated with higher discharge to long-term care (adjusted OR: 1.73 [1.18-2.57]), prolonged median LOS (19 days vs. 16 days, P = 0.03), and higher mean costs ($69,273 vs. $58,156, P = 0.02). With regards to OAC, a higher proportion of ICH was seen among patients on Warfarin (39.0% vs. 32.5%), as compared to DOACs. No difference in mortality was seen between DOACs and Warfarin (adjusted OR: 0.84 [0.40-1.72]). Patients with major bleeding on Warfarin had longer median LOS (11 days vs. 6 days, P = 0.03) and higher total costs ($51,524 vs. $35,176, P < 0.01) than patients on DOACs. Conclusion: AC use was associated with higher mortality among ED patients with major bleeding. Among survivors, AC use was associated with increased LOS, costs, and discharge to long-term care. Among OAC patients, no difference in mortality was found. Warfarin was associated with prolonged LOS and costs, likely secondary to higher incidence of ICH, as compared to DOACs.
Introduction: For rhythm control of acute atrial fibrillation (AAF) in the emergency department (ED), choices include initial drug therapy or initial electrical cardioversion (ECV). We compared the strategies of pharmacological cardioversion followed by ECV if necessary (Drug-Shock), and ECV alone (Shock Only). Methods: We conducted a randomized, blinded, placebo-controlled trial (1:1 allocation) comparing two rhythm control strategies at 11 academic EDs. We included stable adult patients with AAF, where onset of symptoms was <48 hours. Patients underwent central web-based randomization stratified by site. The Drug-Shock group received an infusion of procainamide (15mg/kg over 30 minutes) followed 30 minutes later, if necessary, by ECV at 200 joules x 3 shocks. The Shock Only group received an infusion of saline followed, if necessary, by ECV x 3 shocks. The primary outcome was conversion to sinus rhythm for ≥30 minutes at any time following onset of infusion. Patients were followed for 14 days. The primary outcome was evaluated on an apriori-specified modified intention-to-treat (MITT) basis excluding patients who never received the study infusion (e.g. spontaneous conversion). Data were analyzed using chi-squared tests and logistic regression. Our target sample size was 374 evaluable patients. Results: Of 395 randomized patients, 18 were excluded from the MITT analysis; none were lost to follow-up. The Drug-Shock (N = 198) and Shock Only (N = 180) groups (total = 378) were similar for all characteristics including mean age (60.0 vs 59.5 yrs), duration of AAF (10.1 vs 10.8 hrs), previous AF (67.2% vs 68.3%), median CHADS2 score (0 vs 0), and mean initial heart rate (119.9 vs 118.0 bpm). More patients converted to normal sinus rhythm in the Drug-Shock group (97.0% vs 92.2%; absolute difference 4.8%, 95% CI 0.2-9.9; P = 0.04). The multivariable analyses confirmed the Drug-Shock strategy superiority (P = 0.04). There were no statistically significant differences for time to conversion (91.4 vs 85.4 minutes), total ED length of stay (7.1 vs 7.7 hours), disposition home (97.0% vs 96.1%), and stroke within 14 days (0 vs 0). Premature discontinuation of infusion was more common in the Drug-Shock group (8.1% vs 0.6%) but there were no serious adverse events. Conclusion: Both the Drug-Shock and Shock Only strategies were highly effective and safe in allowing AAF patients to go home in sinus rhythm. A strategy of initial cardioversion with procainamide was superior to a strategy of immediate ECV.
To assess differences in cognition functions and gross brain structure in children seven years after an episode of severe acute malnutrition (SAM), compared with other Malawian children.
Prospective longitudinal cohort assessing school grade achieved and results of five computer-based (CANTAB) tests, covering three cognitive domains. A subset underwent brain MRI scans which were reviewed using a standardized checklist of gross abnormalities and compared with a reference population of Malawian children.
Children discharged from SAM treatment in 2006 and 2007 (n 320; median age 9·3 years) were compared with controls: siblings closest in age to the SAM survivors and age/sex-matched community children.
SAM survivors were significantly more likely to be in a lower grade at school than controls (adjusted OR = 0·4; 95 % CI 0·3, 0·6; P < 0·0001) and had consistently poorer scores in all CANTAB cognitive tests. Adjusting for HIV and socio-economic status diminished statistically significant differences. There were no significant differences in odds of brain abnormalities and sinusitis between SAM survivors (n 49) and reference children (OR = 1·11; 95 % CI 0·61, 2·03; P = 0·73).
Despite apparent preservation in gross brain structure, persistent impaired school achievement is likely to be detrimental to individual attainment and economic well-being. Understanding the multifactorial causes of lower school achievement is therefore needed to design interventions for SAM survivors to thrive in adulthood. The cognitive and potential economic implications of SAM need further emphasis to better advocate for SAM prevention and early treatment.
Cognitive–behavioural therapy (CBT) is the treatment of choice for generalised anxiety disorder (GAD), yielding significant improvements in approximately 50% of patients. There is significant room for improvement in the outcomes of treatment, especially in recovery.
We aimed to compare metacognitive therapy (MCT) with the gold standard treatment, CBT, in patients with GAD (clinicaltrials.gov identifier: NCT00426426).
A total of 246 patients with long-term GAD were assessed and 81 were randomised into three conditions: CBT (n = 28), MCT (n = 32) and a wait-list control (n = 21). Assessments were made at pre-treatment, post-treatment and at 2 year follow-up.
Both CBT and MCT were effective treatments, but MCT was more effective (mean difference 9.762, 95% CI 2.679–16.845, P = 0.004) and led to significantly higher recovery rates (65% v. 38%). These differences were maintained at 2 year follow-up.
MCT seems to produce recovery rates that exceed those of CBT. These results demonstrate that the effects of treatment cannot be attributed to non-specific therapy factors.
Declaration of interest
A.W. wrote the treatment protocol in MCT and several books on CBT and MCT, and receives royalties from these. T.D.B. wrote the protocol in CBT and has published several articles and chapters on CBT and receives royalties from these. All other authors declare no competing interests.
BACKGROUND: IGTS is a rare phenomenon of paradoxical germ cell tumor (GCT) growth during or following treatment despite normalization of tumor markers. We sought to evaluate the frequency, clinical characteristics and outcome of IGTS in patients in 21 North-American and Australian institutions. METHODS: Patients with IGTS diagnosed from 2000-2017 were retrospectively evaluated. RESULTS: Out of 739 GCT diagnoses, IGTS was identified in 33 patients (4.5%). IGTS occurred in 9/191 (4.7%) mixed-malignant GCTs, 4/22 (18.2%) immature teratomas (ITs), 3/472 (0.6%) germinomas/germinomas with mature teratoma, and in 17 secreting non-biopsied tumours. Median age at GCT diagnosis was 10.9 years (range 1.8-19.4). Male gender (84%) and pineal location (88%) predominated. Of 27 patients with elevated markers, median serum AFP and Beta-HCG were 70 ng/mL (range 9.2-932) and 44 IU/L (range 4.2-493), respectively. IGTS occurred at a median time of 2 months (range 0.5-32) from diagnosis, during chemotherapy in 85%, radiation in 3%, and after treatment completion in 12%. Surgical resection was attempted in all, leading to gross total resection in 76%. Most patients (79%) resumed GCT chemotherapy/radiation after surgery. At a median follow-up of 5.3 years (range 0.3-12), all but 2 patients are alive (1 succumbed to progressive disease, 1 to malignant transformation of GCT). CONCLUSION: IGTS occurred in less than 5% of patients with GCT and most commonly after initiation of chemotherapy. IGTS was more common in patients with IT-only on biopsy than with mixed-malignant GCT. Surgical resection is a principal treatment modality. Survival outcomes for patients who developed IGTS are favourable.
Introduction: Current guideline recommendations for optimal management of non-purulent skin and soft tissue infections (SSTIs) are based on expert consensus. There is currently a lack of evidence to guide emergency physicians on when to select oral versus intravenous antibiotic therapy. The primary objective was to identify risk factors associated with oral antibiotic treatment failure. A secondary objective was to describe the epidemiology of adult emergency department (ED) patients with non-purulent SSTIs. Methods: We performed a health records review of adults (age 18 years) with non-purulent SSTIs treated at two tertiary care EDs. Patients were excluded if they had a purulent infection or infected ulcers without surrounding cellulitis. Treatment failure was defined any of the following after a minimum of 48 hours of oral therapy: (i) hospitalization for SSTI; (ii) change in class of oral antibiotic owing to infection progression; or (iii) change to intravenous therapy owing to infection progression. Multivariable logistic regression was used to identify predictors independently associated with the primary outcome of oral antibiotic treatment failure after a minimum of 48 hours of oral therapy. Results: We enrolled 500 patients (mean age 64 years, 279 male (55.8%) and 126 (25.2%) with diabetes) and the hospital admission rate was 29.6%. The majority of patients (70.8%) received at least one intravenous antibiotic dose in the ED. Of 288 patients who had received a minimum of 48 hours of oral antibiotics, there were 85 oral antibiotic treatment failures (29.5%). Tachypnea at triage (odds ratio [OR]=6.31, 95% CI=1.80 to 22.08), chronic ulcers (OR=4.90, 95% CI=1.68 to 14.27), history of MRSA colonization or infection (OR=4.83, 95% CI=1.51 to 15.44), and cellulitis in the past 12 months (OR=2.23, 95% CI=1.01 to 4.96) were independently associated with oral antibiotic treatment failure. Conclusion: This is the first study to evaluate potential predictors of oral antibiotic treatment failure for non-purulent SSTIs in the ED. We observed a high rate of treatment failure and hospitalization. Tachypnea at triage, chronic ulcers, history of MRSA colonization or infection and cellulitis within the past year were independently associated with oral antibiotic treatment failure. Emergency physicians should consider these risk factors when deciding on oral versus intravenous antimicrobial therapy for non-purulent SSTIs being managed as outpatients.
Introduction: Two published studies reported natriuretic peptides can aid in risk-stratification of Emergency Department (ED) syncope. We sought to assess the role of N-Terminal pro Brain Natriuretic Peptide (NT pro-BNP) to identify syncope patients at risk for serious adverse events (SAE) within 30 days of the ED visit, and its value above that of the Canadian Syncope Risk Score (CSRS). Methods: We conducted a multicenter prospective cohort study at 6 large Canadian EDs from Nov 2011 to Feb 2015. We enrolled adults who presented within 24-hours of syncope and excluded those with persistent altered mentation, obvious seizure, and intoxication. We collected patient characteristics, nine CSRS predictors (includes troponin), ED management and NT pro-BNP levels. Adjudicated serious adverse events (SAE) included death, cardiac SAE (arrhythmias, myocardial infarction, serious structural heart disease) and non-cardiac SAE (pulmonary embolism, severe hemorrhage and procedural interventions within 30-days). We used two tailed t-test and logistic regression analysis. Results: Of the 1359 patients (mean age 57.2 years, 54.7% females, 13.3% hospitalized) enrolled, 148 patients (10.9%; 0.7% deaths, 7.9% cardiac SAE including 6.1% arrhythmia) suffered SAE within 30-days. The mean NT pro-BNP values, when compared to the patients with no SAE (499.8ng/L) was significantly higher among the 56 patients who suffered SAE after ED disposition (3147ng/L, p=0.001), and among the 35 patients with cardiac SAE after ED disposition (2016.2ng/L, p=0.02). While there was a trend to higher levels among patients who suffered arrhythmia after the ED visit, it was not statistically significant (1776.4ng/L, p=0.07). In a model with CSRS predictors, the adjusted odds ratio for NT pro-BNP was 8.0 (95%CI 1.8, 35.9) and troponin was 3.8 (95%CI 1.7, 8.8). The addition of NT pro-BNP did not significantly improve the classification performance (p=0.76) with areas under the curves for CSRS was 0.91 (95%CI 0.88, 0.95) and CSRS with NT pro-BNP was 0.92 (95%CI 0.88, 0.95). Conclusion: In this multicenter study, mean NT pro-BNP levels were significantly higher among ED syncope patients who suffered SAE including cardiac SAE after ED disposition. Though NT pro-BNP was a significant independent predictor of SAE after ED disposition, it did not improve accuracy in ED syncope risk-stratification when compared to CSRS. Hence, we do not recommend NT pro-BNP measurement for ED syncope management.
Introduction: Emergency department (ED) patients with non-purulent skin and soft tissue infections (SSTIs) requiring intravenous antibiotics may be managed via outpatient parenteral antibiotic therapy (OPAT). To date, there are no prospective studies describing the performance of an ED-to-OPAT clinic program. Furthermore, there are no studies that have examined physician rationale for intravenous therapy, despite this being a critical first step in the decision to refer to an OPAT program. Methods: We conducted a prospective observational cohort study of adults (age 18 years) with non-purulent SSTIs receiving parenteral therapy at two tertiary care EDs. Patients were excluded if they had purulent infections or could not provide consent. The emergency physician completed a form documenting rationale for intravenous therapy, infection size, and choice of antimicrobial agent, dose and duration. OPAT treatment failure was defined as hospitalization after a minimum of 48 hours of OPAT for: (i) worsening infection; (ii) peripheral intravenous line complications; or (iii) adverse antibiotic events. Patient satisfaction was assessed at a 14-day telephone follow up. Results: We enrolled a consecutive sample of 153 patients (mean age 60 years, 82 male (53.6%) and 38 (24.8%) with diabetes). A total of 137 patients (89.5%) attended their clinic appointment. Of the 101 patients prescribed cefazolin, 50.5% received 1000 mg and 48.5% received 2000 mg per day. There were low rates of OPAT treatment failure (3.9%). None of the adverse peripheral intravenous line events (9.8%) or adverse antibiotic events (7.2%) required hospitalization. Patients reported a high degree of satisfaction with timeliness of clinic referral (median score 9 out of 10) and overall care received (median score of 10 out of 10). The top 5 reasons given by physicians for selecting intravenous therapy were: clinical impression of severity (52.9%); failed oral antibiotic therapy (41.8%); diabetes (17.6%); severe pain (7.8%); and peripheral vascular disease (7.8%). Conclusion: This is the first study to identify physician rationale for the use of intravenous antibiotics for SSTIs. There was significant variability in antibiotic prescribing practices by ED physicians. This prospective study demonstrates that an ED-to-OPAT clinic program for non-purulent SSTIs is safe, has a low rate of treatment failures and results in high patient satisfaction.
We sought to evaluate the role healthcare providers play in carbapenem-resistant Enterobacteriaceae (CRE) acquisition among hospitalized patients.
A 1:4 case-control study with incidence density sampling.
Academic healthcare center with regular CRE perirectal screening in high-risk units.
We included case patients with ≥1 negative CRE test followed by positive culture with a length of stay (LOS) >9 days. For controls, we included patients with ≥2 negative CRE tests and assignment to the same unit set as case patients with a LOS >9 days.
Controls were time-matched to each case patient. Case exposure was evaluated between days 2 and 9 before positive culture and control evaluation was based on maximizing overlap with the case window. Exposure sources were all CRE-colonized or -infected patients. Nonphysician providers were compared between study patients and sources during their evaluation windows. Dichotomous and continuous exposures were developed from the number of source-shared providers and were used in univariate and multivariate regression.
In total, 121 cases and 484 controls were included. Multivariate analysis showed odds of dichotomous exposure (≥1 source-shared provider) of 2.27 (95% confidence interval [CI], 1.25–4.15; P=.006) for case patients compared to controls. Multivariate continuous exposure showed odds of 1.02 (95% CI, 1.01–1.03; P=.009) for case patients compared to controls.
Patients who acquire CRE during hospitalization are more likely to receive care from a provider caring for a patient with CRE than those patients who do not acquire CRE. These data support the importance of hand hygiene and cohorting measures for CRE patients to reduce transmission risk.
Introduction: Most ambulance communication officers receive minimal education on agonal breathing, often leading to unrecognized out-of-hospital cardiac arrest (OHCA). We sought to evaluate the impact of an educational program on cardiac arrest recognition, and on bystander CPR and survival rates. Methods: Ambulance communication officers in Ottawa, Canada received additional training on agonal breathing, while the control site (Windsor, Canada) did not. Sites were compared to their pre-study performance (before-after design), and to each other (concurrent control). Trained investigators used a piloted-standardized data collection tool when reviewing the recordings for all potential OHCA cases submitted. OHCA was confirmed using our local OHCA registry, and we requested 9-1-1 recordings for OHCA cases not initially suspected. Two independent investigators reviewed medical records for non-OHCA cases receiving telephone-assisted CPR in Ottawa. We present descriptive and chi-square statistics. Results: There were 988 confirmed and suspected OHCA in the “before” (540 Ottawa; 448 Windsor), and 1,076 in the “after” group (689 Ottawa; 387 Windsor). Characteristics of “after” group OHCA patients were: mean age (68.1 Ottawa, 68.2 Windsor); Male (68.5% Ottawa, 64.8% Windsor); witnessed (45.0% Ottawa, 41.9% Windsor); and initial rhythm VF/VT (Ottawa 28.9, Windsor 22.5%). Before-after comparisons were: for cardiac arrest recognition (from 65.4% to 71.9% in Ottawa p=0.03; from 70.9% to 74.1% in Windsor p=0.37); for bystander CPR rates (from 23.0% to 35.9% in Ottawa p=0.0001; from 28.2% to 39.4% in Windsor p=0.001); and for survival to hospital discharge (from 4.1% to 12.5% in Ottawa p=0.001; from 3.9% to 6.9% in Windsor p=0.03). “After” group comparisons between Ottawa and Windsor (control) were not statistically different, except survival (p=0.02). Agonal breathing was common (25.6% Ottawa, 22.4% Windsor) and present in 18.5% of missed cases (15.8% Ottawa, 22.2% Windsor p=0.27). In Ottawa, 31 patients not in OHCA received chest compressions resulting from telephone-assisted CPR instructions. None suffered injury or adverse effects. Conclusion: While all OHCA outcomes improved over time, the educational intervention significantly improved OHCA recognition in Ottawa, and appeared to mitigate the impact of agonal breathing.
Intestinal barrier integrity is a prerequisite for homeostasis of mucosal function, which is balanced to maximise absorptive capacity, while maintaining efficient defensive reactions against chemical and microbial challenges. Evidence is mounting that disruption of epithelial barrier integrity is one of the major aetiological factors associated with several gastrointestinal diseases, including infection by pathogens, obesity and diabetes, necrotising enterocolitis, irritable bowel syndrome and inflammatory bowel disease. The notion that specific probiotic bacterial strains can affect barrier integrity fuelled research in which in vitro cell lines, animal models and clinical trials are used to assess whether probiotics can revert the diseased state back to homeostasis and health. This review catalogues and categorises the lines of evidence available in literature for the role of probiotics in epithelial integrity and, consequently, their beneficial effect for the reduction of gastrointestinal disease symptoms.
Recent scholarship in political science identifies emotions as an important antecedent to political behavior. Existing work, however, has focused much more on the political effects of emotions than on their causes. Here, we begin to examine how personality moderates emotional responses to political events. We hypothesized that the personality trait need for affect (NFA) would moderate the emotions evoked by disturbing political news. Drawing data from a survey experiment conducted on a national sample, we find that individuals high in NFA have an especially vivid emotional response to disturbing news—a moderating relationship that has the potential to surpass those associated with symbolic attachments.
The emergence of invasive fungal wound infections (IFIs) in combat casualties led to development of a combat trauma-specific IFI case definition and classification. Prospective data were collected from 1133 US military personnel injured in Afghanistan (June 2009–August 2011). The IFI rates ranged from 0·2% to 11·7% among ward and intensive care unit admissions, respectively (6·8% overall). Seventy-seven IFI cases were classified as proven/probable (n = 54) and possible/unclassifiable (n = 23) and compared in a case-case analysis. There was no difference in clinical characteristics between the proven/probable and possible/unclassifiable cases. Possible IFI cases had shorter time to diagnosis (P = 0·02) and initiation of antifungal therapy (P = 0·05) and fewer operative visits (P = 0·002) compared to proven/probable cases, but clinical outcomes were similar between the groups. Although the trauma-related IFI classification scheme did not provide prognostic information, it is an effective tool for clinical and epidemiological surveillance and research.
To examine barriers to initiation and continuation of mental health treatment among individuals with common mental disorders.
Data were from the World Health Organization (WHO) World Mental Health (WMH) surveys. Representative household samples were interviewed face to face in 24 countries. Reasons to initiate and continue treatment were examined in a subsample (n = 636 78) and analyzed at different levels of clinical severity.
Among those with a DSM-IV disorder in the past 12 months, low perceived need was the most common reason for not initiating treatment and more common among moderate and mild than severe cases. Women and younger people with disorders were more likely to recognize a need for treatment. A desire to handle the problem on one's own was the most common barrier among respondents with a disorder who perceived a need for treatment (63.8%). Attitudinal barriers were much more important than structural barriers to both initiating and continuing treatment. However, attitudinal barriers dominated for mild-moderate cases and structural barriers for severe cases. Perceived ineffectiveness of treatment was the most commonly reported reason for treatment drop-out (39.3%), followed by negative experiences with treatment providers (26.9% of respondents with severe disorders).
Low perceived need and attitudinal barriers are the major barriers to seeking and staying in treatment among individuals with common mental disorders worldwide. Apart from targeting structural barriers, mainly in countries with poor resources, increasing population mental health literacy is an important endeavor worldwide.