To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Cognitive-behavioural therapy (CBT) is an effective treatment for depressed adults. CBT interventions are complex, as they include multiple content components and can be delivered in different ways. We compared the effectiveness of different types of therapy, different components and combinations of components and aspects of delivery used in CBT interventions for adult depression. We conducted a systematic review of randomised controlled trials in adults with a primary diagnosis of depression, which included a CBT intervention. Outcomes were pooled using a component-level network meta-analysis. Our primary analysis classified interventions according to the type of therapy and delivery mode. We also fitted more advanced models to examine the effectiveness of each content component or combination of components. We included 91 studies and found strong evidence that CBT interventions yielded a larger short-term decrease in depression scores compared to treatment-as-usual, with a standardised difference in mean change of −1.11 (95% credible interval −1.62 to −0.60) for face-to-face CBT, −1.06 (−2.05 to −0.08) for hybrid CBT, and −0.59 (−1.20 to 0.02) for multimedia CBT, whereas wait list control showed a detrimental effect of 0.72 (0.09 to 1.35). We found no evidence of specific effects of any content components or combinations of components. Technology is increasingly used in the context of CBT interventions for depression. Multimedia and hybrid CBT might be as effective as face-to-face CBT, although results need to be interpreted cautiously. The effectiveness of specific combinations of content components and delivery formats remain unclear. Wait list controls should be avoided if possible.
Environmental risk factors for dementia are poorly understood. Aluminium and fluorine in drinking water have been linked with dementia but uncertainties remain about this relationship.
In the largest longitudinal study in this context, we set out to explore the individual effect of aluminium and fluoride in drinking water on dementia risk and, as fluorine can increase absorption of aluminium, we also examine any synergistic influence on dementia.
We used Cox models to investigate the association between mean aluminium and fluoride levels in drinking water at their residential location (collected 2005–2012 by the Drinking Water Quality Regulator for Scotland) with dementia in members of the Scottish Mental Survey 1932 cohort who were alive in 2005.
A total of 1972 out of 6990 individuals developed dementia by the linkage date in 2012. Dementia risk was raised with increasing mean aluminium levels in women (hazard ratio per s.d. increase 1.09, 95% CI 1.03–1.15, P < 0.001) and men (1.12, 95% CI 1.03–1.21, P = 0.004). A dose-response pattern of association was observed between mean fluoride levels and dementia in women (1.34, 95% CI 1.28–1.41, P < 0.001) and men (1.30, 95% CI 1.22–1.39, P < 0.001), with dementia risk more than doubled in the highest quartile compared with the lowest. There was no statistical interaction between aluminium and fluoride levels in relation with dementia.
Higher levels of aluminium and fluoride were related to dementia risk in a population of men and women who consumed relatively low drinking-water levels of both.
Healthcare organizations are required to provide workers with respiratory protection (RP) to mitigate hazardous airborne inhalation exposures. This study sought to better identify gaps that exist between RP guidance and clinical practice to understand issues that would benefit from additional research or clarification.
An insect trap constructed using three-dimensional (3D) printing technology was tested in potato (Solanum tuberosum Linnaeus; Solanaceae) fields to determine whether it could substitute for the standard yellow sticky card used to monitor Bactericera cockerelli (Šulc) (Hemiptera: Psylloidea: Triozidae). Sticky cards have shortcomings that prompted search for a replacement: cards are messy, require weekly replacement, are expensive to purchase, and accumulate large numbers of nontarget insects. Bactericera cockerelli on sticky cards also deteriorate enough that specimens cannot be tested reliably for the presence of vectored plant pathogens. A prototype trap constructed using 3D printing technology for monitoring Diaphorina citri Kuwayama (Hemiptera: Psylloidea: Liviidae) was tested for monitoring B. cockerelli. The trap was designed to attract B. cockerelli visually to the trap and then funnel specimens into preservative-filled vials at the trap bottom. Prototype traps were paired against yellow sticky cards at multiple fields to compare the captures of B. cockerelli between cards and traps. The prototype trap was competitive with sticky cards early in the growing season when B. cockerelli numbers were low. We estimated that two or three prototype traps would collect as many B. cockerelli as one sticky card under these conditions. Efficacy of the prototype declined as B. cockerelli numbers increased seasonally. The prototype trap accumulated nontarget taxa that are common on sticky cards (especially Thysanoptera and Diptera), and was also found to capture taxa of possible interest in integrated pest management research, including predatory insects, parasitic Hymenoptera, and winged Aphididae (Hemiptera), suggesting that the traps could be useful outside of the purpose targeted here. We believe that 3D printing technology has substantial promise for developing monitoring tools that exploit behavioural traits of the targeted insect. Ongoing work includes the use of this technology to modify the prototype, with a focus on making it more effective at capturing psyllids and less susceptible to capture of nontarget species.
The role that vitamin D plays in pulmonary function remains uncertain. Epidemiological studies reported mixed findings for serum 25-hydroxyvitamin D (25(OH)D)–pulmonary function association. We conducted the largest cross-sectional meta-analysis of the 25(OH)D–pulmonary function association to date, based on nine European ancestry (EA) cohorts (n 22 838) and five African ancestry (AA) cohorts (n 4290) in the Cohorts for Heart and Aging Research in Genomic Epidemiology Consortium. Data were analysed using linear models by cohort and ancestry. Effect modification by smoking status (current/former/never) was tested. Results were combined using fixed-effects meta-analysis. Mean serum 25(OH)D was 68 (sd 29) nmol/l for EA and 49 (sd 21) nmol/l for AA. For each 1 nmol/l higher 25(OH)D, forced expiratory volume in the 1st second (FEV1) was higher by 1·1 ml in EA (95 % CI 0·9, 1·3; P<0·0001) and 1·8 ml (95 % CI 1·1, 2·5; P<0·0001) in AA (Prace difference=0·06), and forced vital capacity (FVC) was higher by 1·3 ml in EA (95 % CI 1·0, 1·6; P<0·0001) and 1·5 ml (95 % CI 0·8, 2·3; P=0·0001) in AA (Prace difference=0·56). Among EA, the 25(OH)D–FVC association was stronger in smokers: per 1 nmol/l higher 25(OH)D, FVC was higher by 1·7 ml (95 % CI 1·1, 2·3) for current smokers and 1·7 ml (95 % CI 1·2, 2·1) for former smokers, compared with 0·8 ml (95 % CI 0·4, 1·2) for never smokers. In summary, the 25(OH)D associations with FEV1 and FVC were positive in both ancestries. In EA, a stronger association was observed for smokers compared with never smokers, which supports the importance of vitamin D in vulnerable populations.
Pepper & Nettle use an evolutionary framework to argue that “temporal discounting” is an appropriate response to low socioeconomic status (SES), or deprivation. We suggest some conceptual refinements to their “appropriate-response” perspective, with the hope that it usefully informs future research on and public policy responses to the relationship between deprivation and temporal discounting.
Background: Measurement of cognitive behavioural therapy (CBT) competency is often resource intensive. A popular emerging alternative to independent observers’ ratings is using other perspectives for rating competency. Aims: This pilot study compared ratings of CBT competency from four perspectives – patient, therapist, supervisor and independent observer using the Cognitive Therapy Scale (CTS). Method: Patients (n = 12, 75% female, mean age 30.5 years) and therapists (n = 5, female, mean age 26.6 years) completed the CTS after therapy sessions, and clinical supervisor and independent observers rated recordings of the same session. Results: Analyses of variance revealed that therapist average CTS competency ratings were not different from supervisor ratings, and supervisor ratings were not different from independent observer ratings; however, therapist ratings were higher than independent observer ratings and patient ratings were higher than all other raters. Conclusions: Raters differed in competency ratings. Implications for potential use and adaptation of CBT competency measurement methods to enhance training and implementation are discussed.
Burkart et al.'s impressive synthesis will serve as a valuable resource for intelligence research. Despite its strengths, the target article falls short of offering compelling explanations for the evolution of intelligence. Here, we outline its shortcomings, illustrate how these can lead to misguided conclusions about the evolution of intelligence, and suggest ways to address the article's key questions.
Older adults are a potentially medically vulnerable population with increased mortality rates during and after disasters. To evaluate the impact of a natural disaster on this population, we performed a temporal and geospatial analysis of emergency department (ED) use by adults aged 65 years and older in New York City (NYC) following Hurricane Sandy’s landfall.
We used an all-payer claims database to analyze demographics, insurance status, geographic distribution, and health conditions for post-disaster ED visits among older adults. We compared ED patterns of use in the weeks before and after Hurricane Sandy throughout NYC and the most afflicted evacuation zones.
We found significant increases in ED utilization by older adults (and disproportionately higher in those aged ≥85 years) in the 3 weeks after Hurricane Sandy, especially in NYC evacuation zone one. Primary diagnoses with notable increases included dialysis, electrolyte disorders, and prescription refills. Secondary diagnoses highlighted homelessness and care access issues.
Older adults display heightened risk for worse health outcomes with increased ED visits after a disaster. Our findings suggest the need for dedicated resources and planning for older adults following a natural disaster by ensuring access to medical facilities, prescriptions, dialysis, and safe housing and by optimizing health care delivery needs to reduce the burden of chronic disease. (Disaster Med Public Health Preparedness. 2018;12:184–193)
There is a need for clinical tools to identify cultural issues in diagnostic assessment.
To assess the feasibility, acceptability and clinical utility of the DSM-5 Cultural Formulation Interview (CFI) in routine clinical practice.
Mixed-methods evaluation of field trial data from six countries. The CFI was administered to diagnostically diverse psychiatric out-patients during a diagnostic interview. In post-evaluation sessions, patients and clinicians completed debriefing qualitative interviews and Likert-scale questionnaires. The duration of CFI administration and the full diagnostic session were monitored.
Mixed-methods data from 318 patients and 75 clinicians found the CFI feasible, acceptable and useful. Clinician feasibility ratings were significantly lower than patient ratings and other clinician-assessed outcomes. After administering one CFI, however, clinician feasibility ratings improved significantly and subsequent interviews required less time.
The CFI was included in DSM-5 as a feasible, acceptable and useful cultural assessment tool.
Woolly distaff thistle is a long-lived winter annual that threatens the ranching and dairy industries within the North Coast counties of California, particularly the organic producers. No peer-reviewed publications have documented effective control options or integrated management approaches for this species. We conducted two experiments, each replicated, in Marin County, California. The first compared several conventional herbicides at two timings and rates, while the second compared a conventional herbicide treatment with organic and integrated organic control methods, including an organic herbicide (mixture of capric and caprylic acids). Results of the conventional herbicide treatments showed most spring applications (March or April) of aminopyralid, aminocyclopyrachlor, clopyralid, and combinations of aminopyralid + triclopyr, or aminocyclopyrachlor + chlorsulfuron had greater than 99% control of woolly distaff thistle with fewer than 1.5 seedlings per 27-m2 plot by the end of the growing season. Higher rates were generally necessary to achieve the same level of control with winter (January) applications. In the organic herbicide treatments, the most consistent treatment was a combination of mowing followed by 9% (v/v) or the organic herbicide. This treatment was slightly less effective compared with aminopyralid but did have better than 95% control of woolly distaff thistle. The results of this study provide control options for both conventional and organic ranching practices where woolly distaff thistle is a problem.
To assess the impact of an emergency intensive care unit (EICU) established concomitantly with a freestanding emergency department (ED) during the aftermath of Hurricane Sandy.
We retrospectively reviewed records of all patients in Bellevue’s EICU from freestanding ED opening (December 10, 2012) until hospital inpatient reopening (February 7, 2013). Temporal and clinical data, and disposition upon EICU arrival, and ultimate disposition were evaluated.
Two hundred twenty-seven patients utilized the EICU, representing approximately 1.8% of freestanding ED patients. Ambulance arrival occurred in 31.6% of all EICU patients. Median length of stay was 11.55 hours; this was significantly longer for patients requiring airborne isolation (25.60 versus 11.37 hours, P<0.0001 by Wilcoxon rank sum test). After stabilization and treatment, 39% of EICU patients had an improvement in their disposition status (P<0.0001 by Wilcoxon signed rank test); upon interhospital transfer, the absolute proportion of patients requiring ICU and SDU resources decreased from 37.8% to 27.1% and from 22.2% to 2.7%, respectively.
An EICU attached to a freestanding ED achieved significant reductions in resource-intensive medical care. Flexible, adaptable care systems should be explored for implementation in disaster response. (Disaster Med Public Health Preparedness. 2016;10:496–502)
We aimed to characterize the geographic distribution of post-Hurricane Sandy emergency department use in administrative flood evacuation zones of New York City.
Using emergency claims data, we identified significant deviations in emergency department use after Hurricane Sandy. Using time-series analysis, we analyzed the frequency of visits for specific conditions and comorbidities to identify medically vulnerable populations who developed acute postdisaster medical needs.
We found statistically significant decreases in overall post-Sandy emergency department use in New York City but increased utilization in the most vulnerable evacuation zone. In addition to dialysis- and ventilator-dependent patients, we identified that patients who were elderly or homeless or who had diabetes, dementia, cardiac conditions, limitations in mobility, or drug dependence were more likely to visit emergency departments after Hurricane Sandy. Furthermore, patients were more likely to develop drug-resistant infections, require isolation, and present for hypothermia, environmental exposures, or administrative reasons.
Our study identified high-risk populations who developed acute medical and social needs in specific geographic areas after Hurricane Sandy. Our findings can inform coherent and targeted responses to disasters. Early identification of medically vulnerable populations can help to map “hot spots” requiring additional medical and social attention and prioritize resources for areas most impacted by disasters. (Disaster Med Public Health Preparedness. 2016;10:351–361)
Surgical site infections (SSIs) are responsible for significant morbidity and mortality. Preadmission skin antisepsis, while controversial, has gained acceptance as a strategy for reducing the risk of SSI. In this study, we analyze the benefit of an electronic alert system for enhancing compliance to preadmission application of 2% chlorhexidine gluconate (CHG).
DESIGN, SETTING, AND PARTICIPANTS
Following informed consent, 100 healthy volunteers in an academic, tertiary care medical center were randomized to 5 chlorhexidine gluconate (CHG) skin application groups: 1, 2, 3, 4, or 5 consecutive applications. Participants were further randomized into 2 subgroups: with or without electronic alert. Skin surface concentrations of CHG (μg/mL) were analyzed using a colorimetric assay at 5 separate anatomic sites.
Preadmission application of chlorhexidine gluconate, 2%
Mean composite skin surface CHG concentrations in volunteer participants receiving EA following 1, 2, 3, 4, and 5 applications were 1,040.5, 1,334.4, 1,278.2, 1,643.9, and 1,803.1 µg/mL, respectively, while composite skin surface concentrations in the no-EA group were 913.8, 1,240.0, 1,249.8, 1,194.4, and 1,364.2 µg/mL, respectively (ANOVA, P<.001). Composite ratios (CHG concentration/minimum inhibitory concentration required to inhibit the growth of 90% of organisms [MIC90]) for 1, 2, 3, 4, or 5 applications using the 2% CHG cloth were 208.1, 266.8, 255.6, 328.8, and 360.6, respectively, representing CHG skin concentrations effective against staphylococcal surgical pathogens. The use of an electronic alert system resulted in significant increase in skin concentrations of CHG in the 4- and 5-application groups (P<.04 and P<.007, respectively).
The findings of this study suggest an evidence-based standardized process that includes use of an Internet-based electronic alert system to improve patient compliance while maximizing skin surface concentrations effective against MRSA and other staphylococcal surgical pathogens.
Infect. Control Hosp. Epidemiol. 2016;37(3):254–259
Residual herbicides are routinely recommended to aid in control of glyphosate-resistant (GR) Palmer amaranth in cotton. Acetochlor, a chloroacetamide herbicide, applied PRE, controls Palmer amaranth. A microencapsulated (ME) formulation of acetochlor is now registered for PRE application in cotton. Field research was conducted in North Carolina to evaluate cotton tolerance and Palmer amaranth control by acetochlor ME alone and in various combinations. Treatments, applied PRE, consisted of acetochlor ME, pendimethalin, or no herbicide arranged factorially with diuron, fluometuron, fomesafen, diuron plus fomesafen, and no herbicide. The PRE herbicides were followed by glufosinate applied twice POST and diuron plus MSMA directed at layby. Acetochlor ME was less injurious to cotton than pendimethalin. Acetochlor ME alone or in combination with other herbicides reduced early season cotton growth 5 to 8%, whereas pendimethalin alone or in combinations injured cotton 11 to 13%. Early season injury was transitory, and by 65 to 84 d after PRE treatment, injury was no longer noticeable. Before the first POST application of glufosinate, acetochlor ME and pendimethalin controlled Palmer amaranth 84 and 64%, respectively. Control by acetochlor ME was similar to control by diuron plus fomesafen and greater than control by diuron, fluometuron, or fomesafen alone. Greater than 90% control was obtained with acetochlor ME mixed with diuron or fomesafen. Palmer amaranth control was similar with acetochlor ME plus a full or reduced rate of fomesafen. Acetochlor ME controlled large crabgrass and goosegrass at 91 and 100% compared with control at 83 and 91%, respectively, by pendimethalin. Following glufosinate, applied twice POST, and diuron plus MSMA, at layby, 96 to 99% control was obtained late in the season by all treatments, and no differences among herbicide treatments were noted for cotton yield. This research demonstrated that acetochlor ME can be safely and effectively used in cotton weed management programs.