To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Item 9 of the Patient Health Questionnaire-9 (PHQ-9) queries about thoughts of death and self-harm, but not suicidality. Although it is sometimes used to assess suicide risk, most positive responses are not associated with suicidality. The PHQ-8, which omits Item 9, is thus increasingly used in research. We assessed equivalency of total score correlations and the diagnostic accuracy to detect major depression of the PHQ-8 and PHQ-9.
We conducted an individual patient data meta-analysis. We fit bivariate random-effects models to assess diagnostic accuracy.
16 742 participants (2097 major depression cases) from 54 studies were included. The correlation between PHQ-8 and PHQ-9 scores was 0.996 (95% confidence interval 0.996 to 0.996). The standard cutoff score of 10 for the PHQ-9 maximized sensitivity + specificity for the PHQ-8 among studies that used a semi-structured diagnostic interview reference standard (N = 27). At cutoff 10, the PHQ-8 was less sensitive by 0.02 (−0.06 to 0.00) and more specific by 0.01 (0.00 to 0.01) among those studies (N = 27), with similar results for studies that used other types of interviews (N = 27). For all 54 primary studies combined, across all cutoffs, the PHQ-8 was less sensitive than the PHQ-9 by 0.00 to 0.05 (0.03 at cutoff 10), and specificity was within 0.01 for all cutoffs (0.00 to 0.01).
PHQ-8 and PHQ-9 total scores were similar. Sensitivity may be minimally reduced with the PHQ-8, but specificity is similar.
In cluster-randomized trials (CRT), groups rather than individuals are randomized to interventions. The aim of this study was to present critical design, implementation, and analysis issues to consider when planning a CRT in the healthcare setting and to synthesize characteristics of published CRT in the field of healthcare epidemiology.
A systematic review was conducted to identify CRT with infection control outcomes.
We identified the following 7 epidemiological principles: (1) identify design type and justify the use of CRT; (2) account for clustering when estimating sample size and report intraclass correlation coefficient (ICC)/coefficient of variation (CV); (3) obtain consent; (4) define level of inference; (5) consider matching and/or stratification; (6) minimize bias and/or contamination; and (7) account for clustering in the analysis. Among 44 included studies, the most common design was CRT with crossover (n = 15, 34%), followed by parallel CRT (n = 11, 25%) and stratified CRT (n = 7, 16%). Moreover, 22 studies (50%) offered justification for their use of CRT, and 20 studies (45%) demonstrated that they accounted for clustering at the design phase. Only 15 studies (34%) reported the ICC, CV, or design effect. Also, 15 studies (34%) obtained waivers of consent, and 7 (16%) sought consent at the cluster level. Only 17 studies (39%) matched or stratified at randomization, and 10 studies (23%) did not report efforts to mitigate bias and/or contamination. Finally, 29 studies (88%) accounted for clustering in their analyses.
We must continue to improve the design and reporting of CRT to better evaluate the effectiveness of infection control interventions in the healthcare setting.
Psychologists have identified multiple different forms of conflict, such as information processing conflict and goal conflict. As such, there is a need to examine the similarities and differences in neurology between each form of conflict. To address this, we conducted a comprehensive electroencephalogram (EEG) analysis of Shadli, Glue, McIntosh, and McNaughton’s calibrated stop-signal task (SST) goal-conflict task. Specifically, we examined changes in scalp-wide current source density (CSD) power and coherence across a wide range of frequency bands during the calibrated SST (n = 34). We assessed differences in EEG between the high and low goal-conflict conditions using hierarchical analyses of variance (ANOVAs). We also related goal-conflict EEG to trait anxiety, neuroticism, Behavioural Inhibition System (BIS)-anxiety and revised BIS (rBIS) using regression analyses. We found that changes in CSD power during goal conflict were limited to increased midfrontocentral theta. Conversely, coherence increased across 23 scalp-wide theta region pairs and one frontal delta region pair. Finally, scalp-wide theta significantly predicted trait neuroticism but not trait anxiety, BIS-anxiety or rBIS. We conclude that goal conflict involves increased midfrontocentral CSD theta power and scalp-wide theta-dominated coherence. Therefore, compared with information processing conflict, goal conflict displays a similar EEG power profile of midfrontocentral theta but a much wider coherence profile. Furthermore, the increases in theta during goal conflict are the characteristic of BIS-driven activity. Therefore, future research should confirm whether these goal-conflict effects are driven by the BIS by examining whether the effects are attenuated by anxiolytic drugs. Overall, we have identified a unique network of goal-conflict EEG during the calibrated SST.
To determine which healthcare worker (HCW) roles and patient care activities are associated with acquisition of vancomycin-resistant Enterococcus (VRE) on HCW gloves or gowns after patient care, as a surrogate for transmission to other patients.
Prospective cohort study.
Medical and surgical intensive care units at a tertiary-care academic institution.
VRE-colonized patients on Contact Precautions and their HCWs.
Overall, 94 VRE-colonized patients and 469 HCW–patient interactions were observed. Research staff recorded patient care activities and cultured HCW gloves and gowns for VRE before doffing and exiting patient room.
VRE were isolated from 71 of 469 HCWs’ gloves or gowns (15%) following patient care. Occupational/physical therapists, patient care technicians, nurses, and physicians were more likely than environmental services workers and other HCWs to have contaminated gloves or gowns. Compared to touching the environment alone, the odds ratio (OR) for VRE contamination associated with touching both the patient (or objects in the immediate vicinity of the patient) and environment was 2.78 (95% confidence interval [CI], 0.99–0.77) and the OR associated with touching only the patient (or objects in the immediate vicinity) was 3.65 (95% CI, 1.17–11.41). Independent risk factors for transmission of VRE to HCWs were touching the patient’s skin (OR, 2.18; 95% CI, 1.15–4.13) and transferring the patient into or out of bed (OR, 2.66; 95% CI, 1.15–6.43).
Patient contact is a major risk factor for HCW contamination and subsequent transmission. Interventions should prioritize contact precautions and hand hygiene for HCWs whose activities involve touching the patient.
Different diagnostic interviews are used as reference standards for major depression classification in research. Semi-structured interviews involve clinical judgement, whereas fully structured interviews are completely scripted. The Mini International Neuropsychiatric Interview (MINI), a brief fully structured interview, is also sometimes used. It is not known whether interview method is associated with probability of major depression classification.
To evaluate the association between interview method and odds of major depression classification, controlling for depressive symptom scores and participant characteristics.
Data collected for an individual participant data meta-analysis of Patient Health Questionnaire-9 (PHQ-9) diagnostic accuracy were analysed and binomial generalised linear mixed models were fit.
A total of 17 158 participants (2287 with major depression) from 57 primary studies were analysed. Among fully structured interviews, odds of major depression were higher for the MINI compared with the Composite International Diagnostic Interview (CIDI) (odds ratio (OR) = 2.10; 95% CI = 1.15–3.87). Compared with semi-structured interviews, fully structured interviews (MINI excluded) were non-significantly more likely to classify participants with low-level depressive symptoms (PHQ-9 scores ≤6) as having major depression (OR = 3.13; 95% CI = 0.98–10.00), similarly likely for moderate-level symptoms (PHQ-9 scores 7–15) (OR = 0.96; 95% CI = 0.56–1.66) and significantly less likely for high-level symptoms (PHQ-9 scores ≥16) (OR = 0.50; 95% CI = 0.26–0.97).
The MINI may identify more people as depressed than the CIDI, and semi-structured and fully structured interviews may not be interchangeable methods, but these results should be replicated.
Declaration of interest
Drs Jetté and Patten declare that they received a grant, outside the submitted work, from the Hotchkiss Brain Institute, which was jointly funded by the Institute and Pfizer. Pfizer was the original sponsor of the development of the PHQ-9, which is now in the public domain. Dr Chan is a steering committee member or consultant of Astra Zeneca, Bayer, Lilly, MSD and Pfizer. She has received sponsorships and honorarium for giving lectures and providing consultancy and her affiliated institution has received research grants from these companies. Dr Hegerl declares that within the past 3 years, he was an advisory board member for Lundbeck, Servier and Otsuka Pharma; a consultant for Bayer Pharma; and a speaker for Medice Arzneimittel, Novartis, and Roche Pharma, all outside the submitted work. Dr Inagaki declares that he has received grants from Novartis Pharma, lecture fees from Pfizer, Mochida, Shionogi, Sumitomo Dainippon Pharma, Daiichi-Sankyo, Meiji Seika and Takeda, and royalties from Nippon Hyoron Sha, Nanzando, Seiwa Shoten, Igaku-shoin and Technomics, all outside of the submitted work. Dr Yamada reports personal fees from Meiji Seika Pharma Co., Ltd., MSD K.K., Asahi Kasei Pharma Corporation, Seishin Shobo, Seiwa Shoten Co., Ltd., Igaku-shoin Ltd., Chugai Igakusha and Sentan Igakusha, all outside the submitted work. All other authors declare no competing interests. No funder had any role in the design and conduct of the study; collection, management, analysis and interpretation of the data; preparation, review or approval of the manuscript; and decision to submit the manuscript for publication.
A systematic review of quasi-experimental studies in the field of infectious diseases was published in 2005. The aim of this study was to assess improvements in the design and reporting of quasi-experiments 10 years after the initial review. We also aimed to report the statistical methods used to analyze quasi-experimental data.
Systematic review of articles published from January 1, 2013, to December 31, 2014, in 4 major infectious disease journals.
Quasi-experimental studies focused on infection control and antibiotic resistance were identified and classified based on 4 criteria: (1) type of quasi-experimental design used, (2) justification of the use of the design, (3) use of correct nomenclature to describe the design, and (4) statistical methods used.
Of 2,600 articles, 173 (7%) featured a quasi-experimental design, compared to 73 of 2,320 articles (3%) in the previous review (P<.01). Moreover, 21 articles (12%) utilized a study design with a control group; 6 (3.5%) justified the use of a quasi-experimental design; and 68 (39%) identified their design using the correct nomenclature. In addition, 2-group statistical tests were used in 75 studies (43%); 58 studies (34%) used standard regression analysis; 18 (10%) used segmented regression analysis; 7 (4%) used standard time-series analysis; 5 (3%) used segmented time-series analysis; and 10 (6%) did not utilize statistical methods for comparisons.
While some progress occurred over the decade, it is crucial to continue improving the design and reporting of quasi-experimental studies in the fields of infection control and antibiotic resistance to better evaluate the effectiveness of important interventions.
Aging is associated with declines in physical capability; however, some individuals demonstrate high well-being despite this decline, i.e. they are “resilient.” We examined socioeconomic position (SEP) and resilience and the influence of potentially modifiable behavioral resources, i.e. social support and leisure time physical activity (LTPA), on these relationships.
Data came from the Medical Research Council National Survey of Health and Development, a nationally-representative birth cohort study. Resilience–vulnerability at age 60–64 years (n = 1,756) was operationalized as the difference between observed and expected levels of well-being, captured by the Warwick–Edinburgh Mental Well-being Scale (WEMWBS), given the level of performance-based physical capability. SEP was assessed by father's and own social class, parental education, and intergenerational social mobility. PA and structural/functional social support were reported at ages 53 years and 60–64 years. Path analysis was used to examine mediation of SEP and resilience–vulnerability through LTPA and social support.
Participants in the highest social class had scores on the resilience to vulnerability continuum that were an average of 2.3 units (β = 0.46, 95% CI 0.17, 0.75) higher than those in the lowest social class. Greater LTPA (β = 0.58, 95% CI 0.31, 0.85) and social support (β = 3.27, 95% CI 2.90, 3.63) were associated with greater resilience; LTPA partly mediated participant social class and resilience (23.4% of variance).
Adult socioeconomic advantage was associated with greater resilience. Initiatives to increase LTPA may contribute to reducing socioeconomic inequalities in this form of resilience in later life.
Risk adjustment is needed to fairly compare central-line–associated bloodstream infection (CLABSI) rates between hospitals. Until 2017, the Centers for Disease Control and Prevention (CDC) methodology adjusted CLABSI rates only by type of intensive care unit (ICU). The 2017 CDC models also adjust for hospital size and medical school affiliation. We hypothesized that risk adjustment would be improved by including patient demographics and comorbidities from electronically available hospital discharge codes.
Using a cohort design across 22 hospitals, we analyzed data from ICU patients admitted between January 2012 and December 2013. Demographics and International Classification of Diseases, Ninth Edition, Clinical Modification (ICD-9-CM) discharge codes were obtained for each patient, and CLABSIs were identified by trained infection preventionists. Models adjusting only for ICU type and for ICU type plus patient case mix were built and compared using discrimination and standardized infection ratio (SIR). Hospitals were ranked by SIR for each model to examine and compare the changes in rank.
Overall, 85,849 ICU patients were analyzed and 162 (0.2%) developed CLABSI. The significant variables added to the ICU model were coagulopathy, paralysis, renal failure, malnutrition, and age. The C statistics were 0.55 (95% CI, 0.51–0.59) for the ICU-type model and 0.64 (95% CI, 0.60–0.69) for the ICU-type plus patient case-mix model. When the hospitals were ranked by adjusted SIRs, 10 hospitals (45%) changed rank when comorbidity was added to the ICU-type model.
Our risk-adjustment model for CLABSI using electronically available comorbidities demonstrated better discrimination than did the CDC model. The CDC should strongly consider comorbidity-based risk adjustment to more accurately compare CLABSI rates across hospitals.
Campylobacter sp. are a globally significant cause of gastroenteritis. Although rates of infection in Australia are among the highest in the industrialized world, studies describing campylobacteriosis incidence in Australia are lacking. Using national disease notification data between 1998 and 2013 we examined Campylobacter infections by gender, age group, season and state and territory. Negative binomial regression was used to estimate incidence rate ratios (IRRs), including trends by age group over time, with post-estimation commands used to obtain adjusted incidence rates. The incidence rate for males was significantly higher than for females [IRR 1·20, 95% confidence interval (CI) 1·18–1·21], while a distinct seasonality was demonstrated with higher rates in both spring (IRR 1·18, 95% CI 1·16–1·20) and summer (IRR 1·17, 95% CI 1·16–1·19). Examination of trends in age-specific incidence over time showed declines in incidence in those aged <40 years combined with contemporaneous increases in older age groups, notably those aged 70–79 years (IRR 1998–2013: 1·75, 95% CI 1·63–1·88). While crude rates continue to be highest in children, our findings suggest the age structure for campylobacteriosis in Australia is changing, carrying significant public health implications for older Australians.
Heteroepitaxial growth of high-quality II-VI-alloy materials on Si substrates is a well-established commercial growth process for infrared (IR) detector devices. However, it has only recently been recognized that these same processes may have important applications for production of high-efficiency photovoltaic devices. This submission reviews the process developments that have enabled effective heteroepitaxy of II-VI alloy materials on lattice-mismatched Si for IR detectors as a foundation to describe recent efforts to apply these insights to the fabrication of multijunction Si/CdZnTe devices with ultimate conversion efficiencies >40%. Reviewed photovoltaic studies include multijunction Si/CdZnTe devices with conversion efficiency of ∼17%, analysis of structural and optoelectrical quality of undoped CdTe epilayer films on Si, and the effect that a Te-rich growth environment has on the structural and optoelectronic quality of both undoped and As-doped heteroepitaxial CdTe.
Severe youth antisocial behaviour has been associated with increased risk of premature mortality in high-risk samples for many years, and some evidence now points to similar effects in representative samples. We set out to assess the prospective association between adolescent conduct problems and premature mortality in a population-based sample of men and women followed to the age of 65 years.
A total of 4158 members of the Medical Research Council National Survey of Health and Development (the British 1946 birth cohort) were assessed for conduct problems at the ages of 13 and 15 years. Follow-up to the age of 65 years via the UK National Health Service Central Register provided data on date and cause of death.
Dimensional measures of teacher-rated adolescent conduct problems were associated with increased hazards of death from cardiovascular disease by the age of 65 years in men [hazard ratio (HR) 1.17, 95% confidence interval (CI) 1.04–1.32], and of all-cause and cancer mortality by the age of 65 years in women (all-cause HR 1.16, 95% CI 1.07–1.25). Adjustment for childhood cognition and family social class did little to attenuate these risks. Adolescent conduct problems were not associated with increased risks of unnatural/substance-related deaths in men or women in this representative sample.
Whereas previous studies of high-risk delinquent or offender samples have highlighted increased risks of unnatural and alcohol- or substance abuse-related deaths in early adulthood, we found marked differences in mortality risk from other causes emerging later in the life course among women as well as men.
Methicillin-resistant Staphylococcus aureus (MRSA) infection is known to increase in-hospital mortality, but little is known about its association with long-term health. Two hundred and thirty-seven deaths occurred among 707 patients with MRSA infection at the time of hospitalization and/or nasal colonization followed for almost 4 years after discharge from the Atlanta Veterans Affairs Medical Center, USA. The crude mortality rate in patients with an infection and colonization (23·57/100 person-years) was significantly higher than the rate in patients with only colonization (15·67/100 person-years, P = 0·037). MRSA infection, hospitalization within past 6 months, and histories of cancer or haemodialysis were independent risk factors. Adjusted mortality rates in patients with infection were almost twice as high compared to patients who were only colonized: patients infected and colonized [hazard ratio (HR) 1·93, 95% confidence interval (CI) 1·31–2·84]; patients infected but not colonized (HR 1·96, 95% CI 1·22–3·17). Surviving MRSA infection adversely affects long-term mortality, underscoring the importance of infection control in healthcare settings.
In the absence of a healthcare budget enabling the import of ready-made aural grommets, Myanmar ENT surgeons have devised an ingenious ‘home-grown’ solution. We describe how grommets are made from raw materials bought from the local market.
The underlying mechanism of predisposition to Ascaris infection is not yet understood but host genetics are thought to play a fundamental role. We investigated the association between the Intelectin-2 gene and resistance in F2 mice derived from mouse strains known to be susceptible and resistant to infection. Ascaris larvae were isolated from murine lungs and the number of copies of the Intelectin-2 gene was determined in F2 mice. Intelectin-2 gene copy number was not significantly linked to larval burden. In a pilot experiment, the response to infection in parental mice of both sexes was observed in order to address the suitablity of female F2 mice. No overall significant sex effect was detected. However, a divergence in resistance/susceptibility status was observed between male and, female hybrid offspring. The responsiveness to Ascaris in mice is likely to be controlled by multiple genes and, despite a unique absence from the susceptible C57BL/6j strain, the Intelectin-2 gene does not play a significant role in resistance. The observed intra-strain variation in larval burden requires further investigation but we hypothesize that it stems from social/dominance hierarchies created by the presence of female mice and possibly subsequent hormonal perturbations that modify the intensity of the immune response.
We report the case of a recurrent familial malignant carotid body tumour presenting with metastasis to local ipsilateral lymph nodes; the rarity of both recurrence combined with nodal spread is emphasised in this article.
We present a case report, and a review of the world literature concerning the diagnosis and management of carotid body tumours in the familial setting.
A woman with a family history of succinate dehydrogenase complex subunit B gene mutation presented with right vocal fold palsy. A causative carotid body tumour was excised. Fifteen years later, the patient developed a right-sided swelling in the jugulo-digastric region, together with shooting pains towards her right ear. Imaging revealed right posterior triangle lymphadenopathy. Fine needle aspiration cytology of the node was equivocal. Computed tomography of her neck revealed, in addition, a mass within the right side of the larynx. Excision biopsy of the lymph node demonstrated metastatic paraganglioma. A carotid angiogram revealed a right-sided carotid body tumour. This was embolised prior to neck exploration and excision of the carotid body tumour with en bloc resection of adjacent nodes. Histological analysis confirmed the presence of lymph nodes containing metastatic paraganglioma.
This case report demonstrates the need for extra vigilance to enable early disease detection in the familial setting of carotid body tumour, in order to reduce the surgical morbidity associated with disease progression. In addition, our report highlights the atypical aspects of presentation in the familial setting, together with the difficulty and lack of standardisation regarding monitoring of the disease.
Mid-pregnancy shearing has consistently been shown to increase lamb birth weight, which can lead to an increase in lamb survival rates. However, shearing ewes during the winter months and under outdoor pastoral farming conditions can expose the recently shorn ewe to a greater risk of hypothermia. The aim of this study was to determine if exposure of ewes to repeated stressors, in mid- and late pregnancy, would result in an increase in lamb birth weight. This information may assist in the elucidation of the mechanism for the birth weight response to mid-pregnancy shearing, which in turn could assist in the design of management options to increase lamb birth weight without placing the ewe at risk. One hundred and forty-four twin-bearing Romney ewes were allocated to one of six mid-pregnancy treatments: control, isolation on 2 or 10 occasions, sham-shearing on 10 occasions, intramuscular cortisol injection on 10 occasions or shearing. Isolation, sham-shearing and cortisol treatments were conducted twice a week beginning, on average, day 74 of pregnancy and shearing occurred on day 76. During pregnancy, ewe treatment had no effect on ewe live weight. However, average ewe body condition scores were higher in the shorn group than in the sham-shorn or cortisol groups (P < 0.05). Intramuscular injections of cortisol had a greater effect on ewe plasma cortisol concentrations than all other treatments (P < 0.05). Shearing produced a greater plasma cortisol response than isolation × 10 and sham-shearing (P < 0.05). Ewe plasma cortisol responses decreased during the 5 weeks of isolation and sham-shearing but cortisol injections produced a greater response during the fifth treatment than the first or ninth treatments (P < 0.05). Lambs born to shorn ewes were heavier and had a longer crown rump, forelimb and hind limb lengths than all other lambs (P < 0.05). In addition, lambs born to ewes in the cortisol treatment were lighter than lambs born to control, isolation × 2, isolation × 10 and shorn ewes (P < 0.05). The plasma cortisol concentrations observed for ewes injected with cortisol were far greater than those observed in all other groups, which is likely to explain the low birth weights of lambs born to ewes in that group. These results indicate that the mechanism by which mid-pregnancy shearing increases lamb birth weight is unlikely to be repeated stressors.
Listeriosis is a foodborne disease associated with significant mortality. This study attempts to identify risk factors for sporadic listeriosis in Australia. Information on underlying illnesses was obtained from cases' treating doctors and other risk factors were elicited from the patient or a surrogate. We attempted to recruit two controls per case matched on age and primary underlying immune condition. Between November 2001 and December 2004 we recruited 136 cases and 97 controls. Of perinatal cases, living in a household where a language other than English was spoken was the main risk factor associated with listeriosis (OR 11·3, 95% CI 1·5–undefined). Of non-perinatal cases we identified the following risk factors for listeriosis: prior hospitalization (OR 4·3, 95% CI 1·0–18·3), use of gastric acid inhibitors (OR 9·4, 95% CI 2·4–37·4), and consumption of camembert (OR 4·7, 95% CI 1·1–20·6). Forty percent of cases with prior hospitalization were exposed to high-risk foods during hospitalization.
Intravenous amphotericin or intravenous voriconazole, both followed by oral voriconazole, have previously been given to treat invasive aspergillosis of the skull base.
Exclusively oral voriconazole was used in an immunocompetent patient with biopsy-proven, invasive aspergillosis. She had a large, erosive lesion extending from the central skull base to the right orbit and ethmoid sinus, and displacing the right internal carotid artery. After four months of oral treatment as an out-patient, a repeated computed tomography scan showed a fully treated infection with post-infectious changes only, and treatment was terminated. Two years later, there had been no recurrence.
Substantial cost savings were made by using exclusively oral treatment, compared with the use of intravenous voriconazole or amphotericin, or a switch strategy.