To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Item 9 of the Patient Health Questionnaire-9 (PHQ-9) queries about thoughts of death and self-harm, but not suicidality. Although it is sometimes used to assess suicide risk, most positive responses are not associated with suicidality. The PHQ-8, which omits Item 9, is thus increasingly used in research. We assessed equivalency of total score correlations and the diagnostic accuracy to detect major depression of the PHQ-8 and PHQ-9.
We conducted an individual patient data meta-analysis. We fit bivariate random-effects models to assess diagnostic accuracy.
16 742 participants (2097 major depression cases) from 54 studies were included. The correlation between PHQ-8 and PHQ-9 scores was 0.996 (95% confidence interval 0.996 to 0.996). The standard cutoff score of 10 for the PHQ-9 maximized sensitivity + specificity for the PHQ-8 among studies that used a semi-structured diagnostic interview reference standard (N = 27). At cutoff 10, the PHQ-8 was less sensitive by 0.02 (−0.06 to 0.00) and more specific by 0.01 (0.00 to 0.01) among those studies (N = 27), with similar results for studies that used other types of interviews (N = 27). For all 54 primary studies combined, across all cutoffs, the PHQ-8 was less sensitive than the PHQ-9 by 0.00 to 0.05 (0.03 at cutoff 10), and specificity was within 0.01 for all cutoffs (0.00 to 0.01).
PHQ-8 and PHQ-9 total scores were similar. Sensitivity may be minimally reduced with the PHQ-8, but specificity is similar.
Cosmopolitan habitat-forming taxa of algae such as the genus Corallina provide an opportunity to compare patterns of biodiversity over wide geographic scales. Nematode assemblages inhabiting Corallina turves were compared between the south coasts of the British Isles and South Korea. A fully nested design was used with three regions in each country, two shores in each region and replicate samples taken from three patches on each shore to compare differences in the taxonomic and biological trait composition of nematode assemblages across scales. A biological traits approach, based on functional diversity of nematodes, was used to make comparisons between countries, among regions, between shores and among patches. The taxonomic and biological trait compositions of nematode assemblages were significantly different across all spatial scales (patches, shores, regions and countries). There is greater variation amongst nematode assemblages at the scale of shore than at other spatial scales. Nematode assemblage structure and functional traits are influenced by the local environmental factors on each shore including sea-surface temperature, the amount of sediment trapped in Corallina spp. and tidal range. The sea-surface temperature and the amount of sediment trapped in Corallina spp. were the predominant factors determining nematode abundance and composition of assemblages and their functional diversity.
Background: Canadian Stroke Guidelines recommend that Transient Ischemic Attack (TIA) patients at highest risk of stroke recurrence should undergo immediate vascular imaging. Computed tomography angiography (CTA) of the head and neck is recommended over carotid doppler because it allows for enhanced visualization of the intracranial and posterior circulation vasculature. Imaging while patients are in the emergency department (ED) is optimal for high-risk patients because the risk of stroke recurrence is highest in the first 48 hours. Aim Statement: At our hospital, a designated stroke centre, less than 5% of TIA patients meet national recommendations by undergoing CTA in the ED. We sought to increase the rate of CTA in high risk ED TIA patients from less than 5% to at least 80% in 10 months. Measures & Design: We used a multi-faceted approach to improve our adherence to guidelines including: 1) education for staff ED physicians; 2) agreements between ED and radiology to facilitate rapid access to CTA; 3) agreements between ED and neurology for consultations regarding patients with abnormal CTA; and 4) the creation of an electronic decision support tool to guide ED physicians as to which patients require CTA. We measured the rate of CTA in high risk patients biweekly using retrospective chart review of patients referred to the TIA clinic from the ED on a biweekly basis. As a balancing measure, we also measured the rate of CTA in non-high risk patients. Evaluation/Results: Data collection is ongoing. An interim run chart at 19 weeks shows a complete shift above the median after implementation, with CTA rates between 70 and 100%. At the time of submission, we had no downward trends below 80%, showing sustained improvement. The CTA rate in non-high risk patients did also increase. Disucssion/Impact: After 19 weeks of our intervention, 112 (78.9%) of high risk TIA patients had a CTA, compared to 10 (9.8%) in the 19 weeks prior to our intervention. On average, 10-15% of high risk patients will have an identifiable lesion on CTA, leading to immediate change in management (at minimum, an inpatient consultation with neurology). Our multi-faceted approach could be replicated in any ED with the engagement and availability of the same multi-disciplinary team (ED, radiology, and neurology), access to CTA, and electronic orders.
Despite established clinical associations among major depression (MD), alcohol dependence (AD), and alcohol consumption (AC), the nature of the causal relationship between them is not completely understood. We leveraged genome-wide data from the Psychiatric Genomics Consortium (PGC) and UK Biobank to test for the presence of shared genetic mechanisms and causal relationships among MD, AD, and AC.
Linkage disequilibrium score regression and Mendelian randomization (MR) were performed using genome-wide data from the PGC (MD: 135 458 cases and 344 901 controls; AD: 10 206 cases and 28 480 controls) and UK Biobank (AC-frequency: 438 308 individuals; AC-quantity: 307 098 individuals).
Positive genetic correlation was observed between MD and AD (rgMD−AD = + 0.47, P = 6.6 × 10−10). AC-quantity showed positive genetic correlation with both AD (rgAD−AC quantity = + 0.75, P = 1.8 × 10−14) and MD (rgMD−AC quantity = + 0.14, P = 2.9 × 10−7), while there was negative correlation of AC-frequency with MD (rgMD−AC frequency = −0.17, P = 1.5 × 10−10) and a non-significant result with AD. MR analyses confirmed the presence of pleiotropy among these four traits. However, the MD-AD results reflect a mediated-pleiotropy mechanism (i.e. causal relationship) with an effect of MD on AD (beta = 0.28, P = 1.29 × 10−6). There was no evidence for reverse causation.
This study supports a causal role for genetic liability of MD on AD based on genetic datasets including thousands of individuals. Understanding mechanisms underlying MD-AD comorbidity addresses important public health concerns and has the potential to facilitate prevention and intervention efforts.
Objectives: Insomnia is associated with neuropsychological dysfunction. Evidence points to the role of nocturnal light exposure in disrupted sleep patterns, particularly blue light emitted through smartphones and computers used before bedtime. This study aimed to test whether blocking nocturnal blue light improves neuropsychological function in individuals with insomnia symptoms. Methods: This study used a randomized, placebo-controlled crossover design. Participants were randomly assigned to a 1-week intervention with amber lenses worn in wrap-around frames (to block blue light) or a 1-week intervention with clear lenses (control) and switched conditions after a 4-week washout period. Neuropsychological function was evaluated with tests from the NIH Toolbox Cognition Battery at three time points: (1) baseline (BL), (2) following the amber lenses intervention, and (3) following the clear lenses intervention. Within-subjects general linear models contrasted neuropsychological test performance following the amber lenses and clear lenses conditions with BL performance. Results: Fourteen participants (mean(standard deviation, SD): age = 46.5(11.4)) with symptoms of insomnia completed the protocol. Compared with BL, individuals performed better on the List Sorting Working Memory task after the amber lenses intervention, but similarly after the clear lenses intervention (F = 5.16; p = .014; η2 = 0.301). A similar pattern emerged on the Pattern Comparison Processing Speed test (F = 7.65; p = 0.002; η2 = 0.370). Consideration of intellectual ability indicated that treatment with amber lenses “normalized” performance on each test from approximately 1 SD below expected performance to expected performance. Conclusions: Using a randomized, placebo-controlled crossover design, we demonstrated improvement in processing speed and working memory with a nocturnal blue light blocking intervention among individuals with insomnia symptoms. (JINS, 2019, 25, 668–677)
Ehrlichiosis is a zoonotic illness caused by Ehrlichia pathogens transmitted by ticks. Case data from 1999 to 2015, provided by the Missouri Department of Health and Senior Services (DHSS), were used to compare the seasonality and the change in incidence over time of ehrlichiosis infection in two Missouri ecoregions, Eastern Temperate Forest (ETF) and Great Plains (GP). Although the number of cases has increased over time in both ecoregions, the rate of change was significantly faster in ETF region. There was no significant difference in seasonality of ehrlichiosis between ecoregions. In Missouri, the estimated ehrlichiosis season begins, on average, in mid-March, peaks in June, and concludes in mid-October. Our results show that the exposure and risk season for ehrlichiosis in Missouri is at least 7 months long.
This study evaluated tumour necrosis factor-α, interleukins 10 and 12, and interferon-γ levels, peripheral blood mononuclear cells, and clusters of differentiation 17c and 86 expression in unilateral sudden sensorineural hearing loss.
Twenty-four patients with unilateral sudden sensorineural hearing loss, and 24 individuals with normal hearing and no history of sudden sensorineural hearing loss (who were attending the clinic for other problems), were enrolled. Peripheral blood mononuclear cells, and clusters of differentiation 11c and 86 were isolated and analysed. Plasma and supernatant levels of tumour necrosis factor-α, interferon-γ, and interleukins 10 and 12 were measured.
There were no significant differences with respect to age and gender. Monocyte population, mean tumour necrosis factor-α level and cluster of differentiation 86 expression were significantly increased in the study group compared to the control group. However, interferon-γ and interleukin 12 levels were significantly decreased. The difference in mean interleukin 10 level was not significant.
Increases in tumour necrosis factor-α level and monocyte population might play critical roles in sudden sensorineural hearing loss. This warrants detailed investigation and further studies on the role of dendritic cells in sudden sensorineural hearing loss.
Given the challenges in accurately identifying unexposed controls in case–control studies of diarrhoea, we examined diarrhoea incidence, subclinical enteric infections and growth stunting within a reference population in the Global Enteric Multicenter Study, Kenya site. Within ‘control’ children (0–59 months old without diarrhoea in the 7 days before enrolment, n = 2384), we examined surveys at enrolment and 60-day follow-up, stool at enrolment and a 14-day post-enrolment memory aid for diarrhoea incidence. At enrolment, 19% of controls had ⩾1 enteric pathogen associated with moderate-to-severe diarrhoea (‘MSD pathogens’) in stool; following enrolment, many reported diarrhoea (27% in 7 days, 39% in 14 days). Controls with and without reported diarrhoea had similar carriage of MSD pathogens at enrolment; however, controls reporting diarrhoea were more likely to report visiting a health facility for diarrhoea (27% vs. 7%) or fever (23% vs. 16%) at follow-up than controls without diarrhoea. Odds of stunting differed by both MSD and ‘any’ (including non-MSD pathogens) enteric pathogen carriage, but not diarrhoea, suggesting control classification may warrant modification when assessing long-term outcomes. High diarrhoea incidence following enrolment and prevalent carriage of enteric pathogens have implications for sequelae associated with subclinical enteric infections and for design and interpretation of case–control studies examining diarrhoea.
A three-dimensional wavelet multi-resolution analysis of direct numerical simulations of a turbulent premixed flame is performed in order to investigate the spatially localized spectral transfer of kinetic energy across scales in the vicinity of the flame front. A formulation is developed that addresses the compressible spectral dynamics of the kinetic energy in wavelet space. The wavelet basis enables the examination of local energy spectra, along with inter-scale and subfilter-scale (SFS) cumulative energy fluxes across a scale cutoff, all quantities being available either unconditioned or conditioned on the local instantaneous value of the progress variable across the flame brush. The results include the quantification of mean spectral values and associated spatial variabilities. The energy spectra undergo, in most locations in the flame brush, a precipitous drop that starts at scales of the same order as the characteristic flame scale and continues to smaller scales, even though the corresponding decrease of the mean spectra is much more gradual. The mean convective inter-scale flux indicates that convection increases the energy of small scales, although it does so in a non-conservative manner due to the high aspect ratio of the grid, which limits the maximum scale level that can be used in the wavelet transform, and to the non-periodic boundary conditions, which exchange energy through surface forces, as explicitly elucidated by the formulation. The mean pressure-gradient inter-scale flux extracts energy from intermediate scales of the same order as the characteristic flame scale, and injects energy in the smaller and larger scales. The local SFS-cumulative contribution of the convective and pressure-gradient mechanisms of energy transfer across a given cutoff scale imposed by a wavelet filter is analysed. The local SFS-cumulative energy flux is such that the subfilter scales upstream from the flame always receive energy on average. Conversely, within the flame brush, energy is drained on average from the subfilter scales by convective and pressure-gradient effects most intensely when the filter cutoff is larger than the characteristic flame scale.
Different diagnostic interviews are used as reference standards for major depression classification in research. Semi-structured interviews involve clinical judgement, whereas fully structured interviews are completely scripted. The Mini International Neuropsychiatric Interview (MINI), a brief fully structured interview, is also sometimes used. It is not known whether interview method is associated with probability of major depression classification.
To evaluate the association between interview method and odds of major depression classification, controlling for depressive symptom scores and participant characteristics.
Data collected for an individual participant data meta-analysis of Patient Health Questionnaire-9 (PHQ-9) diagnostic accuracy were analysed and binomial generalised linear mixed models were fit.
A total of 17 158 participants (2287 with major depression) from 57 primary studies were analysed. Among fully structured interviews, odds of major depression were higher for the MINI compared with the Composite International Diagnostic Interview (CIDI) (odds ratio (OR) = 2.10; 95% CI = 1.15–3.87). Compared with semi-structured interviews, fully structured interviews (MINI excluded) were non-significantly more likely to classify participants with low-level depressive symptoms (PHQ-9 scores ≤6) as having major depression (OR = 3.13; 95% CI = 0.98–10.00), similarly likely for moderate-level symptoms (PHQ-9 scores 7–15) (OR = 0.96; 95% CI = 0.56–1.66) and significantly less likely for high-level symptoms (PHQ-9 scores ≥16) (OR = 0.50; 95% CI = 0.26–0.97).
The MINI may identify more people as depressed than the CIDI, and semi-structured and fully structured interviews may not be interchangeable methods, but these results should be replicated.
Declaration of interest
Drs Jetté and Patten declare that they received a grant, outside the submitted work, from the Hotchkiss Brain Institute, which was jointly funded by the Institute and Pfizer. Pfizer was the original sponsor of the development of the PHQ-9, which is now in the public domain. Dr Chan is a steering committee member or consultant of Astra Zeneca, Bayer, Lilly, MSD and Pfizer. She has received sponsorships and honorarium for giving lectures and providing consultancy and her affiliated institution has received research grants from these companies. Dr Hegerl declares that within the past 3 years, he was an advisory board member for Lundbeck, Servier and Otsuka Pharma; a consultant for Bayer Pharma; and a speaker for Medice Arzneimittel, Novartis, and Roche Pharma, all outside the submitted work. Dr Inagaki declares that he has received grants from Novartis Pharma, lecture fees from Pfizer, Mochida, Shionogi, Sumitomo Dainippon Pharma, Daiichi-Sankyo, Meiji Seika and Takeda, and royalties from Nippon Hyoron Sha, Nanzando, Seiwa Shoten, Igaku-shoin and Technomics, all outside of the submitted work. Dr Yamada reports personal fees from Meiji Seika Pharma Co., Ltd., MSD K.K., Asahi Kasei Pharma Corporation, Seishin Shobo, Seiwa Shoten Co., Ltd., Igaku-shoin Ltd., Chugai Igakusha and Sentan Igakusha, all outside the submitted work. All other authors declare no competing interests. No funder had any role in the design and conduct of the study; collection, management, analysis and interpretation of the data; preparation, review or approval of the manuscript; and decision to submit the manuscript for publication.
Introduction: Patients with Heart failure (HF) experience frequent decompensation necessitating multiple emergency department (ED) visits and hospitalizations. If patients are able to receive timely interventions and optimize self-management, recurrent ED visits may be reduced. In this feasibility study, we piloted the application of home telemonitoring to support the discharge of HF patients from hospital to home. We hypothesized that TEC4Home would decrease ED revisits and hospital admissions and improve patient health outcomes. Methods: Upon discharge from the ED or hospital, patients with HF received a blood pressure cuff, weight scale, pulse oximeter, and a touchscreen tablet. Participants submitted measurements and answered questions on the tablet about their HF symptoms daily for 60 days. Data were reviewed by a monitoring nurse. From November 2016 to July 2017, 69 participants were recruited from Vancouver General Hospital (VGH), St. Pauls Hospital (SPH) and Kelowna General Hospital (KGH). Participants completed pre-surveys at enrollement and post-surveys 30 days after monitoring finished. Administrative data related to ED visits and hospital admissions were reviewed. Interviews were conducted with the monitoring nurses to assess the impact of monitoring on patient health outcomes. Results: A preliminary analysis was conducted on a subsample of participants (n=22) enrolled across all 3 sites by March 31, 2017. At VGH and SPH (n=14), 25% fewer patients required an ED visit in the post-survey reporting compared to pre-survey. During the monitoring period, the monitoring nurse observed seven likely avoided ED admissions due to early intervention. In total, admissions were reduced by 20% and total hospital length of stay reduced by 69%. At KGH (n=8), 43% fewer patients required an ED visit in the post-survey reporting compared to the pre-survey. Hospital admissions were reduced by 20% and total hospital length of stay reduced by 50%. Overall, TEC4Home participants from all sites showed a significant improvement in health-related quality of life and in self-care behaviour pre- to 90 days post-monitoring. A full analysis of the 69 patients will be complete in February 2018. Conclusion: Preliminary findings indicate that home telemonitoring for HF patients can decrease ED revisits and improve patient experience. The length of stay data may also suggest the potential for early discharge of ED patients with home telemonitoring to avoid or reduce hospitalization. A stepped-wedge randomized controlled trial of TEC4Home in 22 BC communities will be conducted in 2018 to generate evidence and scale up the service in urban, regional and rural communities. This work is submitted on behalf of the TEC4Home Healthcare Innovation Community.
Coinfection with human immunodeficiency virus (HIV) and viral hepatitis is associated with high morbidity and mortality in the absence of clinical management, making identification of these cases crucial. We examined characteristics of HIV and viral hepatitis coinfections by using surveillance data from 15 US states and two cities. Each jurisdiction used an automated deterministic matching method to link surveillance data for persons with reported acute and chronic hepatitis B virus (HBV) or hepatitis C virus (HCV) infections, to persons reported with HIV infection. Of the 504 398 persons living with diagnosed HIV infection at the end of 2014, 2.0% were coinfected with HBV and 6.7% were coinfected with HCV. Of the 269 884 persons ever reported with HBV, 5.2% were reported with HIV. Of the 1 093 050 persons ever reported with HCV, 4.3% were reported with HIV. A greater proportion of persons coinfected with HIV and HBV were males and blacks/African Americans, compared with those with HIV monoinfection. Persons who inject drugs represented a greater proportion of those coinfected with HIV and HCV, compared with those with HIV monoinfection. Matching HIV and viral hepatitis surveillance data highlights epidemiological characteristics of persons coinfected and can be used to routinely monitor health status and guide state and national public health interventions.
This study evaluated the annual prevalence of anogenital warts (AGW) caused by human papillomavirus (HPV) and analysed the trend in annual per cent changes (APC) by using national claims data from the Health Insurance Review and Assessment of Korea, 2007–2015. We also estimated the socio-economic burden and co-morbidities of AGW. All analyses were performed based on data for primary A63.0, the specific diagnosis code for AGW. The socio-economic cost of AGW was calculated based on the direct medical cost, direct non-medical cost and indirect cost. The overall AGW prevalence and socio-economic burden has increased during the last 9 years. However, the prevalence of AGW differed significantly by sex. The female prevalence increased until 2012, and decreased thereafter (APC + 3·6%). It would fall after the introduction of routine HPV vaccination, principally for females, in Korea. The male prevalence increased continuously over time (APC + 11·6%), especially in those aged 20–49 years. Referring to the increasing AGW prevalence and its disease burden, active HPV infection control surveillance and prevention in males are worth consideration.
The Dacron bag technique has been widely used to estimate degradation in the rumen. The drawbacks, such as variation in rinsing losses and inability to correct for microbial contamination, are well known. However, these effects also suggest the potential to use the technique in studies of microbial colonisation. Other studies in our laboratory have investigated the use of odd-chain fatty acids (C15:0 and C17:0) as markers of rumen microbial activity (Fievez et al, 2003) because they are generally rare or absent from feeds. The objective of this work was to use multivariate statistical analysis to explore the relationships between plant and microbial fatty acids in ingested herbage.
As grazing ruminants rely almost entirely on mastication to disrupt plant tissues, a series of processes (mastication, bolus formation and ingestion) will impact on the viability and number of cells that remain intact, and consequently alive, after ingestion (Kingston-Smith and Theodorou, 2000). Preliminary work in our group has shown substantial variation in the degree of cell damage during mastication and ingestion between grass species, resulting in differences in the rate of release of cell contents (protein, sugars and lipids) into the rumen (E.J. Kim, unpublished). These differences may affect nutrient utilisation by ruminal micro-organisms. The aim of this study was to compare the extent of nutrient release from three contrasting grass species following ingestion of the fresh forage by dairy cows.
Rapid breakdown of herbage proteins in the rumen and inefficient capture of nitrogen (N) by the rumen microbial populations are a major source of N loss and pollution in pasture-based ruminant agriculture. Degree of cell damage during mastication and ingestion varies between grass species with consequences for release of cell contents (protein, sugars and lipids) into the rumen (Kim et al., 2008). Consequently, grazing cattle on different grass species may provide an opportunity to manipulate N efficiency. The purpose of this study was to compare N utilisation efficiency by dairy cattle grazing grass species differing in chemical and morphological characteristics.
The overall objective of our work is to assess the relative contributions of plant enzymes and rumen microbes to rumen degradation of freshly-ingested herbage. In situ techniques have been used extensively to compare rumen degradation characteristics of feeds, though there remain technical problems associated with microbial contamination of residues after incubation. We hypothesised that techniques to study microbial contamination might also provide insights into microbial colonisation. Our earlier studies (Lee et al., 1999) identified distinctive odd-chain fatty acids that could be used as microbial markers. A dacron bag study was conducted to examine the influence of dacron bag rinsing techniques on DM disappearance and microbial contamination in residues from fresh grass, assessed using odd-chain fatty acids as markers.