To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We describe elongate, wet, subglacial bedforms in the shear margins of the NE Greenland Ice Stream and place some constraints on their formation. Lateral shear margin moraines have been observed across the previously glaciated landscape, but little is known about the ice-flow conditions necessary to form these bedforms. Here we describe in situ sediment bedforms under the NE Greenland Ice Stream shear margins that are observed in active-source seismic and ground-penetrating radar surveys. We find bedforms in the shear margins that are ~500 m wide, ~50 m tall, and elongated nearly parallel to ice-flow, including what we believe to be the first subglacial observation of a shear margin moraine. Acoustic impedance analysis of the bedforms shows that they are composed of unconsolidated, deformable, water-saturated till. We use these geophysical observations to place constraints on the possible formation mechanism of these subglacial features.
Item 9 of the Patient Health Questionnaire-9 (PHQ-9) queries about thoughts of death and self-harm, but not suicidality. Although it is sometimes used to assess suicide risk, most positive responses are not associated with suicidality. The PHQ-8, which omits Item 9, is thus increasingly used in research. We assessed equivalency of total score correlations and the diagnostic accuracy to detect major depression of the PHQ-8 and PHQ-9.
We conducted an individual patient data meta-analysis. We fit bivariate random-effects models to assess diagnostic accuracy.
16 742 participants (2097 major depression cases) from 54 studies were included. The correlation between PHQ-8 and PHQ-9 scores was 0.996 (95% confidence interval 0.996 to 0.996). The standard cutoff score of 10 for the PHQ-9 maximized sensitivity + specificity for the PHQ-8 among studies that used a semi-structured diagnostic interview reference standard (N = 27). At cutoff 10, the PHQ-8 was less sensitive by 0.02 (−0.06 to 0.00) and more specific by 0.01 (0.00 to 0.01) among those studies (N = 27), with similar results for studies that used other types of interviews (N = 27). For all 54 primary studies combined, across all cutoffs, the PHQ-8 was less sensitive than the PHQ-9 by 0.00 to 0.05 (0.03 at cutoff 10), and specificity was within 0.01 for all cutoffs (0.00 to 0.01).
PHQ-8 and PHQ-9 total scores were similar. Sensitivity may be minimally reduced with the PHQ-8, but specificity is similar.
Biologic asymmetry is present in all bilaterally symmetric organisms as a result of normal developmental instability. However, fossilized organisms, which have undergone distortion due to burial, may have additional asymmetry as a result of taphonomic processes. To investigate this issue, we evaluated the magnitude of shape variation resulting from taphonomy on vertebrate bone using a novel application of fluctuating asymmetry. We quantified the amount of total variance attributed to asymmetry in a taphonomically distorted fossil taxon and compared it with that of three extant taxa. The fossil taxon had an average of 27% higher asymmetry than the extant taxa. In spite of the high amount of taphonomic input, the major axes of shape variation were not greatly altered by removal of the asymmetric component of shape variation. This presents the possibility that either underlying biologic trends drive the principal directions of shape change irrespective of asymmetric taphonomic distortion or that the symmetric taphonomic component is large enough that removing only the asymmetric component is inadequate to restore fossil shape. Our study is the first to present quantitative data on the relative magnitude of taphonomic shape change and presents a new method to further explore how taphonomic processes impact our interpretation of the fossil record.
The objective of this WSSA Weed Loss Committee report is to provide quantitative data on the potential yield loss in sugar beet due to weed interference from the major sugar beet growing areas of the United States and Canada. Researchers and extension specialists who conducted research on weed control in sugar beet in the United States and Canada provided quantitative data on sugar beet yield loss due to weed interference in their regions. Specifically, data were requested from weed control studies in sugar beet from up to 10 individual studies per calendar year over a 15-yr period between 2002 and 2017. Data collected indicated that if weeds are left uncontrolled under optimal agronomic practices, growers in Idaho, Michigan, Minnesota, Montana, Nebraska, North Dakota, Ontario, Oregon, and Wyoming would potentially lose an average of 79%, 61%, 66%, 68%, 63%, 75%, 83%, 78%, and 77% of the sugar beet yield. The corresponding monetary loss would be approximately US$234, US$122, US$369, US$43, US$40, US$211, US$12, US$14, and US$32 million, respectively. The average yield loss due to weed interference for the primary sugar beet growing areas of North America was estimated to be 70%. Thus, if weeds are not controlled, growers in the United States would lose approximately 22.4 million tonnes of sugar beet yield valued at approximately US$1.25 billion, and growers in Canada would lose approximately 0.5 million tonnes of sugar beet yield valued at approximately US$25 million. The high return on investment in weed management highlights the importance of continued weed science research for sustaining high crop yield and profitability of sugar beet production in North America.
Cir X-1 is a young X-ray binary exhibiting X-ray flux changes of four orders of magnitude over several decades. It has been observed many times since the launch of the Chandra X-ray Observatory with high energy transmission grating spectrometer and each time the source gave us a vastly different look. At its very lowest X-ray flux we found a single 1.7 keV blackbody spectrum with an emission radius of 0.5 km. Since the neutron star in Cir X-1 is only few thousand years old we identify this as emission from an accretion column since at this youth the neutron star is assumed to be highly magnetized. At an X-ray flux of 1.8×10−11 erg cm−2 s−1 this implies a moderate magnetic field of a few times of 1011 G. The photoionized X-ray emission line properties at this low flux are consistent with B5-type companion wind. We suggest that Cir X-1 is a very young Be-star binary.
Different diagnostic interviews are used as reference standards for major depression classification in research. Semi-structured interviews involve clinical judgement, whereas fully structured interviews are completely scripted. The Mini International Neuropsychiatric Interview (MINI), a brief fully structured interview, is also sometimes used. It is not known whether interview method is associated with probability of major depression classification.
To evaluate the association between interview method and odds of major depression classification, controlling for depressive symptom scores and participant characteristics.
Data collected for an individual participant data meta-analysis of Patient Health Questionnaire-9 (PHQ-9) diagnostic accuracy were analysed and binomial generalised linear mixed models were fit.
A total of 17 158 participants (2287 with major depression) from 57 primary studies were analysed. Among fully structured interviews, odds of major depression were higher for the MINI compared with the Composite International Diagnostic Interview (CIDI) (odds ratio (OR) = 2.10; 95% CI = 1.15–3.87). Compared with semi-structured interviews, fully structured interviews (MINI excluded) were non-significantly more likely to classify participants with low-level depressive symptoms (PHQ-9 scores ≤6) as having major depression (OR = 3.13; 95% CI = 0.98–10.00), similarly likely for moderate-level symptoms (PHQ-9 scores 7–15) (OR = 0.96; 95% CI = 0.56–1.66) and significantly less likely for high-level symptoms (PHQ-9 scores ≥16) (OR = 0.50; 95% CI = 0.26–0.97).
The MINI may identify more people as depressed than the CIDI, and semi-structured and fully structured interviews may not be interchangeable methods, but these results should be replicated.
Declaration of interest
Drs Jetté and Patten declare that they received a grant, outside the submitted work, from the Hotchkiss Brain Institute, which was jointly funded by the Institute and Pfizer. Pfizer was the original sponsor of the development of the PHQ-9, which is now in the public domain. Dr Chan is a steering committee member or consultant of Astra Zeneca, Bayer, Lilly, MSD and Pfizer. She has received sponsorships and honorarium for giving lectures and providing consultancy and her affiliated institution has received research grants from these companies. Dr Hegerl declares that within the past 3 years, he was an advisory board member for Lundbeck, Servier and Otsuka Pharma; a consultant for Bayer Pharma; and a speaker for Medice Arzneimittel, Novartis, and Roche Pharma, all outside the submitted work. Dr Inagaki declares that he has received grants from Novartis Pharma, lecture fees from Pfizer, Mochida, Shionogi, Sumitomo Dainippon Pharma, Daiichi-Sankyo, Meiji Seika and Takeda, and royalties from Nippon Hyoron Sha, Nanzando, Seiwa Shoten, Igaku-shoin and Technomics, all outside of the submitted work. Dr Yamada reports personal fees from Meiji Seika Pharma Co., Ltd., MSD K.K., Asahi Kasei Pharma Corporation, Seishin Shobo, Seiwa Shoten Co., Ltd., Igaku-shoin Ltd., Chugai Igakusha and Sentan Igakusha, all outside the submitted work. All other authors declare no competing interests. No funder had any role in the design and conduct of the study; collection, management, analysis and interpretation of the data; preparation, review or approval of the manuscript; and decision to submit the manuscript for publication.
There is limited empirical information on service-level outcome domains and indicators for the large number of people with intellectual disabilities being treated in forensic psychiatric hospitals.
This study identified and developed the domains that should be used to measure treatment outcomes for this population.
A systematic review of the literature highlighted 60 studies which met eligibility criteria; they were synthesised using content analysis. The findings were refined within a consultation and consensus exercises with carers, patients and experts.
The final framework encompassed three a priori superordinate domains: (a) effectiveness, (b) patient safety and (c) patient and carer experience. Within each of these, further sub-domains emerged from our systematic review and consultation exercises. These included severity of clinical symptoms, offending behaviours, reactive and restrictive interventions, quality of life and patient satisfaction.
To index recovery, services need to measure treatment outcomes using this framework.
Xpert MTB/RIF (Xpert) is the preferred first-line test for all persons with tuberculosis (TB) symptoms in South Africa in line with a diagnostic algorithm. This study evaluates pre- and post-implementation trends in diagnostic practices for drug-sensitive, pulmonary TB in adults in an operational setting, following the introduction of the Xpert-based algorithm. We retrospectively analysed data from the national TB database for Greater Tzaneen sub-district, Limpopo Province. Trends in a number of cases, diagnosis and outcome and characteristics associated with death are reported. A total of 8407 cases were treated from 2008 until 2015, with annual cases registered decreasing by 31·7% over that time period (from 1251 to 855 per year). After implementation of Xpert, 69·9% of cases were diagnosed by Xpert, 29·4% clinically, 0·6% by smear microscopy and 0·1% by culture. Cases with a recorded microbiological test increased from 76·2% to 96·4%. Cases started on treatment without confirmation, but with a negative microbiological test increased from 7·1% to 25·7%. Case fatality decreased from 15·0% to 9·8%, remaining consistently higher in empirically treated groups, regardless of HIV status. Implementation of the algorithm coincided with a reduced number of TB cases treated and improved coverage of microbiological testing; however, a substantial proportion of cases continued to start treatment empirically.
Urban slums provide suitable conditions for infestation by rats, which harbour and shed a wide diversity of zoonotic pathogens including helminths. We aimed to identify risk factors associated with the probability and intensity of infection of helminths of the digestive tract in an urban slum population of Rattus norvegicus. Among 299 rats, eleven species/groups of helminths were identified, of which Strongyloides sp., Nippostrongylus brasiliensis and, the human pathogen, Angiostrongylus cantonensis were the most frequent (97, 41 and 39%, respectively). Sex interactions highlighted behavioural differences between males and females, as eg males were more likely to be infected with N. brasiliensis where rat signs were present, and males presented more intense infections of Strongyloides sp. Moreover, rats in poor body condition had higher intensities of N. brasiliensis. We describe a high global richness of parasites in R. norvegicus, including five species known to cause disease in humans. Among these, A. cantonensis was found in high prevalence and it was ubiquitous in the study area – knowledge which is of public health importance. A variety of environmental, demographic and body condition variables were associated with helminth species infection of rats, suggesting a comparable variety of risk factors for humans.
Prospective data on the associations between vitamin D intake and risk of CVD and all-cause mortality are limited and inconclusive. The aim of the present study was to investigate the associations between vitamin D intake and CVD risk and all-cause mortality in the Caerphilly Prospective Cohort Study.
The associations of vitamin D intake with CVD risk markers were examined cross-sectionally at baseline and longitudinally at 5-year, 10-year and >20-year follow-ups. In addition, the predictive value of vitamin D intake for CVD events and all-cause mortality after >20 years of follow-up was examined. Logistic regression and general linear regression were used for data analysis.
Participants in the UK.
Men (n 452) who were free from CVD and type 2 diabetes at recruitment.
Higher vitamin D intake was associated with increased HDL cholesterol (P=0·003) and pulse pressure (P=0·04) and decreased total cholesterol:HDL cholesterol (P=0·008) cross-sectionally at baseline, but the associations were lost during follow-up. Furthermore, higher vitamin D intake was associated with decreased concentration of plasma TAG at baseline (P=0·01) and at the 5-year (P=0·01), but not the 10-year examination. After >20 years of follow-up, vitamin D was not associated with stroke (n 72), myocardial infarctions (n 142), heart failure (n 43) or all-cause mortality (n 281), but was positively associated with increased diastolic blood pressure (P=0·03).
The study supports associations of higher vitamin D intake with lower fasting plasma TAG and higher diastolic blood pressure.
Scale-up of antiretroviral therapy (ART) for human immunodeficiency virus (HIV) infection has reduced the incidence of pulmonary tuberculosis (PTB) in South Africa. Despite the strong association of HIV infection with extrapulmonary tuberculosis (EPTB), the effect of ART on the epidemiology of EPTB remains undocumented. We conducted a retrospective record review of patients initiated on treatment for EPTB in 2009 (ART coverage <5%) and 2013 (ART coverage 41%) at four public hospitals in rural Mopani District, South Africa. Data were obtained from TB registers and patients’ clinical records. There was a 13% decrease in overall number of TB cases, which was similar for cases registered as EPTB (n = 399 in 2009 vs. 336 in 2013; P < 0·01) and for PTB (1031 vs. 896; P < 0·01). Among EPTB cases, the proportion of miliary TB and disseminated TB decreased significantly (both P < 0·01), TB meningitis and TB of bones increased significantly (P < 0·01 and P = 0·02, respectively) and TB pleural effusion and lymphadenopathy remained the same. This study shows a reduction of EPTB cases that is similar to that of PTB in the context of the ART scale-up. The changing profile of EPTB warrants attention of healthcare workers.
Introduction of antiretroviral therapy (ART) has dramatically reduced the incidence of infectious ocular diseases in human immunodeficiency virus (HIV)-infected individuals. However, the effects of long-term ART and chronic HIV infection on the eye are ill-defined. This study determined the occurrence and severity of ocular diseases among 342 participants in a rural South African setting: HIV-naïve (n = 105), HIV-infected ART-naïve (n = 16), HIV-infected on ART for <12 months (short-term ART; n = 56) and HIV-infected individuals on ART for >36 months (long-term ART; n = 165). More HIV-infected participants presented with an external eye condition, in particular blepharitis, than HIV-naïve individuals (18% vs. 7%; age-adjusted odds ratio (aOR) = 2·8, P < 0·05). Anterior segment conditions (particularly keratoconjunctivitis sicca and pterygium) were also more common (50% vs. 27%; aOR = 2·4; P < 0·01). Compared with individuals on short-term ART, participants receiving long-term ART were more likely to have clinically detectable cataract (57% vs. 38%; aOR = 2·2, P = 0·01) and posterior segment diseases, especially HIV retinopathy (30% vs. 11%; aOR = 3·4, P < 0·05). Finally, long-term ART was significantly associated with presence of HIV retinopathy (P < 0·01). These data implicate that ocular disease is more common and of more diverse etiology among HIV-infected individuals, especially those on long-term ART and suggest that regular ophthalmological monitoring of HIV-infected individuals on ART is warranted.
The study aims to assess whether supplementation with the probiotic Lactobacillus rhamnosus HN001 (HN001) can reduce the prevalence of gestational diabetes mellitus (GDM). A double-blind, randomised, placebo-controlled parallel trial was conducted in New Zealand (NZ) (Wellington and Auckland). Pregnant women with a personal or partner history of atopic disease were randomised at 14–16 weeks’ gestation to receive HN001 (6×109 colony-forming units) (n 212) or placebo (n 211) daily. GDM at 24–30 weeks was assessed using the definition of the International Association of Diabetes and Pregnancy Study Groups (IADPSG) (fasting plasma glucose ≥5·1 mmol/l, or 1 h post 75 g glucose level at ≥10 mmol/l or at 2 h ≥8·5 mmol/l) and NZ definition (fasting plasma glucose ≥5·5 mmol/l or 2 h post 75 g glucose at ≥9 mmol/l). All analyses were intention-to-treat. A total of 184 (87 %) women took HN001 and 189 (90 %) women took placebo. There was a trend towards lower relative rates (RR) of GDM (IADPSG definition) in the HN001 group, 0·59 (95 % CI 0·32, 1·08) (P=0·08). HN001 was associated with lower rates of GDM in women aged ≥35 years (RR 0·31; 95 % CI 0·12, 0·81, P=0·009) and women with a history of GDM (RR 0·00; 95 % CI 0·00, 0·66, P=0·004). These rates did not differ significantly from those of women without these characteristics. Using the NZ definition, GDM prevalence was significantly lower in the HN001 group, 2·1 % (95 % CI 0·6, 5·2), v. 6·5 % (95 % CI 3·5, 10·9) in the placebo group (P=0·03). HN001 supplementation from 14 to 16 weeks’ gestation may reduce GDM prevalence, particularly among older women and those with previous GDM.
Background Traditionally, the delivery of dedicated neurocritical care (NCC) occurs in distinct NCC units and is associated with improved outcomes. Institution-specific logistical challenges pose barriers to the development of distinct NCC units; therefore, we developed a consultancy NCC service coupled with the implementation of invasive multimodal neuromonitoring, within a medical-surgical intensive care unit. Our objective was to evaluate the effect of a consultancy NCC program on neurologic outcomes in severe traumatic brain injury patients. Methods: We conducted a single-center quasi-experimental uncontrolled pre- and post-NCC study in severe traumatic brain injury patients (Glasgow Coma Scale ≤8). The NCC program includes consultation with a neurointensivist and neurosurgeon and multimodal neuromonitoring. Demographic, injury severity metrics, neurophysiologic data, and therapeutic interventions were collected. Glasgow Outcome Scale (GOS) at 6 months was the primary outcome. Multivariable ordinal logistic regression was used to model the association between NCC implementation and GOS at 6 months. Results: A total of 113 patients were identified: 76 pre-NCC and 37 post-NCC. Mean age was 39 years (standard deviation [SD], 2) and 87 of 113 (77%) patients were male. Median admission motor score was 3 (interquartile ratio, 1-4). Daily mean arterial pressure was higher (95 mmHg [SD, 10]) versus (88 mmHg [SD, 10], p<0.001) and daily mean core body temperature was lower (36.6°C [SD, 0.90]) versus (37.2°C [SD, 1.0], p=0.001) post-NCC compared with pre-NCC, respectively. Multivariable regression modelling revealed the NCC program was associated with a 2.5 increased odds (odds ratios, 2.5; 95% confidence interval, 1.1-5.3; p=0.022) of improved 6-month GOS. Conclusions: Implementation of a NCC program is associated with improved 6 month GOS in severe TBI patients.