To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Item 9 of the Patient Health Questionnaire-9 (PHQ-9) queries about thoughts of death and self-harm, but not suicidality. Although it is sometimes used to assess suicide risk, most positive responses are not associated with suicidality. The PHQ-8, which omits Item 9, is thus increasingly used in research. We assessed equivalency of total score correlations and the diagnostic accuracy to detect major depression of the PHQ-8 and PHQ-9.
We conducted an individual patient data meta-analysis. We fit bivariate random-effects models to assess diagnostic accuracy.
16 742 participants (2097 major depression cases) from 54 studies were included. The correlation between PHQ-8 and PHQ-9 scores was 0.996 (95% confidence interval 0.996 to 0.996). The standard cutoff score of 10 for the PHQ-9 maximized sensitivity + specificity for the PHQ-8 among studies that used a semi-structured diagnostic interview reference standard (N = 27). At cutoff 10, the PHQ-8 was less sensitive by 0.02 (−0.06 to 0.00) and more specific by 0.01 (0.00 to 0.01) among those studies (N = 27), with similar results for studies that used other types of interviews (N = 27). For all 54 primary studies combined, across all cutoffs, the PHQ-8 was less sensitive than the PHQ-9 by 0.00 to 0.05 (0.03 at cutoff 10), and specificity was within 0.01 for all cutoffs (0.00 to 0.01).
PHQ-8 and PHQ-9 total scores were similar. Sensitivity may be minimally reduced with the PHQ-8, but specificity is similar.
In 2013, the national surveillance case definition for West Nile virus (WNV) disease was revised to remove fever as a criterion for neuroinvasive disease and require at most subjective fever for non-neuroinvasive disease. The aims of this project were to determine how often afebrile WNV disease occurs and assess differences among patients with and without fever. We included cases with laboratory evidence of WNV disease reported from four states in 2014. We compared demographics, clinical symptoms and laboratory evidence for patients with and without fever and stratified the analysis by neuroinvasive and non-neuroinvasive presentations. Among 956 included patients, 39 (4%) had no fever; this proportion was similar among patients with and without neuroinvasive disease symptoms. For neuroinvasive and non-neuroinvasive patients, there were no differences in age, sex, or laboratory evidence between febrile and afebrile patients, but hospitalisations were more common among patients with fever (P < 0.01). The only significant difference in symptoms was for ataxia, which was more common in neuroinvasive patients without fever (P = 0.04). Only 5% of non-neuroinvasive patients did not meet the WNV case definition due to lack of fever. The evidence presented here supports the changes made to the national case definition in 2013.
A national need is to prepare for and respond to accidental or intentional disasters categorized as chemical, biological, radiological, nuclear, or explosive (CBRNE). These incidents require specific subject-matter expertise, yet have commonalities. We identify 7 core elements comprising CBRNE science that require integration for effective preparedness planning and public health and medical response and recovery. These core elements are (1) basic and clinical sciences, (2) modeling and systems management, (3) planning, (4) response and incident management, (5) recovery and resilience, (6) lessons learned, and (7) continuous improvement. A key feature is the ability of relevant subject matter experts to integrate information into response operations. We propose the CBRNE medical operations science support expert as a professional who (1) understands that CBRNE incidents require an integrated systems approach, (2) understands the key functions and contributions of CBRNE science practitioners, (3) helps direct strategic and tactical CBRNE planning and responses through first-hand experience, and (4) provides advice to senior decision-makers managing response activities. Recognition of both CBRNE science as a distinct competency and the establishment of the CBRNE medical operations science support expert informs the public of the enormous progress made, broadcasts opportunities for new talent, and enhances the sophistication and analytic expertise of senior managers planning for and responding to CBRNE incidents.
The early village at Çatalhöyük (7100–6150 BC) provides important evidence for the Neolithic and Chalcolithic people of central Anatolia. This article reports on the use of lipid biomarker analysis to identify human coprolites from midden deposits, and microscopy to analyse these coprolites and soil samples from human burials. Whipworm (Trichuris trichiura) eggs are identified in two coprolites, but the pelvic soil samples are negative for parasites. Çatalhöyük is one of the earliest Eurasian sites to undergo palaeoparasitological analysis to date. The results inform how intestinal parasitic infection changed as humans modified their subsistence strategies from hunting and gathering to settled farming.
The USA is currently enduring an opioid crisis. Identifying cost-effective, easy-to-implement behavioral measures that predict treatment outcomes in opioid misusers is a crucial scientific, therapeutic, and epidemiological goal.
The current study used a mixed cross-sectional and longitudinal design to test whether a behavioral choice task, previously validated in stimulant users, was associated with increased opioid misuse severity at baseline, and whether it predicted change in opioid misuse severity at follow-up. At baseline, data from 100 prescription opioid-treated chronic pain patients were analyzed; at follow-up, data were analyzed in 34 of these participants who were non-misusers at baseline. During the choice task, participants chose under probabilistic contingencies whether to view opioid-related images in comparison with affectively pleasant, unpleasant, and neutral images. Following previous procedures, we also assessed insight into choice behavior, operationalized as whether (yes/no) participants correctly self-reported the image category they chose most often.
At baseline, the higher choice for viewing opioid images in direct comparison with pleasant images was associated with opioid misuse and impaired insight into choice behavior; the combination of these produced especially elevated opioid-related choice behavior. In longitudinal analyses of individuals who were initially non-misusers, higher baseline opioid v. pleasant choice behavior predicted more opioid misuse behaviors at follow-up.
These results indicate that greater relative allocation of behavior toward opioid stimuli and away from stimuli depicting natural reinforcement is associated with concurrent opioid misuse and portends vulnerability toward future misuse. The choice task may provide important medical information to guide opioid-prescribing practices.
Targeted screening for carbapenem-resistant organisms (CROs), including carbapenem-resistant Enterobacteriaceae (CRE) and carbapenemase-producing organisms (CPOs), remains limited; recent data suggest that existing policies miss many carriers.
Our objective was to measure the prevalence of CRO and CPO perirectal colonization at hospital unit admission and to use machine learning methods to predict probability of CRO and/or CPO carriage.
We performed an observational cohort study of all patients admitted to the medical intensive care unit (MICU) or solid organ transplant (SOT) unit at The Johns Hopkins Hospital between July 1, 2016 and July 1, 2017. Admission perirectal swabs were screened for CROs and CPOs. More than 125 variables capturing preadmission clinical and demographic characteristics were collected from the electronic medical record (EMR) system. We developed models to predict colonization probabilities using decision tree learning.
Evaluating 2,878 admission swabs from 2,165 patients, we found that 7.5% and 1.3% of swabs were CRO and CPO positive, respectively. Organism and carbapenemase diversity among CPO isolates was high. Despite including many characteristics commonly associated with CRO/CPO carriage or infection, overall, decision tree models poorly predicted CRO and CPO colonization (C statistics, 0.57 and 0.58, respectively). In subgroup analyses, however, models did accurately identify patients with recent CRO-positive cultures who use proton-pump inhibitors as having a high likelihood of CRO colonization.
In this inpatient population, CRO carriage was infrequent but was higher than previously published estimates. Despite including many variables associated with CRO/CPO carriage, models poorly predicted colonization status, likely due to significant host and organism heterogeneity.
Objectives: Studies of neurocognitively elite older adults, termed SuperAgers, have identified clinical predictors and neurobiological indicators of resilience against age-related neurocognitive decline. Despite rising rates of older persons living with HIV (PLWH), SuperAging (SA) in PLWH remains undefined. We aimed to establish neuropsychological criteria for SA in PLWH and examined clinically relevant correlates of SA. Methods: 734 PLWH and 123 HIV-uninfected participants between 50 and 64 years of age underwent neuropsychological and neuromedical evaluations. SA was defined as demographically corrected (i.e., sex, race/ethnicity, education) global neurocognitive performance within normal range for 25-year-olds. Remaining participants were labeled cognitively normal (CN) or impaired (CI) based on actual age. Chi-square and analysis of variance tests examined HIV group differences on neurocognitive status and demographics. Within PLWH, neurocognitive status differences were tested on HIV disease characteristics, medical comorbidities, and everyday functioning. Multinomial logistic regression explored independent predictors of neurocognitive status. Results: Neurocognitive status rates and demographic characteristics differed between PLWH (SA=17%; CN=38%; CI=45%) and HIV-uninfected participants (SA=35%; CN=55%; CI=11%). In PLWH, neurocognitive groups were comparable on demographic and HIV disease characteristics. Younger age, higher verbal IQ, absence of diabetes, fewer depressive symptoms, and lifetime cannabis use disorder increased likelihood of SA. SA reported increased independence in everyday functioning, employment, and health-related quality of life than non-SA. Conclusions: Despite combined neurological risk of aging and HIV, youthful neurocognitive performance is possible for older PLWH. SA relates to improved real-world functioning and may be better explained by cognitive reserve and maintenance of cardiometabolic and mental health than HIV disease severity. Future research investigating biomarker and lifestyle (e.g., physical activity) correlates of SA may help identify modifiable neuroprotective factors against HIV-related neurobiological aging. (JINS, 2019, 25, 507–519)
Over the past 25 years, numerous studies utilizing both X-ray diffraction (XRE) and differential scanning calorimetry (DSC) have been reported In the literature. Generally, conventional high-temperature X-ray data identifies solid-state transitions, then attempts to correlate them with thermal events observed by the calorimeter. Since changes occur in the sample during studies such as these, separate portions of the sample must be used for XRD and DSC experiments. When comparing results of the two experiments, questions arise concerning sample homogeniety as well as temperature and environmental differences. In fact, no conventional high-temperature X-ray diffraction instrument can give the precise control over temperature and heating rate available with a DSC, The problems of sample inhomogeneltles and Instrumental differences could be avoided if X-ray diffraction and DSC could be performed simultaneously on one sample.
Minimizing the negative ecological impacts of exotic plant invasions is one goal of land management. Using selective herbicides is one strategy to achieve this goal; however, the unintended consequences of this strategy are not always fully understood. The recently introduced herbicide indaziflam has a mode of action not previously used in non-crop weed management. Thus, there is limited information about the impacts of this active ingredient when applied alone or in combination with other non-crop herbicides. The objective of this research was to evaluate native species tolerance to indaziflam and imazapic applied alone and with other broadleaf herbicides. Replicated field plots were established at two locations in Colorado with a diverse mix of native forbs and grasses. Species richness and abundance were compared between the nontreated control plots and plots where indaziflam and imazapic were applied alone and in combination with picloram and aminocyclopyrachlor. Species richness and abundance did not decrease when indaziflam or imazapic were applied alone; however, species abundance was reduced by treatments containing picloram and aminocyclopyrachlor. Species richness was only impacted at one site 1 yr after treatment (YAT) by these broadleaf herbicides. Decreases in abundance were mainly due to reductions in forbs that resulted in a corresponding increase in grass cover. Our data suggest that indaziflam will control downy brome (Bromus tectorum L.) for multiple years without reduction in perennial species richness or abundance. If B. tectorum is present with perennial broadleaf weeds requiring the addition of herbicides like picloram or aminocyclopyrachlor, forb abundance could be reduced, and in some cases there could be a temporary reduction in perennial species richness.
We evaluated provider adherence to practice guidelines for inpatients diagnosed with Clostridoides difficile infection (CDI) before and after implementation of a best practice alert (BPA) linking a positive test result to guideline-based orders. After implementation of the BPA, guideline-based prescribing increased from 39.4% in 2013 to 67.7% in 2016 (P = .014).
This research addresses dementia and driving cessation, a major life event for affected individuals, and an immense challenge in primary care. In Australia, as with many other countries, it is primarily general practitioners (GPs) who identify changes in cognitive functioning and monitor driving issues with their patients with dementia. Qualitative evidence from studies with family members and other health professionals shows it is a complicated area of practice. However we still know little from GPs about how they manage the challenges with their patients and the strategies that they use to facilitate driving cessation.
Data were collected through five focus groups with 29 GPs at their primary care practices in metropolitan and regional Queensland, Australia. A semi-structured topic guide was used to direct questions addressing decision factors and management strategies. Discussions were audio recorded, transcribed verbatim and thematically analyzed.
Regarding the challenges of raising driving cessation, four key themes emerged. These included: (i) Considering the individual; (ii) GP-patient relationships may hinder or help; (iii) Resources to support raising driver retirement; and (iv) Ethical dilemmas and ethical considerations. The impact of discussing driving cessation on GPs is discussed.
The findings of this study contribute to further understanding the experiences and needs of primary care physicians related to managing driving retirement with their patients with dementia. Results support a need for programs regarding identification and assessment of fitness to drive, to upskill health professionals and particularly GPs to manage the complex issues around dementia and driving cessation, and explore cost-effective and timely delivery of such support to patients.
Total laryngectomy is considered the primary treatment modality for advanced laryngeal carcinoma. This study assessed the quality of life in patients after total laryngectomy, and ascertained whether quality of life is affected by socioeconomic status.
Forty-seven patients (20 state- and 27 private-sector) who underwent total laryngectomy between 1998 and 2014 responded to the University of Washington Quality of Life Questionnaire, the Voice-Related Quality of Life Questionnaire and the Brief Illness Perception Questionnaire.
Significant differences were found in socioeconomic status between state- and private-sector patients (p < 0.001). There was no significant difference in overall quality of life between groups (p = 0.210). State-sector patients scored significantly higher Voice-Related Quality of Life Questionnaire scores (p = 0.043). Perception of illness did not differ significantly between groups.
Overall quality of life after total laryngectomy appears to be similar in patients from different socioeconomic backgrounds. However, patients from lower socioeconomic circumstances have better voice-related quality of life. The results illustrate the importance of including socioeconomic status when reporting voice outcomes in total laryngectomy patients.
To evaluate the efficacy of multiple ultraviolet (UV) light decontamination devices in a radiology procedure room.
We compared the efficacy of 8 UV decontamination devices with a 4-minute UV exposure time in reducing recovery of methicillin-resistant Staphylococcus aureus (MRSA), vancomycin-resistant Enterococcus (VRE), and Clostridium difficile spores on steel disk carriers placed at 5 sites on a computed tomography patient table. Analysis of variance was used to compare reductions for the different devices. A spectrometer was used to obtain irradiance measurements for the devices.
Four standard vertical tower low-pressure mercury devices achieved 2 log10CFU or greater reductions in VRE and MRSA and ~1 log10CFU reductions in C. difficile spores, whereas a pulsed-xenon device resulted in less reduction in the pathogens (P<.001). In comparison to the vertical tower low-pressure mercury devices, equal or greater reductions in the pathogens were achieved by 3 nonstandard low-pressure mercury devices that included either adjustable bulbs that could be oriented directly over the exam table, a robotic base allowing movement along the side of the table during operation, or 3 vertical towers operated simultaneously. The low-pressure mercury devices produced primarily UV-C light, whereas the pulsed-xenon device produced primarily UV-A and UV-B light. The time required to move the devices from the corner of the room and set up for operation varied from 18 to 59 seconds.
Many currently available UV devices could provide an effective and efficient adjunct to manual cleaning and disinfection in radiology procedure rooms.
With the recent discovery of a dozen dusty star-forming galaxies and around 30 quasars at z > 5 that are hyper-luminous in the infrared (μ LIR > 1013 L⊙, where μ is a lensing magnification factor), the possibility has opened up for SPICA, the proposed ESA M5 mid-/far-infrared mission, to extend its spectroscopic studies toward the epoch of reionisation and beyond. In this paper, we examine the feasibility and scientific potential of such observations with SPICA’s far-infrared spectrometer SAFARI, which will probe a spectral range (35–230 μm) that will be unexplored by ALMA and JWST. Our simulations show that SAFARI is capable of delivering good-quality spectra for hyper-luminous infrared galaxies at z = 5 − 10, allowing us to sample spectral features in the rest-frame mid-infrared and to investigate a host of key scientific issues, such as the relative importance of star formation versus AGN, the hardness of the radiation field, the level of chemical enrichment, and the properties of the molecular gas. From a broader perspective, SAFARI offers the potential to open up a new frontier in the study of the early Universe, providing access to uniquely powerful spectral features for probing first-generation objects, such as the key cooling lines of low-metallicity or metal-free forming galaxies (fine-structure and H2 lines) and emission features of solid compounds freshly synthesised by Population III supernovae. Ultimately, SAFARI’s ability to explore the high-redshift Universe will be determined by the availability of sufficiently bright targets (whether intrinsically luminous or gravitationally lensed). With its launch expected around 2030, SPICA is ideally positioned to take full advantage of upcoming wide-field surveys such as LSST, SKA, Euclid, and WFIRST, which are likely to provide extraordinary targets for SAFARI.
In 4 hospitals, we demonstrated frequent dispersal of fluorescent tracer and fluoroquinolone-resistant gram-negative bacilli from sink drains to sink bowls and to surfaces outside the bowl. Fluorescent tracer dispersal correlated inversely with the depth of the sink bowl. Modifications in sink design could substantially reduce the risk for pathogen dissemination.