We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Accurate diagnosis of bipolar disorder (BPD) is difficult in clinical practice, with an average delay between symptom onset and diagnosis of about 7 years. A depressive episode often precedes the first manic episode, making it difficult to distinguish BPD from unipolar major depressive disorder (MDD).
Aims
We use genome-wide association analyses (GWAS) to identify differential genetic factors and to develop predictors based on polygenic risk scores (PRS) that may aid early differential diagnosis.
Method
Based on individual genotypes from case–control cohorts of BPD and MDD shared through the Psychiatric Genomics Consortium, we compile case–case–control cohorts, applying a careful quality control procedure. In a resulting cohort of 51 149 individuals (15 532 BPD patients, 12 920 MDD patients and 22 697 controls), we perform a variety of GWAS and PRS analyses.
Results
Although our GWAS is not well powered to identify genome-wide significant loci, we find significant chip heritability and demonstrate the ability of the resulting PRS to distinguish BPD from MDD, including BPD cases with depressive onset (BPD-D). We replicate our PRS findings in an independent Danish cohort (iPSYCH 2015, N = 25 966). We observe strong genetic correlation between our case–case GWAS and that of case–control BPD.
Conclusions
We find that MDD and BPD, including BPD-D are genetically distinct. Our findings support that controls, MDD and BPD patients primarily lie on a continuum of genetic risk. Future studies with larger and richer samples will likely yield a better understanding of these findings and enable the development of better genetic predictors distinguishing BPD and, importantly, BPD-D from MDD.
The purpose of the current study was to understand the prevalence and patterns of cannabinoid use among LTC residents across Canada. We gathered data on cannabinoid prescriptions among LTC residents for one year before and after recreational cannabis legalization. Multi-level modelling was used to examine the effects of demographic and diagnostic characteristics on rates of cannabinoid prescription over time. All prescriptions were for nabilone. There was a significant increase in the proportion of residents prescribed nabilone following the legalization of recreational cannabis in Canada. Residents with relatively more severe pain (based on the Minimum Data Set pain scale), a diagnosis of depression, or a diagnosis of an anxiety disorder were more likely to have received a nabilone prescription. Our results provide valuable information regarding the increasing use of synthetic cannabinoids in LTC. The implications for clinical practice and policy decision-makers are discussed.
Legionellosis is a disease caused by the bacterium Legionella that most commonly presents as Legionnaires’ disease (LD), a severe form of pneumonia. From 2015 to 2019, an average of 438 LD cases per year were reported in Canada. However, it is believed that the actual number of cases is much higher, since LD may be underdiagnosed and underreported. The purpose of this study was to develop an estimate of the true incidence of illnesses, hospitalizations, and deaths associated with LD in Canada. Values were derived using a stochastic model, based on Canadian surveillance data from 2015 to 2019, which were scaled up to account for underdiagnosis and underreporting. Overall, there were an estimated 1,113 (90% CrI: 737–1,730) illnesses, 1,008 (90% CrI: 271–2,244) hospitalizations, and 34 (90% CrI: 4–86) deaths due to domestically acquired waterborne LD annually in Canada from 2015 to 2019. It was further estimated that only 36% of illnesses and 39% of hospitalizations and deaths were captured in surveillance, and that 22% of illnesses were caused by Legionella serogroups and species other than Legionella pneumophila serogroup 1 (non-Lp1). This study highlights the true burden and areas for improvement in Canada’s surveillance and detection of LD.
Background: All critically ill patients are at risk for hospital-acquired bloodstream infection (HABSI). At any time, however, there is heterogeneity among patients in the ICU; some patients have the added complexity of end-of-life discussions. We sought to better understand the patients in our medical intensive care unit (MICU) with HABSIs that do and do not meet the NHSN definition for a central-line–associated bloodstream infection (CLABSI) event by evaluating for the presence of a do-not-resuscitate (DNR) order. Methods: The study was conducted at our 66-bed MICU at the Cleveland Clinic Main Campus between January 2021 and September 2022. Surveillance for HABSI to include determination of CLABSI is performed prospectively according to the NSHN definition. The electronic health record was queried for each patient with a HABSI for the presence of a DNR order. DNR orders were categorized as follows: prevalent (DNR orders present at the time of admission to the MICU), incident (orders entered after admission to the MICU), or no DNR (for patients without an order at any time during their MICU stay). For incident orders, time from order to HABSI was recorded. Time to event was calculated as days between ICU admission to HABSI. Results: During the observation period there were 36,477 MICU patient days and 4,815 admissions. There were 112 HABSIs, of which 48 (43%) were CLABSIs. Overall, 65 patients were categorized as incident DNR, 7 were categorized as prevalent DNR, and 40 were categorized as no DNR. For patients with an incident DNR order, 50 HABSIs occurred on the date of or before the order and 15 occurred after the order. In patients in whom HABSI occurred after the incident DNR order, the median number of days between DNR order and HABSI was 11 days (range, 1–69). Discussion: In our MICU, >50% of HABSIs and 60% of CLABSIs occurred in patients with a DNR order incident to their MICU stay. Interventions to prevent hospital-acquired bloodstream infection and the analysis of the events are inextricably linked to issues of end-of-life care for critically ill patients. Further exploration of patient characteristics easily obtainable from the EHR, such as DNR orders, is necessary to inform best practices for prevention and risk adjustment of bloodstream infection rates.
Infants and children born with CHD are at significant risk for neurodevelopmental delays and abnormalities. Individualised developmental care is widely recognised as best practice to support early neurodevelopment for medically fragile infants born premature or requiring surgical intervention after birth. However, wide variability in clinical practice is consistently demonstrated in units caring for infants with CHD. The Cardiac Newborn Neuroprotective Network, a Special Interest Group of the Cardiac Neurodevelopmental Outcome Collaborative, formed a working group of experts to create an evidence-based developmental care pathway to guide clinical practice in hospital settings caring for infants with CHD. The clinical pathway, “Developmental Care Pathway for Hospitalized Infants with Congenital Heart Disease,” includes recommendations for standardised developmental assessment, parent mental health screening, and the implementation of a daily developmental care bundle, which incorporates individualised assessments and interventions tailored to meet the needs of this unique infant population and their families. Hospitals caring for infants with CHD are encouraged to adopt this developmental care pathway and track metrics and outcomes using a quality improvement framework.
This article is a clinical guide which discusses the “state-of-the-art” usage of the classic monoamine oxidase inhibitor (MAOI) antidepressants (phenelzine, tranylcypromine, and isocarboxazid) in modern psychiatric practice. The guide is for all clinicians, including those who may not be experienced MAOI prescribers. It discusses indications, drug-drug interactions, side-effect management, and the safety of various augmentation strategies. There is a clear and broad consensus (more than 70 international expert endorsers), based on 6 decades of experience, for the recommendations herein exposited. They are based on empirical evidence and expert opinion—this guide is presented as a new specialist-consensus standard. The guide provides practical clinical advice, and is the basis for the rational use of these drugs, particularly because it improves and updates knowledge, and corrects the various misconceptions that have hitherto been prominent in the literature, partly due to insufficient knowledge of pharmacology. The guide suggests that MAOIs should always be considered in cases of treatment-resistant depression (including those melancholic in nature), and prior to electroconvulsive therapy—while taking into account of patient preference. In selected cases, they may be considered earlier in the treatment algorithm than has previously been customary, and should not be regarded as drugs of last resort; they may prove decisively effective when many other treatments have failed. The guide clarifies key points on the concomitant use of incorrectly proscribed drugs such as methylphenidate and some tricyclic antidepressants. It also illustrates the straightforward “bridging” methods that may be used to transition simply and safely from other antidepressants to MAOIs.
Background: The NHSN parameter estimate for predicted number of central-line–associated bloodstream infection (CLABSI) is the same for gastroenterology wards as other specialty wards, such as behavioral health and gerontology. We conducted this study to contribute to the body of knowledge surrounding the risk for hospital-acquired bloodstream infection (HABSI) in patients with and without hepatic failure. The Cleveland Clinic is a 1,200-bed, multispecialty hospital with a solid-organ transplant service. Patients with hepatic failure who do not require critical care are housed on 36-bed unit A. On unit A, 43% of patients are under hepatology or gastroenterology service, although 51% of patients are under general internal medicine. Overall, unit A has a high incidence of HABSI. Methods: Surveillance for HABSI and CLABSI is performed at the Cleveland Clinic per NHSN protocol. All patients with a midnight stay on unit A from January 2019 through September 2021 were dichotomized as having hepatic failure (yes or no) if they ever received the International Classification of Diseases Tenth Revision code for “hepatic failure, not elsewhere classified.” We joined the diagnostic code to patient days and central-line-days databases and summarized the data using Microsoft Excel software. We stratified the number of patients, patient days, device days, infection classification, and hospital length of stay by whether the patient had hepatic failure, and we compared the incidence of HABSI and CLABSI between the 2 groups using OpenEpi version 3.01 software. Results: We identified 72 HABSIs among 4,285 patients who stayed on unit A for 30,910 patient days during the study period. The incidences of HABSI in patients with and without hepatic failure were 39.0 and 13.9 per 10,000 patient days, respectively (P < .001). The incidence of CLABSI was 5.4 and 1.9 per 1,000 line days, respectively (P = .01). Patients with hepatic failure stayed longer (11.5 vs 5.9 days), yet the central-line utilization ratios were not substantially different (0.25 vs 0.24). Enterococcus was the most common pathogen involved in CLABSI in both groups (Table 2). Conclusions: Patients with hepatic failure experienced CLABSI more frequently than patients without hepatic failure, stayed longer in the hospital, and were less likely have HABSI attributed to another primary focus of infection according to NHSN definitions. Although hepatic failure may be among the most severe conditions among patients in a gastroenterology ward, we have demonstrated that these units house a population uniquely susceptible to HABSI and CLABSI.
As clinical trials were rapidly initiated in response to the COVID-19 pandemic, Data and Safety Monitoring Boards (DSMBs) faced unique challenges overseeing trials of therapies never tested in a disease not yet characterized. Traditionally, individual DSMBs do not interact or have the benefit of seeing data from other accruing trials for an aggregated analysis to meaningfully interpret safety signals of similar therapeutics. In response, we developed a compliant DSMB Coordination (DSMBc) framework to allow the DSMB from one study investigating the use of SARS-CoV-2 convalescent plasma to treat COVID-19 to review data from similar ongoing studies for the purpose of safety monitoring.
Methods:
The DSMBc process included engagement of DSMB chairs and board members, execution of contractual agreements, secure data acquisition, generation of harmonized reports utilizing statistical graphics, and secure report sharing with DSMB members. Detailed process maps, a secure portal for managing DSMB reports, and templates for data sharing and confidentiality agreements were developed.
Results:
Four trials participated. Data from one trial were successfully harmonized with that of an ongoing trial. Harmonized reports allowing for visualization and drill down into the data were presented to the ongoing trial’s DSMB. While DSMB deliberations are confidential, the Chair confirmed successful review of the harmonized report.
Conclusion:
It is feasible to coordinate DSMB reviews of multiple independent studies of a similar therapeutic in similar patient cohorts. The materials presented mitigate challenges to DSMBc and will help expand these initiatives so DSMBs may make more informed decisions with all available information.
Whole-genome sequencing (WGS) shotgun metagenomics (metagenomics) attempts to sequence the entire genetic content straight from the sample. Diagnostic advantages lie in the ability to detect unsuspected, uncultivatable, or very slow-growing organisms.
Objective:
To evaluate the clinical and economic effects of using WGS and metagenomics for outbreak management in a large metropolitan hospital.
Design:
Cost-effectiveness study.
Setting:
Intensive care unit and burn unit of large metropolitan hospital.
Patients:
Simulated intensive care unit and burn unit patients.
Methods:
We built a complex simulation model to estimate pathogen transmission, associated hospital costs, and quality-adjusted life years (QALYs) during a 32-month outbreak of carbapenem-resistant Acinetobacter baumannii (CRAB). Model parameters were determined using microbiology surveillance data, genome sequencing results, hospital admission databases, and local clinical knowledge. The model was calibrated to the actual pathogen spread within the intensive care unit and burn unit (scenario 1) and compared with early use of WGS (scenario 2) and early use of WGS and metagenomics (scenario 3) to determine their respective cost-effectiveness. Sensitivity analyses were performed to address model uncertainty.
Results:
On average compared with scenario 1, scenario 2 resulted in 14 fewer patients with CRAB, 59 additional QALYs, and $75,099 cost savings. Scenario 3, compared with scenario 1, resulted in 18 fewer patients with CRAB, 74 additional QALYs, and $93,822 in hospital cost savings. The likelihoods that scenario 2 and scenario 3 were cost-effective were 57% and 60%, respectively.
Conclusions:
The use of WGS and metagenomics in infection control processes were predicted to produce favorable economic and clinical outcomes.
We present the data and initial results from the first pilot survey of the Evolutionary Map of the Universe (EMU), observed at 944 MHz with the Australian Square Kilometre Array Pathfinder (ASKAP) telescope. The survey covers
$270 \,\mathrm{deg}^2$
of an area covered by the Dark Energy Survey, reaching a depth of 25–30
$\mu\mathrm{Jy\ beam}^{-1}$
rms at a spatial resolution of
$\sim$
11–18 arcsec, resulting in a catalogue of
$\sim$
220 000 sources, of which
$\sim$
180 000 are single-component sources. Here we present the catalogue of single-component sources, together with (where available) optical and infrared cross-identifications, classifications, and redshifts. This survey explores a new region of parameter space compared to previous surveys. Specifically, the EMU Pilot Survey has a high density of sources, and also a high sensitivity to low surface brightness emission. These properties result in the detection of types of sources that were rarely seen in or absent from previous surveys. We present some of these new results here.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
Aims
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Method
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Results
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
Conclusions
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
China's National Reimbursement Drug List (NRDL) covers medicines that are included in national health insurance schemes. NRDL updates take into account evidence and recommendations of experts from the fields of medicine, health economics, pharmacy and health policy. A negotiation mechanism between the government and manufacturers was introduced in 2017 to include a more detailed evaluation and negotiation for high cost drugs. However, the values that are considered in NRDL decision making are not well-understood. This study aims to investigate the influence of available evidence and other factors on coverage decisions.
Methods
Outcomes of the 2017 and 2018 NRDL negotiations were analyzed. Logistic regression was used to investigate factors associated with listing decisions. Ordinary least squares and Tobit regression were used to investigate factors associated with negotiated price discounts. Independent variables were published cost-effectiveness analysis (CEA), incremental cost-effectiveness ratio (ICER), disease area, burden of disease (disability-adjusted life years), company ownership (domestic or foreign) and regulatory approval year.
Results
Twenty-eight out of sixty-two negotiated drugs had one or more published CEA studies in the English or Chinese language, although neither the presence of a study nor the central ICER estimates were predictive of price discount or listing. A longer time since regulatory approval was a significant predictor of listing (p < 0.05). Disease area (oncology) and ownership (foreign) were significant predictors of a higher price discount (p < 0.01).
Conclusions
The NRDL plays a key role in providing access to healthcare for the 95 percent of China's population that is covered by public insurance. We found several factors that were associated with reimbursement decisions. Many of the medicines in the NRDL negotiation have CEA evidence, although the role of CEA in reimbursement decision making in China remains inconclusive.
Chronic obstructive pulmonary disease (COPD) is a leading cause of morbidity and mortality in China. However, early identification of patients with COPD in the community is challenging. This study used a real-world survey of the Chinese urban adult population to estimate the prevalence of COPD diagnosis or COPD-risk, examine the health outcomes and healthcare resource use of these groups, and investigate the sociodemographic factors associated with these statuses.
Methods
Respondents to the 2017 National Health and Wellness Survey in China (n = 19,994) were classified into: COPD (diagnosed), COPD-risk (undiagnosed), and control (undiagnosed, not at-risk) using their self-reported diagnosis and Lung Function Questionnaire (LFQ) score. These groups were compared by healthcare resource use and health outcomes (EuroQol [EQ-5D] and Work Productivity and Activity Impairment questionnaires). Factors associated with being in these groups were investigated using pairwise comparisons (t-tests and chi-square tests) and multivariable logistic regression.
Results
In total, 3,320 respondents (16.6%) had a suspected risk of COPD but did not report receiving a diagnosis. This was projected to 105.3 million people (16.9% of urban adults). Relative to the controls, COPD-risk and COPD-diagnosed respondents had higher healthcare resource use, lower productivity, and lower health-related quality of life (HRQoL) (p < 0.05). Age, smoking, alcohol consumption, weight, exercise, comorbidities, gender, education, employment, and air pollution were associated with increased odds of COPD-risk relative to the controls (p < 0.05).
Conclusions
A substantial group of individuals, undiagnosed, but with a risk of COPD, have impaired HRQoL, lower productivity, and elevated healthcare resource use. A range of sociodemographic factors are predictive of COPD risk, which may support targeted screening. Case-detection tools such as the LFQ may offer a convenient approach for identifying individuals for further definitive testing and appropriate treatment in China.
At its late Pleistocene maximum, the Laurentide Ice Sheet was the largest ice mass on Earth and a key player in the modulation of global climate and sea level. At the same time, this temperate ice sheet was itself sensitive to climate, and high-magnitude fluctuations in ice extent, reconstructed from relict glacial deposits, reflect past changes in atmospheric temperature. Here, we present a cosmogenic 10Be surface-exposure chronology for the Berlin moraines in the White Mountains of northern New Hampshire, USA, which supports the model that deglaciation of New England was interrupted by a pronounced advance of ice during the Bølling-Allerød. Together with recalculated 10Be ages from the southern New England coast, the expanded White Mountains moraine chronology also brackets the timing of ice sheet retreat in this sector of the Laurentide. In conjunction with existing chronological data, the moraine ages presented here suggest that deglaciation was widespread during Heinrich Stadial 1 event (~18–14.7 ka) despite apparently cold marine conditions in the adjacent North Atlantic. As part of the White Mountains moraine system, the Berlin chronology also places a new terrestrial constraint on the former glacial configuration during the marine incursion of the St. Lawrence River valley north of the White Mountains.
We evaluated the impact of an electronic health record based 72-hour antimicrobial time-out (ATO) on antimicrobial utilization. We observed that 6 hours after the ATO, 21% of empiric antimicrobials were discontinued or de-escalated. There was a significant reduction in the duration of antimicrobial therapy but no impact on overall antimicrobial usage metrics.
In the area of electromagnetic metrology, binary coded excitation signals become more and more important and various binary coded sequences are available. The measurement approach is to assess the impulse response function of a device under test by correlating the response signal with the excitation signal. In order to achieve a high measurement reproducibility as well as a high dynamic range, the generated binary coded signals have to provide low-noise. In this contribution, a low-noise signal generator realized with a field programmable gate array is presented. The performance investigation of different kinds of binary coded excitation signals and different correlation concepts have been practically investigated. With a chip rate of 5 Gchip/s, the generator can be utilized for ultra-wideband applications. In order to allow for a low-noise and long-term stable signal generation, a new clock generator concept is presented and results of phase noise measurements are shown. Furthermore, an algorithm to fast and precisely shifting the time lag between two binary coded signals for correlating excitation and response signals with a hardware correlator is presented. Finally, the realized demonstrator system is tested using two commonly used types of binary coded sequences.
Contaminated hands of healthcare workers (HCWs) are an important source of transmission of healthcare-associated infections. Alcohol-based hand sanitizers, while effective, do not provide sustained antimicrobial activity. The objective of this study was to compare the immediate and persistent activity of 2 hand hygiene products (ethanol [61% w/v] plus chlorhexidine gluconate [CHG; 1.0% solution] and ethanol only [70% v/v]) when used in an intensive care unit (ICU).
DESIGN
Prospective, randomized, double-blinded, crossover study
SETTING
Three ICUs at a large teaching hospital
PARTICIPANTS
In total, 51 HCWs involved in direct patient care were enrolled in and completed the study.
METHODS
All HCWs were randomized 1:1 to either product. Hand prints were obtained immediately after the product was applied and again after spending 4–7 minutes in the ICU common areas prior to entering a patient room or leaving the area. The numbers of aerobic colony-forming units (CFU) were compared for the 2 groups after log transformation. Each participant tested the alternative product after a 3-day washout period.
RESULTS
On bare hands, use of ethanol plus CHG was associated with significantly lower recovery of aerobic CFU, both immediately after use (0.27 ± 0.05 and 0.88 ± 0.08 log10 CFU; P = .035) and after spending time in ICU common areas (1.81 ± 0.07 and 2.17 ± 0.05 log10 CFU; P<.0001). Both the antiseptics were well tolerated by HCWs.
CONCLUSIONS
In comparison to the ethanol-only product, the ethanol plus CHG sanitizer was associated with significantly lower aerobic bacterial counts on hands of HCWs, both immediately after use and after spending time in ICU common areas.