We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Over the last 25 years, radiowave detection of neutrino-generated signals, using cold polar ice as the neutrino target, has emerged as perhaps the most promising technique for detection of extragalactic ultra-high energy neutrinos (corresponding to neutrino energies in excess of 0.01 Joules, or 1017 electron volts). During the summer of 2021 and in tandem with the initial deployment of the Radio Neutrino Observatory in Greenland (RNO-G), we conducted radioglaciological measurements at Summit Station, Greenland to refine our understanding of the ice target. We report the result of one such measurement, the radio-frequency electric field attenuation length $L_\alpha$. We find an approximately linear dependence of $L_\alpha$ on frequency with the best fit of the average field attenuation for the upper 1500 m of ice: $\langle L_\alpha \rangle = ( ( 1154 \pm 121) - ( 0.81 \pm 0.14) \, ( \nu /{\rm MHz}) ) \,{\rm m}$ for frequencies ν ∈ [145 − 350] MHz.
This study aims to look at the trends in our head and neck cancer patient population over the past 5 years with an emphasis on the past 2 years to evaluate how the coronavirus disease 2019 (COVID-19) pandemic has impacted our disparities and availability of care for patients, especially those living in rural areas. An additional aim is to identify existing disparities at our institution in the treatment of head and neck patients and determine solutions to improve patient care.
Materials and Methods:
A retrospective chart review was performed to identify patients who were consulted and subsequently treated with at least one fraction of radiation therapy at our institution with palliative or curative intent. Patient demographic information was collected including hometown, distance from the cancer centre based on zip-codes and insurance information and type of appointment (in-person or telehealth). Rural–urban continuum codes were used to determine rurality.
Results:
A total of 490 head and neck cancer patients (n = 490) were treated from 2017 to 2021. When broken down by year, there were no significant trends in patient population regarding travel distance or rurality. Roughly 20–30% of our patients live in rural areas and about 30% have a commute > 50 miles for radiation treatment. A majority of our patients rely on public insurance (68%) with a small percentage of those uninsured (4%). Telehealth visits were rare prior to 2019 and rose to 5 and 2 visits in 2020 and 2021, respectively.
Conclusions:
Head and neck cancer patients, despite rurality or distance from a cancer centre, may present with alarmingly enough symptoms despite limitations and difficulties with seeking medical attention even during the COVID-19 pandemic in 2020. However, providers must be aware of these potential disparities that exist in the rural population and seek to address these.
To determine associations of alcohol use with cognitive aging among middle-aged men.
Method:
1,608 male twins (mean 57 years at baseline) participated in up to three visits over 12 years, from 2003–2007 to 2016–2019. Participants were classified into six groups based on current and past self-reported alcohol use: lifetime abstainers, former drinkers, very light (1–4 drinks in past 14 days), light (5–14 drinks), moderate (15–28 drinks), and at-risk drinkers (>28 drinks in past 14 days). Linear mixed-effects regressions modeled cognitive trajectories by alcohol group, with time-based models evaluating rate of decline as a function of baseline alcohol use, and age-based models evaluating age-related differences in performance by current alcohol use. Analyses used standardized cognitive domain factor scores and adjusted for sociodemographic and health-related factors.
Results:
Performance decreased over time in all domains. Relative to very light drinkers, former drinkers showed worse verbal fluency performance, by –0.21 SD (95% CI –0.35, –0.07), and at-risk drinkers showed faster working memory decline, by 0.14 SD (95% CI 0.02, –0.20) per decade. There was no evidence of protective associations of light/moderate drinking on rate of decline. In age-based models, light drinkers displayed better memory performance at advanced ages than very light drinkers (+0.14 SD; 95% CI 0.02, 0.20 per 10-years older age); likely attributable to residual confounding or reverse association.
Conclusions:
Alcohol consumption showed minimal associations with cognitive aging among middle-aged men. Stronger associations of alcohol with cognitive aging may become apparent at older ages, when cognitive abilities decline more rapidly.
To successfully address large-scale public health threats such as the novel coronavirus outbreak, policymakers need to limit feelings of fear that threaten social order and political stability. We study how policy responses to an infectious disease affect mass fear using data from a survey experiment conducted on a representative sample of the adult population in the USA (N = 5,461). We find that fear is affected strongly by the final policy outcome, mildly by the severity of the initial outbreak, and minimally by policy response type and rapidity. These results hold across alternative measures of fear and various subgroups of individuals regardless of their level of exposure to coronavirus, knowledge of the virus, and several other theoretically relevant characteristics. Remarkably, despite accumulating evidence of intense partisan conflict over pandemic-related attitudes and behaviors, we show that effective government policy reduces fear among Democrats, Republicans, and Independents alike.
We quantified hospital-acquired COVID-19 during the early phases of the pandemic, and we evaluated solely temporal determinations of hospital acquisition.
Design:
Retrospective observational study during early phases of the COVID-19 pandemic, March 1-November 30, 2020. We identified laboratory-detected SARS-CoV-2 from 30 days before admission through discharge. All episodes detected after hospital day 5 were categorized by chart review as community or unlikely hospital-acquired, or possible or probable hospital-acquired.
Setting:
Two acute-care hospitals in Chicago, IL.
Patients:
All hospitalized patients including an inpatient rehabilitation unit.
Interventions:
Each hospital implemented infection-control precautions soon after identifying COVID-19 cases, including patient- and staff-cohorting, universal masking, and restricted visitation policies.
Results:
Among 2,667 patients with SARS-CoV-2, detection before hospital day six was most common (n=2,612; 98%); days 6-14 uncommon (n=43; 1.6%); and, after day 14, rare (n=16; 0.6%). By chart review, most episodes after day 5 were categorized as community-acquired, usually because SARS-CoV-2 had been detected at a prior healthcare facility (68% of cases on days 6-14; 53% of cases after day 14). Incidence for possible and probable hospital-acquired cases, per 10,000 patient-days, was similar for ICU- and non-ICU patients at Hospitals A (1.2 vs 1.3, difference = 0.1; 95% CI, -2.8 to 3.0) and B (2.8 vs 1.2, difference = 1.6; 95% CI, -0.1 to 4.0).
Conclusions:
Most patients were protected by early and sustained application of infection-control precautions, modified to reduce COVID-19 transmission. Using solely temporal criteria to discriminate hospital- vs community-acquisition would have misclassified many “late-onset” SARS-CoV-2 positive episodes.
To evaluate a relatively new half–face-piece powered air-purifying respirator (PAPR) device called the HALO (CleanSpace). We assessed its communication performance, its degree of respiratory protection, and its usability and comfort level.
Design and setting:
This simulation study was conducted at the simulation center of the Royal Melbourne Hospital.
Participants:
In total, 8 voluntary healthcare workers participated in the study: 4 women and 4 men comprising 3 nursing staff and 5 medical staff.
Methods:
We performed the modified rhyme test, outlined by the National Institute for Occupational Safety and Health (NIOSH), for the communication assessment. We conducted quantitative fit test and simulated workplace protection factor studies to assess the degree of respiratory protection for participants at rest, during, and immediately after performing chest compression. We also invited the participants to complete a usability and comfort survey.
Results:
The HALO PAPR met the NIOSH minimum standard for speech intelligibility, which was significantly improved with the addition of wireless communication headsets. The HALO provided consistent and adequate level of respiratory protection at rest, during and after chest compression regardless of the device power mode. It was rated favorably for its usability and comfort. However, participants criticized doffing difficulty and perceived communication interference.
Conclusions:
The HALO device can be considered as an alternative to a filtering face-piece respirator. Thorough doffing training and mitigation planning to improve the device communication performance are recommended. Further research is required to examine its clinical outcomes and barriers that may potentially affect patient or healthcare worker safety.
OBJECTIVES/GOALS: The Informatics Program in the Wake Forest CTSI is experiencing rapid growth. To accommodate an influx of both staff and clinical investigators this program Invests resources in self-service tools to increase researcher capabilities Automates resource intensive activities Creates transparency of operational processes for researchers. METHODS/STUDY POPULATION: Self-service tools (immediate/automated) The i2b2 tool queries clinical data for feasibility numbers and cohort identification; and provides demographic breakdowns of patient sets The Data Puller tool pulls identified patient data (with IRB approval) The SKAN NLP tool pulls aggregate numbers from over 3 million clinical notes Automation A custom-built tracking system automates parts of tracking requests for data and checking IRB protocols Operational transparency The Data Request Dashboard shows requesters information about their request and where it is in the process of being fulfilled The Data Quote tool was constructed leveraging the integrated CTSA informatics network and uses details of the request to estimate how long it will take to complete. RESULTS/ANTICIPATED RESULTS: i2b2 has had over 300 unique users each year; 80% are faculty or research staff, 20% are clinicians or students. From 2017-2021 there have been an average of 300 i2b2 queries and 45 Data Puller pulls each month. SKAN has had 58 unique users since its implementation in late 2020, averaging 5 new users per month. The automated data request tracking system took approximately 30 staff hours to create and saves an average of 4 hours of staff time per week. It also decreases human error by pulling/pushing information directly between systems. The Informatics program has received positive feedback from researchers who use the Data Request Dashboard. The Data Quote Tool is being used to give standardized quotes to researchers. DISCUSSION/SIGNIFICANCE: Investing resources in developing and implementing self-service tools and operational transparency ultimately reduces overall resource consumption, saving staff and investigator time and effort. This enables the Informatics program to maintain a high standard of service while experiencing rapid growth.
This paper provides a large-scale, per Major League Baseball (MLB) game analysis of foul ball (FB) injury data and provides estimates of injury frequency and severity.
Objective:
This study’s goal was to quantify and describe the rate and type of FB injuries at MLB games.
Design:
This was a retrospective review of medical care reports for patients evaluated by on-site health care providers (HCPs) over a non-contiguous 11-year period (2005-2016). Data were obtained using Freedom of Information Act (FOIA) requests.
Setting:
Data were received from three US-based MLB stadiums.
Results:
The review reported 0.42-0.55 FB injuries per game that were serious enough to warrant presentation at a first aid center. This translated to a patients per 10,000 fans rate (PPTT) of 0.13-0.23. The transport to hospital rate (TTHR) was 0.02-0.39. Frequently, FB injuries required analgesics but were overwhelmingly minor and occurred less often than non-FB traumatic injuries (5.2% versus 42%-49%). However, FB injured fans were more likely to need higher levels of care and transport to hospital (TH) as compared to people suffering other traumatic injuries at the ballpark. Contusions or head injuries were common. Finally, FB injured fans were often hit in the abdomen, upper extremity, face, or head. It was found that FB injuries appeared to increase with time, and this increase in injuries aligns with the sudden increase in popularity of smartphones in the United States.
Conclusions and Relevance:
These data suggest that in roughly every two or three MLB games, a foul ball causes a serious enough injury that a fan seeks medical attention. This rate is high enough to warrant attention, but is comparable in frequency to other diagnostic categories. Assessing the risk to fans from FBs remains difficult, but with access to uniform data, researchers could answer persistent questions that would lead to actionable changes and help guide public policy towards safer stadiums.
Research on complications with peripherally inserted central catheter (PICC) lines that are placed for the treatment of prosthetic joint infection (PJI) after total hip arthroplasty (THA) and total knee arthroplasty (TKA) is scarce. We investigated the timing, frequency, and risk factors for PICC complications during treatment of PJI after THA and TKA.
Methods:
We retrospectively queried an institutional database for THA and TKA patients from January 2015 through December 2020 that developed a PJI and required PICC placement at an academic, tertiary-care referral center.
Results:
The study included 889 patients (48.3% female) with a mean age of 64.6 years (range, 18.7–95.2) who underwent 435 THAs and 454 TKAs that were revised for PJI. The cohort had 275 90-day ED visits (30.9%), and 51 (18.5%) were PICC related. The average time from discharge to PICC ED visit was 26.2 days (range, 0.3–89.4). The most common reasons for a 90-day ED visit were issues related to the joint replacement or wound site (musculoskeletal or MSK; n = 116, 42.2%) and PICC complaints (n = 51, 18.5%). A multivariable logistic regression demonstrated that non-White race (odds ratio [OR], 2.24; 95% confidence interval [CI], 1.24–4.04; P = .007) and younger age (OR, 0.98; 95% CI, 0.95–1.00; P = .035) were associated with PICC-related ED visits. Malposition/readjustment (41.2%) and occlusion (35.3%) were the most common PICC complications leading to ED presentation.
Conclusions:
PICC complications are common after PJI treatment, accounting for nearly 20% of 90-day ED visits.
Alzheimer’s disease (AD) is highly heritable, and AD polygenic risk scores (AD-PRSs) have been derived from genome-wide association studies. However, the nature of genetic influences very early in the disease process is still not well known. Here we tested the hypothesis that an AD-PRSs would be associated with changes in episodic memory and executive function across late midlife in men who were cognitively unimpaired at their baseline midlife assessment..
Method:
We examined 1168 men in the Vietnam Era Twin Study of Aging (VETSA) who were cognitively normal (CN) at their first of up to three assessments across 12 years (mean ages 56, 62, and 68). Latent growth models of episodic memory and executive function were based on 6–7 tests/subtests. AD-PRSs were based on Kunkle et al. (Nature Genetics, 51, 414–430, 2019), p < 5×10−8 threshold.
Results:
AD-PRSs were correlated with linear slopes of change for both cognitive abilities. Men with higher AD-PRSs had steeper declines in both memory (r = −.19, 95% CI [−.35, −.03]) and executive functioning (r = −.27, 95% CI [−.49, −.05]). Associations appeared driven by a combination of APOE and non-APOE genetic influences.
Conclusions:
Memory is most characteristically impaired in AD, but executive functions are one of the first cognitive abilities to decline in midlife in normal aging. This study is among the first to demonstrate that this early decline also relates to AD genetic influences, even in men CN at baseline.
Copy number variants (CNVs) have been associated with the risk of schizophrenia, autism and intellectual disability. However, little is known about their spectrum of psychopathology in adulthood.
Methods
We investigated the psychiatric phenotypes of adult CNV carriers and compared probands, who were ascertained through clinical genetics services, with carriers who were not. One hundred twenty-four adult participants (age 18–76), each bearing one of 15 rare CNVs, were recruited through a variety of sources including clinical genetics services, charities for carriers of genetic variants, and online advertising. A battery of psychiatric assessments was used to determine psychopathology.
Results
The frequencies of psychopathology were consistently higher for the CNV group compared to general population rates. We found particularly high rates of neurodevelopmental disorders (NDDs) (48%), mood disorders (42%), anxiety disorders (47%) and personality disorders (73%) as well as high rates of psychiatric multimorbidity (median number of diagnoses: 2 in non-probands, 3 in probands). NDDs [odds ratio (OR) = 4.67, 95% confidence interval (CI) 1.32–16.51; p = 0.017) and psychotic disorders (OR = 6.8, 95% CI 1.3–36.3; p = 0.025) occurred significantly more frequently in probands (N = 45; NDD: 39[87%]; psychosis: 8[18%]) than non-probands (N = 79; NDD: 20 [25%]; psychosis: 3[4%]). Participants also had somatic diagnoses pertaining to all organ systems, particularly conotruncal cardiac malformations (in individuals with 22q11.2 deletion syndrome specifically), musculoskeletal, immunological, and endocrine diseases.
Conclusions
Adult CNV carriers had a markedly increased rate of anxiety and personality disorders not previously reported and high rates of psychiatric multimorbidity. Our findings support in-depth psychiatric and medical assessments of carriers of CNVs and the establishment of multidisciplinary clinical services.
Contrasting the well-described effects of early intervention (EI) services for youth-onset psychosis, the potential benefits of the intervention for adult-onset psychosis are uncertain. This paper aims to examine the effectiveness of EI on functioning and symptomatic improvement in adult-onset psychosis, and the optimal duration of the intervention.
Methods
360 psychosis patients aged 26–55 years were randomized to receive either standard care (SC, n = 120), or case management for two (2-year EI, n = 120) or 4 years (4-year EI, n = 120) in a 4-year rater-masked, parallel-group, superiority, randomized controlled trial of treatment effectiveness (Clinicaltrials.gov: NCT00919620). Primary (i.e. social and occupational functioning) and secondary outcomes (i.e. positive and negative symptoms, and quality of life) were assessed at baseline, 6-month, and yearly for 4 years.
Results
Compared with SC, patients with 4-year EI had better Role Functioning Scale (RFS) immediate [interaction estimate = 0.008, 95% confidence interval (CI) = 0.001–0.014, p = 0.02] and extended social network (interaction estimate = 0.011, 95% CI = 0.004–0.018, p = 0.003) scores. Specifically, these improvements were observed in the first 2 years. Compared with the 2-year EI group, the 4-year EI group had better RFS total (p = 0.01), immediate (p = 0.01), and extended social network (p = 0.05) scores at the fourth year. Meanwhile, the 4-year (p = 0.02) and 2-year EI (p = 0.004) group had less severe symptoms than the SC group at the first year.
Conclusions
Specialized EI treatment for psychosis patients aged 26–55 should be provided for at least the initial 2 years of illness. Further treatment up to 4 years confers little benefits in this age range over the course of the study.
Liquid-electron microscopy (EM), the room-temperature correlate to cryo-EM, is a rapidly growing field providing high-resolution insights of macromolecules in solution. Here, we describe how liquid-EM experiments can incorporate automated tools to propel the field to new heights. We demonstrate fresh workflows for specimen preparation, data collection, and computing processes to assess biological structures in liquid. Adeno-associated virus (AAV) and the SARS-CoV-2 nucleocapsid (N) were used as model systems to highlight the technical advances. These complexes were selected based on their major differences in size and natural symmetry. AAV is a highly symmetric, icosahedral assembly with a particle diameter of ~25 nm. At the other end of the spectrum, N protein is an asymmetric monomer or dimer with dimensions of approximately 5–7 nm, depending upon its oligomerization state. Equally important, both AAV and N protein are popular subjects in biomedical research due to their high value in vaccine development and therapeutic efforts against COVID-19. Overall, we demonstrate how automated practices in liquid-EM can be used to decode molecules of interest for human health and disease.
Antibiotic overuse is high in patients hospitalized with coronavirus disease 2019 (COVID-19) despite a low documented prevalence of bacterial infections in many studies. In this study evaluating 65 COVID-19 patients in the intensive care unit, empiric broad-spectrum antibiotics were often overutilized with an inertia to de-escalate despite negative culture results.
Approximately one-third of individuals in a major depressive episode will not achieve sustained remission despite multiple, well-delivered treatments. These patients experience prolonged suffering and disproportionately utilize mental and general health care resources. The recently proposed clinical heuristic of ‘difficult-to-treat depression’ (DTD) aims to broaden our understanding and focus attention on the identification, clinical management, treatment selection, and outcomes of such individuals. Clinical trial methodologies developed to detect short-term therapeutic effects in treatment-responsive populations may not be appropriate in DTD. This report reviews three essential challenges for clinical intervention research in DTD: (1) how to define and subtype this heterogeneous group of patients; (2) how, when, and by what methods to select, acquire, compile, and interpret clinically meaningful outcome metrics; and (3) how to choose among alternative clinical trial design options to promote causal inference and generalizability. The boundaries of DTD are uncertain, and an evidence-based taxonomy and reliable assessment tools are preconditions for clinical research and subtyping. Traditional outcome metrics in treatment-responsive depression may not apply to DTD, as they largely reflect the only short-term symptomatic change and do not incorporate durability of benefit, side effect burden, or sustained impact on quality of life or daily function. The trial methodology will also require modification as trials will likely be of longer duration to examine the sustained impact, raising complex issues regarding control group selection, blinding and its integrity, and concomitant treatments.
Academic discovery in biomedicine is a growing enterprise with tens of billions of dollars in research funding available to universities and hospitals. Protecting and optimizing the resultant intellectual property is required in order for the discoveries to have an impact on society. To achieve that, institutions must create a multidisciplinary, collaborative system of review and support, and utilize connections to industry partners. In this study, we outline the efforts of Case Western Reserve University, coordinated through its Clinical and Translational Science Collaborative (CTSC), to promote entrepreneurial culture, and achieve goals of product development and startup formation for biomedical and population health discoveries arising from the academic ecosystem in Cleveland. The CTSC Office of Translation and Innovation, with the university’s Technology Transfer Office (TTO), helps identify and derisk promising IP while building interdisciplinary project teams to optimize the assets through key preclinical derisking steps. The benefits of coordinating funding across multiple programs, assuring dedicated project management to oversee optimizing the IP, and ensuring training to help improve proposals and encourage an entrepreneurial culture, are discussed in the context of a case study of therapeutic assets, the Council to Advance Human Health. This case study highlights best practices in academic innovation.
Early in the COVID-19 pandemic, the World Health Organization stressed the importance of daily clinical assessments of infected patients, yet current approaches frequently consider cross-sectional timepoints, cumulative summary measures, or time-to-event analyses. Statistical methods are available that make use of the rich information content of longitudinal assessments. We demonstrate the use of a multistate transition model to assess the dynamic nature of COVID-19-associated critical illness using daily evaluations of COVID-19 patients from 9 academic hospitals. We describe the accessibility and utility of methods that consider the clinical trajectory of critically ill COVID-19 patients.
Yarkoni's analysis clearly articulates a number of concerns limiting the generalizability and explanatory power of psychological findings, many of which are compounded in infancy research. ManyBabies addresses these concerns via a radically collaborative, large-scale and open approach to research that is grounded in theory-building, committed to diversification, and focused on understanding sources of variation.
We interviewed 1,208 healthcare workers with positive SARS-CoV-2 tests between October 2020 and June 2021 to determine likely exposure sources. Overall, 689 (57.0%) had community exposures (479 from household members), 76 (6.3%) had hospital exposures (64 from other employees including 49 despite masking), 11 (0.9%) had community and hospital exposures, and 432 (35.8%) had no identifiable source of exposure.