We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We present a re-discovery of G278.94+1.35a as possibly one of the largest known Galactic supernova remnants (SNRs) – that we name Diprotodon. While previously established as a Galactic SNR, Diprotodon is visible in our new Evolutionary Map of the Universe (EMU) and GaLactic and Extragalactic All-sky MWA (GLEAM) radio continuum images at an angular size of $3{{{{.\!^\circ}}}}33\times3{{{{.\!^\circ}}}}23$, much larger than previously measured. At the previously suggested distance of 2.7 kpc, this implies a diameter of 157$\times$152 pc. This size would qualify Diprotodon as the largest known SNR and pushes our estimates of SNR sizes to the upper limits. We investigate the environment in which the SNR is located and examine various scenarios that might explain such a large and relatively bright SNR appearance. We find that Diprotodon is most likely at a much closer distance of $\sim$1 kpc, implying its diameter is 58$\times$56 pc and it is in the radiative evolutionary phase. We also present a new Fermi-LAT data analysis that confirms the angular extent of the SNR in gamma rays. The origin of the high-energy emission remains somewhat puzzling, and the scenarios we explore reveal new puzzles, given this unexpected and unique observation of a seemingly evolved SNR having a hard GeV spectrum with no breaks. We explore both leptonic and hadronic scenarios, as well as the possibility that the high-energy emission arises from the leftover particle population of a historic pulsar wind nebula.
Diagnosis of acute ischemia typically relies on evidence of ischemic lesions on magnetic resonance imaging (MRI), a limited diagnostic resource. We aimed to determine associations of clinical variables and acute infarcts on MRI in patients with suspected low-risk transient ischemic attack (TIA) and minor stroke and to assess their predictive ability.
Methods:
We conducted a post-hoc analysis of the Diagnosis of Uncertain-Origin Benign Transient Neurological Symptoms (DOUBT) study, a prospective, multicenter cohort study investigating the frequency of acute infarcts in patients with low-risk neurological symptoms. Primary outcome parameter was defined as diffusion-weighted imaging (DWI)-positive lesions on MRI. Logistic regression analysis was performed to evaluate associations of clinical characteristics with MRI-DWI-positivity. Model performance was evaluated by Harrel’s c-statistic.
Results:
In 1028 patients, age (Odds Ratio (OR) 1.03, 95% Confidence Interval (CI) 1.01–1.05), motor (OR 2.18, 95%CI 1.27–3.65) or speech symptoms (OR 2.53, 95%CI 1.28–4.80), and no previous identical event (OR 1.75, 95%CI 1.07–2.99) were positively associated with MRI-DWI-positivity. Female sex (OR 0.47, 95%CI 0.32–0.68), dizziness and gait instability (OR 0.34, 95%CI 0.14–0.69), normal exam (OR 0.55, 95%CI 0.35–0.85) and resolved symptoms (OR 0.49, 95%CI 0.30–0.78) were negatively associated. Symptom duration and any additional symptoms/symptom combinations were not associated. Predictive ability of the model was moderate (c-statistic 0.72, 95%CI 0.69–0.77).
Conclusion:
Detailed clinical information is helpful in assessing the risk of ischemia in patients with low-risk neurological events, but a predictive model had only moderate discriminative ability. Patients with clinically suspected low-risk TIA or minor stroke require MRI to confirm the diagnosis of cerebral ischemia.
Although behavioral mechanisms in the association among depression, anxiety, and cancer are plausible, few studies have empirically studied mediation by health behaviors. We aimed to examine the mediating role of several health behaviors in the associations among depression, anxiety, and the incidence of various cancer types (overall, breast, prostate, lung, colorectal, smoking-related, and alcohol-related cancers).
Methods
Two-stage individual participant data meta-analyses were performed based on 18 cohorts within the Psychosocial Factors and Cancer Incidence consortium that had a measure of depression or anxiety (N = 319 613, cancer incidence = 25 803). Health behaviors included smoking, physical inactivity, alcohol use, body mass index (BMI), sedentary behavior, and sleep duration and quality. In stage one, path-specific regression estimates were obtained in each cohort. In stage two, cohort-specific estimates were pooled using random-effects multivariate meta-analysis, and natural indirect effects (i.e. mediating effects) were calculated as hazard ratios (HRs).
Results
Smoking (HRs range 1.04–1.10) and physical inactivity (HRs range 1.01–1.02) significantly mediated the associations among depression, anxiety, and lung cancer. Smoking was also a mediator for smoking-related cancers (HRs range 1.03–1.06). There was mediation by health behaviors, especially smoking, physical inactivity, alcohol use, and a higher BMI, in the associations among depression, anxiety, and overall cancer or other types of cancer, but effects were small (HRs generally below 1.01).
Conclusions
Smoking constitutes a mediating pathway linking depression and anxiety to lung cancer and smoking-related cancers. Our findings underline the importance of smoking cessation interventions for persons with depression or anxiety.
Blood-based biomarkers represent a scalable and accessible approach for the detection and monitoring of Alzheimer’s disease (AD). Plasma phosphorylated tau (p-tau) and neurofilament light (NfL) are validated biomarkers for the detection of tau and neurodegenerative brain changes in AD, respectively. There is now emphasis to expand beyond these markers to detect and provide insight into the pathophysiological processes of AD. To this end, a reactive astrocytic marker, namely plasma glial fibrillary acidic protein (GFAP), has been of interest. Yet, little is known about the relationship between plasma GFAP and AD. Here, we examined the association between plasma GFAP, diagnostic status, and neuropsychological test performance. Diagnostic accuracy of plasma GFAP was compared with plasma measures of p-tau181 and NfL.
Participants and Methods:
This sample included 567 participants from the Boston University (BU) Alzheimer’s Disease Research Center (ADRC) Longitudinal Clinical Core Registry, including individuals with normal cognition (n=234), mild cognitive impairment (MCI) (n=180), and AD dementia (n=153). The sample included all participants who had a blood draw. Participants completed a comprehensive neuropsychological battery (sample sizes across tests varied due to missingness). Diagnoses were adjudicated during multidisciplinary diagnostic consensus conferences. Plasma samples were analyzed using the Simoa platform. Binary logistic regression analyses tested the association between GFAP levels and diagnostic status (i.e., cognitively impaired due to AD versus unimpaired), controlling for age, sex, race, education, and APOE e4 status. Area under the curve (AUC) statistics from receiver operating characteristics (ROC) using predicted probabilities from binary logistic regression examined the ability of plasma GFAP to discriminate diagnostic groups compared with plasma p-tau181 and NfL. Linear regression models tested the association between plasma GFAP and neuropsychological test performance, accounting for the above covariates.
Results:
The mean (SD) age of the sample was 74.34 (7.54), 319 (56.3%) were female, 75 (13.2%) were Black, and 223 (39.3%) were APOE e4 carriers. Higher GFAP concentrations were associated with increased odds for having cognitive impairment (GFAP z-score transformed: OR=2.233, 95% CI [1.609, 3.099], p<0.001; non-z-transformed: OR=1.004, 95% CI [1.002, 1.006], p<0.001). ROC analyses, comprising of GFAP and the above covariates, showed plasma GFAP discriminated the cognitively impaired from unimpaired (AUC=0.75) and was similar, but slightly superior, to plasma p-tau181 (AUC=0.74) and plasma NfL (AUC=0.74). A joint panel of the plasma markers had greatest discrimination accuracy (AUC=0.76). Linear regression analyses showed that higher GFAP levels were associated with worse performance on neuropsychological tests assessing global cognition, attention, executive functioning, episodic memory, and language abilities (ps<0.001) as well as higher CDR Sum of Boxes (p<0.001).
Conclusions:
Higher plasma GFAP levels differentiated participants with cognitive impairment from those with normal cognition and were associated with worse performance on all neuropsychological tests assessed. GFAP had similar accuracy in detecting those with cognitive impairment compared with p-tau181 and NfL, however, a panel of all three biomarkers was optimal. These results support the utility of plasma GFAP in AD detection and suggest the pathological processes it represents might play an integral role in the pathogenesis of AD.
Blood-based biomarkers offer a more feasible alternative to Alzheimer’s disease (AD) detection, management, and study of disease mechanisms than current in vivo measures. Given their novelty, these plasma biomarkers must be assessed against postmortem neuropathological outcomes for validation. Research has shown utility in plasma markers of the proposed AT(N) framework, however recent studies have stressed the importance of expanding this framework to include other pathways. There is promising data supporting the usefulness of plasma glial fibrillary acidic protein (GFAP) in AD, but GFAP-to-autopsy studies are limited. Here, we tested the association between plasma GFAP and AD-related neuropathological outcomes in participants from the Boston University (BU) Alzheimer’s Disease Research Center (ADRC).
Participants and Methods:
This sample included 45 participants from the BU ADRC who had a plasma sample within 5 years of death and donated their brain for neuropathological examination. Most recent plasma samples were analyzed using the Simoa platform. Neuropathological examinations followed the National Alzheimer’s Coordinating Center procedures and diagnostic criteria. The NIA-Reagan Institute criteria were used for the neuropathological diagnosis of AD. Measures of GFAP were log-transformed. Binary logistic regression analyses tested the association between GFAP and autopsy-confirmed AD status, as well as with semi-quantitative ratings of regional atrophy (none/mild versus moderate/severe) using binary logistic regression. Ordinal logistic regression analyses tested the association between plasma GFAP and Braak stage and CERAD neuritic plaque score. Area under the curve (AUC) statistics from receiver operating characteristics (ROC) using predicted probabilities from binary logistic regression examined the ability of plasma GFAP to discriminate autopsy-confirmed AD status. All analyses controlled for sex, age at death, years between last blood draw and death, and APOE e4 status.
Results:
Of the 45 brain donors, 29 (64.4%) had autopsy-confirmed AD. The mean (SD) age of the sample at the time of blood draw was 80.76 (8.58) and there were 2.80 (1.16) years between the last blood draw and death. The sample included 20 (44.4%) females, 41 (91.1%) were White, and 20 (44.4%) were APOE e4 carriers. Higher GFAP concentrations were associated with increased odds for having autopsy-confirmed AD (OR=14.12, 95% CI [2.00, 99.88], p=0.008). ROC analysis showed plasma GFAP accurately discriminated those with and without autopsy-confirmed AD on its own (AUC=0.75) and strengthened as the above covariates were added to the model (AUC=0.81). Increases in GFAP levels corresponded to increases in Braak stage (OR=2.39, 95% CI [0.71-4.07], p=0.005), but not CERAD ratings (OR=1.24, 95% CI [0.004, 2.49], p=0.051). Higher GFAP levels were associated with greater temporal lobe atrophy (OR=10.27, 95% CI [1.53,69.15], p=0.017), but this was not observed with any other regions.
Conclusions:
The current results show that antemortem plasma GFAP is associated with non-specific AD neuropathological changes at autopsy. Plasma GFAP could be a useful and practical biomarker for assisting in the detection of AD-related changes, as well as for study of disease mechanisms.
Medical-legal partnerships connect legal advocates to healthcare providers and settings. Maintaining effectiveness of medical-legal partnerships and consistently identifying opportunities for innovation and adaptation takes intentionality and effort. In this paper, we discuss ways in which our use of data and quality improvement methods have facilitated advocacy at both patient (client) and population levels as we collectively pursue better, more equitable outcomes.
Disaster Medicine (DM) is the clinical specialty whose expertise includes the care and management of patients and populations outside conventional care protocols. While traditional standards of care assume the availability of adequate resources, DM practitioners operate in situations where resources are not adequate, necessitating a modification in practice. While prior academic efforts have succeeded in developing a list of core disaster competencies for emergency medicine residency programs, international fellowships, and affiliated health care providers, no official standardized curriculum or consensus has yet been published to date for DM fellowship programs based in the United States.
Study Objective:
The objective of this work is to define the core curriculum for DM physician fellowships in the United States, drawing consensus among existing DM fellowship directors.
Methods:
A panel of DM experts was created from the members of the Council of Disaster Medicine Fellowship Directors. This council is an independent group of DM fellowship directors in the United States that have met annually at the American College of Emergency Physicians (ACEP)’s Scientific Assembly for the last eight years with meeting support from the Disaster Preparedness and Response Committee. Using a modified Delphi technique, the panel members revised and expanded on the existing Society of Academic Emergency Medicine (SAEM) DM fellowship curriculum, with the final draft being ratified by an anonymous vote. Multiple publications were reviewed during the process to ensure all potential topics were identified.
Results:
The results of this effort produced the foundational curriculum, the 2023 Model Core Content of Disaster Medicine.
Conclusion:
Members from the Council of Disaster Medicine Fellowship Directors have developed the 2023 Model Core Content for Disaster Medicine in the United States. This living document defines the foundational curriculum for DM fellowships, providing the basis of a standardized experience, contributing to the development of a board-certified subspecialty, and informing fellowship directors and DM practitioners of content and topics that may appear on future certification examinations.
Emergency departments are high-risk settings for severe acute respiratory coronavirus virus 2 (SARS-CoV-2) surface contamination. Environmental surface samples were obtained in rooms with patients suspected of having COVID-19 who did or did not undergo aerosol-generating procedures (AGPs). SARS-CoV-2 RNA surface contamination was most frequent in rooms occupied by coronavirus disease 2019 (COVID-19) patients who received no AGPs.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
Over the past 2 decades, several categorizations have been proposed for the abnormalities of the aortic root. These schemes have mostly been devoid of input from specialists of congenital cardiac disease. The aim of this review is to provide a classification, from the perspective of these specialists, based on an understanding of normal and abnormal morphogenesis and anatomy, with emphasis placed on the features of clinical and surgical relevance. We contend that the description of the congenitally malformed aortic root is simplified when approached in a fashion that recognizes the normal root to be made up of 3 leaflets, supported by their own sinuses, with the sinuses themselves separated by the interleaflet triangles. The malformed root, usually found in the setting of 3 sinuses, can also be found with 2 sinuses, and very rarely with 4 sinuses. This permits description of trisinuate, bisinuate, and quadrisinuate variants, respectively. This feature then provides the basis for classification of the anatomical and functional number of leaflets present. By offering standardized terms and definitions, we submit that our classification will be suitable for those working in all cardiac specialties, whether pediatric or adult. It is of equal value in the settings of acquired or congenital cardiac disease. Our recommendations will serve to amend and/or add to the existing International Paediatric and Congenital Cardiac Code, along with the Eleventh iteration of the International Classification of Diseases provided by the World Health Organization.
Invasive emergent and floating macrophytes can have detrimental impacts on aquatic ecosystems. Management of these aquatic weeds frequently relies upon foliar application of aquatic herbicides. However, there is inherent variability of overspray (herbicide loss) for foliar applications into waters within and adjacent to the targeted treatment area. The spray retention (tracer dye captured) of four invasive broadleaf emergent species (water hyacinth, alligatorweed, creeping water primrose, and parrotfeather) and two emergent grass-like weeds (cattail and torpedograss) were evaluated. For all species, spray retention was simulated using foliar applications of rhodamine WT (RWT) dye as a herbicide surrogate under controlled mesocosm conditions. Spray retention of the broadleaf species was first evaluated using a CO2-pressurized spray chamber overtop dense vegetation growth or no plants (positive control) at a greenhouse (GH) scale. Broadleaf species and grass-like species were then evaluated in larger outdoor mesocosms (OM). These applications were made using a CO2-pressurized backpack sprayer. Evaluation metrics included species-wise canopy cover and height influence on in-water RWT concentration using image analysis and modeling techniques. Results indicated spray retention was greatest for water hyacinth (GH, 64.7 ± 7.4; OM, 76.1 ± 3.8). Spray retention values were similar among the three sprawling marginal species alligatorweed (GH, 37.5 ± 4.5; OM, 42 ± 5.7), creeping water primrose (GH, 54.9 ± 7.2; OM, 52.7 ± 5.7), and parrotfeather (GH, 48.2 ± 2.3; OM, 47.2 ± 3.5). Canopy cover and height were strongly correlated with spray retention for broadleaf species and less strongly correlated for grass-like species. Although torpedograss and cattail were similar in percent foliar coverage, they differed in percent spray retention (OM, 8.5± 2.3 and 28.9 ±4.1, respectively). The upright leaf architecture of the grass-like species likely influenced the lower spray retention values in comparison to the broadleaf species.
HIV and severe wasting are associated with post-discharge mortality and hospital readmission among children with complicated severe acute malnutrition (SAM); however, the reasons remain unclear. We assessed body composition at hospital discharge, stratified by HIV and oedema status, in a cohort of children with complicated SAM in three hospitals in Zambia and Zimbabwe. We measured skinfold thicknesses and bioelectrical impedance analysis (BIA) to investigate whether fat and lean mass were independent predictors of time to death or readmission. Cox proportional hazards models were used to estimate the association between death/readmission and discharge body composition. Mixed effects models were fitted to compare longitudinal changes in body composition over 1 year. At discharge, 284 and 546 children had complete BIA and skinfold measurements, respectively. Low discharge lean and peripheral fat mass were independently associated with death/hospital readmission. Each unit Z-score increase in impedance index and triceps skinfolds was associated with 48 % (adjusted hazard ratio 0·52, 95 % CI (0·30, 0·90)) and 17 % (adjusted hazard ratio 0·83, 95 % CI (0·71, 0·96)) lower hazard of death/readmission, respectively. HIV-positive v. HIV-negative children had lower gains in sum of skinfolds (mean difference −1·49, 95 % CI (−2·01, −0·97)) and impedance index Z-scores (–0·13, 95 % CI (−0·24, −0·01)) over 52 weeks. Children with non-oedematous v. oedematous SAM had lower mean changes in the sum of skinfolds (–1·47, 95 % CI (−1·97, −0·97)) and impedance index Z-scores (–0·23, 95 % CI (−0·36, −0·09)). Risk stratification to identify children at risk for mortality or readmission, and interventions to increase lean and peripheral fat mass, should be considered in the post-discharge care of these children.
We aimed to understand which non-household activities increased infection odds and contributed greatest to SARS-CoV-2 infections following the lifting of public health restrictions in England and Wales.
Procedures
We undertook multivariable logistic regressions assessing the contribution to infections of activities reported by adult Virus Watch Community Cohort Study participants. We calculated adjusted weighted population attributable fractions (aPAF) estimating which activity contributed greatest to infections.
Findings
Among 11 413 participants (493 infections), infection was associated with: leaving home for work (aOR 1.35 (1.11–1.64), aPAF 17%), public transport (aOR 1.27 (1.04–1.57), aPAF 12%), shopping once (aOR 1.83 (1.36–2.45)) vs. more than three times a week, indoor leisure (aOR 1.24 (1.02–1.51), aPAF 10%) and indoor hospitality (aOR 1.21 (0.98–1.48), aPAF 7%). We found no association for outdoor hospitality (1.14 (0.94–1.39), aPAF 5%) or outdoor leisure (1.14 (0.82–1.59), aPAF 1%).
Conclusion
Essential activities (work and public transport) carried the greatest risk and were the dominant contributors to infections. Non-essential indoor activities (hospitality and leisure) increased risk but contributed less. Outdoor activities carried no statistical risk and contributed to fewer infections. As countries aim to ‘live with COVID’, mitigating transmission in essential and indoor venues becomes increasingly relevant.
Cognitive impairments are well-established features of psychotic disorders and are present when individuals are at ultra-high risk for psychosis. However, few interventions target cognitive functioning in this population.
Aims
To investigate whether omega-3 polyunsaturated fatty acid (n−3 PUFA) supplementation improves cognitive functioning among individuals at ultra-high risk for psychosis.
Method
Data (N = 225) from an international, multi-site, randomised controlled trial (NEURAPRO) were analysed. Participants were given omega-3 supplementation (eicosapentaenoic acid and docosahexaenoic acid) or placebo over 6 months. Cognitive functioning was assessed with the Brief Assessment of Cognition in Schizophrenia (BACS). Mixed two-way analyses of variance were computed to compare the change in cognitive performance between omega-3 supplementation and placebo over 6 months. An additional biomarker analysis explored whether change in erythrocyte n−3 PUFA levels predicted change in cognitive performance.
Results
The placebo group showed a modest greater improvement over time than the omega-3 supplementation group for motor speed (ηp2 = 0.09) and BACS composite score (ηp2 = 0.21). After repeating the analyses without individuals who transitioned, motor speed was no longer significant (ηp2 = 0.02), but the composite score remained significant (ηp2 = 0.02). Change in erythrocyte n-3 PUFA levels did not predict change in cognitive performance over 6 months.
Conclusions
We found no evidence to support the use of omega-3 supplementation to improve cognitive functioning in ultra-high risk individuals. The biomarker analysis suggests that this finding is unlikely to be attributed to poor adherence or consumption of non-trial n−3 PUFAs.
Posttraumatic stress symptoms (PTSS) are common following traumatic stress exposure (TSE). Identification of individuals with PTSS risk in the early aftermath of TSE is important to enable targeted administration of preventive interventions. In this study, we used baseline survey data from two prospective cohort studies to identify the most influential predictors of substantial PTSS.
Methods
Self-identifying black and white American women and men (n = 1546) presenting to one of 16 emergency departments (EDs) within 24 h of motor vehicle collision (MVC) TSE were enrolled. Individuals with substantial PTSS (⩾33, Impact of Events Scale – Revised) 6 months after MVC were identified via follow-up questionnaire. Sociodemographic, pain, general health, event, and psychological/cognitive characteristics were collected in the ED and used in prediction modeling. Ensemble learning methods and Monte Carlo cross-validation were used for feature selection and to determine prediction accuracy. External validation was performed on a hold-out sample (30% of total sample).
Results
Twenty-five percent (n = 394) of individuals reported PTSS 6 months following MVC. Regularized linear regression was the top performing learning method. The top 30 factors together showed good reliability in predicting PTSS in the external sample (Area under the curve = 0.79 ± 0.002). Top predictors included acute pain severity, recovery expectations, socioeconomic status, self-reported race, and psychological symptoms.
Conclusions
These analyses add to a growing literature indicating that influential predictors of PTSS can be identified and risk for future PTSS estimated from characteristics easily available/assessable at the time of ED presentation following TSE.
The Hierarchical Taxonomy of Psychopathology (HiTOP) has emerged out of the quantitative approach to psychiatric nosology. This approach identifies psychopathology constructs based on patterns of co-variation among signs and symptoms. The initial HiTOP model, which was published in 2017, is based on a large literature that spans decades of research. HiTOP is a living model that undergoes revision as new data become available. Here we discuss advantages and practical considerations of using this system in psychiatric practice and research. We especially highlight limitations of HiTOP and ongoing efforts to address them. We describe differences and similarities between HiTOP and existing diagnostic systems. Next, we review the types of evidence that informed development of HiTOP, including populations in which it has been studied and data on its validity. The paper also describes how HiTOP can facilitate research on genetic and environmental causes of psychopathology as well as the search for neurobiologic mechanisms and novel treatments. Furthermore, we consider implications for public health programs and prevention of mental disorders. We also review data on clinical utility and illustrate clinical application of HiTOP. Importantly, the model is based on measures and practices that are already used widely in clinical settings. HiTOP offers a way to organize and formalize these techniques. This model already can contribute to progress in psychiatry and complement traditional nosologies. Moreover, HiTOP seeks to facilitate research on linkages between phenotypes and biological processes, which may enable construction of a system that encompasses both biomarkers and precise clinical description.
Relapse and recurrence of depression are common, contributing to the overall burden of depression globally. Accurate prediction of relapse or recurrence while patients are well would allow the identification of high-risk individuals and may effectively guide the allocation of interventions to prevent relapse and recurrence.
Aims
To review prognostic models developed to predict the risk of relapse, recurrence, sustained remission, or recovery in adults with remitted major depressive disorder.
Method
We searched the Cochrane Library (current issue); Ovid MEDLINE (1946 onwards); Ovid Embase (1980 onwards); Ovid PsycINFO (1806 onwards); and Web of Science (1900 onwards) up to May 2021. We included development and external validation studies of multivariable prognostic models. We assessed risk of bias of included studies using the Prediction model risk of bias assessment tool (PROBAST).
Results
We identified 12 eligible prognostic model studies (11 unique prognostic models): 8 model development-only studies, 3 model development and external validation studies and 1 external validation-only study. Multiple estimates of performance measures were not available and meta-analysis was therefore not necessary. Eleven out of the 12 included studies were assessed as being at high overall risk of bias and none examined clinical utility.
Conclusions
Due to high risk of bias of the included studies, poor predictive performance and limited external validation of the models identified, presently available clinical prediction models for relapse and recurrence of depression are not yet sufficiently developed for deploying in clinical settings. There is a need for improved prognosis research in this clinical area and future studies should conform to best practice methodological and reporting guidelines.
The inaugural data from the first systematic program of sea-ice observations in Kotzebue Sound, Alaska, in 2018 coincided with the first winter in living memory when the Sound was not choked with ice. The following winter of 2018–19 was even warmer and characterized by even less ice. Here we discuss the mass balance of landfast ice near Kotzebue (Qikiqtaġruk) during these two anomalously warm winters. We use in situ observations and a 1-D thermodynamic model to address three research questions developed in partnership with an Indigenous Advisory Council. In doing so, we improve our understanding of connections between landfast ice mass balance, marine mammals and subsistence hunting. Specifically, we show: (i) ice growth stopped unusually early due to strong vertical ocean heat flux, which also likely contributed to early start to bearded seal hunting; (ii) unusually thin ice contributed to widespread surface flooding. The associated snow ice formation partly offset the reduced ice growth, but the flooding likely had a negative impact on ringed seal habitat; (iii) sea ice near Kotzebue during the winters of 2017–18 and 2018–19 was likely the thinnest since at least 1945, driven by a combination of warm air temperatures and a persistent ocean heat flux.
Retrospective self-report is typically used for diagnosing previous pediatric traumatic brain injury (TBI). A new semi-structured interview instrument (New Mexico Assessment of Pediatric TBI; NewMAP TBI) investigated test–retest reliability for TBI characteristics in both the TBI that qualified for study inclusion and for lifetime history of TBI.
Method:
One-hundred and eight-four mTBI (aged 8–18), 156 matched healthy controls (HC), and their parents completed the NewMAP TBI within 11 days (subacute; SA) and 4 months (early chronic; EC) of injury, with a subset returning at 1 year (late chronic; LC).
Results:
The test–retest reliability of common TBI characteristics [loss of consciousness (LOC), post-traumatic amnesia (PTA), retrograde amnesia, confusion/disorientation] and post-concussion symptoms (PCS) were examined across study visits. Aside from PTA, binary reporting (present/absent) for all TBI characteristics exhibited acceptable (≥0.60) test–retest reliability for both Qualifying and Remote TBIs across all three visits. In contrast, reliability for continuous data (exact duration) was generally unacceptable, with LOC and PCS meeting acceptable criteria at only half of the assessments. Transforming continuous self-report ratings into discrete categories based on injury severity resulted in acceptable reliability. Reliability was not strongly affected by the parent completing the NewMAP TBI.
Conclusions:
Categorical reporting of TBI characteristics in children and adolescents can aid clinicians in retrospectively obtaining reliable estimates of TBI severity up to a year post-injury. However, test–retest reliability is strongly impacted by the initial data distribution, selected statistical methods, and potentially by patient difficulty in distinguishing among conceptually similar medical concepts (i.e., PTA vs. confusion).
The first demonstration of laser action in ruby was made in 1960 by T. H. Maiman of Hughes Research Laboratories, USA. Many laboratories worldwide began the search for lasers using different materials, operating at different wavelengths. In the UK, academia, industry and the central laboratories took up the challenge from the earliest days to develop these systems for a broad range of applications. This historical review looks at the contribution the UK has made to the advancement of the technology, the development of systems and components and their exploitation over the last 60 years.