We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The impact of the coronavirus disease 2019 (COVID-19) pandemic on mental health is still being unravelled. It is important to identify which individuals are at greatest risk of worsening symptoms. This study aimed to examine changes in depression, anxiety and post-traumatic stress disorder (PTSD) symptoms using prospective and retrospective symptom change assessments, and to find and examine the effect of key risk factors.
Method
Online questionnaires were administered to 34 465 individuals (aged 16 years or above) in April/May 2020 in the UK, recruited from existing cohorts or via social media. Around one-third (n = 12 718) of included participants had prior diagnoses of depression or anxiety and had completed pre-pandemic mental health assessments (between September 2018 and February 2020), allowing prospective investigation of symptom change.
Results
Prospective symptom analyses showed small decreases in depression (PHQ-9: −0.43 points) and anxiety [generalised anxiety disorder scale – 7 items (GAD)-7: −0.33 points] and increases in PTSD (PCL-6: 0.22 points). Conversely, retrospective symptom analyses demonstrated significant large increases (PHQ-9: 2.40; GAD-7 = 1.97), with 55% reported worsening mental health since the beginning of the pandemic on a global change rating. Across both prospective and retrospective measures of symptom change, worsening depression, anxiety and PTSD symptoms were associated with prior mental health diagnoses, female gender, young age and unemployed/student status.
Conclusions
We highlight the effect of prior mental health diagnoses on worsening mental health during the pandemic and confirm previously reported sociodemographic risk factors. Discrepancies between prospective and retrospective measures of changes in mental health may be related to recall bias-related underestimation of prior symptom severity.
Herbicide-resistant (HR) crops are widely grown throughout the United States and Canada. These crop-trait technologies can enhance weed management and therefore can be an important component of integrated weed management (IWM) programs. Concomitantly, evolution of HR weed populations has become ubiquitous in agricultural areas where HR crops are grown. Nevertheless, crop cultivars with new or combined (stacked) HR traits continue to be developed and commercialized. This review, based on a symposium held at the Western Society of Weed Science annual meeting in 2021, examines the impact of HR crops on HR weed management in the U.S. Great Plains, U.S. Pacific Northwest, and the Canadian Prairies over the past 25 yr and their past and future contributions to IWM. We also provide an industry perspective on the future of HR crop development and the role of HR crops in resistance management. Expanded options for HR traits in both major and minor crops are expected. With proper stewardship, HR crops can reduce herbicide-use intensity and help reduce selection pressure on weed populations. However, their proper deployment in cropping systems must be carefully planned by considering a diverse crop rotation sequence with multiple HR and non-HR crops and maximizing crop competition to effectively manage HR weed populations. Based on past experiences in the cultivation of HR crops and associated herbicide use in the western United States and Canada, HR crops have been important determinants of both the selection and management of HR weeds.
To investigate the effect of cariprazine on cognitive symptom change across bipolar I disorder and schizophrenia.
Methods
Post hoc analyses of 3- to 8-week pivotal studies in bipolar I depression and mania were conducted; one schizophrenia trial including the Cognitive Drug Research System attention battery was also analyzed. Outcomes of interest: Montgomery-Åsberg Depression Rating Scale [MADRS], Functioning Assessment Short Test [FAST], Positive and Negative Syndrome Scale [PANSS]). LSMDs in change from baseline to end of study were reported in the overall intent-to-treat population and in patient subsets with specified levels of baseline cognitive symptoms or performance.
Results
In patients with bipolar depression and at least mild cognitive symptoms, LSMDs were statistically significant for cariprazine vs placebo on MADRS item 6 (3 studies; 1.5 mg=−0.5 [P<.001]; 3 mg/d=−0.2 [P<.05]) and on the FAST Cognitive subscale (1 study; 1.5 mg/d=−1.4; P=.0039). In patients with bipolar mania and at least mild cognitive symptoms, the LSMD in PANSS Cognitive subscale score was statistically significant for cariprazine vs placebo (3 studies; −2.1; P=.001). In patients with schizophrenia and high cognitive impairment, improvement in power of attention was observed for cariprazine 3 mg/d vs placebo (P=.0080), but not for cariprazine 6 mg/d; improvement in continuity of attention was observed for cariprazine 3 mg/d (P=.0012) and 6 mg/d (P=.0073).
Conclusion
These post hoc analyses provide preliminary evidence of greater improvements for cariprazine vs placebo across cognitive measures in patients with bipolar I depression and mania, and schizophrenia, suggesting potential benefits for cariprazine in treating cognitive symptoms.
This study aimed to assess the current literature on the safety and impact of in-office biopsy on cancer waiting times as well as review evidence regarding cost-efficacy and patient satisfaction.
Method
A search of Cinahl, Cochrane Library, Embase, Medline, Prospero, PubMed and Web of Science was conducted for papers relevant to this study. Included articles were quality assessed and critically appraised.
Results
Of 19 741 identified studies, 22 articles were included. Lower costs were consistently reported for in-office biopsy compared with operating room biopsy. Four complications requiring intervention were documented. In-office biopsy is highly tolerated, with a procedure abandonment rate of less than 1 per cent. When compared with operating room biopsy, it is associated with significantly reduced time-to-diagnosis and time-to-treatment initiation. It is linked to improved overall three-year survival.
Conclusion
In-office biopsy is a safe procedure that may help certain patients avoid general anaesthetic. It was shown to significantly reduce time-to-diagnosis and time-to-treatment initiation when compared with operating room biopsy. This may have important implications for oncological outcomes. In-office biopsy requires fewer resources and is likely to be cost-saving five-years following introduction. With high rates of sensitivity and specificity, in-office biopsy should be considered as the first-line procedure to achieve tissue diagnosis.
Substantial progress has been made in the standardization of nomenclature for paediatric and congenital cardiac care. In 1936, Maude Abbott published her Atlas of Congenital Cardiac Disease, which was the first formal attempt to classify congenital heart disease. The International Paediatric and Congenital Cardiac Code (IPCCC) is now utilized worldwide and has most recently become the paediatric and congenital cardiac component of the Eleventh Revision of the International Classification of Diseases (ICD-11). The most recent publication of the IPCCC was in 2017. This manuscript provides an updated 2021 version of the IPCCC.
The International Society for Nomenclature of Paediatric and Congenital Heart Disease (ISNPCHD), in collaboration with the World Health Organization (WHO), developed the paediatric and congenital cardiac nomenclature that is now within the eleventh version of the International Classification of Diseases (ICD-11). This unification of IPCCC and ICD-11 is the IPCCC ICD-11 Nomenclature and is the first time that the clinical nomenclature for paediatric and congenital cardiac care and the administrative nomenclature for paediatric and congenital cardiac care are harmonized. The resultant congenital cardiac component of ICD-11 was increased from 29 congenital cardiac codes in ICD-9 and 73 congenital cardiac codes in ICD-10 to 318 codes submitted by ISNPCHD through 2018 for incorporation into ICD-11. After these 318 terms were incorporated into ICD-11 in 2018, the WHO ICD-11 team added an additional 49 terms, some of which are acceptable legacy terms from ICD-10, while others provide greater granularity than the ISNPCHD thought was originally acceptable. Thus, the total number of paediatric and congenital cardiac terms in ICD-11 is 367. In this manuscript, we describe and review the terminology, hierarchy, and definitions of the IPCCC ICD-11 Nomenclature. This article, therefore, presents a global system of nomenclature for paediatric and congenital cardiac care that unifies clinical and administrative nomenclature.
The members of ISNPCHD realize that the nomenclature published in this manuscript will continue to evolve. The version of the IPCCC that was published in 2017 has evolved and changed, and it is now replaced by this 2021 version. In the future, ISNPCHD will again publish updated versions of IPCCC, as IPCCC continues to evolve.
Suicide accounts for 2.2% of all years of life lost worldwide. We aimed to establish whether infectious epidemics are associated with any changes in the incidence of suicide or the period prevalence of self-harm, or thoughts of suicide or self-harm, with a secondary objective of establishing the frequency of these outcomes.
Methods
In this systematic review and meta-analysis, MEDLINE, Embase, PsycINFO and AMED were searched from inception to 9 September 2020. Studies of infectious epidemics reporting outcomes of (a) death by suicide, (b) self-harm or (c) thoughts of suicide or self-harm were identified. A random-effects model meta-analysis for the period prevalence of thoughts of suicide or self-harm was conducted.
Results
In total, 1354 studies were screened with 57 meeting eligibility criteria, of which 7 described death by suicide, 9 by self-harm, and 45 thoughts of suicide or self-harm. The observation period ranged from 1910 to 2020 and included epidemics of Spanish Flu, severe acute respiratory syndrome, human monkeypox, Ebola virus disease and coronavirus disease 2019 (COVID-19). Regarding death by suicide, data with a clear longitudinal comparison group were available for only two epidemics: SARS in Hong Kong, finding an increase in suicides among the elderly, and COVID-19 in Japan, finding no change in suicides among children and adolescents. In terms of self-harm, five studies examined emergency department attendances in epidemic and non-epidemic periods, of which four found no difference and one showed a reduction during the epidemic. In studies of thoughts of suicide or self-harm, one large survey showed a substantial increase in period prevalence compared to non-epidemic periods, but smaller studies showed no difference. As a secondary objective, a meta-analysis of thoughts of suicide and self-harm found that the pooled prevalence was 8.0% overall (95% confidence interval (CI) 5.2–12.0%; 14 820 of 99 238 cases in 24 studies) over a time period of between seven days and six months. The quality assessment found 42 studies were of low quality, nine of moderate quality and six of high quality.
Conclusions
There is little robust evidence on the association of infectious epidemics with suicide, self-harm and thoughts of suicide or self-harm. There was an increase in suicides among the elderly in Hong Kong during SARS and no change in suicides among young people in Japan during COVID-19, but it is unclear how far these findings may be generalised. The development of up-to-date self-harm and suicide statistics to monitor the effect of the current pandemic is an urgent priority.
We present a detailed analysis of the radio galaxy PKS
$2250{-}351$
, a giant of 1.2 Mpc projected size, its host galaxy, and its environment. We use radio data from the Murchison Widefield Array, the upgraded Giant Metre-wavelength Radio Telescope, the Australian Square Kilometre Array Pathfinder, and the Australia Telescope Compact Array to model the jet power and age. Optical and IR data come from the Galaxy And Mass Assembly (GAMA) survey and provide information on the host galaxy and environment. GAMA spectroscopy confirms that PKS
$2250{-}351$
lies at
$z=0.2115$
in the irregular, and likely unrelaxed, cluster Abell 3936. We find its host is a massive, ‘red and dead’ elliptical galaxy with negligible star formation but with a highly obscured active galactic nucleus dominating the mid-IR emission. Assuming it lies on the local M–
$\sigma$
relation, it has an Eddington accretion rate of
$\lambda_{\rm EDD}\sim 0.014$
. We find that the lobe-derived jet power (a time-averaged measure) is an order of magnitude greater than the hotspot-derived jet power (an instantaneous measure). We propose that over the lifetime of the observed radio emission (
${\sim} 300\,$
Myr), the accretion has switched from an inefficient advection-dominated mode to a thin disc efficient mode, consistent with the decrease in jet power. We also suggest that the asymmetric radio morphology is due to its environment, with the host of PKS
$2250{-}351$
lying to the west of the densest concentration of galaxies in Abell 3936.
CVD and associated metabolic diseases are linked to chronic inflammation, which can be modified by diet. The objective of the present study was to determine whether there is a difference in inflammatory markers, blood metabolic and lipid panels and lymphocyte gene expression in response to a high-fat dairy food challenge with or without milk fat globule membrane (MFGM). Participants consumed a dairy product-based meal containing whipping cream (WC) high in saturated fat with or without the addition of MFGM, following a 12 h fasting blood draw. Inflammatory markers including IL-6 and C-reactive protein, lipid and metabolic panels and lymphocyte gene expression fold changes were measured using multiplex assays, clinical laboratory services and TaqMan real-time RT-PCR, respectively. Fold changes in gene expression were determined using the Pfaffl method. Response variables were converted into incremental AUC, tested for differences, and corrected for multiple comparisons. The postprandial insulin response was significantly lower following the meal containing MFGM (P < 0·01). The gene encoding soluble epoxide hydrolase (EPHX2) was shown to be more up-regulated in the absence of MFGM (P = 0·009). Secondary analyses showed that participants with higher baseline cholesterol:HDL-cholesterol ratio (Chol:HDL) had a greater reduction in gene expression of cluster of differentiation 14 (CD14) and lymphotoxin β receptor (LTBR) with the WC+MFGM meal. The protein and lipid composition of MFGM is thought to be anti-inflammatory. These exploratory analyses suggest that addition of MFGM to a high-saturated fat meal modifies postprandial insulin response and offers a protective role for those individuals with higher baseline Chol:HDL.
Objectives: This study examined the effects of anodal transcranial direct current stimulation (a-tDCS) on sentence and word comprehension in healthy adults. Methods: Healthy adult participants, aged between 19 and 30 years, received either a-tDCS over the left inferior frontal gyrus (n=18) or sham stimulation (n=18). Participants completed sentence comprehension and word comprehension tasks before and during stimulation. Accuracy and reaction times (RTs) were recorded as participants completed both tasks. Results: a-tDCS was found to significantly decrease RT on the sentence comprehension task compared to baseline. There was no change in RT following sham stimulation. a-tDCS was not found to have a significant effect on accuracy. Also, a-tDCS did not affect accuracy or RTs on the word comprehension task. Conclusions: The study provides evidence that non-invasive anodal electrical stimulation can modulate sentence comprehension in healthy adults, at least compared to their baseline performance. (JINS, 2019, 25, 331–335)
Around the world large quantities of sludge wastes derived from nuclear energy production are currently kept in storage facilities. In the UK, the British government has marked sludge removal as a top priority as these facilities are nearing the end of their operational lifetimes. Therefore chemical understanding of uranium uptake in Mg-rich sludge is critical for successful remediation strategies. Previous studies have explored uranium uptake by the calcium carbonate minerals, calcite and aragonite, under conditions applicable to both natural and anthropogenically perturbed systems. However, studies of the uptake by Mg-rich minerals such as brucite [Mg(OH)2], nesquehonite [MgCO3·3H2O] and hydromagnesite [Mg5(CO3)4 (OH)2·4H2O], have not been previously conducted. Such experiments will improve our understanding of the mobility of uranium and other actinides in natural lithologies as well as provide key information applicable to nuclear waste repository strategies involving Mg-rich phases. Experiments with mineral powders were used to determine the partition coefficients (Kd) and coordination of UO22+ during adsorption and co-precipitation with brucite, nesquehonite and hydromagnesite. The Kd values for the selected Mg-rich minerals were comparable or greater than those published for calcium carbonates. Extended X-ray absorption fine structure analysis results showed that the structure of the uranyl-triscarbonato [UO2(CO3)3] species was maintained after surface attachment and that uptake of uranyl ions took place mainly via mineral surface reactions.
The Rockefeller Clinical Scholars (KL2) program began in 1976 and transitioned into a 3-year Master’s degree program in 2006 when Rockefeller joined the National Institute of Health Clinical and Translational Science Award program. The program consists of ∼15 trainees supported by the Clinical and Translational Science Award KL2 award and University funds. It is designed to provide an optimal environment for junior translational investigators to develop team science and leadership skills by designing and performing a human subjects protocol under the supervision of a distinguished senior investigator mentor and a team of content expert educators. This is complemented by a tutorial focused on important translational skills.
Results
Since 2006, 40 Clinical Scholars have graduated from the programs and gone on to careers in academia (72%), government service (5%), industry (15%), and private medical practice (3%); 2 (5%) remain in training programs; 39/40 remain in translational research careers with 23 National Institute of Health awards totaling $23 million, foundation and philanthropic support of $20.3 million, and foreign government and foundation support of $6 million. They have made wide ranging scientific discoveries and have endeavored to translate those discoveries into improved human health.
Conclusion
The Rockefeller Clinical Scholars (KL2) program provides one model for translational science training.
Field research was conducted during the summers of 1981 and 1982 in order to determine relative infection and population increase of lesion nematodes (Pratylenchus spp.) on seven weed species that commonly occur in field-bean (Phaseolus vulgaris L.) fields in western Nebraska. Weeds were grown at three densities with and without fieldbeans. A representative sample of the root systems from plants in each plot was removed in August and the nematodes were extracted and counted. No difference in nematode infection rate was found among weed population levels. Nematodes per gram of dry root were not different in weeds grown with or without fieldbeans. Weeds grown with fieldbeans had smaller root systems, and consequently total nematodes per root system were less than in weeds grown in the absence of fieldbeans. There was a significant difference among most weed species when nematodes per gram of dry root were estimated. Hairy nightshade (Solanum sarachoides Sendt. ♯3 SOLSA) and barnyardgrass [Echinochloa crus-galli (L.) ♯ ECHCG] supported the highest numbers of nematodes per g oven-dry roots, redroot pigweed (Amaranthus retroflexus L. ♯ AMARE) and common cocklebur (Xanthium pensylvanicum Wallr. ♯ XANPE) had least numbers of nematodes/g oven-dry roots, and infestation levels on other weed species were variable but generally intermediate.
The Learning Health System Network clinical data research network includes academic medical centers, health-care systems, public health departments, and health plans, and is designed to facilitate outcomes research, pragmatic trials, comparative effectiveness research, and evaluation of population health interventions.
Methods
The Learning Health System Network is 1 of 13 clinical data research networks assembled to create, in partnership with 20 patient-powered research networks, a National Patient-Centered Clinical Research Network.
Results and Conclusions
Herein, we describe the Learning Health System Network as an emerging resource for translational research, providing details on the governance and organizational structure of the network, the key milestones of the current funding period, and challenges and opportunities for collaborative science leveraging the network.
Stratigraphic, sedimentologic, and pedologic studies of beach ridge and lacustrine deposits indicate that up to five times during the Holocene, shallow lakes covered Silver Lake playa in southeastern California for periods of years to decades. The two youngest lacustrine events (at about 390 ± 90 yr B. P. and 3620 ± 70 yr B. P.) coincide with the early and late Neoglacial episodes of North America. Increasing evidence in recent years from other nonglaciated areas leads us to conclude that the effects of these climatic episodes were much more widespread than previously thought. The climate during these episodes was characterized by an increased frequency of winter storms in the southwestern United States, causing wetter conditions that affected diverse, hyperarid environments in the Mojave Desert and adjacent regions. We propose that this wide areal coverage was caused by large-scale, winter atmospheric circulation patterns, which are probably related to changes in sea-surface temperatures and oceanic circulation in the eastern North Pacific Ocean.
Well-preserved shorelines in Estancia basin and a relatively simple hydrologic setting have prompted several inquiries into the basin's hydrologic balance for the purpose of estimating regional precipitation during the late Pleistocene. Estimates have ranged from 86% to 150% of modern, the disparity largely the result of assumptions about past temperatures. In this study, we use an array of models for surface-water runoff, groundwater flow, and lake energy balance to examine previously proposed scenarios for late Pleistocene climate. Constraints imposed by geologic evidence of past lake levels indicate that precipitation for the Last Glacial Maximum (LGM) may have doubled relative to modern values during brief episodes of colder and wetter climate and that annual runoff was as much as 15% of annual precipitation during these episodes.
The objective of this study was to assess the public’s experience, expectations, and perceptions related to Emergency Medical Services (EMS).
Methods
A population-based telephone interview of adults in the United States was conducted. The survey instrument consisted of 112 items. Demographic variables including age, race, political beliefs, and household income were collected. Data collection was performed by trained interviewers from Kent State University’s (Kent, Ohio USA)Social Research Laboratory. Descriptive statistics were calculated. Comparative analyses were conducted between those who used EMS at least once in the past five years and those who did not use EMS using χ2 and t tests.
Results
A total of 2,443 phone calls were made and 1,348 individuals agreed to complete the survey (55.2%). There were 297 individuals who requested to drop out of the survey during the phone interview, leaving a total of 1,051 (43.0%) full responses. Participants ranged in age from 18 to 94 years with an average age of 57.5 years. Most were Caucasian or white (83.0%), married (62.8%), and held conservative political beliefs (54.8%). Three-fourths of all respondents believed that at least 40% of patients survive cardiac arrest when EMS services are received. Over half (56.7%) believed that Emergency Medical Technician (EMT)-Basics and EMT-Paramedics provide the same level of care. The estimated median hours of training required for EMT-Basics was 100 hours (IQR: 40-200 hours), while the vast majority of respondents estimated that EMT-Paramedics are required to take fewer than 1,000 clock hours of training (99.3%). The majority believed EMS professionals should be screened for illegal drug use (97.0%), criminal background (95.9%), mental health (95.2%), and physical fitness (91.3%). Over one-third (37.6%) had used EMS within the past five years. Of these individuals, over two-thirds (69.6%) rated their most recent experience as “excellent.” More of those who used EMS at least once in the past five years reported a willingness to consent to participate in EMS research compared with those who had not used EMS (69.9% vs. 61.4%, P=.005).
Conclusions
Most respondents who had used EMS services rated their experience as excellent. Nevertheless, expectations related to survival after cardiac arrest in the out-of-hospital setting were not realistic. Furthermore, much of the public was unaware of the differences in training hour requirements and level of care provided by EMT-Basics and EMT-Paramedics.
CroweRP, LevineR, RodriguezS, LarrimoreAD, PirralloRG. Public Perception of Emergency Medical Services in the United States. Prehosp Disaster Med. 2016;31(Suppl. 1):s112–s117.