To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Here we present stringent low-frequency (185 MHz) limits on coherent radio emission associated with a short-duration gamma-ray burst (SGRB). Our observations of the short gamma-ray burst (GRB) 180805A were taken with the upgraded Murchison Widefield Array (MWA) rapid-response system, which triggered within 20s of receiving the transient alert from the Swift Burst Alert Telescope, corresponding to 83.7 s post-burst. The SGRB was observed for a total of 30 min, resulting in a
persistent flux density upper limit of 40.2 mJy beam–1. Transient searches were conducted at the Swift position of this GRB on 0.5 s, 5 s, 30 s and 2 min timescales, resulting in
limits of 570–1 830, 270–630, 200–420, and 100–200 mJy beam–1, respectively. We also performed a dedispersion search for prompt signals at the position of the SGRB with a temporal and spectral resolution of 0.5 s and 1.28 MHz, respectively, resulting in a
fluence upper-limit range from 570 Jy ms at DM
pc cm–3 (
) to 1 750 Jy ms at DM
pc cm–3 (
, corresponding to the known redshift range of SGRBs. We compare the fluence prompt emission limit and the persistent upper limit to SGRB coherent emission models assuming the merger resulted in a stable magnetar remnant. Our observations were not sensitive enough to detect prompt emission associated with the alignment of magnetic fields of a binary neutron star just prior to the merger, from the interaction between the relativistic jet and the interstellar medium (ISM) or persistent pulsar-like emission from the spin-down of the magnetar. However, in the case of a more powerful SGRB (a gamma-ray fluence an order of magnitude higher than GRB 180805A and/or a brighter X-ray counterpart), our MWA observations may be sensitive enough to detect coherent radio emission from the jet-ISM interaction and/or the magnetar remnant. Finally, we demonstrate that of all current low- frequency radio telescopes, only the MWA has the sensitivity and response times capable of probing prompt emission models associated with the initial SGRB merger event.
Coronavirus disease (COVID-19) has been identified as an acute respiratory illness leading to severe acute respiratory distress syndrome. As the disease spread, demands on health care systems increased, specifically the need to expand hospital capacity. Alternative care hospitals (ACHs) have been used to mitigate these issues; however, establishing an ACH has many challenges. The goal of this session was to perform systems testing, using a simulation-based evaluation to identify areas in need of improvement.
Four simulation cases were designed to depict common and high acuity situations encountered in the ACH, using a high technology simulator and standardized patient. A multidisciplinary observer group was given debriefing forms listing the objectives, critical actions, and specific areas to focus their attention. These forms were compiled for data collection.
Logistical, operational, and patient safety issues were identified during the simulation and compiled into a simulation event report. Proposed solutions and protocol changes were made in response to the identified issues.
Simulation was successfully used for systems testing, supporting efforts to maximize patient care and provider safety in a rapidly developed ACH. The simulation event report identified operational deficiencies and safety concerns directly resulting in equipment modifications and protocol changes.
On October 10, 2020, the Memorial Sloan Kettering Cancer Center Supportive Care Service hosted their first-ever United States (US) World Hospice and Palliative Care Day (WHPCD) Celebration. The purpose of this article is to describe the US inaugural event in alignment with the broader goals of WHPCD and provide lessons learned in anticipation of the second annual conference to be held on October 5–6, 2021.
Description of the inaugural event in the context of COVID-19 and WHPCD, co-planning conference team reflection, and attendee survey responses.
The Worldwide Hospice Palliative Care Alliance initially launched WHPCD in 2005 as an annual unified day of action to celebrate and support hospice and palliative care around the world. The US-based innovative virtual conference featured 23 interprofessional hospice and palliative care specialists and patient and family caregiver speakers across nine diverse sessions addressing priorities at the intersection of COVID-19, social injustice, and the global burden of serious health-related suffering. Two primary aims guided the event: community building and wisdom sharing. Nearly 270 registrants from at least 16 countries and one dozen states across the US joined the free program focused on both personal and professional development.
Significance of results
Unlike many other academic conferences and professional gatherings that were relegated to online forums due to pandemic-related restrictions, the US WHPCD Celebration was intentionally established to create a virtual coming together for collective reflection on the barriers and facilitators of palliative care delivery amid vast societal change. The goal to ensure a globally relevant and culturally inclusive agenda will continue to draw increased participation at an international level during future annual events. Finally, the transparent and respectful sharing of palliative care team experiences in the year preceding the conference established a safe environment for both individual expression and scholarly discussion.
Perceived discrimination is associated with worse mental health. Few studies have assessed whether perceived discrimination (i) is associated with the risk of psychotic disorders and (ii) contributes to an increased risk among minority ethnic groups relative to the ethnic majority.
We used data from the European Network of National Schizophrenia Networks Studying Gene-Environment Interactions Work Package 2, a population-based case−control study of incident psychotic disorders in 17 catchment sites across six countries. We calculated odds ratios (OR) and 95% confidence intervals (95% CI) for the associations between perceived discrimination and psychosis using mixed-effects logistic regression models. We used stratified and mediation analyses to explore differences for minority ethnic groups.
Reporting any perceived experience of major discrimination (e.g. unfair treatment by police, not getting hired) was higher in cases than controls (41.8% v. 34.2%). Pervasive experiences of discrimination (≥3 types) were also higher in cases than controls (11.3% v. 5.5%). In fully adjusted models, the odds of psychosis were 1.20 (95% CI 0.91–1.59) for any discrimination and 1.79 (95% CI 1.19–1.59) for pervasive discrimination compared with no discrimination. In stratified analyses, the magnitude of association for pervasive experiences of discrimination appeared stronger for minority ethnic groups (OR = 1.73, 95% CI 1.12–2.68) than the ethnic majority (OR = 1.42, 95% CI 0.65–3.10). In exploratory mediation analysis, pervasive discrimination minimally explained excess risk among minority ethnic groups (5.1%).
Pervasive experiences of discrimination are associated with slightly increased odds of psychotic disorders and may minimally help explain excess risk for minority ethnic groups.
This study examined post-traumatic stress disorder (PTSD) symptoms in 13 049 survivors of suspected or confirmed COVID-19, from the UK general population, as a function of severity and hospital admission status. Compared with mild COVID-19, significantly elevated rates of PTSD symptoms were identified in those requiring medical support at home (effect size 0.178 s.d., P = 0.0316), those requiring hospital admission without ventilation (effect size 0.234 s.d., P = 0.0064) and those requiring hospital admission with ventilator support (effect size 0.454 s.d., P < 0.001). Intrusive images were the most prominent elevated symptom. Adequate psychiatric provision for such individuals will be of paramount importance.
In recent years, a variety of efforts have been made in political science to enable, encourage, or require scholars to be more open and explicit about the bases of their empirical claims and, in turn, make those claims more readily evaluable by others. While qualitative scholars have long taken an interest in making their research open, reflexive, and systematic, the recent push for overarching transparency norms and requirements has provoked serious concern within qualitative research communities and raised fundamental questions about the meaning, value, costs, and intellectual relevance of transparency for qualitative inquiry. In this Perspectives Reflection, we crystallize the central findings of a three-year deliberative process—the Qualitative Transparency Deliberations (QTD)—involving hundreds of political scientists in a broad discussion of these issues. Following an overview of the process and the key insights that emerged, we present summaries of the QTD Working Groups’ final reports. Drawing on a series of public, online conversations that unfolded at www.qualtd.net, the reports unpack transparency’s promise, practicalities, risks, and limitations in relation to different qualitative methodologies, forms of evidence, and research contexts. Taken as a whole, these reports—the full versions of which can be found in the Supplementary Materials—offer practical guidance to scholars designing and implementing qualitative research, and to editors, reviewers, and funders seeking to develop criteria of evaluation that are appropriate—as understood by relevant research communities—to the forms of inquiry being assessed. We dedicate this Reflection to the memory of our coauthor and QTD working group leader Kendra Koivu.1
The first demonstration of laser action in ruby was made in 1960 by T. H. Maiman of Hughes Research Laboratories, USA. Many laboratories worldwide began the search for lasers using different materials, operating at different wavelengths. In the UK, academia, industry and the central laboratories took up the challenge from the earliest days to develop these systems for a broad range of applications. This historical review looks at the contribution the UK has made to the advancement of the technology, the development of systems and components and their exploitation over the last 60 years.
The contribution of neonatal cyanosis, inherent to cyanotic congenital heart disease, to the magnitude of neurologic injury during deep hypothermic circulatory arrest has not been fully delineated. This study investigates the impact of cyanosis and deep hypothermic circulatory arrest on brain injury.
Neonatal piglets were randomised to placement of a pulmonary artery to left atrium shunt to create cyanosis or sham thoracotomy. At day 7, animals were randomised to undergo deep hypothermic circulatory arrest or sham. Arterial oxygen tension and haematocrit were obtained. Neurobehavioural performance was serially assessed. The animals were sacrificed on day 14. Brain tissue was assessed for neuronal necrosis using a 5-point histopathologic score.
Four experimental groups were analysed (sham, n = 10; sham + deep hypothermic circulatory arrest, n = 8; shunt, n = 9; shunt + deep hypothermic circulatory arrest, n = 7). Cyanotic piglets had significantly higher haematocrit and lower partial pressure of oxygen at day 14 than non-cyanotic piglets. There were no statistically significant differences in neurobehavioural scores at day 1. However, shunt + deep hypothermic circulatory arrest piglets had evidence of greater neuronal injury than sham animals (median (range): 2 (0–4) versus 0 (0–0), p = 0.02).
Cyanotic piglets undergoing deep hypothermic circulatory arrest had increased neuronal injury compared to sham animals. Significant injury was not seen for either cyanosis or deep hypothermic circulatory arrest alone relative to shams. These findings suggest an interaction between cyanosis and deep hypothermic circulatory arrest and may partially explain the suboptimal neurologic outcomes seen in children with cyanotic heart disease who undergo deep hypothermic circulatory arrest.
To test the effectiveness of a social network intervention (SNI) to improve children’s healthy drinking behaviours.
A three-arm cluster randomised control trial design was used. In the SNI, a subset of children were selected and trained as ‘influence agents’ to promote water consumption–as an alternative to sugar-sweetened beverages (SSB)–among their peers. In the active control condition, all children were simultaneously exposed to the benefits of water consumption. The control condition received no intervention.
Eleven schools in the Netherlands.
Four hundred and fifty-one children (Mage = 10·74, SDage = 0·97; 50·8 % girls).
Structural path models showed that children exposed to the SNI consumed 0·20 less SSB per day compared to those in the control condition (β = 0·25, P = 0·035). There was a trend showing that children exposed to the SNI consumed 0·17 less SSB per day than those in the active control condition (β = 0·20, P = 0·061). No differences were found between conditions for water consumption. However, the moderation effects of descriptive norms (β = –0·12, P = 0·028) and injunctive norms (β = 0·11–0·14, both P = 0·050) indicated that norms are more strongly linked to water consumption in the SNI condition compared to the active control and control conditions.
These findings suggest that a SNI promoting healthy drinking behaviours may prevent children from consuming more SSB. Moreover, for water consumption, the prevailing social norms in the context play an important role in mitigating the effectiveness of the SNI.
North Carolina Central University (NCCU) and Duke Cancer Institute implemented an NCI-funded Translational Cancer Disparities Research Partnership to enhance translational cancer research, increase the pool of underrepresented racial and ethnic group (UREG) researchers in the translational and clinical research workforce, and equip UREG trainees with skills to increase diversity in clinical trials. The Cancer Research Education Program (C-REP) provided training for UREG graduate students and postdoctoral fellows at Duke and NCCU. An innovative component of C-REP is the Translational Immersion Experience (TIE), which enabled Scholars to gain knowledge across eight domains of clinical and translational research (clinical trials operations, data monitoring, regulatory affairs, UREG accrual, biobanking, community engagement, community outreach, and high-throughput drug screening). Program-specific evaluative metrics were created for three broad domains (clinical operations, basic science/lab research, and population-based science) and eight TIE domains. Two cohorts (n = 13) completed pre- and post-surveys to determine program impact and identify recommendations for program improvement. Scholars reported statistically significant gains in knowledge across three broad domains of biomedical research and seven distinct areas within TIE. Training in translational research incorporating immersions in clinical trials operation, biobanking, drug development, and community engagement adds value to career development of UREG researchers.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Prenatal diethylstilbestrol (DES) exposure is associated with increased risk of hormonally mediated cancers and other medical conditions. We evaluated the association between DES and risk of pancreatic cancer and pancreatic disorders, type 2 diabetes, and gallbladder disease, which may be involved with this malignancy. Our analyses used follow-up data from the US National Cancer Institute DES Combined Cohort Study. Cox proportional hazards models estimated hazard ratios (HRs) and 95% confidence intervals (CIs) adjusted for age, sex, cohort, body mass index, smoking, and alcohol for the association between prenatal DES exposure and type 2 diabetes, gallbladder disease (mainly cholelithiasis), pancreatic disorders (mainly pancreatitis), and pancreatic cancer among 5667 exposed and 3315 unexposed individuals followed from 1990 to 2017. Standardized incidence rate (SIR) ratios for pancreatic cancer were based on age-, race-, and calendar year-specific general population cancer incidence rates. In women and men combined, the hazards for total pancreatic disorders and pancreatitis were greater in the prenatally DES exposed than the unexposed (HR = 11, 95% CI 2.6–51 and HR = 7.0, 95% CI 1.5–33, respectively). DES was not associated overall with gallbladder disease (HR = 1.2, 95% CI 0.88–1.5) or diabetes (HR = 1.1, 95% CI 0.9–1.2). In women, but not in men, DES exposure was associated with increased risk of pancreatic cancer compared with the unexposed (HR: 4.1, 95% CI 0.84–20) or general population (SIR: 1.9, 95% CI 1.0–3.2). Prenatal DES exposure may increase the risk of pancreatic disorders, including pancreatitis in women and men. The data suggested elevated pancreatic cancer risk in DES-exposed women, but not in exposed men.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
Few studies have examined burnout in psychosocial oncology clinicians. The aim of this systematic review was to summarize what is known about the prevalence and severity of burnout in psychosocial clinicians who work in oncology settings and the factors that are believed to contribute or protect against it.
Articles on burnout (including compassion fatigue and secondary trauma) in psychosocial oncology clinicians were identified by searching PubMed/MEDLINE, EMBASE, PsycINFO, the Cumulative Index to Nursing and Allied Health Literature, and the Web of Science Core Collection.
Thirty-eight articles were reviewed at the full-text level, and of those, nine met study inclusion criteria. All were published between 2004 and 2018 and included data from 678 psychosocial clinicians. Quality assessment revealed relatively low risk of bias and high methodological quality. Study composition and sample size varied greatly, and the majority of clinicians were aged between 40 and 59 years. Across studies, 10 different measures were used to assess burnout, secondary traumatic stress, and compassion fatigue, in addition to factors that might impact burnout, including work engagement, meaning, and moral distress. When compared with other medical professionals, psychosocial oncology clinicians endorsed lower levels of burnout.
Significance of results
This systematic review suggests that psychosocial clinicians are not at increased risk of burnout compared with other health care professionals working in oncology or in mental health. Although the data are quite limited, several factors appear to be associated with less burnout in psychosocial clinicians, including exposure to patient recovery, discussing traumas, less moral distress, and finding meaning in their work. More research using standardized measures of burnout with larger samples of clinicians is needed to examine both prevalence rates and how the experience of burnout changes over time. By virtue of their training, psychosocial clinicians are well placed to support each other and their nursing and medical colleagues.
Prolonged survival of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) on environmental surfaces and personal protective equipment may lead to these surfaces transmitting this pathogen to others. We sought to determine the effectiveness of a pulsed-xenon ultraviolet (PX-UV) disinfection system in reducing the load of SARS-CoV-2 on hard surfaces and N95 respirators.
Chamber slides and N95 respirator material were directly inoculated with SARS-CoV-2 and were exposed to different durations of PX-UV.
For hard surfaces, disinfection for 1, 2, and 5 minutes resulted in 3.53 log10, >4.54 log10, and >4.12 log10 reductions in viral load, respectively. For N95 respirators, disinfection for 5 minutes resulted in >4.79 log10 reduction in viral load. PX-UV significantly reduced SARS-CoV-2 on hard surfaces and N95 respirators.
With the potential to rapidly disinfectant environmental surfaces and N95 respirators, PX-UV devices are a promising technology to reduce environmental and personal protective equipment bioburden and to enhance both healthcare worker and patient safety by reducing the risk of exposure to SARS-CoV-2.
Radiocarbon (14C) ages cannot provide absolutely dated chronologies for archaeological or paleoenvironmental studies directly but must be converted to calendar age equivalents using a calibration curve compensating for fluctuations in atmospheric 14C concentration. Although calibration curves are constructed from independently dated archives, they invariably require revision as new data become available and our understanding of the Earth system improves. In this volume the international 14C calibration curves for both the Northern and Southern Hemispheres, as well as for the ocean surface layer, have been updated to include a wealth of new data and extended to 55,000 cal BP. Based on tree rings, IntCal20 now extends as a fully atmospheric record to ca. 13,900 cal BP. For the older part of the timescale, IntCal20 comprises statistically integrated evidence from floating tree-ring chronologies, lacustrine and marine sediments, speleothems, and corals. We utilized improved evaluation of the timescales and location variable 14C offsets from the atmosphere (reservoir age, dead carbon fraction) for each dataset. New statistical methods have refined the structure of the calibration curves while maintaining a robust treatment of uncertainties in the 14C ages, the calendar ages and other corrections. The inclusion of modeled marine reservoir ages derived from a three-dimensional ocean circulation model has allowed us to apply more appropriate reservoir corrections to the marine 14C data rather than the previous use of constant regional offsets from the atmosphere. Here we provide an overview of the new and revised datasets and the associated methods used for the construction of the IntCal20 curve and explore potential regional offsets for tree-ring data. We discuss the main differences with respect to the previous calibration curve, IntCal13, and some of the implications for archaeology and geosciences ranging from the recent past to the time of the extinction of the Neanderthals.