To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The Repugnant Conclusion is an implication of some approaches to population ethics. It states, in Derek Parfit's original formulation,
For any possible population of at least ten billion people, all with a very high quality of life, there must be some much larger imaginable population whose existence, if other things are equal, would be better, even though its members have lives that are barely worth living. (Parfit 1984: 388)
The Old Copper Complex (OCC) refers to the production of heavy copper-tool technology by Archaic Native American societies in the Lake Superior region. To better define the timing of the OCC, we evaluated 53 (eight new and 45 published) radiocarbon (14C) dates associated with copper artifacts and mines. We compared these dates to six lake sediment-based chronologies of copper mining and annealing in the Michigan Copper District. 14C dates grouped by archaeological context show that cremation remains, and wood and cordage embedded in copper artifacts have ages that overlap with the timing of high lead (Pb) concentrations in lake sediment. In contrast, dates in stratigraphic association and from mines are younger than those from embedded and cremation materials, suggesting that the former groups reflect the timing of processes that occurred post-abandonment. The comparatively young dates obtained from copper mines therefore likely reflect abandonment and infill of the mines rather than active use. Excluding three anomalously young samples, the ages of embedded organic material associated with 15 OCC copper artifacts range from 8500 to 3580 cal BP, confirming that the OCC is among the oldest known metalworking societies in the world.
This reflection article presents insights on conducting fieldwork during and after COVID-19 from a diverse collection of political scientists—from department heads to graduate students based at public and private universities in the United States and abroad. Many of them contributed to a newly published volume, Stories from the Field: A Guide to Navigating Fieldwork in Political Science (Krause and Szekely 2020). As in the book, these contributors draw on their years of experience in the field to identify the unique ethical and logistical challenges posed by COVID-19 and offer suggestions for how to adjust and continue research in the face of the pandemic’s disruptions. Key themes include how contingency planning must now be a central part of our research designs; how cyberspace has increasingly become “the field” for the time being; and how scholars can build lasting, mutually beneficial partnerships with “field citizens,” now and in the future.
Clarifying the relationship between depression symptoms and cardiometabolic and related health could clarify risk factors and treatment targets. The objective of this study was to assess whether depression symptoms in midlife are associated with the subsequent onset of cardiometabolic health problems.
The study sample comprised 787 male twin veterans with polygenic risk score data who participated in the Harvard Twin Study of Substance Abuse (‘baseline’) and the longitudinal Vietnam Era Twin Study of Aging (‘follow-up’). Depression symptoms were assessed at baseline [mean age 41.42 years (s.d. = 2.34)] using the Diagnostic Interview Schedule, Version III, Revised. The onset of eight cardiometabolic conditions (atrial fibrillation, diabetes, erectile dysfunction, hypercholesterolemia, hypertension, myocardial infarction, sleep apnea, and stroke) was assessed via self-reported doctor diagnosis at follow-up [mean age 67.59 years (s.d. = 2.41)].
Total depression symptoms were longitudinally associated with incident diabetes (OR 1.29, 95% CI 1.07–1.57), erectile dysfunction (OR 1.32, 95% CI 1.10–1.59), hypercholesterolemia (OR 1.26, 95% CI 1.04–1.53), and sleep apnea (OR 1.40, 95% CI 1.13–1.74) over 27 years after controlling for age, alcohol consumption, smoking, body mass index, C-reactive protein, and polygenic risk for specific health conditions. In sensitivity analyses that excluded somatic depression symptoms, only the association with sleep apnea remained significant (OR 1.32, 95% CI 1.09–1.60).
A history of depression symptoms by early midlife is associated with an elevated risk for subsequent development of several self-reported health conditions. When isolated, non-somatic depression symptoms are associated with incident self-reported sleep apnea. Depression symptom history may be a predictor or marker of cardiometabolic risk over decades.
In recent years, a variety of efforts have been made in political science to enable, encourage, or require scholars to be more open and explicit about the bases of their empirical claims and, in turn, make those claims more readily evaluable by others. While qualitative scholars have long taken an interest in making their research open, reflexive, and systematic, the recent push for overarching transparency norms and requirements has provoked serious concern within qualitative research communities and raised fundamental questions about the meaning, value, costs, and intellectual relevance of transparency for qualitative inquiry. In this Perspectives Reflection, we crystallize the central findings of a three-year deliberative process—the Qualitative Transparency Deliberations (QTD)—involving hundreds of political scientists in a broad discussion of these issues. Following an overview of the process and the key insights that emerged, we present summaries of the QTD Working Groups’ final reports. Drawing on a series of public, online conversations that unfolded at www.qualtd.net, the reports unpack transparency’s promise, practicalities, risks, and limitations in relation to different qualitative methodologies, forms of evidence, and research contexts. Taken as a whole, these reports—the full versions of which can be found in the Supplementary Materials—offer practical guidance to scholars designing and implementing qualitative research, and to editors, reviewers, and funders seeking to develop criteria of evaluation that are appropriate—as understood by relevant research communities—to the forms of inquiry being assessed. We dedicate this Reflection to the memory of our coauthor and QTD working group leader Kendra Koivu.1
The first demonstration of laser action in ruby was made in 1960 by T. H. Maiman of Hughes Research Laboratories, USA. Many laboratories worldwide began the search for lasers using different materials, operating at different wavelengths. In the UK, academia, industry and the central laboratories took up the challenge from the earliest days to develop these systems for a broad range of applications. This historical review looks at the contribution the UK has made to the advancement of the technology, the development of systems and components and their exploitation over the last 60 years.
In December 2015, the 21st Conference of Parties (COP21) to the UN Framework Convention on Climate Change invited the Intergovernmental Panel on Climate Change (IPCC) to provide a special report on the impacts of global warming of 1.5°C above pre-industrial levels and related global greenhouse gas emission pathways. In October 2018, the IPCC issued a Special Report on the impacts of global warming of 1.5°C above pre-industrial levels. This was the first IPCC report to employ the concept of the Anthropocene in its climate-change assessments – referring to it as the ‘overarching context’ and a ‘boundary concept’ that provides the ‘unifying lens’ through which to acknowledge ‘profound, differential but increasingly geologically significant human influences on the Earth system as a whole’.
Functional neurological disorder (FND) encompasses a complex and heterogeneous group of neuropsychiatric syndromes commonly encountered in clinical practice. Patients with FND may present with a myriad of neurological symptoms and frequently have comorbid medical, neurological, and psychiatric disorders. Over the past decade, important advances have been made in understanding the pathophysiology of FND within a biopsychosocial framework. Many challenges remain in addressing the stigma associated with this diagnosis, refining diagnostic criteria, and providing access to evidence-based treatments. This paper outlines FND treatment approaches, emphasizing the importance of respectful communication and comprehensive explanation of the diagnosis to patients, as critical first step to enhance engagement, adherence, self-agency, and treatment outcomes. We then focus on a brief review of evidence-based treatments for psychogenic non-epileptic seizures and functional movement disorder, a guide for designing future treatment trials for FND, and a proposal for a treatment research agenda, in order to aid in advancing the field to develop and implement treatments for patients with FND.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Recent studies suggest that close-range blast exposure (CBE), regardless of acute concussive symptoms, may have negative long-term effects on brain health and cognition; however, these effects are highly variable across individuals. One potential genetic risk factor that may impact recovery and explain the heterogeneity of blast injury’s long-term cognitive outcomes is the inheritance of an apolipoprotein (APOE) ε4 allele, a well-known genetic risk factor for Alzheimer’s disease. We hypothesized that APOE ε4 carrier status would moderate the impact of CBE on long-term cognitive outcomes.
To test this hypothesis, we examined 488 post-9/11 veterans who completed assessments of neuropsychological functioning, psychiatric diagnoses, history of blast exposure, military and non-military mild traumatic brain injuries (mTBIs), and available APOE genotypes. We separately examined the effects of CBE on attention, memory, and executive functioning in individuals with and without the APOE ε4 allele.
As predicted, we observed a differential impact of CBE status on cognition as a function of APOE ε4 status, in which CBE ε4 carriers displayed significantly worse neuropsychological performance, specifically in the domain of memory. These results persisted after adjusting for clinical, demographic, and genetic factors and were not observed when examining other neurotrauma variables (i.e., lifetime or military mTBI, distant blast exposure), though these variables displayed similar trends.
These results suggest APOE ε4 carriers are more vulnerable to the impact of CBE on cognition and highlight the importance of considering genetic risk when studying cognitive effects of neurotrauma.
The San Pedro de Atacama oases, located in northern Chile’s hyperarid Atacama Desert, have been occupied for at least 3000 years. Here, we examine cemetery use in the oases, with emphasis on the Middle Period (ca. AD 400–1000). By modeling of a large corpus (n=243) of radiocarbon dates, over 90% of which are direct AMS assays of human bone collagen, we attempt to establish a temporal framework by which to explore the establishment of formalized social inequality in this period. Modeling of these dates at three locally defined scales (all ayllus, inter-ayllu, and intra-ayllu) permit heretofore unavailable insights into the chronological and spatial dimensions of life and mortuary activity in the oases and allow us to better contextualize patterns of social inequality during the dynamic Middle Period. The results of this modeling indicate two distinct peaks of occupation during the Middle Period in San Pedro and document significant temporal variability in cemetery use patterns on both inter- and intra-ayllu scales. These results stress the importance of local social and environmental factors to the occupation of the oases and provide crucial chronological structure for future archaeological and bioarchaeological research in the region.
This is the first report on the association between trauma exposure and depression from the Advancing Understanding of RecOvery afteR traumA(AURORA) multisite longitudinal study of adverse post-traumatic neuropsychiatric sequelae (APNS) among participants seeking emergency department (ED) treatment in the aftermath of a traumatic life experience.
We focus on participants presenting at EDs after a motor vehicle collision (MVC), which characterizes most AURORA participants, and examine associations of participant socio-demographics and MVC characteristics with 8-week depression as mediated through peritraumatic symptoms and 2-week depression.
Eight-week depression prevalence was relatively high (27.8%) and associated with several MVC characteristics (being passenger v. driver; injuries to other people). Peritraumatic distress was associated with 2-week but not 8-week depression. Most of these associations held when controlling for peritraumatic symptoms and, to a lesser degree, depressive symptoms at 2-weeks post-trauma.
These observations, coupled with substantial variation in the relative strength of the mediating pathways across predictors, raises the possibility of diverse and potentially complex underlying biological and psychological processes that remain to be elucidated in more in-depth analyses of the rich and evolving AURORA database to find new targets for intervention and new tools for risk-based stratification following trauma exposure.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
Rapid spread of coronavirus disease 2019 (COVID-19) has affected people with intellectual disability disproportionately. Existing data does not provide enough information to understand factors associated with increased deaths in those with intellectual disability. Establishing who is at high risk is important in developing prevention strategies, given risk factors or comorbidities in people with intellectual disability may be different to those in the general population.
To identify comorbidities, demographic and clinical factors of those individuals with intellectual disability who have died from COVID-19.
An observational descriptive case series looking at deaths because of COVID-19 in people with intellectual disability was conducted. Along with established risk factors observed in the general population, possible specific risk factors and comorbidities in people with intellectual disability for deaths related to COVID-19 were examined. Comparisons between mild and moderate-to-profound intellectual disability subcohorts were undertaken.
Data on 66 deaths in individuals with intellectual disability were analysed. This group was younger (mean age 64 years) compared with the age of death in the general population because of COVID-19. High rates of moderate-to-profound intellectual disability (n = 43), epilepsy (n = 29), mental illness (n = 29), dysphagia (n = 23), Down syndrome (n = 20) and dementia (n = 15) were observed.
This is the first study exploring associations between possible risk factors and comorbidities found in COVID-19 deaths in people with intellectual disability. Our data provides insight into possible factors for deaths in people with intellectual disability. Some of the factors varied between the mild and moderate-to-profound intellectual disability groups. This highlights an urgent need for further systemic inquiry and study of the possible cumulative impact of these factors and comorbidities given the possibility of COVID-19 resurgence.
Background: The burden of C. difficile infection (CDI) on healthcare facilities is well recognized. However, studies focusing on inpatient settings, in addition to ascertainment bias in general, have led to a paucity of data on the true burden of CDI across whole healthcare economies. Methods: Sites testing both inpatient and community samples were recruited from 12 European countries (1 site per 3 million population). On 2 selected days, all diarrheal fecal samples (regardless of tests requested) were sent to the European Coordinating Laboratory (ECL) for C. difficile toxin testing and culture. The CDI results and tests not requested at each submitting site were compared with the ECL results to determine the number of missed CDIs. Contemporaneous C. difficile isolates from food and animal sources were collected. All isolates underwent PCR ribotyping and toxinotyping; prevalences of ribotypes among regions of Europe and reservoir settings were compared. Results: Overall, 3,163 diarrheal fecal samples were received from 119 sites. The burden of CDI varied by country (positivity rates, 0–15.8%) and by European region; the highest positivity rate in Eastern Europe was 13.1%. The testing and positivity rates in community samples were 29.6% and 1.4% vs 74.9% and 5.0% in hospital samples; 16% and 55% of samples positive for CDI at ECL were not diagnosed in hospitals and the community. The most common C. difficile ribotypes from hospital samples were 027 (11%), 181 (12%), and 014 (8%), although prevalence varied by country. The highest prevalence of toxinotype IIIb (ribotypes 027, 181, and 176) was seen in Eastern Europe (55% of all isolates), which also had the lowest testing rate. For hospital samples, the proportion of toxinotype IIIb was inversely related to the testing rate (r = −0.79) (Fig. 1). The most common ribotypes from food sources were 078 (23%) and 126 (13%) (toxinotype V), and most common ribotypes from community samples were 078 (9%) and 039 (9%). Overall, 106 different ribotypes were identified: 25 in both the hospital and community and 16 in the hospital, community, and food chain. Conclusions: The diagnosed burden of CDI varies markedly among countries in both hospital and community settings. Reduced sampling/testing in Eastern Europe is inversely related to the proportion of toxinotype IIIb strains identified, suggesting that lack of suspicion leads to underdiagnosis and outbreaks of infection. The proportion of missed CDIs in the community was ~3.5× higher than in hospitals, indicating major underrecognition in the former setting. There were marked differences in ribotypes in different reservoir settings, emphasizing the complex epidemiology of C. difficile.
Funding: Proprietary organization: COMBACTE-CDI is an EU funded (Horizon2020) consortium of academic and EFPIA partners (bioMerieux, GSK, Sanofi Pasteur, Astra Zeneca, Pfizer, Da Volterra) with additional Funding: from the EFPIA partners.
Disclosures: Submitter: Kerrie Davies; the work presented is funded via the EU and EFPIA (commercial) partners in a consortium.
Establishment of alfalfa by interseeding it with corn planted for silage can enhance crop productivity but weed management is a challenge to adoption of the practice. Although a simple and effective approach to weed management would be to apply a glyphosate-based herbicide, concerns about herbicide resistance and limitations in available alfalfa varieties exist. Field experiments were conducted to compare the efficacy and selectivity of PRE, POST, and PRE followed by POST herbicide programs to a glyphosate-only strategy when interseeding alfalfa with corn. Experiment 1 compared PRE applications of acetochlor, mesotrione, S-metalochlor, metribuzin, and flumetsulam. Results indicate that acetochlor and metribuzin, and S-metalochlor used at a rate of 1.1 kg ai ha−1 were the most effective and selective PRE herbicides 4 wk after treatment (WAT), but each resulted in greater overall weed cover than glyphosate by 8 WAT. Experiment 2 evaluated applications of bentazon, bromoxynil, 2,4-DB, and mesotrione at early and late POST times. Several herbicides used POST exhibited similar effectiveness and selectivity as glyphosate, including early applications of bromoxynil (0.14 kg ai ha−1) and 2,4-DB (0.84 or 1.68 kg ai ha−1), as well as late applications of bromoxynil (0.42 kg ai ha−1), 2,4-DB (0.84 kg ai ha−1), and mesotrione (0.05 or 0.11 kg ai ha−1). A third experiment compared applications of acetochlor PRE, bromoxynil POST, and a combination of acetochlor PRE with bromoxynil POST. All treatments were effective and safe for use in this interseeded system, although interseeded alfalfa provided 65% to 70% weed suppression in corn planted for silage without any herbicide. Herbicide treatments had no observable impacts on corn and alfalfa yields, so weed management was likely of limited economic importance. However, weed competitiveness can vary based on several different factors including weed species, density, and site-specific factors, and so further investigations under different environments and conditions are needed.
The inclusion of students with autism spectrum disorder (ASD) is increasing, but there have been no longitudinal studies of included students in Australia. Interview data reported in this study concern primary school children with ASD enrolled in mainstream classes in South Australia and New South Wales, Australia. In order to examine perceived facilitators and barriers to inclusion, parents, teachers, and principals were asked to comment on the facilitators and barriers to inclusion relevant to each child. Data are reported about 60 students, comprising a total of 305 parent interviews, 208 teacher interviews, and 227 principal interviews collected at 6-monthly intervals over 3.5 years. The most commonly mentioned facilitator was teacher practices. The most commonly mentioned barrier was intrinsic student factors. Other factors not directly controllable by school staff, such as resource limitations, were also commonly identified by principals and teachers. Parents were more likely to mention school- or teacher-related barriers. Many of the current findings were consistent with previous studies but some differences were noted, including limited reporting of sensory issues and bullying as barriers. There was little change in the pattern of facilitators and barriers identified by respondents over time. A number of implications for practice and directions for future research are discussed.
Reward Deficiency Syndrome (RDS) is an umbrella term for all drug and nondrug addictive behaviors, due to a dopamine deficiency, “hypodopaminergia.” There is an opioid-overdose epidemic in the USA, which may result in or worsen RDS. A paradigm shift is needed to combat a system that is not working. This shift involves the recognition of dopamine homeostasis as the ultimate treatment of RDS via precision, genetically guided KB220 variants, called Precision Behavioral Management (PBM). Recognition of RDS as an endophenotype and an umbrella term in the future DSM 6, following the Research Domain Criteria (RDoC), would assist in shifting this paradigm.
Prolonged survival of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) on environmental surfaces and personal protective equipment may lead to these surfaces transmitting this pathogen to others. We sought to determine the effectiveness of a pulsed-xenon ultraviolet (PX-UV) disinfection system in reducing the load of SARS-CoV-2 on hard surfaces and N95 respirators.
Chamber slides and N95 respirator material were directly inoculated with SARS-CoV-2 and were exposed to different durations of PX-UV.
For hard surfaces, disinfection for 1, 2, and 5 minutes resulted in 3.53 log10, >4.54 log10, and >4.12 log10 reductions in viral load, respectively. For N95 respirators, disinfection for 5 minutes resulted in >4.79 log10 reduction in viral load. PX-UV significantly reduced SARS-CoV-2 on hard surfaces and N95 respirators.
With the potential to rapidly disinfectant environmental surfaces and N95 respirators, PX-UV devices are a promising technology to reduce environmental and personal protective equipment bioburden and to enhance both healthcare worker and patient safety by reducing the risk of exposure to SARS-CoV-2.