To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Childhood exposure to interpersonal violence (IPV) may be linked to distinct manifestations of mental illness, yet the nature of this change remains poorly understood. Network analysis can provide unique insights by contrasting the interrelatedness of symptoms underlying psychopathology across exposed and non-exposed youth, with potential clinical implications for a treatment-resistant population. We anticipated marked differences in symptom associations among IPV-exposed youth, particularly in terms of ‘hub’ symptoms holding outsized influence over the network, as well as formation and influence of communities of highly interconnected symptoms.
Participants from a population-representative sample of youth (n = 4433; ages 11–18 years) completed a comprehensive structured clinical interview assessing mental health symptoms, diagnostic status, and history of violence exposure. Network analytic methods were used to model the pattern of associations between symptoms, quantify differences across diagnosed youth with (IPV+) and without (IPV–) IPV exposure, and identify transdiagnostic ‘bridge’ symptoms linking multiple disorders.
Symptoms organized into six ‘disorder’ communities (e.g. Intrusive Thoughts/Sensations, Depression, Anxiety), that exhibited considerably greater interconnectivity in IPV+ youth. Five symptoms emerged in IPV+ youth as highly trafficked ‘bridges’ between symptom communities (11 in IPV– youth).
IPV exposure may alter mutually reinforcing symptom co-occurrence in youth, thus contributing to greater psychiatric comorbidity and treatment resistance. The presence of a condensed and unique set of bridge symptoms suggests trauma-enriched nodes which could be therapeutically targeted to improve outcomes in violence-exposed youth.
Coronavirus disease 2019 (COVID-19) has migrated to regions that were initially spared, and it is likely that different populations are currently at risk for illness. Herein, we present our observations of the change in characteristics and resource use of COVID-19 patients over time in a national system of community hospitals to help inform those managing surge planning, operational management, and future policy decisions.
To determine risk factors for mortality among COVID-19 patients admitted to a system of community hospitals in the United States.
Retrospective analysis of patient data collected from the routine care of COVID-19 patients.
System of >180 acute-care facilities in the United States.
All admitted patients with positive identification of COVID-19 and a documented discharge as of May 12, 2020.
Determination of demographic characteristics, vital signs at admission, patient comorbidities and recorded discharge disposition in this population to construct a logistic regression estimating the odds of mortality, particular for those patients characterized as not being critically ill at admission.
In total, 6,180 COVID-19+ patients were identified as of May 12, 2020. Most COVID-19+ patients (4,808, 77.8%) were admitted directly to a medical-surgical unit with no documented critical care or mechanical ventilation within 8 hours of admission. After adjusting for demographic characteristics, comorbidities, and vital signs at admission in this subgroup, the largest driver of the odds of mortality was patient age (OR, 1.07; 95% CI, 1.06–1.08; P < .001). Decreased oxygen saturation at admission was associated with increased odds of mortality (OR, 1.09; 95% CI, 1.06–1.12; P < .001) as was diabetes (OR, 1.57; 95% CI, 1.21–2.03; P < .001).
The identification of factors observable at admission that are associated with mortality in COVID-19 patients who are initially admitted to non-critical care units may help care providers, hospital epidemiologists, and hospital safety experts better plan for the care of these patients.
The pharmacotherapy of epilepsy is a complex process guided by evidence-based research and clinical experience. Some patients achieve seizure freedom upon treatment with the first anti-seizure medication (ASM) prescribed, whereas others may be treated with two or three medications before one (or a combination) is found that reduces seizure frequency and/or severity with minimal side effects. Many patients demonstrate a partial response to treatment, leading to reduced seizure frequency and/or severity, but do not become completely seizure free. It is often stated that ~30% of epilepsy patients have seizures that cannot be controlled pharmacologically, and these patients are defined as having medication-resistant epilepsy (MRE). The International League Against Epilepsy (ILAE) published the following definition of MRE: ‘drug resistant epilepsy may be defined as failure of adequate trials of two tolerated and appropriately chosen and used ASM schedules (whether as monotherapies or in combination) to achieve sustained seizure freedom’. Treatment success or sustained seizure freedom is defined as one year without seizures or three times the inter-seizure interval (whichever is longer). The ILAE definition provides a useful standard from which to work, and MRE can be clinically identified in patients that fail to achieve seizure freedom after multiple ASM trials. However, the ILAE definition of successful treatment does not account for partial response to pharmacotherapy. Indeed, many partial responders have improved quality of life, even if they are not seizure-free for one year or more.
Early administration of blood products to patients with hemorrhagic shock has a positive impact on morbidity and mortality. Smaller hospitals may have limited supply of blood, and air medical systems may not carry blood. The primary outcome is to quantify the number of patients meeting established physiologic criteria for blood product administration and to identify which patients receive and which ones do not receive it due to lack of availability locally.
Electronic patient care records were used to identify a retrospective cohort of patients undergoing emergent air medical transport in Ontario, Canada, who are likely to require blood. Presenting problems for blood product administration were identified. Physiologic data were extracted with criteria for transfusion used to identify patients where blood product administration is indicated.
There were 11,520 emergent patient transports during the study period, with 842 (7.3%) where blood product administration was considered. Of these, 290 met established physiologic criteria for blood products, with 167 receiving blood, of which 57 received it at a hospital with a limited supply. The mean number of units administered per patient was 3.5. The remaining 123 patients meeting criteria did not receive product because none was unavailable.
Indications for blood product administration are present in 2.5% of patients undergoing time-sensitive air medical transport. Air medical services can enhance access to potentially lifesaving therapy in patients with hemorrhagic shock by carrying blood products, as blood may be unavailable or in limited supply locally in the majority of patients where it is indicated.
The role of air medical and land-based critical care transport services is not always clear amongst traditional emergency medical service providers or hospital-based health care practitioners. Some of this is historical, when air medical services were in their infancy and their role within the broader health care system was limited. Despite their evolution within the regionalized health care system, some myths remain regarding air medical services in Canada. The goal is to clarify several commonly held but erroneous beliefs regarding the role, impact, and practices in air medical transport.
Catatonia is a psychomotor dysregulation syndrome of diverse aetiology, increasingly recognised as a prominent feature of N-methyl-d-aspartate receptor antibody encephalitis (NMDARE) in adults. No study to date has systematically assessed the prevalence and symptomatology of catatonia in children with NMDARE. We analysed 57 paediatric patients with NMDARE from the literature using the Bush-Francis Catatonia Rating Scale. Catatonia was common (occurring in 86% of patients), manifesting as complex clusters of positive and negative features within individual patients. It was both underrecognised and undertreated. Immunotherapy was the only effective intervention, highlighting the importance of prompt recognition and treatment of the underlying cause of catatonia.
We present English translations of two French documents to show that the main reason for the rejection of Semmelweis's theory of the cause of childbed (puerperal) fever was because his proof relied on the post hoc ergo propter hoc fallacy, and not because Joseph Skoda referred only to cadaveric particles as the cause in his lecture to the Academy of Science on Semmelweis's discovery. Friedrich Wieger (1821–1890), an obstetrician from Strasbourg, published an accurate account of Semmelweis's theory six months before Skoda's lecture, and reported a case in which the causative agent originated from a source other than cadavers. Wieger also presented data showing that chlorine hand disinfection reduced the annual maternal mortality rate from childbed fever (MMR) from more than 7 per cent for the years 1840–1846 to 1.27 per cent in 1848, the first full year in which chlorine hand disinfection was practised. But an editorial in the Gazette médicale de Paris rejected the data as proof of the effectiveness of chlorine hand disinfection, stating that the fact that the MMR fell after chlorine hand disinfection was implemented did not mean that this innovation had caused the MMR to fall. This previously unrecognized objection to Semmelweis's proof was also the reason why Semmelweis's chief rejected Semmelweis's evidence.
We describe 14 yr of public data from the Parkes Pulsar Timing Array (PPTA), an ongoing project that is producing precise measurements of pulse times of arrival from 26 millisecond pulsars using the 64-m Parkes radio telescope with a cadence of approximately 3 weeks in three observing bands. A comprehensive description of the pulsar observing systems employed at the telescope since 2004 is provided, including the calibration methodology and an analysis of the stability of system components. We attempt to provide full accounting of the reduction from the raw measured Stokes parameters to pulse times of arrival to aid third parties in reproducing our results. This conversion is encapsulated in a processing pipeline designed to track provenance. Our data products include pulse times of arrival for each of the pulsars along with an initial set of pulsar parameters and noise models. The calibrated pulse profiles and timing template profiles are also available. These data represent almost 21 000 h of recorded data spanning over 14 yr. After accounting for processes that induce time-correlated noise, 22 of the pulsars have weighted root-mean-square timing residuals of
in at least one radio band. The data should allow end users to quickly undertake their own gravitational wave analyses, for example, without having to understand the intricacies of pulsar polarisation calibration or attain a mastery of radio frequency interference mitigation as is required when analysing raw data files.
The deviation from thermodynamic equilibrium of the ion velocity distribution functions (VDFs), as measured by the Magnetospheric Multiscale (MMS) mission in the Earth’s turbulent magnetosheath, is quantitatively investigated. Making use of the unprecedented high-resolution MMS ion data, and together with Vlasov–Maxwell simulations, this analysis aims at investigating the relationship between deviation from Maxwellian equilibrium and typical plasma parameters. Correlations of the non-Maxwellian features with plasma quantities such as electric fields, ion temperature, current density and ion vorticity are found to be similar in magnetosheath data and numerical experiments, with a poor correlation between distortions of ion VDFs and current density, evidence that questions the occurrence of VDF departure from Maxwellian at the current density peaks. Moreover, strong correlation has been observed with the magnitude of the electric field in the turbulent magnetosheath, while a certain degree of correlation has been found in the numerical simulations and during a magnetopause crossing by MMS. This work could help shed light on the influence of electrostatic waves on the distortion of the ion VDFs in space turbulent plasmas.
Giant miscanthus has the potential to move beyond cultivated fields and invade noncrop areas, but this can be overshadowed by aesthetic appeal and monetary value as a biofuel crop. Most research on giant miscanthus has focused on herbicide tolerance for establishment and production rather than terminating an existing stand. This study was conducted to evaluate herbicide options for control or terminating a stand of giant miscanthus. In 2013 and 2014, field experiments were conducted on established stands of the giant miscanthus cultivars ‘Nagara’ and ‘Freedom.’ Herbicides evaluated in both years included glyphosate, hexazinone, imazapic, imazapyr, clethodim, fluazifop, and glyphosate plus fluazifop. All treatments were applied in summer (June or July) and September. For both years, biomass reduction ranged from 85% to 100% when glyphosate was applied in June or July at 4.5 or 7.3 kg ae ha−1. No other treatment applied at this timing provided more than 50% giant miscanthus biomass reduction 1 yr after application. September applications of glyphosate were not consistent: treatments in 2013 reduced biomass by 40% or less, whereas in 2014, at all rates provided at least 78% biomass reduction. Glyphosate applied in June or July was the only treatment that provided effective and consistent control of giant miscanthus 1 yr after treatment.
The evolution of agriculture improved food security and enabled significant increases in the size and complexity of human groups. Despite these positive effects, some societies never adopted these practices, became only partially reliant on them, or even reverted to foraging after temporarily adopting them. Given the critical importance of climate and biotic interactions for modern agriculture, it seems likely that ecological conditions could have played a major role in determining the degree to which different societies adopted farming. However, this seemingly simple proposition has been surprisingly difficult to prove and is currently controversial. Here, we investigate how recent agricultural practices relate both to contemporary ecological opportunities and the suitability of local environments for the first species domesticated by humans. Leveraging a globally distributed dataset on 1,291 traditional societies, we show that after accounting for the effects of cultural transmission and more current ecological opportunities, levels of reliance on farming continue to be predicted by the opportunities local ecologies provided to the first human domesticates even after centuries of cultural evolution. Based on the details of our models, we conclude that ecology probably helped shape the geography of agriculture by biasing both human movement and the human-assisted dispersal of domesticates.
Considerable progress in explaining cultural evolutionary dynamics has been made by applying rigorous models from the natural sciences to historical and ethnographic information collected and accessed using novel digital platforms. Initial results have clarified several long-standing debates in cultural evolutionary studies, such as population origins, the role of religion in the evolution of complex societies and the factors that shape global patterns of language diversity. However, future progress requires recognition of the unique challenges posed by cultural data. To address these challenges, standards for data collection, organisation and analysis must be improved and widely adopted. Here, we describe some major challenges to progress in the construction of large comparative databases of cultural history, including recognising the critical role of theory, selecting appropriate units of analysis, data gathering and sampling strategies, winning expert buy-in, achieving reliability and reproducibility in coding, and ensuring interoperability and sustainability of the resulting databases. We conclude by proposing a set of practical guidelines to meet these challenges.
Fossil crayfish are typically rare, worldwide. In Australia, the strictly Southern Hemisphere clade Parastacidae, while ubiquitous in modern freshwater systems, is known only from sparse fossil occurrences from the Aptian–Albian of Victoria. We expand this record to the Cenomanian of northern New South Wales, where opalized bio-gastroliths (temporary calcium storage bodies found in the foregut of pre-moult crayfish) form a significant proportion of the fauna of the Griman Creek Formation. Crayfish bio-gastroliths are exceedingly rare in the fossil record but here form a remarkable supplementary record for crayfish, whose body and trace fossils are otherwise unknown from the Griman Creek Formation. The new specimens indicate that parastacid crayfish were widespread in eastern Australia by middle Cretaceous time, occupying a variety of freshwater ecosystems from the Australian–Antarctic rift valley in the south, to the near-coastal floodplains surrounding the epeiric Eromanga Sea further to the north.
Early-life stress (ELS) has previously been identified as a risk factor for cognitive decline, but this work has predominantly focused on clinical groups and indexed traditional cognitive domains. It, therefore, remains unclear whether ELS is related to cognitive function in healthy community-dwelling older adults, as well as whether any effects of ELS also extend to social cognition. To test each of these questions, the Childhood Trauma Questionnaire (CTQ) was administered to 484 older adults along with a comprehensive neuropsychological test battery and a well-validated test of social cognitive function. The results revealed no differences in global cognition according to overall experiences of ELS. However, a closer examination into the different ELS subscales showed that global cognition was poorer in those who had experienced physical neglect (relative to those who had not). Social cognitive function did not differ according to experiences to ELS. These results indicate that the relationship between ELS and cognition in older age may be dependent on the nature of the trauma experienced.
How landscapes respond to, and evolve from, large jökulhlaups (glacial outburst floods) is poorly constrained due to limited observations and detailed monitoring. We investigate how melt of glacier ice transported and deposited by multiple jökulhlaups during the 2010 eruption of Eyjafjallajökull, Iceland, modified the volume and surface elevation of jökulhlaup deposits. Jökulhlaups generated by the eruption deposited large volumes of sediment and ice, causing significant geomorphic change in the Gígjökull proglacial basin over a 4-week period. Observation of these events enabled robust constraints on the physical properties of the floods which informs our understanding of the deposits. Using ground-based LiDAR, GPS observations and the satellite-image-derived ArcticDEMs, we quantify the post-depositional response of the 60 m-thick Gígjökull sediment package to the meltout of buried ice and other geomorphic processes. Between 2010 and 2016, total deposit volume reduced by −0.95 × 106 m3 a−1, with significant surface lowering of up to 1.88 m a−1. Surface lowering and volumetric loss of the deposits is attributed to three factors: (i) meltout of ice deposited by the jökulhlaups; (ii) rapid melting of the buried Gígjökull glacier snout; and (iii) incision of the proglacial meltwater system into the jökulhlaup deposits.
While constant change characterises ecology, subtidal ecologists seem set to take a deep dive in to the biological processes that accelerate and compensate for environmental change. Similar to the technological and collaborative progress that benefited the present generation of authors, continuing progress may assist future generations of subtidal ecologists to figure out why kelp forests are characterised by global mosaics of long-term loss, gain and stasis. Where and how might kelp decline or flourish or simply persist future ocean change? Our review takes a biogeographic perspective to synthesise ecological patterns and the processes that create them. On this basis, we consider the modification of ecological processes by oceans undergoing physical and chemical change and, as a result, consider their future ecology. We find that future oceans will make life beyond the capacity of kelp to exist on many coasts, but not all coasts will be beyond the capacity of a kelp’s life. Consequently, this review provides a sign post for future research into the future decline or persistence or even increase of kelp forests.