To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Spatially and temporally unpredictable rainfall patterns presented food production challenges to small-scale agricultural communities, requiring multiple risk-mitigating strategies to increase food security. Although site-based investigations of the relationship between climate and agricultural production offer insights into how individual communities may have created long-term adaptations to manage risk, the inherent spatial variability of climate-driven risk makes a landscape-scale perspective valuable. In this article, we model risk by evaluating how the spatial structure of ancient climate conditions may have affected the reliability of three major strategies used to reduce risk: drawing upon social networks in time of need, hunting and gathering of wild resources, and storing surplus food. We then explore how climate-driven changes to this reliability may relate to archaeologically observed social transformations. We demonstrate the utility of this methodology by comparing the Salinas and Cibola regions in the prehispanic U.S. Southwest to understand the complex relationship among climate-driven threats to food security, risk-mitigation strategies, and social transformations. Our results suggest key differences in how communities buffered against risk in the Cibola and Salinas study regions, with the structure of precipitation influencing the range of strategies to which communities had access through time.
To identify predictors of remission with placebo treatment in double-blind randomized controlled trials (RCTs) in major depressive disorder (MDD) based on baseline characteristics.
989 placebo-treated MDD subjects with baseline Hamilton Depression Rating Scale total score (HAMDT17) ≥15 who completed 7-8 weeks of treatment from 8 duloxetine RCTs with placebo lead-in were included. Remission was defined as HAMDT17 endpoint score ≤7. Stepwise logistic regression and classification and regression tree (CART) methods were used to identify predictors of remission. Data were randomly split into training data (N=791, 80%) for model selection and test data (N=198, 20%) for validation. Predictive quality of models was assessed by ROC curves.
In the logistic regression analysis, out of >50 potential pre-treatment predictors, the HAMDT17 score, age, Hamilton Anxiety Scale item 14 (HAMA14: behavior at interview) and length of current MDD episode (length) were found to be most predictive for remission on placebo. These variables were also identified as top predictors by the CART method that identified 2 subgroups: “HAMA14=0 OR (HAMDT17< 22 AND length< 18 weeks)” (45% remitted), “HAMA14>0 AND (HAMDT17≥22 OR length≥18 weeks)” (21% remitted). However, the predictive power was weak for both methods with areas under the ROC curve (test data) of 67% and 56%, respectively.
Baseline risk factors for non-remission after 7-8 weeks of placebo treatment were older age, more severe depressive symptoms, apparent anxiety, and a longer duration of current MDD episode. These results are consistent with previous findings and may aid in the design of RCTs.
To determine whether the Society for Healthcare Epidemiology of America (SHEA) and the Infectious Diseases Society of America (IDSA) Clostridioides difficile infection (CDI) severity criteria adequately predicts poor outcomes.
Retrospective validation study.
Setting and participants:
Patients with CDI in the Veterans’ Affairs Health System from January 1, 2006, to December 31, 2016.
For the 2010 criteria, patients with leukocytosis or a serum creatinine (SCr) value ≥1.5 times the baseline were classified as severe. For the 2018 criteria, patients with leukocytosis or a SCr value ≥1.5 mg/dL were classified as severe. Poor outcomes were defined as hospital or intensive care admission within 7 days of diagnosis, colectomy within 14 days, or 30-day all-cause mortality; they were modeled as a function of the 2010 and 2018 criteria separately using logistic regression.
We analyzed data from 86,112 episodes of CDI. Severity was unclassifiable in a large proportion of episodes diagnosed in subacute care (2010, 58.8%; 2018, 49.2%). Sensitivity ranged from 0.48 for subacute care using 2010 criteria to 0.73 for acute care using 2018 criteria. Areas under the curve were poor and similar (0.60 for subacute care and 0.57 for acute care) for both versions, but negative predictive values were >0.80.
Model performances across care settings and criteria versions were generally poor but had reasonably high negative predictive value. Many patients in the subacute-care setting, an increasing fraction of CDI cases, could not be classified. More work is needed to develop criteria to identify patients at risk of poor outcomes.
Cover crops (CCs) play an important role in integrated weed management. Data necessary to evaluate the role of CCs in weed management at the watershed scale with topographic positions are lacking. We evaluated the effects of cereal rye and hairy vetch CCs on weed suppression at different topographic positions (shoulder, backslope, and footslope) at a watershed scale. Watersheds with a CC treatment followed a crop rotation of corn–cereal rye–soybean–hairy vetch, whereas watersheds without a CC (no-CC) had a crop rotation of corn–winter fallow–soybean–winter fallow. A negative relationship was present between CCs and weed biomass at the shoulder, backslope, and footslope topographic landscape positions, with R2 values of 0.40, 0.48, and 0.50, respectively. In 2016, a cereal rye CC reduced weed biomass 46% to 50% at footslope and shoulder positions compared to no CC. In 2018, a cereal rye CC reduced weed biomass between 52% and 85% at all topographic positions in CC treatment watersheds compared to no-CC watersheds. Hairy vetch in 2017 reduced weed biomass 62% to 72% at footslope and shoulder topographic positions in CC watersheds compared to no-CC. The C:N ratio of weed biomass in CC treatment watersheds was generally higher compared to watersheds without CCs. In this study, several significant interactions were found between the topographic positions and CC treatments. Cover crop–induced weed suppression at different topographic positions can lead to developing better site-specific weed control strategies. Therefore, CC interactions with topography, weed germination potential, and the role of soil moisture at the watershed scale should be further evaluated.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
The majority of paediatric Clostridioides difficile infections (CDI) are community-associated (CA), but few data exist regarding associated risk factors. We conducted a case–control study to evaluate CA-CDI risk factors in young children. Participants were enrolled from eight US sites during October 2014–February 2016. Case-patients were defined as children aged 1–5 years with a positive C. difficile specimen collected as an outpatient or ⩽3 days of hospital admission, who had no healthcare facility admission in the prior 12 weeks and no history of CDI. Each case-patient was matched to one control. Caregivers were interviewed regarding relevant exposures. Multivariable conditional logistic regression was performed. Of 68 pairs, 44.1% were female. More case-patients than controls had a comorbidity (33.3% vs. 12.1%; P = 0.01); recent higher-risk outpatient exposures (34.9% vs. 17.7%; P = 0.03); recent antibiotic use (54.4% vs. 19.4%; P < 0.0001); or recent exposure to a household member with diarrhoea (41.3% vs. 21.5%; P = 0.04). In multivariable analysis, antibiotic exposure in the preceding 12 weeks was significantly associated with CA-CDI (adjusted matched odds ratio, 6.25; 95% CI 2.18–17.96). Improved antibiotic prescribing might reduce CA-CDI in this population. Further evaluation of the potential role of outpatient healthcare and household exposures in C. difficile transmission is needed.
Laser-based compact MeV X-ray sources are useful for a variety of applications such as radiography and active interrogation of nuclear materials. MeV X rays are typically generated by impinging the intense laser onto ~mm-thick high-Z foil. Here, we have characterized such a MeV X-ray source from 120 TW (80 J, 650 fs) laser interaction with a 1 mm-thick tantalum foil. Our measurements show X-ray temperature of 2.5 MeV, flux of 3 × 1012 photons/sr/shot, beam divergence of ~0.1 sr, conversion efficiency of ~1%, that is, ~1 J of MeV X rays out of 80 J incident laser, and source size of 80 m. Our measurement also shows that MeV X-ray yield and temperature is largely insensitive to nanosecond laser contrasts up to 10−5. Also, preliminary measurements of similar MeV X-ray source using a double-foil scheme, where the laser-driven hot electrons from a thin foil undergoing relativistic transparency impinging onto a second high-Z converter foil separated by 50–400 m, show MeV X-ray yield more than an order of magnitude lower compared with the single-foil results.
To evaluate whole-genome sequencing (WGS) as a molecular typing tool for MRSA outbreak investigation.
Investigation of MRSA colonization/infection in a neonatal intensive care unit (NICU) over 3 years (2014–2017).
Single-center level IV NICU.
NICU infants and healthcare workers (HCWs).
Infants were screened for MRSA using a swab of the anterior nares, axilla, and groin, initially by targeted (ring) screening, and later by universal weekly screening. Clinical cultures were collected as indicated. HCWs were screened once using swabs of the anterior nares. MRSA isolates were typed using WGS with core-genome multilocus sequence typing (cgMLST) analysis and by pulsed-field gel electrophoresis (PFGE). Colonized and infected infants and HCWs were decolonized. Control strategies included reinforcement of hand hygiene, use of contact precautions, cohorting, enhanced environmental cleaning, and remodeling of the NICU.
We identified 64 MRSA-positive infants: 53 (83%) by screening and 11 (17%) by clinical cultures. Of 85 screened HCWs, 5 (6%) were MRSA positive. WGS of MRSA isolates identified 2 large clusters (WGS groups 1 and 2), 1 small cluster (WGS group 3), and 8 unrelated isolates. PFGE failed to distinguish WGS group 2 and 3 isolates. WGS groups 1 and 2 were codistributed over time. HCW MRSA isolates were primarily in WGS group 1. New infant MRSA cases declined after implementation of the control interventions.
We identified 2 contemporaneous MRSA outbreaks alongside sporadic cases in a NICU. WGS was used to determine strain relatedness at a higher resolution than PFGE and was useful in guiding efforts to control MRSA transmission.
Hill (Twin Research and Human Genetics, Vol. 21, 2018, 84–88) presented a critique of our recently published paper in Cell Reports entitled ‘Large-Scale Cognitive GWAS Meta-Analysis Reveals Tissue-Specific Neural Expression and Potential Nootropic Drug Targets’ (Lam et al., Cell Reports, Vol. 21, 2017, 2597–2613). Specifically, Hill offered several interrelated comments suggesting potential problems with our use of a new analytic method called Multi-Trait Analysis of GWAS (MTAG) (Turley et al., Nature Genetics, Vol. 50, 2018, 229–237). In this brief article, we respond to each of these concerns. Using empirical data, we conclude that our MTAG results do not suffer from ‘inflation in the FDR [false discovery rate]’, as suggested by Hill (Twin Research and Human Genetics, Vol. 21, 2018, 84–88), and are not ‘more relevant to the genetic contributions to education than they are to the genetic contributions to intelligence’.
While our fascination with understanding the past is sufficient to warrant an increased focus on synthesis, solutions to important problems facing modern society require understandings based on data that only archaeology can provide. Yet, even as we use public monies to collect ever-greater amounts of data, modes of research that can stimulate emergent understandings of human behavior have lagged behind. Consequently, a substantial amount of archaeological inference remains at the level of the individual project. We can more effectively leverage these data and advance our understandings of the past in ways that contribute to solutions to contemporary problems if we adapt the model pioneered by the National Center for Ecological Analysis and Synthesis to foster synthetic collaborative research in archaeology. We propose the creation of the Coalition for Archaeological Synthesis coordinated through a U.S.-based National Center for Archaeological Synthesis. The coalition will be composed of established public and private organizations that provide essential scholarly, cultural heritage, computational, educational, and public engagement infrastructure. The center would seek and administer funding to support collaborative analysis and synthesis projects executed through coalition partners. This innovative structure will enable the discipline to address key challenges facing society through evidentially based, collaborative synthetic research.
The present study examined the impacts of training on nutrition, hygiene and food safety designed by the Nutrition Working Group, Child Survival Collaborations and Resources Group (CORE).
Adapted from the 21d Positive Deviance/Hearth model, mothers were trained on the subjects of appropriate complementary feeding, water, sanitation and hygiene (WASH) practices, and aflatoxin contamination in food. To assess the impacts on child undernutrition, a randomised controlled trial was implemented on a sample of 179 mothers and their children (<2 years old) in two districts of Malawi, namely Mzimba and Balaka.
A 21d intensive learning-by-doing process using the positive deviance approach.
Malawian children and mothers.
Difference-in-difference panel regression analysis revealed that the impacts of the comprehensive training were positive and statistically significant on the Z-scores for wasting and underweight, where the effects increased constantly over time within the 21d time frame. As for stunting, the coefficients were not statistically significant during the 21d programme, although the level of significance started increasing in 2 weeks, indicating that stunting should also be alleviated in a slightly longer time horizon.
The study clearly suggests that comprehensive training immediately guides mothers into improved dietary and hygiene practices, and that improved practices take immediate and progressive effects in ameliorating children’s undernutrition.
Although dendrochronological methods have the potential to provide precise calendar dates, they are virtually absent in Mesoamerican archaeological research. This absence is due to several long-standing, but erroneous, assumptions: that tree rings in this region do not reflect annual growth and environmental variability, that an adequate number of samples do not exist, and that tree-ring measurements cannot be useful without modern trees to link prehispanic chronologies. In this article we present data from the sites of La Quemada and Los Pilarillos, located in the Malpaso Valley, Zacatecas, to demonstrate that suitable archaeologically derived samples of dendrochronologically useful species do exist, that the samples from these sites are measurable and cross-datable, and that the tree rings can yield precise calendar dates using a method that “wiggle-matches” radiocarbon dates on tree-ring sequences. The work demonstrates the potential of these methods to address chronological, and, in the future, climatic questions, which have so far eluded archaeological work in the region.
Habitat preferences and response to habitat conversion remain under-studied for many groups in the tropics, limiting our understanding of how environmental and anthropogenic factors may interact to shape patterns of diversity. To help fill this knowledge gap, we surveyed nocturnal birds such as owls, nightjars and potoos through auditory transect surveys in 22 forest fragments (2.7 to 33.6 ha) in north-west Ecuador. We assessed the relative effect of habitat characteristics (e.g. canopy height and openness, and density of large trees) and fragment attributes (e.g. area, altitude and proportion of surrounding forest cover) on species richness and community composition. Based on our previous work, we predicted that nocturnal bird richness would be highest in relatively larger fragments with more surrounding forest cover. We recorded 11 total species with an average ± SD of 3.4 ± 1.4 (range = 2–7) species per fragment, with higher richness in fragments that were larger, at lower altitudes, and characterized by more open canopies. Nocturnal bird community similarity was not significantly correlated with any measured environmental variable. These results indicate that both landscape (e.g. altitude) and fragment-specific (e.g. size, forest structure) attributes are likely to interact to shape patterns of diversity among this poorly known but ecologically important guild in fragmented tropical landscapes.
Whether monozygotic (MZ) and dizygotic (DZ) twins differ from each other in a variety of phenotypes is important for genetic twin modeling and for inferences made from twin studies in general. We analyzed whether there were differences in individual, maternal and paternal education between MZ and DZ twins in a large pooled dataset. Information was gathered on individual education for 218,362 adult twins from 27 twin cohorts (53% females; 39% MZ twins), and on maternal and paternal education for 147,315 and 143,056 twins respectively, from 28 twin cohorts (52% females; 38% MZ twins). Together, we had information on individual or parental education from 42 twin cohorts representing 19 countries. The original education classifications were transformed to education years and analyzed using linear regression models. Overall, MZ males had 0.26 (95% CI [0.21, 0.31]) years and MZ females 0.17 (95% CI [0.12, 0.21]) years longer education than DZ twins. The zygosity difference became smaller in more recent birth cohorts for both males and females. Parental education was somewhat longer for fathers of DZ twins in cohorts born in 1990–1999 (0.16 years, 95% CI [0.08, 0.25]) and 2000 or later (0.11 years, 95% CI [0.00, 0.22]), compared with fathers of MZ twins. The results show that the years of both individual and parental education are largely similar in MZ and DZ twins. We suggest that the socio-economic differences between MZ and DZ twins are so small that inferences based upon genetic modeling of twin data are not affected.
Reports in the literature of treatment with recombinant tissue plasminogen activator following cardiac surgery are limited. We reviewed our experience to provide a case series of the therapeutic use of tissue plasminogen activator for the treatment of venous thrombosis in children after cardiac surgery. The data describe the morbidity, mortality, and clinical outcomes of tissue plasminogen activator administration for treatment of venous thrombosis in children following cardiac surgery.
The study was designed as a retrospective case series.
The study was carried out in a 25-bed cardiac intensive care unit in an academic, free-standing paediatric hospital.
All children who received tissue plasminogen activator for venous thrombosis within 60 days of cardiac surgery, a total of 13 patients, were included.
Data was collected, collated, and analysed as a part of the interventions of this study.
Measurements and main results
Patients treated with tissue plasminogen activator were principally young infants (median 0.2, IQR 0.07–0.58 years) who had recently (22, IQR 12.5–27.3 days) undergone cardiac surgery. Hospital mortality was high in this patient group (38%), but there was no mortality attributable to tissue plasminogen activator administration, occurring within <72 hours. There was one major haemorrhagic complication that may be attributable to tissue plasminogen activator. Complete or partial resolution of venous thrombosis was confirmed using imaging in 10 of 13 patients (77%), and tissue plasminogen activator administration was associated with resolution of chylous drainage, with no drainage through chest tubes, at 10 days after tissue plasminogen activator treatment in seven of nine patients who had upper-compartment venous thrombosis-associated chylothorax.
On the basis of our experience with administration of tissue plasminogen activator in children after cardiac surgery, tissue plasminogen activator is both safe and effective for resolution of venous thrombosis in this high-risk population.
In recent years, mass-casualty incidents (MCIs) have become more frequent and deadly, while emergency department (ED) crowding has grown steadily worse and widespread. The ability of hospitals to implement an effective mass-casualty surge plan, immediately and expertly, has therefore never been more important. Yet, mass-casualty exercises tend to be highly choreographed, pre-scheduled events that provide limited insight into hospitals’ true capacity to respond to a no-notice event under real-world conditions. To address this gap, the US Department of Health and Human Services (Washington, DC USA), Office of the Assistant Secretary for Preparedness and Response (ASPR), sponsored development of a set of tools meant to allow any hospital to run a real-time, no-notice exercise, focusing on the first hour and 15 minutes of a hospital’s response to a sudden MCI, with the goals of minimizing burden, maximizing realism, and providing meaningful, outcome-oriented metrics to facilitate self-assessment. The resulting exercise, which was iteratively developed, piloted at nine hospitals nationwide, and completed in 2015, is now freely available for anyone to use or adapt. This report demonstrates the feasibility of implementing a no-notice exercise in the hospital setting and describes insights gained during the development process that might be helpful to future exercise developers. It also introduces the use of ED “immediate bed availability (IBA)” as an objective, dynamic measure of an ED’s physical capacity for new arrivals.
WaxmanDA, ChanEW, PillemerF, SmithTWJ, AbirM, NelsonC. Assessing and Improving Hospital Mass-Casualty Preparedness: A No-Notice Exercise. Prehosp Disaster Med. 2017;32(6):662–666.