To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
Sea-level science has seen many recent developments in observations and modelling of the different contributions and the total mean sea-level change. In this overview, we discuss (1) the evolution of the Intergovernmental Panel on Climate Change (IPCC) projections, (2) how the projections compare to observations and (3) the outlook for further improving projections. We start by discussing how the model projections of 21st century sea-level change have changed from the IPCC AR5 report (2013) to SROCC (2019) and AR6 (2021), highlighting similarities and differences in the methodologies and comparing the global mean and regional projections. This shows that there is good agreement in the median values, but also highlights some differences. In addition, we discuss how the different reports included high-end projections. We then show how the AR5 projections (from 2007 onwards) compare against the observations and find that they are highly consistent with each other. Finally, we discuss how to further improve sea-level projections using high-resolution ocean modelling and recent vertical land motion estimates.
Personality traits (e.g. neuroticism) and the social environment predict risk for internalizing disorders and suicidal behavior. Studying these characteristics together and prospectively within a population confronted with high stressor exposure (e.g. U.S. Army soldiers) has not been done, yet could uncover unique and interactive predictive effects that may inform prevention and early intervention efforts.
Five broad personality traits and social network size were assessed via self-administered questionnaires among experienced soldiers preparing for deployment (N = 4645) and new soldiers reporting for basic training (N = 6216). Predictive models examined associations of baseline personality and social network variables with recent distress disorders or suicidal behaviors assessed 3- and 9-months post-deployment and approximately 5 years following enlistment.
Among the personality traits, elevated neuroticism was consistently associated with increased mental health risk following deployment. Small social networks were also associated with increased mental health risk following deployment, beyond the variance accounted for by personality. Limited support was found for social network size moderating the association between personality and mental health outcomes. Small social networks also predicted distress disorders and suicidal behavior 5 years following enlistment, whereas unique effects of personality traits on these more distal outcomes were rare.
Heightened neuroticism and small social networks predict a greater risk for negative mental health sequelae, especially following deployment. Social ties may mitigate adverse impacts of personality traits on psychopathology in some contexts. Early identification and targeted intervention for these distinct, modifiable factors may decrease the risk of distress disorders and suicidal behavior.
To describe the epidemiology of patients with nonintestinal carbapenem-resistant Enterobacterales (CRE) colonization and to compare clinical outcomes of these patients to those with CRE infection.
A secondary analysis of Consortium on Resistance Against Carbapenems in Klebsiella and other Enterobacteriaceae 2 (CRACKLE-2), a prospective observational cohort.
A total of 49 US short-term acute-care hospitals.
Patients hospitalized with CRE isolated from clinical cultures, April, 30, 2016, through August 31, 2017.
We described characteristics of patients in CRACKLE-2 with nonintestinal CRE colonization and assessed the impact of site of colonization on clinical outcomes. We then compared outcomes of patients defined as having nonintestinal CRE colonization to all those defined as having infection. The primary outcome was a desirability of outcome ranking (DOOR) at 30 days. Secondary outcomes were 30-day mortality and 90-day readmission.
Of 547 patients with nonintestinal CRE colonization, 275 (50%) were from the urinary tract, 201 (37%) were from the respiratory tract, and 71 (13%) were from a wound. Patients with urinary tract colonization were more likely to have a more desirable clinical outcome at 30 days than those with respiratory tract colonization, with a DOOR probability of better outcome of 61% (95% confidence interval [CI], 53%–71%). When compared to 255 patients with CRE infection, patients with CRE colonization had a similar overall clinical outcome, as well as 30-day mortality and 90-day readmission rates when analyzed in aggregate or by culture site. Sensitivity analyses demonstrated similar results using different definitions of infection.
Patients with nonintestinal CRE colonization had outcomes similar to those with CRE infection. Clinical outcomes may be influenced more by culture site than classification as “colonized” or “infected.”
The neural mechanisms contributing to the social problems of pediatric brain tumor survivors (PBTS) are unknown. Face processing is important to social communication, social behavior, and peer acceptance. Research with other populations with social difficulties, namely autism spectrum disorder, suggests atypical brain activation in areas important for face processing. This case-controlled functional magnetic resonance imaging (fMRI) study compared brain activation during face processing in PBTS and typically developing (TD) youth.
Participants included 36 age-, gender-, and IQ-matched youth (N = 18 per group). PBTS were at least 5 years from diagnosis and 2 years from the completion of tumor therapy. fMRI data were acquired during a face identity task and a control condition. Groups were compared on activation magnitude within the fusiform gyrus for the faces condition compared to the control condition. Correlational analyses evaluated associations between neuroimaging metrics and indices of social behavior for PBTS participants.
Both groups demonstrated face-specific activation within the social brain for the faces condition compared to the control condition. PBTS showed significantly decreased activation for faces in the medial portions of the fusiform gyrus bilaterally compared to TD youth, ps ≤ .004. Higher peak activity in the left fusiform gyrus was associated with better socialization (r = .53, p < .05).
This study offers initial evidence of atypical activation in a key face processing area in PBTS. Such atypical activation may underlie some of the social difficulties of PBTS. Social cognitive neuroscience methodologies may elucidate the neurobiological bases for PBTS social behavior.
Electroencephalographic (EEG) abnormalities are greater in mild cognitive impairment (MCI) with Lewy bodies (MCI-LB) than in MCI due to Alzheimer’s disease (MCI-AD) and may anticipate the onset of dementia. We aimed to assess whether quantitative EEG (qEEG) slowing would predict a higher annual hazard of dementia in MCI across these etiologies. MCI patients (n = 92) and healthy comparators (n = 31) provided qEEG recording and underwent longitudinal clinical and cognitive follow-up. Associations between qEEG slowing, measured by increased theta/alpha ratio, and clinical progression from MCI to dementia were estimated with a multistate transition model to account for death as a competing risk, while controlling for age, cognitive function, and etiology classified by an expert consensus panel.
Over a mean follow-up of 1.5 years (SD = 0.5), 14 cases of incident dementia and 5 deaths were observed. Increased theta/alpha ratio on qEEG was associated with increased annual hazard of dementia (hazard ratio = 1.84, 95% CI: 1.01–3.35). This extends previous findings that MCI-LB features early functional changes, showing that qEEG slowing may anticipate the onset of dementia in prospectively identified MCI.
The present study reports the validity of multiple assessment methods for tracking changes in body composition over time and quantifies the influence of unstandardised pre-assessment procedures. Resistance-trained males underwent 6 weeks of structured resistance training alongside a hyperenergetic diet, with four total body composition evaluations. Pre-intervention, body composition was estimated in standardised (i.e. overnight fasted and rested) and unstandardised (i.e. no control over pre-assessment activities) conditions within a single day. The same assessments were repeated post-intervention, and body composition changes were estimated from all possible combinations of pre-intervention and post-intervention data. Assessment methods included dual-energy X-ray absorptiometry (DXA), air displacement plethysmography, three-dimensional optical imaging, single- and multi-frequency bioelectrical impedance analysis, bioimpedance spectroscopy and multi-component models. Data were analysed using equivalence testing, Bland–Altman analysis, Friedman tests and validity metrics. Most methods demonstrated meaningful errors when unstandardised conditions were present pre- and/or post-intervention, resulting in blunted or exaggerated changes relative to true body composition changes. However, some methods – particularly DXA and select digital anthropometry techniques – were more robust to a lack of standardisation. In standardised conditions, methods exhibiting the highest overall agreement with the four-component model were other multi-component models, select bioimpedance technologies, DXA and select digital anthropometry techniques. Although specific methods varied, the present study broadly demonstrates the importance of controlling and documenting standardisation procedures prior to body composition assessments across distinct assessment technologies, particularly for longitudinal investigations. Additionally, there are meaningful differences in the ability of common methods to track longitudinal body composition changes.
It is uncertain if long-term levels of low-density lipoprotein-cholesterol (LDL-C) affect cognition in middle age. We examined the association of LDL-C levels over 25 years with cognitive function in a prospective cohort of black and white US adults.
Lipids were measured at baseline (1985–1986; age: 18–30 years) and at serial examinations conducted over 25 years. Time-averaged cumulative LDL-C was calculated using the area under the curve for 3,328 participants with ≥3 LDL-C measurements and a cognitive function assessment. Cognitive function was assessed at the Year 25 examination with the Digit Symbol Substitution Test [DSST], Rey Auditory Visual Learning Test [RAVLT], and Stroop Test. A brain magnetic resonance imaging (MRI) sub-study (N = 707) was also completed at Year 25 to assess abnormal white matter tissue volume (AWMV) and gray matter cerebral blood flow volume (GM-CBFV) as secondary outcomes.
There were 15.6%, 32.9%, 28.9%, and 22.6% participants with time-averaged cumulative LDL-C <100 mg/dL, 101–129 mg/dL, 130–159 mg/dL, and ≥160 mg/dL, respectively. Standardized differences in all cognitive function test scores ranged from 0.16 SD lower to 0.09 SD higher across time-averaged LDL-C categories in comparison to those with LDL-C < 100 mg/dL. After covariate adjustment, participants with higher versus lower time-averaged LDL-C had a lower RAVLT score (p-trend = 0.02) but no differences were present for DSST, Stroop Test, AWMV, or GM-CBFV.
Cumulative LDL-C was associated with small differences in memory, as assessed by RAVLT scores, but not other cognitive or brain MRI measures over 25 years of follow-up.
Optical tracking systems typically trade off between astrometric precision and field of view. In this work, we showcase a networked approach to optical tracking using very wide field-of-view imagers that have relatively low astrometric precision on the scheduled OSIRIS-REx slingshot manoeuvre around Earth on 22 Sep 2017. As part of a trajectory designed to get OSIRIS-REx to NEO 101955 Bennu, this flyby event was viewed from 13 remote sensors spread across Australia and New Zealand to promote triangulatable observations. Each observatory in this portable network was constructed to be as lightweight and portable as possible, with hardware based off the successful design of the Desert Fireball Network. Over a 4-h collection window, we gathered 15 439 images of the night sky in the predicted direction of the OSIRIS-REx spacecraft. Using a specially developed streak detection and orbit determination data pipeline, we detected 2 090 line-of-sight observations. Our fitted orbit was determined to be within about 10 km of orbital telemetry along the observed 109 262 km length of OSIRIS-REx trajectory, and thus demonstrating the impressive capability of a networked approach to Space Surveillance and Tracking.
Hill (Twin Research and Human Genetics, Vol. 21, 2018, 84–88) presented a critique of our recently published paper in Cell Reports entitled ‘Large-Scale Cognitive GWAS Meta-Analysis Reveals Tissue-Specific Neural Expression and Potential Nootropic Drug Targets’ (Lam et al., Cell Reports, Vol. 21, 2017, 2597–2613). Specifically, Hill offered several interrelated comments suggesting potential problems with our use of a new analytic method called Multi-Trait Analysis of GWAS (MTAG) (Turley et al., Nature Genetics, Vol. 50, 2018, 229–237). In this brief article, we respond to each of these concerns. Using empirical data, we conclude that our MTAG results do not suffer from ‘inflation in the FDR [false discovery rate]’, as suggested by Hill (Twin Research and Human Genetics, Vol. 21, 2018, 84–88), and are not ‘more relevant to the genetic contributions to education than they are to the genetic contributions to intelligence’.
Over the past 30 years, the number of US doctoral anthropology graduates has increased by about 70%, but there has not been a corresponding increase in the availability of new faculty positions. Consequently, doctoral degree-holding archaeologists face more competition than ever before when applying for faculty positions. Here we examine where US and Canadian anthropological archaeology faculty originate and where they ultimately end up teaching. Using data derived from the 2014–2015 AnthroGuide, we rank doctoral programs whose graduates in archaeology have been most successful in the academic job market; identify long-term and ongoing trends in doctoral programs; and discuss gender division in academic archaeology in the US and Canada. We conclude that success in obtaining a faculty position upon graduation is predicated in large part on where one attends graduate school.
It is increasingly essential for medical researchers to be literate in statistics, but the requisite degree of literacy is not the same for every statistical competency in translational research. Statistical competency can range from ‘fundamental’ (necessary for all) to ‘specialized’ (necessary for only some). In this study, we determine the degree to which each competency is fundamental or specialized.
We surveyed members of 4 professional organizations, targeting doctorally trained biostatisticians and epidemiologists who taught statistics to medical research learners in the past 5 years. Respondents rated 24 educational competencies on a 5-point Likert scale anchored by ‘fundamental’ and ‘specialized.’
There were 112 responses. Nineteen of 24 competencies were fundamental. The competencies considered most fundamental were assessing sources of bias and variation (95%), recognizing one’s own limits with regard to statistics (93%), identifying the strengths, and limitations of study designs (93%). The least endorsed items were meta-analysis (34%) and stopping rules (18%).
We have identified the statistical competencies needed by all medical researchers. These competencies should be considered when designing statistical curricula for medical researchers and should inform which topics are taught in graduate programs and evidence-based medicine courses where learners need to read and understand the medical research literature.
Auxinic herbicides, such as 2,4-D and dicamba, that act as plant growth regulators are commonly used for broadleaf weed control in cereal crops (e.g., wheat, barley), grasslands, and noncroplands. If applied at late growth stages, while cereals are developing reproductive parts, the herbicides can reduce seed production. We tested whether growth regulators have this same effect on the invasive annual grass Japanese brome. The herbicides 2,4-D, dicamba, and picloram were applied at typical field use rates to Japanese brome at various growth stages in a greenhouse. Picloram reduced seed production nearly 100% when applied at the internode elongation, boot, or heading stages of growth, whereas dicamba appeared to be slightly less effective and 2,4-D was much less effective. Our results indicate it may be possible to control Japanese brome by using growth regulator herbicides to reduce its seed production, thereby depleting its short-lived seed bank.
This paper tests whether the most common fossil brachiopod, gastropod, and bivalve genera also have intrinsically more durable shells. Commonness was quantified using occurrence frequency of the 450 most frequently occurring genera of these groups in the Paleobiology Database (PBDB). Durability was scored for each taxon on the basis of shell size, thickness, reinforcement (ribs, folds, spines), mineralogy, and microstructural organic content. Contrary to taphonomic expectation, common genera in the PBDB are as likely to be small, thin-shelled, and unreinforced as large, thick-shelled, ribbed, folded, or spiny. In fact, only six of the 30 tests we performed showed a statistically significant relationship between durability and occurrence frequency, and these six tests were equally divided in supporting or contradicting the taphonomic expectation. Thus, for the most commonly occurring genera in these three important groups, taphonomic effects are either neutral with respect to durability or compensated for by other factors (e.g., less durable taxa were more common in the original communities). These results suggest that biological information is retained in the occurrence frequency patterns of our target groups.
Phanerozoic trends in shell and life habit traits linked to postmortem durability were evaluated for the most common fossil brachiopod, gastropod, and bivalve genera in order to test for changes in taphonomic bias. Using the Paleobiology Database, we tabulated occurrence frequencies of genera for 48 intervals of ∼11 Myr duration. The most frequently occurring genera, cumulatively representing 40% of occurrences in each time bin, were scored for intrinsic durability on the basis of shell size, reinforcement (ribs, folds, and spines), life habit, and mineralogy.
Shell durability is positively correlated with the number of genera in a time bin, but durability traits exhibit different temporal patterns across higher taxa, with notable offsets in the timing of changes in these traits. We find no evidence for temporal decreases in durability that would indicate taphonomic bias at the Phanerozoic scale among commonly occurring genera. Also, all three groups show a remarkable stability in mean shell size through the Phanerozoic, an unlikely pattern if strong size-filtering taphonomic megabiases were affecting the fossil record of shelly faunas. Moreover, small shell sizes are attained in the early Paleozoic in brachiopods and in the latest Paleozoic in gastropods but are steady in bivalves; unreinforced shells are common to all groups across the entire Phanerozoic; organophosphatic and aragonitic shells dominate only the oldest and youngest time bins; and microstructures having high organic content are most common in the oldest time bins.
In most cases, the timing of changes in durability-related traits is inconsistent with a late Mesozoic Marine Revolution. The post-Paleozoic increase in mean gastropod reinforcement occurs in the early Triassic, suggesting either an earlier appearance and expansion of durophagous predators or other drivers. Increases in shell durability hypothesized to be the result of increased predation in the late Mesozoic are not evident in the common genera examined here. Infaunal life habit does increase in the late Mesozoic, but it does not become more common than levels already attained during the Paleozoic, and only among bivalves does the elevated late Mesozoic level persist through the Holocene.
These temporal patterns suggest control on the occurrence of durability-related traits by individual evolutionary histories rather than taphonomic megabiases. Our findings do not mean taphonomic biases are absent from the fossil record, but rather that their effects apparently have had little net effect on the relative occurrence of shell traits generally thought to confer higher preservation potential over long time scales.
Ronald Mason’s hypothesis from the 1960s that the southeastern United States possesses greater Paleoindian projectile-point diversity than other regions is regularly cited, and often assumed to be true, but in fact has never been quantitatively tested. Even if valid, however, the evolutionary meaning of this diversity is contested. Point diversity is often linked to Clovis “origins,” but point diversity could also arise from group fissioning and drift, admixture, adaptation, or multiple founding events, among other possibilities. Before archaeologists can even begin to discuss these scenarios, it is paramount to ensure that what we think we know is representative of reality. To this end, we tested Mason’s hypothesis for the first time, using a sample of 1,056 Paleoindian points from eastern North America arui employing paradigmatic classification and rigorous statistical tools used in the quantification of ecological biodiversity. Our first set of analyses, which compared the Southeast to the Northeast, showed that the Southeast did indeed possess significantly greater point-class richness. Although this result was consistent with Mason’s hypothesis, our second set of analyses, which compared the Upper Southeast to the Lower Southeast and the Northeast showed that in terms of point-class richness the Upper Southeast > Lower Southeast > Northeast. Given current chronometrie evidence, we suggest that this latter result is consistent with the suggestion that the area of the Ohio, Cumberland, and Tennessee River valleys, as well as the mid-Atlantic coastal plain, were possible initial and secondary “staging areas” for colonizing Paleoindian foragers moving from western to eastern North America.