To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Behavioral features of binge eating disorder (BED) suggest abnormalities in reward and inhibitory control. Studies of adult populations suggest functional abnormalities in reward and inhibitory control networks. Despite behavioral markers often developing in children, the neurobiology of pediatric BED remains unstudied.
58 pre-adolescent children (aged 9–10-years) with BED (mBMI = 25.05; s.d. = 5.40) and 66 age, BMI and developmentally matched control children (mBMI = 25.78; s.d. = 0.33) were extracted from the 3.0 baseline (Year 0) release of the Adolescent Brain Cognitive Development (ABCD) Study. We investigated group differences in resting-state functional MRI functional connectivity (FC) within and between reward and inhibitory control networks. A seed-based approach was employed to assess nodes in the reward [orbitofrontal cortex (OFC), nucleus accumbens, amygdala] and inhibitory control [dorsolateral prefrontal cortex, anterior cingulate cortex (ACC)] networks via hypothesis-driven seed-to-seed analyses, and secondary seed-to-voxel analyses.
Findings revealed reduced FC between the dlPFC and amygdala, and between the ACC and OFC in pre-adolescent children with BED, relative to controls. These findings indicating aberrant connectivity between nodes of inhibitory control and reward networks were corroborated by the whole-brain FC analyses.
Early-onset BED may be characterized by diffuse abnormalities in the functional synergy between reward and cognitive control networks, without perturbations within reward and inhibitory control networks, respectively. The decreased capacity to regulate a reward-driven pursuit of hedonic foods, which is characteristic of BED, may in part, rest on this dysconnectivity between reward and inhibitory control networks.
Seed retention, and ultimately seed shatter, are extremely important for the efficacy of harvest weed seed control (HWSC) and are likely influenced by various agroecological and environmental factors. Field studies investigated seed-shattering phenology of 22 weed species across three soybean [Glycine max (L.) Merr.]-producing regions in the United States. We further evaluated the potential drivers of seed shatter in terms of weather conditions, growing degree days, and plant biomass. Based on the results, weather conditions had no consistent impact on weed seed shatter. However, there was a positive correlation between individual weed plant biomass and delayed weed seed–shattering rates during harvest. This work demonstrates that HWSC can potentially reduce weed seedbank inputs of plants that have escaped early-season management practices and retained seed through harvest. However, smaller individuals of plants within the same population that shatter seed before harvest pose a risk of escaping early-season management and HWSC.
Otitis media (OM) is a common reason for children to be prescribed antibiotics and undergo surgery but a thorough understanding of disease mechanisms is lacking. We evaluate the evidence of a dysregulated immune response in the pathogenesis of OM.
A comprehensive systematic review of the literature using search terms [otitis media OR glue ear OR AOM OR OME] OR [middle ear AND (infection OR inflammation)] which were run through Medline and Embase via Ovid, including both human and animal studies. In total, 82 955 studies underwent automated filtering followed by manual screening. One hundred studies were included in the review.
Most studies were based on in vitro or animal work. Abnormalities in pathogen detection pathways, such as Toll-like receptors, have confirmed roles in OM. The aetiology of OM, its chronic subgroups (chronic OM, persistent OM with effusion) and recurrent acute OM is complex; however, inflammatory signalling mechanisms are frequently implicated. Host epithelium likely plays a crucial role, but the characterisation of human middle ear tissue lags behind that of other anatomical subsites.
Translational research for OM presently falls far behind its clinical importance. This has likely hindered the development of new diagnostic and treatment modalities. Further work is urgently required; particularly to disentangle the respective immune pathologies in the clinically observed phenotypes and thereby work towards more personalised treatments.
To study the airflow, transmission, and clearance of aerosols in the clinical spaces of a hospital ward that had been used to care for patients with coronavirus disease 2019 (COVID-19) and to examine the impact of portable air cleaners on aerosol clearance.
A single ward of a tertiary-care public hospital in Melbourne, Australia.
Glycerin-based aerosol was used as a surrogate for respiratory aerosols. The transmission of aerosols from a single patient room into corridors and a nurses’ station in the ward was measured. The rate of clearance of aerosols was measured over time from the patient room, nurses’ station and ward corridors with and without air cleaners [ie, portable high-efficiency particulate air (HEPA) filters].
Aerosols rapidly travelled from the patient room into other parts of the ward. Air cleaners were effective in increasing the clearance of aerosols from the air in clinical spaces and reducing their spread to other areas. With 2 small domestic air cleaners in a single patient room of a hospital ward, 99% of aerosols could be cleared within 5.5 minutes.
Air cleaners may be useful in clinical spaces to help reduce the risk of acquisition of respiratory viruses that are transmitted via aerosols. They are easy to deploy and are likely to be cost-effective in a variety of healthcare settings.
Online grocery shopping could improve access to healthy food, but it may not be equally accessible to all populations – especially those at higher risk for food insecurity. The current study aimed to compare the socio-demographic characteristics of families who ordered groceries online v. those who only shopped in-store.
We analysed enrollment survey and 44 weeks of individually linked grocery transaction data. We used univariate χ2 and t-tests and logistic regression to assess differences in socio-demographic characteristics between households that only shopped in-store and those that shopped online with curbside pickup (online only or online and in-store).
Two Maine supermarkets.
863 parents or caregivers of children under 18 years old enrolled in two fruit and vegetable incentive trials.
Participants had a total of 32 757 transactions. In univariate assessments, online shoppers had higher incomes (P < 0 0001), were less likely to participate in Special Supplemental Nutrition Program for Women, Infants, and Children or Supplemental Nutrition Assistance Program (SNAP; P < 0 0001) and were more likely to be female (P = 0·04). Most online shoppers were 30–39 years old, and few were 50 years or older (P = 0·003). After controlling for age, gender, race/ethnicity, number of children, number of adults, income and SNAP participation, female primary shoppers (OR = 2·75, P = 0·003), number of children (OR = 1·27, P = 0·04) and income (OR = 3·91 for 186–300 % federal poverty line (FPL) and OR = 6·92 for >300 % FPL, P < 0·0001) were significantly associated with likelihood of shopping online.
In the current study of Maine families, low-income shoppers were significantly less likely to utilise online grocery ordering with curbside pickup. Future studies could focus on elucidating barriers and developing strategies to improve access.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
Background: The epidemic NAP1/027 Clostridioides difficile strain (MLST1, ST1) that emerged in the mid-2000 is on the decline. The current distribution of C. difficile strain types and their transmission dynamics are poorly defined. We performed whole-genome sequencing (WGS) of C. difficile isolates in 2 regions to identify the predominant multilocus sequence types (MLSTs) in community- and healthcare-associated cases and potential transmission between cases using whole-genome single-nucleotide polymorphism (SNP) analysis. Methods: Isolates were collected through the CDC Emerging Infections Program population-based surveillance for C. difficile infections (CDI) for 3 months between 2016 and 2017 in 5 Minnesota counties and 1 New York county. Isolates were limited to incident cases (CDI in a county resident with no positive C. difficile test in the preceding 8 weeks). Cases were classified as healthcare associated (HA-CDI) or community associated (CA-CDI) based on healthcare exposures as previously described. WGS was performed on an Illumina Miseq. The CFSAN (FDA) pipeline was used to compute whole-genome SNPs, SPAdes was used for assembly, and MLST was assigned according to www.pubmlst.org. Results: Of 431 isolates, 269 originated from New York and 162 from Minnesota; 203 cases were classified as CA-CDI and 221 as HA-CDI. The proportion of CA-CDI cases was higher in Minnesota than in New York: 62% vs 38%. The predominant MLSTs across both sites were ST42 (9%), ST8 (8%), and ST2 (8%). MLSTs more frequently encountered in HA-CDI than CA-CDI included ST1 (note that this ST includes PCR Ribotype 027; 76% HA-CDI), ST53 (84% HA-CDI), and ST43 (80% HA-CDI). In contrast, ST110 (63% CA-CDI) and ST3 (67% CA-CDI) were more commonly isolated from CA-CDI cases. ST1 accounted for 7.6% of circulating strains and was more common in New York than Minnesota (10% vs 3%) and was concentrated among New York HA-CDI cases. Also, 412 isolates (1 per patient) were included in the final whole-genome SNP analysis. Of these, only 12 pairs were separated by 0–3 SNPs, indicating potential transmission, and most involved HA-CDI cases. ST1, ST17, and ST46 accounted for 8 of 12 pairs, with ST17 and ST46 potentially forming small clusters. Conclusions: This analysis provides a snapshot of the current genomic epidemiology of C. difficile across 2 geographically and epidemiologically distinct regions of the United States and supports other studies suggesting that the role of direct transmission in the spread of CDI may be limited.
The popular approach of assuming a control policy and then finding the largest region of attraction (ROA) (e.g., sum-of-squares optimization) may lead to conservative estimates of the ROA, especially for highly nonlinear systems. We present a sampling-based approach that starts by assuming an ROA and then finds the necessary control policy by performing trajectory optimization on sampled initial conditions. Our method works with black-box models, produces a relatively large ROA, and ensures exponential convergence of the initial conditions to the periodic motion. We demonstrate the approach on a model of hopping and include extensive verification and robustness checks.
Procedural sedation and analgesia (PSA) is a core competency for emergency physicians (EP) that is commonly practiced.1–4 PSA entails suppressing a patient’s level of consciousness with sedative or dissociative agents to alleviate pain, anxiety, and suffering to enhance medical procedure performance and patient experience (Table 22.1).1,5
Radio observations allow us to identify a wide range of active galactic nuclei (AGN), which play a significant role in the evolution of galaxies. Amongst AGN at low radio-luminosities is the ‘radio-quiet’ quasar (RQQ) population, but how they contribute to the total radio emission is under debate, with previous studies arguing that it is predominantly through star formation. In this talk, SVW summarised the results of recent papers on RQQs, including the use of far-infrared data to disentangle the radio emission from the AGN and that from star formation. This provides evidence that black-hole accretion, instead, dominates the radio emission in RQQs. In addition, we find that this accretion-related emission is correlated with the optical luminosity of the quasar, whilst a weaker luminosity-dependence is evident for the radio emission connected with star formation. What remains unclear is the process by which this accretion-related emission is produced. Understanding this for RQQs will then allow us to investigate how this type of AGN influences its surroundings. Such studies have important implications for modelling AGN feedback, and for determining the accretion and star-formation histories of the Universe.
The primary objective of this study was to examine the impact of an electronic medical record (EMR)–driven intensive care unit (ICU) antimicrobial stewardship (AMS) service on clinician compliance with face-to-face AMS recommendations. AMS recommendations were defined by an internally developed “5 Moments of Antimicrobial Prescribing” metric: (1) escalation, (2) de-escalation, (3) discontinuation, (4) switch, and (5) optimization. The secondary objectives included measuring the impact of this service on (1) antibiotic appropriateness, and (2) use of high-priority target antimicrobials.
A prospective review was undertaken of the implementation and compliance with a new ICU-AMS service that utilized EMR data coupled with face-to-face recommendations. Additional patient data were collected when an AMS recommendation was made. The impact of the ICU-AMS round on antimicrobial appropriateness was evaluated using point-prevalence survey data.
For the 202 patients, 412 recommendations were made in accordance with the “5 Moments” metric. The most common recommendation made by the ICU-AMS team was moment 3 (discontinuation), which comprised 173 of 412 recommendations (42.0%), with an acceptance rate of 83.8% (145 of 173). Data collected for point-prevalence surveys showed an increase in prescribing appropriateness from 21 of 45 (46.7%) preintervention (October 2016) to 30 of 39 (76.9%) during the study period (September 2017).
The integration of EMR with an ICU-AMS program allowed us to implement a new AMS service, which was associated with high clinician compliance with recommendations and improved antibiotic appropriateness. Our “5 Moments of Antimicrobial Prescribing” metric provides a framework for measuring AMS recommendation compliance.
The increased use of insecticide seed treatments in rice has raised many questions about the potential benefits of these products. In 2014 and 2015, a field experiment was conducted near Stuttgart and Lonoke, AR, to evaluate whether an insecticide seed treatment could possibly lessen injury from acetolactate synthase (ALS)–inhibiting herbicides in imidazolinone-resistant (IR) rice. Two IR cultivars were tested (a hybrid, ‘CLXL745’, and an inbred, ‘CL152’), with and without an insecticide seed treatment (thiamethoxam). Four different herbicide combinations were evaluated: a nontreated control, two applications of bispyribac-sodium (hereafter bispyribac), two applications of imazethapyr, and two applications of imazethapyr plus bispyribac. The first herbicide application was to two- to three-leaf rice, and the second immediately prior to flooding (one- to two-tiller). At both 2 and 4 wk after final treatment (WAFT), the sequential applications of imazethapyr or bispyribac plus imazethapyr were more injurious to CLXL745 than CL152. This increased injury led to decreased groundcover 3 WAFT. Rice treated with thiamethoxam was less injured than nontreated rice and had improved groundcover and greater canopy heights. Even with up to 32% injury, the rice plants recovered by the end of the growing season, and yields within a cultivar were similar with and without a thiamethoxam seed treatment across all herbicide treatments. Based on these results, thiamethoxam can partially protect rice from injury caused by ALS-inhibiting herbicides as well as increase groundcover and canopy height; that is, the injury to rice never negatively affected yield.
Each year there are multiple reports of drift occurrences, and the majority of drift complaints in rice are from imazethapyr or glyphosate. In 2014 and 2015, multiple field experiments were conducted near Stuttgart, AR, and near Lonoke, AR, to evaluate whether insecticide seed treatments would reduce injury from glyphosate or imazethapyr drift or decrease the recovery time following exposure to a low rate of these herbicides. Study I was referred to as the “seed treatment study,” and Study II was the “drift timing study.” In the seed treatment study the conventional rice cultivar ‘Roy J’ was planted, and herbicide treatments included imazethapyr at 10.5 g ai ha–1, glyphosate at 126 g ae ha–1, or no herbicide. Each plot had either a seed treatment of thiamethoxam, clothianidin, chlorantraniliprole, or no insecticide seed treatment. The herbicides were applied at the two- to three-leaf growth stage. Crop injury was assessed 1, 3, and 5 wk after application. Averaged over site-years, thiamethoxam-treated rice had less injury than rice with no insecticide seed treatment at each rating, along with an increased yield. Clothianidin-treated rice had an increased yield over no insecticide seed treatment, but the reduction in injury for both herbicides was less pronounced than in the thiamethoxam-treated plots. Overall, chlorantraniliprole was generally the least effective of the three insecticides in reducing injury from either herbicide and in protecting rice yield potential. A second experiment conducted at Stuttgart, AR, was meant to determine whether damage to rice from glyphosate and imazethapyr was influenced by the timing (15, 30, and 45 d after planting) of exposure to herbicides for thiamethoxam-treated and nontreated rice. There was an overall reduction in injury with the use of thiamethoxam, but the reduction in injury was not dependent on the timing of the drift event. Reduction in damage from physical drift of glyphosate and imazethapyr as well as increased yields over the absence of an insecticide seed treatment appear to be an added benefit.
The Lothagam harpoon site in north-west Kenya's Lake Turkana Basin provides a stratified Holocene sequence capturing changes in African fisher-hunter-gatherer strategies through a series of subtle and dramatic climate shifts (Figure 1). The site rose to archaeological prominence following Robbins's 1965–1966 excavations, which yielded sizeable lithic and ceramic assemblages and one of the largest collections of Early Holocene human remains from Eastern Africa (Robbins 1974; Angel et al. 1980).
Over the past few decades, farmers have increasingly integrated cover crops into their cropping systems. Cover-crop benefits can help a farmer to achieve sustainability or reduce negative environmental externalities, such as soil erosion or chemical runoff. However, the impact on farm economics will likely be the strongest incentive to adopt cover crops. These impacts can include farm profits, cash crop yields or both. This paper provides a review of cover-crop adoption, production, risk and policy considerations from an economic perspective. These dimensions are examined through a review of cover-crop literature. This review was written to provide an overview of cover crops and their impacts on the farm business and the environment, especially with regard to economic considerations. Through increasing knowledge about cover crops, the intent here is to inform producers contemplating adoption and policy makers seeking to encourage adoption.
This article describes a formal proof of the Kepler conjecture on dense sphere packings in a combination of the HOL Light and Isabelle proof assistants. This paper constitutes the official published account of the now completed Flyspeck project.