To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Online grocery shopping could improve access to healthy food, but it may not be equally accessible to all populations – especially those at higher risk for food insecurity. The current study aimed to compare the socio-demographic characteristics of families who ordered groceries online v. those who only shopped in-store.
We analysed enrollment survey and 44 weeks of individually linked grocery transaction data. We used univariate χ2 and t-tests and logistic regression to assess differences in socio-demographic characteristics between households that only shopped in-store and those that shopped online with curbside pickup (online only or online and in-store).
Two Maine supermarkets.
863 parents or caregivers of children under 18 years old enrolled in two fruit and vegetable incentive trials.
Participants had a total of 32 757 transactions. In univariate assessments, online shoppers had higher incomes (P < 0 0001), were less likely to participate in Special Supplemental Nutrition Program for Women, Infants, and Children or Supplemental Nutrition Assistance Program (SNAP; P < 0 0001) and were more likely to be female (P = 0·04). Most online shoppers were 30–39 years old, and few were 50 years or older (P = 0·003). After controlling for age, gender, race/ethnicity, number of children, number of adults, income and SNAP participation, female primary shoppers (OR = 2·75, P = 0·003), number of children (OR = 1·27, P = 0·04) and income (OR = 3·91 for 186–300 % federal poverty line (FPL) and OR = 6·92 for >300 % FPL, P < 0·0001) were significantly associated with likelihood of shopping online.
In the current study of Maine families, low-income shoppers were significantly less likely to utilise online grocery ordering with curbside pickup. Future studies could focus on elucidating barriers and developing strategies to improve access.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
Background: The epidemic NAP1/027 Clostridioides difficile strain (MLST1, ST1) that emerged in the mid-2000 is on the decline. The current distribution of C. difficile strain types and their transmission dynamics are poorly defined. We performed whole-genome sequencing (WGS) of C. difficile isolates in 2 regions to identify the predominant multilocus sequence types (MLSTs) in community- and healthcare-associated cases and potential transmission between cases using whole-genome single-nucleotide polymorphism (SNP) analysis. Methods: Isolates were collected through the CDC Emerging Infections Program population-based surveillance for C. difficile infections (CDI) for 3 months between 2016 and 2017 in 5 Minnesota counties and 1 New York county. Isolates were limited to incident cases (CDI in a county resident with no positive C. difficile test in the preceding 8 weeks). Cases were classified as healthcare associated (HA-CDI) or community associated (CA-CDI) based on healthcare exposures as previously described. WGS was performed on an Illumina Miseq. The CFSAN (FDA) pipeline was used to compute whole-genome SNPs, SPAdes was used for assembly, and MLST was assigned according to www.pubmlst.org. Results: Of 431 isolates, 269 originated from New York and 162 from Minnesota; 203 cases were classified as CA-CDI and 221 as HA-CDI. The proportion of CA-CDI cases was higher in Minnesota than in New York: 62% vs 38%. The predominant MLSTs across both sites were ST42 (9%), ST8 (8%), and ST2 (8%). MLSTs more frequently encountered in HA-CDI than CA-CDI included ST1 (note that this ST includes PCR Ribotype 027; 76% HA-CDI), ST53 (84% HA-CDI), and ST43 (80% HA-CDI). In contrast, ST110 (63% CA-CDI) and ST3 (67% CA-CDI) were more commonly isolated from CA-CDI cases. ST1 accounted for 7.6% of circulating strains and was more common in New York than Minnesota (10% vs 3%) and was concentrated among New York HA-CDI cases. Also, 412 isolates (1 per patient) were included in the final whole-genome SNP analysis. Of these, only 12 pairs were separated by 0–3 SNPs, indicating potential transmission, and most involved HA-CDI cases. ST1, ST17, and ST46 accounted for 8 of 12 pairs, with ST17 and ST46 potentially forming small clusters. Conclusions: This analysis provides a snapshot of the current genomic epidemiology of C. difficile across 2 geographically and epidemiologically distinct regions of the United States and supports other studies suggesting that the role of direct transmission in the spread of CDI may be limited.
The popular approach of assuming a control policy and then finding the largest region of attraction (ROA) (e.g., sum-of-squares optimization) may lead to conservative estimates of the ROA, especially for highly nonlinear systems. We present a sampling-based approach that starts by assuming an ROA and then finds the necessary control policy by performing trajectory optimization on sampled initial conditions. Our method works with black-box models, produces a relatively large ROA, and ensures exponential convergence of the initial conditions to the periodic motion. We demonstrate the approach on a model of hopping and include extensive verification and robustness checks.
Procedural sedation and analgesia (PSA) is a core competency for emergency physicians (EP) that is commonly practiced.1–4 PSA entails suppressing a patient’s level of consciousness with sedative or dissociative agents to alleviate pain, anxiety, and suffering to enhance medical procedure performance and patient experience (Table 22.1).1,5
Radio observations allow us to identify a wide range of active galactic nuclei (AGN), which play a significant role in the evolution of galaxies. Amongst AGN at low radio-luminosities is the ‘radio-quiet’ quasar (RQQ) population, but how they contribute to the total radio emission is under debate, with previous studies arguing that it is predominantly through star formation. In this talk, SVW summarised the results of recent papers on RQQs, including the use of far-infrared data to disentangle the radio emission from the AGN and that from star formation. This provides evidence that black-hole accretion, instead, dominates the radio emission in RQQs. In addition, we find that this accretion-related emission is correlated with the optical luminosity of the quasar, whilst a weaker luminosity-dependence is evident for the radio emission connected with star formation. What remains unclear is the process by which this accretion-related emission is produced. Understanding this for RQQs will then allow us to investigate how this type of AGN influences its surroundings. Such studies have important implications for modelling AGN feedback, and for determining the accretion and star-formation histories of the Universe.
The primary objective of this study was to examine the impact of an electronic medical record (EMR)–driven intensive care unit (ICU) antimicrobial stewardship (AMS) service on clinician compliance with face-to-face AMS recommendations. AMS recommendations were defined by an internally developed “5 Moments of Antimicrobial Prescribing” metric: (1) escalation, (2) de-escalation, (3) discontinuation, (4) switch, and (5) optimization. The secondary objectives included measuring the impact of this service on (1) antibiotic appropriateness, and (2) use of high-priority target antimicrobials.
A prospective review was undertaken of the implementation and compliance with a new ICU-AMS service that utilized EMR data coupled with face-to-face recommendations. Additional patient data were collected when an AMS recommendation was made. The impact of the ICU-AMS round on antimicrobial appropriateness was evaluated using point-prevalence survey data.
For the 202 patients, 412 recommendations were made in accordance with the “5 Moments” metric. The most common recommendation made by the ICU-AMS team was moment 3 (discontinuation), which comprised 173 of 412 recommendations (42.0%), with an acceptance rate of 83.8% (145 of 173). Data collected for point-prevalence surveys showed an increase in prescribing appropriateness from 21 of 45 (46.7%) preintervention (October 2016) to 30 of 39 (76.9%) during the study period (September 2017).
The integration of EMR with an ICU-AMS program allowed us to implement a new AMS service, which was associated with high clinician compliance with recommendations and improved antibiotic appropriateness. Our “5 Moments of Antimicrobial Prescribing” metric provides a framework for measuring AMS recommendation compliance.
The increased use of insecticide seed treatments in rice has raised many questions about the potential benefits of these products. In 2014 and 2015, a field experiment was conducted near Stuttgart and Lonoke, AR, to evaluate whether an insecticide seed treatment could possibly lessen injury from acetolactate synthase (ALS)–inhibiting herbicides in imidazolinone-resistant (IR) rice. Two IR cultivars were tested (a hybrid, ‘CLXL745’, and an inbred, ‘CL152’), with and without an insecticide seed treatment (thiamethoxam). Four different herbicide combinations were evaluated: a nontreated control, two applications of bispyribac-sodium (hereafter bispyribac), two applications of imazethapyr, and two applications of imazethapyr plus bispyribac. The first herbicide application was to two- to three-leaf rice, and the second immediately prior to flooding (one- to two-tiller). At both 2 and 4 wk after final treatment (WAFT), the sequential applications of imazethapyr or bispyribac plus imazethapyr were more injurious to CLXL745 than CL152. This increased injury led to decreased groundcover 3 WAFT. Rice treated with thiamethoxam was less injured than nontreated rice and had improved groundcover and greater canopy heights. Even with up to 32% injury, the rice plants recovered by the end of the growing season, and yields within a cultivar were similar with and without a thiamethoxam seed treatment across all herbicide treatments. Based on these results, thiamethoxam can partially protect rice from injury caused by ALS-inhibiting herbicides as well as increase groundcover and canopy height; that is, the injury to rice never negatively affected yield.
Each year there are multiple reports of drift occurrences, and the majority of drift complaints in rice are from imazethapyr or glyphosate. In 2014 and 2015, multiple field experiments were conducted near Stuttgart, AR, and near Lonoke, AR, to evaluate whether insecticide seed treatments would reduce injury from glyphosate or imazethapyr drift or decrease the recovery time following exposure to a low rate of these herbicides. Study I was referred to as the “seed treatment study,” and Study II was the “drift timing study.” In the seed treatment study the conventional rice cultivar ‘Roy J’ was planted, and herbicide treatments included imazethapyr at 10.5 g ai ha–1, glyphosate at 126 g ae ha–1, or no herbicide. Each plot had either a seed treatment of thiamethoxam, clothianidin, chlorantraniliprole, or no insecticide seed treatment. The herbicides were applied at the two- to three-leaf growth stage. Crop injury was assessed 1, 3, and 5 wk after application. Averaged over site-years, thiamethoxam-treated rice had less injury than rice with no insecticide seed treatment at each rating, along with an increased yield. Clothianidin-treated rice had an increased yield over no insecticide seed treatment, but the reduction in injury for both herbicides was less pronounced than in the thiamethoxam-treated plots. Overall, chlorantraniliprole was generally the least effective of the three insecticides in reducing injury from either herbicide and in protecting rice yield potential. A second experiment conducted at Stuttgart, AR, was meant to determine whether damage to rice from glyphosate and imazethapyr was influenced by the timing (15, 30, and 45 d after planting) of exposure to herbicides for thiamethoxam-treated and nontreated rice. There was an overall reduction in injury with the use of thiamethoxam, but the reduction in injury was not dependent on the timing of the drift event. Reduction in damage from physical drift of glyphosate and imazethapyr as well as increased yields over the absence of an insecticide seed treatment appear to be an added benefit.
The Lothagam harpoon site in north-west Kenya's Lake Turkana Basin provides a stratified Holocene sequence capturing changes in African fisher-hunter-gatherer strategies through a series of subtle and dramatic climate shifts (Figure 1). The site rose to archaeological prominence following Robbins's 1965–1966 excavations, which yielded sizeable lithic and ceramic assemblages and one of the largest collections of Early Holocene human remains from Eastern Africa (Robbins 1974; Angel et al. 1980).
Over the past few decades, farmers have increasingly integrated cover crops into their cropping systems. Cover-crop benefits can help a farmer to achieve sustainability or reduce negative environmental externalities, such as soil erosion or chemical runoff. However, the impact on farm economics will likely be the strongest incentive to adopt cover crops. These impacts can include farm profits, cash crop yields or both. This paper provides a review of cover-crop adoption, production, risk and policy considerations from an economic perspective. These dimensions are examined through a review of cover-crop literature. This review was written to provide an overview of cover crops and their impacts on the farm business and the environment, especially with regard to economic considerations. Through increasing knowledge about cover crops, the intent here is to inform producers contemplating adoption and policy makers seeking to encourage adoption.
This article describes a formal proof of the Kepler conjecture on dense sphere packings in a combination of the HOL Light and Isabelle proof assistants. This paper constitutes the official published account of the now completed Flyspeck project.
Multi-wavelength flares have routinely been observed from the supermassive black hole, Sagittarius A⋆ (Sgr A⋆), at our Galactic center. The nature of these flares remains largely unclear, despite many theoretical models. We study the statistical properties of the Sgr A⋆ X-ray flares and find that they are consistent with the theoretical prediction of the self-organized criticality system with the spatial dimension S = 3. We suggest that the X-ray flares represent plasmoid ejections driven by magnetic reconnection (similar to solar flares) in the accretion flow onto the black hole. Motivated by the statistical results, we further develop a time-dependent magnetohydrodynamic (MHD) model for the multi-band flares from Sgr A⋆ by analogy with models of solar flares/coronal mass ejections (CMEs). We calculate the X-ray, infrared flare light curves, and the spectra, and find that our model can explain the main features of the flares.
Objectives: Several studies have found impaired response inhibition, measured by a stop-signal task (SST), in individuals who are currently symptomatic for obsessive-compulsive disorder (OCD). The aim of this study was to assess stop-signal reaction time (SSRT) performance in individuals with a lifetime diagnosis of OCD, in comparison to a healthy control group. This is the first study that has examined OCD in participants along a continuum of OCD severity, including approximately half of whom had sub-syndromal symptoms at the time of assessment. Methods: OCD participants were recruited primarily from within the OCD clinic at a psychiatric hospital, as well as from the community. Healthy controls were recruited from the community. We used the stop signal task to examine the difference between 21 OCD participants (mean age, 42.95 years) and 40 healthy controls (mean age, 35.13 years). We also investigated the relationship between SST and measures of OCD, depression, and anxiety severity. Results: OCD participants were significantly slower than healthy controls with regard to mean SSRT. Contrary to our prediction, there was no correlation between SSRT and current levels of OCD, anxiety, and depression severity. Conclusions: Results support prior studies showing impaired response inhibition in OCD, and extend the findings to a sample of patients with lifetime OCD who were not all currently above threshold for diagnosis. These findings indicate that response inhibition deficits may be a biomarker of OCD, regardless of current severity levels. (JINS, 2016, 22, 785–789)
Cardiomyopathy is a rare disorder of the heart muscle, affecting 1.13 cases per 100,000 children, from birth to 18 years of age. Cardiomyopathy is the leading cause of heart transplantation in children over the age of 1. The Pediatric Cardiomyopathy Registry funded in 1994 by the National Heart, Lung, and Blood Institute was established to examine the epidemiology of the disease in children below 18 years of age. More than 3500 children across the United States and Canada have been enrolled in the Pediatric Cardiomyopathy Registry, which has followed-up these patients until death, heart transplantation, or loss to follow-up. The Pediatric Cardiomyopathy Registry has provided the most in-depth illustration of this disease regarding its aetiology, clinical course, associated risk factors, and patient outcomes. Data from the registry have helped in guiding the clinical management of cardiomyopathy in children under 18 years of age; however, questions still remain regarding the most clinically effective diagnostic and treatment approaches for these patients. Future directions of the registry include the use of next-generation whole-exome sequencing and cardiac biomarkers to identify aetiology-specific treatments and improve diagnostic strategies. This article provides a brief synopsis of the work carried out by the Pediatric Cardiomyopathy Registry since its inception, including the current knowledge on the aetiologies, outcomes, and treatments of cardiomyopathy in children.
To determine the optimal number of specimens for virus detection in a respiratory outbreak, laboratory results from 2 Canadian public health laboratories were reviewed. The evidence suggests that 3 specimens are sufficient for detection of a virus in >95% of outbreaks, thereby reducing laboratory costs.
Infect. Control Hosp. Epidemiol. 2015;36(11): 1344–1347