To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Seed retention, and ultimately seed shatter, are extremely important for the efficacy of harvest weed seed control (HWSC) and are likely influenced by various agroecological and environmental factors. Field studies investigated seed-shattering phenology of 22 weed species across three soybean [Glycine max (L.) Merr.]-producing regions in the United States. We further evaluated the potential drivers of seed shatter in terms of weather conditions, growing degree days, and plant biomass. Based on the results, weather conditions had no consistent impact on weed seed shatter. However, there was a positive correlation between individual weed plant biomass and delayed weed seed–shattering rates during harvest. This work demonstrates that HWSC can potentially reduce weed seedbank inputs of plants that have escaped early-season management practices and retained seed through harvest. However, smaller individuals of plants within the same population that shatter seed before harvest pose a risk of escaping early-season management and HWSC.
Abstract: This chapter focuses on private firms’ compliance with norms concerning transnational bribery. It begins with an overview of the regulatory context and obstacles to effective enforcement of norms against transnational bribery. It then reviews how compliance is defined, how it ought to be defined, and obstacles to the achievement of optimal compliance. Finally, it ends by focusing on the next steps forward in this space: (1) greater information sharing from private firms to outsiders in order to better analyze and evaluate the current efficacy of compliance programs targeting anti-bribery, and (2) increased coordination between enforcement agencies at the national and international levels to better tackle transnational bribery.
Early in the coronavirus disease 2019 (COVID-19) pandemic, the CDC recommended collection of a lower respiratory tract (LRT) specimen for severe acute respiratory coronavirus virus 2 (SARS-CoV-2) testing in addition to the routinely recommended upper respiratory tract (URT) testing in mechanically ventilated patients. Significant operational challenges were noted at our institution using this approach. In this report, we describe our experience with routine collection of paired URT and LRT sample testing. Our results revealed a high concordance between the 2 sources, and that all children tested for SARS-CoV-2 were appropriately diagnosed with URT testing alone. There was no added benefit to LRT testing. Based on these findings, our institutional approach was therefore adjusted to sample the URT alone for most patients, with LRT sampling reserved for patients with ongoing clinical suspicion for SARS-CoV-2 after a negative URT test.
To determine patient-specific risk factors and clinical outcomes associated with contaminated blood cultures.
A single-center, retrospective case-control risk factor and clinical outcome analysis performed on inpatients with blood cultures collected in the emergency department, 2014–2018. Patients with contaminated blood cultures (cases) were compared to patients with negative blood cultures (controls).
A 509-bed tertiary-care university hospital.
Risk factors independently associated with blood-culture contamination were determined using multivariable logistic regression. The impacts of contamination on clinical outcomes were assessed using linear regression, logistic regression, and generalized linear model with γ log link.
Of 13,782 blood cultures, 1,504 (10.9%) true positives were excluded, leaving 1,012 (7.3%) cases and 11,266 (81.7%) controls. The following factors were independently associated with blood-culture contamination: increasing age (adjusted odds ratio [aOR], 1.01; 95% confidence interval [CI], 1.01–1.01), black race (aOR, 1.32; 95% CI, 1.15–1.51), increased body mass index (BMI; aOR, 1.01; 95% CI, 1.00–1.02), chronic obstructive pulmonary disease (aOR, 1.16; 95% CI, 1.02–1.33), paralysis (aOR 1.64; 95% CI, 1.26–2.14) and sepsis plus shock (aOR, 1.26; 95% CI, 1.07–1.49). After controlling for age, race, BMI, and sepsis, blood-culture contamination increased length of stay (LOS; β = 1.24 ± 0.24; P < .0001), length of antibiotic treatment (LOT; β = 1.01 ± 0.20; P < .001), hospital charges (β = 0.22 ± 0.03; P < .0001), acute kidney injury (AKI; aOR, 1.60; 95% CI, 1.40–1.83), echocardiogram orders (aOR, 1.51; 95% CI, 1.30–1.75) and in-hospital mortality (aOR, 1.69; 95% CI, 1.31–2.16).
These unique risk factors identify high-risk individuals for blood-culture contamination. After controlling for confounders, contamination significantly increased LOS, LOT, hospital charges, AKI, echocardiograms, and in-hospital mortality.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
Background: Carbapenemase-producing Enterobacterales (CPE) have rapidly become a global health concern and are associated with substantial morbidity and mortality due to limited treatment options. Travel to endemic areas, especially healthcare exposure in these areas, is an important risk factor for acquisition. We describe the evolving epidemiology, molecular features, and outcomes of CPE in Canada through surveillance by the Canadian Nosocomial Infection Surveillance Program (CNISP). Methods: CNISP has conducted surveillance for CPE among inpatients and outpatients of all ages since 2010. Participating acute-care facilities submit eligible specimens to the National Microbiology Laboratory for detection of carbapenemase production, and epidemiological data are collected. Incidence rates per 10,000 patient days are calculated based on inpatient data. Results: In total, 59 CNISP hospitals in 10 Canadian provinces representing 21,789 beds and 6,785,013 patient days participated in this surveillance. From 2010 to 2018, 118 (26%) CPE-infected and 547 (74%) CPE-colonized patients were identified. Few pediatric cases were identified (n = 18). Infection incidence rates remain low and stable (0.02 per 10,000 patient days in 2010 to 0.03 per 10,000 patient days in 2018), and colonization incidence rates have increased by 89% over the surveillance period. Overall, 92% of cases were acquired in a healthcare facility: 61% (n = 278) in a Canadian healthcare facility and 31% (n = 142) in a healthcare facility outside Canada. Of the 8% of cases not acquired in a healthcare facility, 50% (16 of 32) reported travel outside of Canada in the 12 months prior to positive culture. The distribution of carbapenemases varied by region; New Delhi metallo-B-lactamase (NDM) was dominant (59%) in western Canada and Klebsiella pneumoniae carbapenemase (KPC) (66%) in central Canada. NDM and class D carbapenemase OXA-48 were more commonly identified among those who traveled outside of Canada, whereas KPC was more commonly identified among patients without travel. In addition, 30-day all-cause mortality was 14% (25 of 181) among CPE infected patients and 32% (14 of 44) among those with bacteremia. Conclusions: CPE rates remain low in Canada; however, national surveillance data suggest that the increase in CPE in Canada is now being driven by local nosocomial transmission as well as travel and healthcare within endemic areas. Changes in screening practices may have contributed to the increase in colonizations; however, these data are currently lacking and will be collected moving forward. These data highlight the need to intensify surveillance and coordinate infection control measures to prevent further spread of CPE in Canadian acute-care hospitals.
Susy Hota reports contracted research for Finch Therapeutics. Allison McGeer reports funds to her institution for projects for which she is the principal investigator from Pfizer and Merck, as well as consulting fees from the following companies: Sanofi-Pasteur, Sunovion, GSK, Pfizer, and Cidara.
During the COVID-19 pandemic, the antimicrobial stewardship module in our electronic medical record was reconfigured for the management of COVID-19 patients. This change allowed our subspecialist providers to review charts quickly to optimize potential therapy and management during the patient surge.
Worldwide, cardiovascular disease (CVD) is the number 1 cause of mortality and is associated with insulin resistance (IR). Emerging biomarkers such as FGF21 and adiponectin are associated with cardiometabolic risk. Low carbohydrate, high fat (LCHF) diets have been reported to reduce cardiometabolic risk markers; however, few studies have compared a LCHF diet vs. a high carbohydrate (HC), lower fat diet under ad libitum conditions on adiponectin and FGF21. The purpose of this study was to investigate the effects of an ad libitum LCHF vs. HC diet on IR, FGF21 and adiponectin in 16 healthy adults. Ethical approval: Liverpool John Moores University Research Ethics Committee (16/ELS/029); registered with ClinicalTrials.gov (Ref. NCT03257085). Participants were randomly assigned to a HC diet (n = 8, the UK Eatwell guidelines; ≥ 50% of energy from carbohydrates) or a LCHF diet (n = 8, consume < 50 g/day of carbohydrates). All provided plasma samples at 0, 4 and 8 weeks. FGF21 (R&D Systems) was analysed via ELISA and adiponectin, insulin and glucose were analysed via immunoassay technology (Randox Evidence Investigator™ Metabolic Syndrome Arrays I & II). Mann Whitney, Friedmans, Wilcoxon tests and 2×3 ANOVA (IBM SPSS 25®) were undertaken to investigate significant differences between and within groups. The homeostatic model assessment (HOMA) was used to calculate IR. FGF21 significantly (P = 0.04) decreased (Mdn, IQR:148.16, 78.51–282.02 to 99.4, 39.87–132.29 pg/ml) after 4 weeks and significantly (P = 0.02) increased (Mdn, IQR:167.38, 80.82–232.89 pg/ml) by 8 weeks vs. baseline with LCHF. No significant differences (P > 0.05) were observed between groups. Adiponectin was significantly (P = 0.03) different at week 4 only between groups. Adiponectin increased after 4 weeks (Mdn, IQR:13.44, 9.12–25.47 to 16.64, 11.96–21.51 ng/ml) but was only significantly (P = 0.03) different by 8 weeks vs. baseline in the HC group (Mdn, IQR:16, 10.8–27.43 ng/ml). Adiponectin remained unchanged (P = 0.96) in the LCHF group. HOMA significantly decreased with both diets after 8 weeks only (mean ± SD, LCHF: 2.9 ± 1.3 to 1.8 ± 0.8, HC: 2.5 ± 0.6 to 1.9 ± 0.6, P = 0.008) but was not significantly (P = 0.60) different between groups. These preliminary data reveal that while both diets improved insulin sensitivity, they may do so by different mechanisms. Future studies are warranted to investigate further, how a LCHF vs. HC diet affects FGF21 and adiponectin, and the subsequent regulation of IR. Furthermore, studies that extend these findings by determining the impact of LCHF vs. HC on peripheral metabolism to determine potential nutrition-mediated mechanisms of metabolic adaptation are warranted.
Apolipoproteins (apo) regulate lipoprotein characteristics and lipid metabolism. ApoC-III is a regulator of triglyceride-rich lipoprotein (TRL) metabolism and apolipoproteins are important biomarkers for cardiovascular disease (CVD) risk prediction. A low carbohydrate high fat (LCHF) diet improves cardiometabolic risk, especially via reduction of TRL. However, few studies have compared a LCHF vs. a high carbohydrate (HC), lower fat diet under ad libitum conditions on apoC-III levels. The objectives of this investigation were to measure the effect of a LCHF vs. a HC diet on apoC-III, apoA1, apoB and apoB/apoA1 in 16 healthy Caucasian adults aged 19–64. Ethical approval: Liverpool John Moores University Research Ethics Committee (16/ELS/029); registered with ClinicalTrials.gov (Ref. NCT03257085). Participants randomly assigned to a HC diet (UK Eatwell guidelines; ≥ 50% of energy from carbohydrates) (n = 8), or a LCHF diet (consume < 50 g/day of carbohydrates) (n = 8) provided plasma samples at 0, 4 and 8 weeks. ApoA1 and apoB were analysed by an automated chemistry analyser (Daytona, Randox Laboratories Ltd, UK). ApoC-III was analysed via ELISA (Thermo Fisher Ltd, USA). Factorial 2×3 ANOVA and ANCOVA (IBM SPSS 25®) were undertaken to investigate significant differences and to control for variables influenced by baseline measures and visceral adipose tissue (VAT). Results show 0, 4, and 8 weeks respectively: ApoC-III (LCHF: 19.12 ± 9.14, 16.05 ± 7.95, 15.11 ± 3.17 mg/dl; HC: 22.13 ± 8.38, 28.22 ± 13.85, 22.22 ± 7.7 mg/dl) showed no significant (P = 0.319) change. No significant (P = 0.23) change was also observed in ApoB (LCHF: 107.25 ± 20.35, 111.38 ± 24.81, 111.43 ± 19.93 mg/dl; HC: 94.38 ± 20.79, 105.00 ± 20.13, 99.00 ± 29.09 mg/dl). Similarly apoA1 (LCHF: 158.71 ± 14.27, 166.50 ± 23.09, 173.00 ± 29.42 mg/dl; HC: 164.71 ± 30.25, 172.50 ± 29.44, 174.00 ± 32.83 mg/dl) showed no significant change (P = 0.76). This resulted in a relatively unchanged apoB/A1 throughout the study in both diets (P = 0.30). No significant (P > 0.05) differences were found after 4 weeks or between groups also. ANCOVA revealed a trend (P = 0.06) in apoC-III for a difference between groups (LCHF: Δ-6.6 mg/dl vs. HC: Δ1.2 mg/dl) after 8 weeks but no significant (P > 0.05) changes in other apolipoproteins were detected. These preliminary data reveal that a LCHF diet does not improve the apolipoprotein profile; however, when accounting for other metabolic risk factors (i.e. VAT) there was a trend towards lowering apoC-III levels (P = 0.06). Modulation of apoC-III may lead to improved lipid metabolism, but higher-powered studies are warranted before any improvement on CVD risk can be inferred.
Complex challenges may arise when patients present to emergency services with an advance decision to refuse life-saving treatment following suicidal behaviour.
To investigate the use of advance decisions to refuse treatment in the context of suicidal behaviour from the perspective of clinicians and people with lived experience of self-harm and/or psychiatric services.
Forty-one participants aged 18 or over from hospital services (emergency departments, liaison psychiatry and ambulance services) and groups of individuals with experience of psychiatric services and/or self-harm were recruited to six focus groups in a multisite study in England. Data were collected in 2016 using a structured topic guide and included a fictional vignette. They were analysed using thematic framework analysis.
Advance decisions to refuse treatment for suicidal behaviour were contentious across groups. Three main themes emerged from the data: (a) they may enhance patient autonomy and aid clarity in acute emergencies, but also create legal and ethical uncertainty over treatment following self-harm; (b) they are anxiety provoking for clinicians; and (c) in practice, there are challenges in validation (for example, validating the patient’s mental capacity at the time of writing), time constraints and significant legal/ethical complexities.
The potential for patients to refuse life-saving treatment following suicidal behaviour in a legal document was challenging and anxiety provoking for participants. Clinicians should act with caution given the potential for recovery and fluctuations in suicidal ideation. Currently, advance decisions to refuse treatment have questionable use in the context of suicidal behaviour given the challenges in validation. Discussion and further patient research are needed in this area.
Declaration of interest
D.G., K.H. and N.K. are members of the Department of Health's (England) National Suicide Prevention Advisory Group. N.K. chaired the National Institute for Health and Care Excellence (NICE) guideline development group for the longer-term management of self-harm and the NICE Topic Expert Group (which developed the quality standards for self-harm services). He is currently chair of the updated NICE guideline for Depression. K.H. and D.G. are NIHR Senior Investigators. K.H. is also supported by the Oxford Health NHS Foundation Trust and N.K. by the Greater Manchester Mental Health NHS Foundation Trust.
Palaeoecology has been prominent in studies of environmental change during the Holocene epoch in Scotland. These studies have been dominated by palynology (pollen, spore and related bio-and litho-stratigraphic analyses) as a key approach to multi- and inter-disciplinary investigations of topics such as vegetation, climate and landscape change. This paper highlights some key dimensions of the pollen- and vegetation-based archive, with a focus upon woodland dynamics, blanket peat, human impacts, biodiversity and conservation. Following a brief discussion of chronological, climatic, faunal and landscape contexts, the migration, survival and nature of the woodland cover through time is assessed, emphasising its time-transgressiveness and altitudinal variation. While agriculture led to the demise of woodland in lowland areas of the south and east, the spread of blanket peat was especially a phenomenon of the north and west, including the Western and Northern Isles. Almost a quarter of Scotland is covered by blanket peat and the cause(s) of its spread continue(s) to evoke recourse to climatic, topographic, pedogenic, hydrological, biotic or anthropogenic influences, while we remain insufficiently knowledgeable about the timing of the formation processes. Humans have been implicated in vegetational change throughout the Holocene, with prehistoric woodland removal, woodland management, agricultural impacts arising from arable and pastoral activities, potential heathland development and afforestation. The viability of many current vegetation communities remains a concern, in that Scottish data show reductions in plant diversity over the last 400 years, which recent conservation efforts have yet to reverse. Palaeoecological evidence can be used to test whether conservation baselines and restoration targets are appropriate to longer-term ecosystem variability and can help identify when modern conditions have no past analogues.
The Taipan galaxy survey (hereafter simply ‘Taipan’) is a multi-object spectroscopic survey starting in 2017 that will cover 2π steradians over the southern sky (δ ≲ 10°, |b| ≳ 10°), and obtain optical spectra for about two million galaxies out to z < 0.4. Taipan will use the newly refurbished 1.2-m UK Schmidt Telescope at Siding Spring Observatory with the new TAIPAN instrument, which includes an innovative ‘Starbugs’ positioning system capable of rapidly and simultaneously deploying up to 150 spectroscopic fibres (and up to 300 with a proposed upgrade) over the 6° diameter focal plane, and a purpose-built spectrograph operating in the range from 370 to 870 nm with resolving power R ≳ 2000. The main scientific goals of Taipan are (i) to measure the distance scale of the Universe (primarily governed by the local expansion rate, H0) to 1% precision, and the growth rate of structure to 5%; (ii) to make the most extensive map yet constructed of the total mass distribution and motions in the local Universe, using peculiar velocities based on improved Fundamental Plane distances, which will enable sensitive tests of gravitational physics; and (iii) to deliver a legacy sample of low-redshift galaxies as a unique laboratory for studying galaxy evolution as a function of dark matter halo and stellar mass and environment. The final survey, which will be completed within 5 yrs, will consist of a complete magnitude-limited sample (i ⩽ 17) of about 1.2 × 106 galaxies supplemented by an extension to higher redshifts and fainter magnitudes (i ⩽ 18.1) of a luminous red galaxy sample of about 0.8 × 106 galaxies. Observations and data processing will be carried out remotely and in a fully automated way, using a purpose-built automated ‘virtual observer’ software and an automated data reduction pipeline. The Taipan survey is deliberately designed to maximise its legacy value by complementing and enhancing current and planned surveys of the southern sky at wavelengths from the optical to the radio; it will become the primary redshift and optical spectroscopic reference catalogue for the local extragalactic Universe in the southern sky for the coming decade.
A field study was conducted in 2014 and 2015 in Arkansas, Illinois, Indiana, Ohio, Tennessee, Wisconsin, and Missouri to determine the effects of tillage system and herbicide program on season-long emergence of Amaranthus species in glufosinate-resistant soybean. The tillage systems evaluated were deep tillage (fall moldboard plow followed by (fb) one pass with a field cultivator in the spring), conventional tillage (fall chisel plow fb one pass with a field cultivator in the spring), minimum tillage (one pass of a vertical tillage tool in the spring), and no-tillage (PRE application of paraquat). Each tillage system also received one of two herbicide programs; PRE application of flumioxazin (0.09 kg ai ha–1) fb a POST application of glufosinate (0.59 kg ai ha−1) plus S-metolachlor (1.39 kg ai ha–1), or POST-only applications of glufosinate (0.59 kg ha−1). The deep tillage system resulted in a 62, 67, and 73% reduction in Amaranthus emergence when compared to the conventional, minimum, and no-tillage systems, respectively. The residual herbicide program also resulted in an 87% reduction in Amaranthus species emergence compared to the POST-only program. The deep tillage system, combined with the residual program, resulted in a 97% reduction in Amaranthus species emergence when compared to the minimum tillage system combined with the POST-only program, which had the highest Amaranthus emergence. Soil cores taken prior to planting and herbicide application revealed that only 28% of the Amaranthus seed in the deep tillage system was placed within the top 5-cm of the soil profile compared to 79, 81, and 77% in the conventional, minimum, and no-tillage systems. Overall, the use of deep tillage with a residual herbicide program provided the greatest reduction in Amaranthus species emergence, thus providing a useful tool in managing herbicide-resistant Amaranthus species where appropriate.
Pigweeds are among the most abundant and troublesome weed species across Midwest and mid-South soybean production systems because of their prolific growth characteristics and ability to rapidly evolve resistance to several herbicide sites of action. This has renewed interest in diversifying weed management strategies by implementing integrated weed management (IWM) programs to efficiently manage weeds, increase soybean light interception, and increase grain yield. Field studies were conducted across 16 site-years to determine the effectiveness of soybean row width, seeding rate, and herbicide strategy as components of IWM in glufosinate-resistant soybean. Sites were grouped according to optimum adaptation zones for soybean maturity groups (MGs). Across all MG regions, pigweed density and height at the POST herbicide timing, and end-of-season pigweed density, height, and fecundity were reduced in IWM programs using a PRE followed by (fb) POST herbicide strategy. Furthermore, a PRE fb POST herbicide strategy treatment increased soybean cumulative intercepted photosynthetically active radiation (CIPAR) and subsequently, soybean grain yield across all MG regions. Soybean row width and seeding rate manipulation effects were highly variable. Narrow row width (≤ 38 cm) and a high seeding rate (470,000 seeds ha−1) reduced end-of-season height and fecundity variably across MG regions compared with wide row width (≥ 76 cm) and moderate to low (322,000 to 173,000 seeds ha−1) seeding rates. However, narrow row widths and high seeding rates did not reduce pigweed density at the POST herbicide application timing or at soybean harvest. Across all MG regions, soybean CIPAR increased as soybean row width decreased and seeding rate increased; however, row width and seeding rate had variable effects on soybean yield. Furthermore, soybean CIPAR was not associated with end-of-season pigweed growth and fecundity. A PRE fb POST herbicide strategy was a necessary component for an IWM program as it simultaneously managed pigweeds, increased soybean CIPAR, and increased grain yield.
Two of the most problematic Amaranthus species in soybean production today are tall waterhemp and Palmer amaranth. This study determined the percentage of tall waterhemp and Palmer amaranth seed that was retained by the weed at soybean maturity to assess the likelihood of using at-harvest weed seed control tactics for soil seedbank management. Palmer amaranth plants were collected from fields in Arkansas, Tennessee, Illinois, Missouri, and Nebraska, and tall waterhemp plants were collected from fields in Nebraska, Missouri, Wisconsin, and Illinois. Collected plants were assessed for at-harvest weed seed retention in 2013 and 2014. Within 1 wk of soybean maturity, Amaranthus plants were harvested and the loose soil and debris beneath the plants were swept into a pan with a hand broom to collect any shattered seed. Percent seed retention ranged from 95 to 100% for all states both years, regardless of species. There was a strong correlation between weed biomass (g) and total seed production (no. plant−1) in that the larger the plant, the more seeds it produced. However, there was no correlation between percent seed retention and weed biomass, which indicates that regardless of plant size and likely time of emergence, seed retention is high at the time of crop maturity. Overall, this study demonstrated that there is great opportunity for Palmer amaranth and tall waterhemp seed capture or destruction at soybean harvest. It is likely that nearly all of the seeds produced for both Amaranthus species passes through the combine during harvest to be returned to the soil seedbank. Thus, there is continued need for research focused on developing and testing harvest weed seed control tactics that aim at reducing the soil seedbank and lowering risks for evolution of herbicide resistance.
Palmer amaranth and waterhemp have become increasingly troublesome weeds throughout the United States. Both species are highly adaptable and emerge continuously throughout the summer months, presenting the need for a residual PRE application in soybean. To improve season-long control of Amaranthus spp., 19 PRE treatments were evaluated on glyphosate-resistant Palmer amaranth in 2013 and 2014 at locations in Arkansas, Indiana, Nebraska, Illinois, and Tennessee; and on glyphosate-resistant waterhemp at locations in Illinois, Missouri, and Nebraska. The two Amaranthus species were analyzed separately; data for each species were pooled across site-years, and site-year was included as a random variable in the analyses. The dissipation of weed control throughout the course of the experiments was compared among treatments with the use of regression analysis where percent weed control was described as a function of time (the number of weeks after treatment [WAT]). At the mean (i.e., average) WAT (4.3 and 3.2 WAT for Palmer amaranth and waterhemp, respectively) isoxaflutole + S-metolachlor + metribuzin had the highest predicted control of Palmer amaranth (98%) and waterhemp (99%). Isoxaflutole + S-metolachlor + metribuzin, S-metolachlor + mesotrione, and flumioxazin + pyroxasulfone had a predicted control ≥ 97% and similar model parameter estimates, indicating control declined at similar rates for these treatments. Dicamba and 2,4-D provided some, short-lived residual control of Amaranthus spp. When dicamba was added to metribuzin or S-metolachlor, control increased compared to dicamba alone. Flumioxazin + pyroxasulfone, a currently labeled PRE, performed similarly to treatments containing isoxaflutole or mesotrione. Additional sites of action will provide soybean growers more opportunities to control these weeds and reduce the potential for herbicide resistance.