To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Seed retention, and ultimately seed shatter, are extremely important for the efficacy of harvest weed seed control (HWSC) and are likely influenced by various agroecological and environmental factors. Field studies investigated seed-shattering phenology of 22 weed species across three soybean [Glycine max (L.) Merr.]-producing regions in the United States. We further evaluated the potential drivers of seed shatter in terms of weather conditions, growing degree days, and plant biomass. Based on the results, weather conditions had no consistent impact on weed seed shatter. However, there was a positive correlation between individual weed plant biomass and delayed weed seed–shattering rates during harvest. This work demonstrates that HWSC can potentially reduce weed seedbank inputs of plants that have escaped early-season management practices and retained seed through harvest. However, smaller individuals of plants within the same population that shatter seed before harvest pose a risk of escaping early-season management and HWSC.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
Intensified cover-cropping practices are increasingly viewed as a herbicide-resistance management tool but clear distinction between reactive and proactive resistance management performance targets is needed. We evaluated two proactive performance targets for integrating cover-cropping tactics, including (1) facilitation of reduced herbicide inputs and (2) reduced herbicide selection pressure. We conducted corn (Zea mays L.) and soybean [Glycine max (L.) Merr.] field experiments in Pennsylvania and Delaware using synthetic weed seedbanks of horseweed [Conyza canadensis (L.) Cronquist] and smooth pigweed (Amaranthus hybridus L.) to assess winter and summer annual population dynamics, respectively. The effect of alternative cover crops was evaluated across a range of herbicide inputs. Cover crop biomass production ranged from 2,000 to 8,500 kg ha−1 in corn and 3,000 to 5,500 kg ha−1 in soybean. Experimental results demonstrated that herbicide-based tactics were the primary drivers of total weed biomass production, with cover-cropping tactics providing an additive weed-suppression benefit. Substitution of cover crops for PRE or POST herbicide programs did not reduce total weed control levels or cash crop yields but did result in lower net returns due to higher input costs. Cover-cropping tactics significantly reduced C. canadensis populations in three of four cover crop treatments and decreased the number of large rosettes (>7.6-cm diameter) at the time of preplant herbicide exposure. Substitution of cover crops for PRE herbicides resulted in increased selection pressure on POST herbicides, but reduced the number of large individuals (>10 cm) at POST applications. Collectively, our findings suggest that cover crops can reduce the intensity of selection pressure on POST herbicides, but the magnitude of the effect varies based on weed life-history traits. Additional work is needed to describe proactive resistance management concepts and performance targets for integrating cover crops so producers can apply these concepts in site-specific, within-field management practices.
Organic grain producers are interested in interseeding cover crops into corn (Zea mays L.) in regions that have a narrow growing season window for post-harvest establishment of cover crops. A field experiment was replicated across 2 years on three commercial organic farms in Pennsylvania to compare the effects of drill- and broadcast-interseeding to standard grower practices, which included post-harvest seeding cereal rye (Secale cereale L.) at the more southern location and winter fallow at the more northern locations. Drill- and broadcast-interseeding treatments occurred just after last cultivation and used a cover crop mixture of annual ryegrass [Lolium perenne L. ssp. multiflorum (Lam.) Husnot] + orchardgrass (Dactylis glomerata L.) + forage radish (Raphanus sativus L. ssp. longipinnatus). Higher mean fall cover crop biomass and forage radish abundance (% of total) was observed in drill-interseeding treatments compared with broadcast-interseeding. However, corn grain yield and weed suppression and N retention in late-fall and spring were similar among interseeding treatments, which suggests that broadcast-interseeding at last cultivation has the potential to produce similar production and conservation benefits at lower labor and equipment costs in organic systems. Post-harvest seeding cereal rye resulted in greater spring biomass production and N retention compared with interseeded cover crops at the southern location, whereas variable interseeding establishment success and dominance of winter-killed forage radish produced conditions that increased the likelihood of N loss at more northern locations. Additional research is needed to contrast conservation benefits and management tradeoffs between interseeding and post-harvest establishment methods.
Organic grain producers are interested in reducing tillage to conserve soil and decrease labor and fuel costs. We examined agronomic and economic tradeoffs associated with alternative strategies for reducing tillage frequency and intensity in a cover crop–soybean (Glycine max L. Merr.) sequence within a corn (Zea mays L.)–soybean–spelt (Triticum spelta L.) organic cropping system experiment in Pennsylvania. Tillage-based soybean production preceded by a cover crop mixture of annual ryegrass (Lolium perenne L. ssp. multiflorum), orchardgrass (Dactylis glomerata L.) and forage radish (Raphanus sativus L.) interseeded into corn grain (Z. mays L.) was compared with reduced-tillage soybean production preceded by roller-crimped cereal rye (Secale cereale L.) that was sown after corn silage. Total aboveground weed biomass did not differ between soybean production strategies. Each strategy, however, was characterized by high inter-annual variability in weed abundance. Tillage-based soybean production marginally increased grain yield by 0.28 Mg ha−1 compared with reduced-tillage soybean. A path model of soybean yield indicated that soybean stand establishment and weed biomass were primary drivers of yield, but soybean production strategy had a measurable effect on yields due to factors other than within-season weed–crop competition. Cumulative tillage frequency and intensity were quantified for each cover crop—sequence using the Soil Tillage Intensity Rating (STIR) index. The reduced-tillage soybean sequence resulted in 50% less soil disturbance compared to tillage-based soybean sequence across study years. Finally, enterprise budget comparisons showed that the reduced-tillage soybean sequence resulted in lower input costs than the tillage-based soybean sequence but was approximately $114 ha−1 less profitable because of lower average yields.
Proactive integrated weed management (IWM) is critically needed in no-till production to reduce the intensity of selection pressure for herbicide-resistant weeds. Reducing the density of emerged weed populations and the number of larger individuals within the population at the time of herbicide application are two practical management objectives when integrating cover crops as a complementary tactic in herbicide-based production systems. We examined the following demographic questions related to the effects of alternative cover-cropping tactics following small grain harvest on preplant, burndown management of horseweed (Erigeron canadensis L.) in no-till commodity-grain production: (1) Do cover crops differentially affect E. canadensis density and size inequality at the time of herbicide exposure? (2) Which cover crop response traits are drivers of E. canadensis suppression at time of herbicide exposure? Interannual variation in growing conditions (study year) and intra-annual variation in soil fertility (low vs. high nitrogen) were the primary drivers of cover crop response traits and significantly affected E. canadensis density at the time of herbicide exposure. In comparison to the fallow control, cover crop treatments reduced E. canadensis density 52% to 86% at the time of a preplant, burndown application. Cereal rye (Secale cereale L.) alone or in combination with forage radish (Raphanus sativus L.) provided the most consistent E. canadensis suppression. Fall and spring cover crop biomass production was negatively correlated with E. canadensis density at the preplant burndown application timing. Our results also show that winter-hardy cover crops reduce the size inequality of E. canadensis populations at the time of herbicide exposure by reducing the number of large individuals within the population. Finally, we advocate for advancement in our understanding of complementarity between cover crop– and herbicide-based management tactics in no-till systems to facilitate development of proactive, herbicide-resistant management strategies.
The triazines are one of the most widely used herbicide classes ever developed and are critical for managing weed populations that have developed herbicide resistance. These herbicides are traditionally valued for their residual weed control in more than 50 crops. Scientific literature suggests that atrazine, and perhaps other s-triazines, may no longer remain persistent in soils due to enhanced microbial degradation. Experiments examined the rate of degradation of atrazine and two other triazine herbicides, simazine and metribuzin, in both atrazine-adapted and non-history Corn Belt soils, with similar soils being used from each state as a comparison of potential triazine degradation. In three soils with no history of atrazine use, the t1/2 of atrazine was at least four times greater than in three soils with a history of atrazine use. Simazine degradation in the same three sets of soils was 2.4 to 15 times more rapid in history soils than non-history soils. Metribuzin in history soils degraded at 0.6, 0.9, and 1.9 times the rate seen in the same three non-history soils. These results indicate enhanced degradation of the symmetrical triazine simazine, but not of the asymmetrical triazine metribuzin.
Seven half-day regional listening sessions were held between December 2016 and April 2017 with groups of diverse stakeholders on the issues and potential solutions for herbicide-resistance management. The objective of the listening sessions was to connect with stakeholders and hear their challenges and recommendations for addressing herbicide resistance. The coordinating team hired Strategic Conservation Solutions, LLC, to facilitate all the sessions. They and the coordinating team used in-person meetings, teleconferences, and email to communicate and coordinate the activities leading up to each regional listening session. The agenda was the same across all sessions and included small-group discussions followed by reporting to the full group for discussion. The planning process was the same across all the sessions, although the selection of venue, time of day, and stakeholder participants differed to accommodate the differences among regions. The listening-session format required a great deal of work and flexibility on the part of the coordinating team and regional coordinators. Overall, the participant evaluations from the sessions were positive, with participants expressing appreciation that they were asked for their thoughts on the subject of herbicide resistance. This paper details the methods and processes used to conduct these regional listening sessions and provides an assessment of the strengths and limitations of those processes.
Herbicide resistance is ‘wicked’ in nature; therefore, results of the many educational efforts to encourage diversification of weed control practices in the United States have been mixed. It is clear that we do not sufficiently understand the totality of the grassroots obstacles, concerns, challenges, and specific solutions needed for varied crop production systems. Weed management issues and solutions vary with such variables as management styles, regions, cropping systems, and available or affordable technologies. Therefore, to help the weed science community better understand the needs and ideas of those directly dealing with herbicide resistance, seven half-day regional listening sessions were held across the United States between December 2016 and April 2017 with groups of diverse stakeholders on the issues and potential solutions for herbicide resistance management. The major goals of the sessions were to gain an understanding of stakeholders and their goals and concerns related to herbicide resistance management, to become familiar with regional differences, and to identify decision maker needs to address herbicide resistance. The messages shared by listening-session participants could be summarized by six themes: we need new herbicides; there is no need for more regulation; there is a need for more education, especially for others who were not present; diversity is hard; the agricultural economy makes it difficult to make changes; and we are aware of herbicide resistance but are managing it. The authors concluded that more work is needed to bring a community-wide, interdisciplinary approach to understanding the complexity of managing weeds within the context of the whole farm operation and for communicating the need to address herbicide resistance.
On 27 April 2015, Washington health authorities identified Escherichia coli O157:H7 infections associated with dairy education school field trips held in a barn 20–24 April. Investigation objectives were to determine the magnitude of the outbreak, identify the source of infection, prevent secondary illness transmission and develop recommendations to prevent future outbreaks. Case-finding, hypothesis generating interviews, environmental site visits and a case–control study were conducted. Parents and children were interviewed regarding event activities. Odds ratios (OR) and 95% confidence intervals (CI) were computed. Environmental testing was conducted in the barn; isolates were compared to patient isolates using pulsed-field gel electrophoresis (PFGE). Sixty people were ill, 11 (18%) were hospitalised and six (10%) developed haemolytic uremic syndrome. Ill people ranged in age from <1 year to 47 years (median: 7), and 20 (33%) were female. Twenty-seven case-patients and 88 controls were enrolled in the case–control study. Among first-grade students, handwashing (i.e. soap and water, or hand sanitiser) before lunch was protective (adjusted OR 0.13; 95% CI 0.02–0.88, P = 0.04). Barn samples yielded E. coli O157:H7 with PFGE patterns indistinguishable from patient isolates. This investigation provided epidemiological, laboratory and environmental evidence for a large outbreak of E. coli O157:H7 infections from exposure to a contaminated barn. The investigation highlights the often overlooked risk of infection through exposure to animal environments as well as the importance of handwashing for disease prevention. Increased education and encouragement of infection prevention measures, such as handwashing, can prevent illness.
In the mid-Atlantic region, there is increasing interest in the use of intercropping strategies to establish cover crops in corn cropping systems. However, intercropping may be limited by potential injury to cover crops from residual herbicide programs. Field experiments were conducted from 2013 to 2015 at Pennsylvania, Maryland, and New York locations (n=8) to evaluate the effect of common residual corn herbicides on interseeded red clover and annual ryegrass. Cover crop establishment and response to herbicide treatments varied across sites and years. S-metolachlor, pyroxasulfone, pendimethalin, and dimethenamid-P reduced annual ryegrass biomass relative to the nontreated check, whereas annual ryegrass biomass in acetochlor treatments was no different compared with the nontreated check. The rank order of observed annual ryegrass biomass reduction among chloroacetamide herbicides was S-metolachlor>pyroxasulfone>dimethenamid-P>acetochlor. Annual ryegrass biomass was not reduced by any of the broadleaf control herbicides. Mesotrione reduced red clover biomass 80% compared to the nontreated check. No differences in red clover biomass were observed between saflufenacil, rimsulfuron and atrazine treatments compared to the nontreated check. Red clover was not reduced by any of the grass control herbicides. This research suggests that annual ryegrass and red clover can be successfully interseeded in silt loam soils of Pennsylvania following use of several shorter-lived residual corn herbicides, but further research is needed in areas with soil types other than silt loam or outside of the mid-Atlantic cropping region.
This study examined the response of corn to clomazone, chlorimuron, imazaquin, and imazethapyr the year following their application to soybeans. Herbicides were surface-applied from one-half to three times the labeled application rates. Soybeans were planted the year of application and crop tolerance was evaluated. Corn was planted in rotation the following season. Soybeans were tolerant of all four herbicides. The highest rates of clomazone, chlorimuron, and imazaquin injured the corn early in the season. Imazethapyr did not influence corn growth. Visual estimates of clomazone injury were as high as 39% chlorosis. Seedling dry weight reductions at the highest chlorimuron and imazaquin rates were 32% and 24%, respectively. Although corn was injured by higher rates of clomazone, imazaquin, and chlorimuron at the 3-leaf stage, none of the herbicides significantly reduced grain yield. This study suggests that these herbicides can carry over and injure corn, especially if labeled application rates are exceeded. However, low to moderate early season injury to corn may not affect grain yield.
Studies were conducted to evaluate the effect of wild oat (Avena fatua L. # AVEFA) interference in lentils (Lens culinaris Medik). An infestation of 32 and 65 wild oats/m2 maintained up to 5 weeks in the field did not reduce lentil grain yield. However, 32 wild oats/m2 reduced yields 32% when allowed to remain for 7 weeks and 49% if they remained until harvest time (11 weeks). Sixty-five wild oats/m2 reduced grain yield 42 and 61% for the same time periods, respectively. In the growth chamber, 69 wild oats/m2 reduced lentil plant dry weight 29% if allowed to remain for 3 weeks, 61% for 5 weeks, and 72% for 7 weeks (harvest time). The field data suggest that wild oat control measures may be delayed for several weeks after lentil emergence without reducing crop yield.
Field experiments were conducted in 1992 and 1993 to evaluate wirestem muhly control in no-till corn with application of glyphosate, nicosulfuron, and primisulfuron. Glyphosate was applied preplant at 1.1 kg ai/ha. Nicosulfuron and primisulfuron were applied at 0.018, 0.036, and 0.072 kg ai/ha and 0.020, 0.040, and 0.080 kg ai/ha, respectively, at four postemergence timings that included a split application. Similar experiments were conducted with wirestem muhly grown from rhizomes and seed in the greenhouse. Glyphosate was the most effective herbicide in the greenhouse, providing at least 96% control. However, preplant application of glyphosate in the field was ineffective in controlling wirestem muhly. On average, nicosulfuron and primisulfuron never exceeded 72% control of wirestem muhly in the greenhouse or in the field. Nicosulfuron was generally more effective than primisulfuron. Control with split application timings was more uniform over a 12-wk period than single applications and late postemergence applications were often too slow acting to affect wirestem muhly growth. Although neither nicosulfuron nor primisulfuron controls wirestem muhly, both can provide suppression of this weed where other alternatives do not exist.
A computer model which selects least cost herbicide programs given a minimum desired level of weed control could provide growers with economical weed management options. Using an integer programming approach, a herbicide selection model was developed for corn production under Pennsylvania conditions. Models for three rotations (corn-soybean, corn-corn, and corn-alfalfa) under three tillage systems (conventional tillage, reduced tillage, and no-till) that evaluated 21 soil-applied and 13 postemergence herbicide options for 24 weeds were developed. Each model minimizes the cost of a herbicide program subject to a desired level of weed control. By selecting the weed species to be controlled and the level of control desired, customized herbicide programs can be generated. The models can also be used to evaluate the cost of changing the level of control desired for an individual weed species or set of weeds.
Biological fitness and negative cross-resistance to other herbicides may be an important factor in managing triazine-resistant common lambsquarters. Greenhouse experiments examined the sensitivity of a resistant and a susceptible biotype to foliarly-applied bentazon, bromoxynil, dicamba, pyridate, and thifensulfuron. The noncompetitive vigor of triazine-resistant and susceptible common lambsquarters also was compared by growing plants in individual containers and harvesting them periodically throughout their vegetative period and at reproductive maturity. In the herbicide susceptibility study, 11 kg ai ha−1 atrazine had no effect on the growth of the resistant biotype, while it reduced susceptible common lambsquarters’ biomass by up to 68%. Estimated I50 values indicated the resistant biotype exhibited between 36 and 79% greater susceptibility to bentazon, bromoxynil, dicamba, and pyridate than did the susceptible one, while both responded similarly to thifensulfuron. In growth studies, the susceptible biotype achieved greater height, leaf area, and plant dry weight than the resistant population for the majority of harvest dates; however, values equalized between biotypes as the plants reached maturity. These experiments suggest that alternative management programs that exploit reduced fitness and increased herbicide susceptibility in triazine-resistant common lambsquarters could be developed. However, further studies are needed to determine whether these results have application for the management of triazine-resistant weeds in the field.
Influences of a hairy vetch cover crop and residual herbicides were examined in field corn in 1991 and 1992. Hairy vetch was seeded in mid-August and killed the following May with tillage, mowing, or glyphosate plus 2,4-D (no-till). These cover crop management systems were compared with a no-cover treatment. Residual herbicides including atrazine plus metolachlor applied PRE at three rates and nicosulfuron plus thifensulfuron applied POST at a single rate were compared within cover crop management systems. All cover crop management systems effectively controlled hairy vetch except mowing in 1992. The corn population was reduced in mow treatments containing uncontrolled vetch. Hairy vetch mulch suppressed some weeds in the no-till treatments in 1991, but more annual grass was noted late in the season with no-till into hairy vetch than with the no-cover treatments in 1992. Residual herbicide performance was similar across cover crop management systems, except for fall panicum control which decreased in some no-till systems. Unlike soil-applied herbicides, performance of POST herbicides was unaffected by cover crop management systems.
A postemergence (POST) timing study was conducted on established populations of burcucumber (Sicyos angulatus) in corn (Zea mays), and a second study examined the residual activity of several herbicides for burcucumber control under greenhouse conditions. In the field study, flumiclorac, halosulfuron, primisulfuron, CGA 152005, and CGA 152005 + primisulfuron (45, 71, 40, 40, and 20 + 20 g ai/ha, respectively) were applied at two POST timings. CGA 152005, primisulfuron, and the combination provided greater than 85% control of burcucumber 14 wk after planting (WAP). Flumiclorac and halosulfuron provided 60% control or less by 8 WAP. Timing of the POST applications did not influence burcucumber control by 11 WAP with any herbicide. In the greenhouse, germinated burcucumber seeds were placed in soil treated with atrazine, chlorimuron, primisulfuron, or CGA 152005 at normal field use rates. All treatments provided similar residual control early; however by 4 wk after treatment (WAT), control from atrazine was less than 10% compared to 69% for chlorimuron and about 50% for primisulfuron and CGA 152005. This research suggests that CGA 152005 and primisulfuron can both be effective for managing burcucumber in corn, whereas flumiclorac and halosulfuron proved ineffective.