To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Item 9 of the Patient Health Questionnaire-9 (PHQ-9) queries about thoughts of death and self-harm, but not suicidality. Although it is sometimes used to assess suicide risk, most positive responses are not associated with suicidality. The PHQ-8, which omits Item 9, is thus increasingly used in research. We assessed equivalency of total score correlations and the diagnostic accuracy to detect major depression of the PHQ-8 and PHQ-9.
We conducted an individual patient data meta-analysis. We fit bivariate random-effects models to assess diagnostic accuracy.
16 742 participants (2097 major depression cases) from 54 studies were included. The correlation between PHQ-8 and PHQ-9 scores was 0.996 (95% confidence interval 0.996 to 0.996). The standard cutoff score of 10 for the PHQ-9 maximized sensitivity + specificity for the PHQ-8 among studies that used a semi-structured diagnostic interview reference standard (N = 27). At cutoff 10, the PHQ-8 was less sensitive by 0.02 (−0.06 to 0.00) and more specific by 0.01 (0.00 to 0.01) among those studies (N = 27), with similar results for studies that used other types of interviews (N = 27). For all 54 primary studies combined, across all cutoffs, the PHQ-8 was less sensitive than the PHQ-9 by 0.00 to 0.05 (0.03 at cutoff 10), and specificity was within 0.01 for all cutoffs (0.00 to 0.01).
PHQ-8 and PHQ-9 total scores were similar. Sensitivity may be minimally reduced with the PHQ-8, but specificity is similar.
Background: Cervical sponylotic myelopathy (CSM) may present with neck and arm pain. This study investiagtes the change in neck/arm pain post-operatively in CSM. Methods: This ambispective study llocated 402 patients through the Canadian Spine Outcomes and Research Network. Outcome measures were the visual analogue scales for neck and arm pain (VAS-NP and VAS-AP) and the neck disability index (NDI). The thresholds for minimum clinically important differences (MCIDs) for VAS-NP and VAS-AP were determined to be 2.6 and 4.1. Results: VAS-NP improved from mean of 5.6±2.9 to 3.8±2.7 at 12 months (P<0.001). VAS-AP improved from 5.8±2.9 to 3.5±3.0 at 12 months (P<0.001). The MCIDs for VAS-NP and VAS-AP were also reached at 12 months. Based on the NDI, patients were grouped into those with mild pain/no pain (33%) versus moderate/severe pain (67%). At 3 months, a significantly high proportion of patients with moderate/severe pain (45.8%) demonstrated an improvement into mild/no pain, whereas 27.2% with mild/no pain demonstrated worsening into moderate/severe pain (P <0.001). At 12 months, 17.4% with mild/no pain experienced worsening of their NDI (P<0.001). Conclusions: This study suggests that neck and arm pain responds to surgical decompression in patients with CSM and reaches the MCIDs for VAS-AP and VAS-NP at 12 months.
Young people with 22q11.2 deletion syndrome (22q11.2DS) are at high risk for neurodevelopmental disorders. Sleep problems may play a role in this risk but their prevalence, nature and links to psychopathology and cognitive function remain undescribed in this population.
Sleep problems, psychopathology, developmental coordination and cognitive function were assessed in 140 young people with 22q11.2DS (mean age = 10.1, s.d. = 2.46) and 65 unaffected sibling controls (mean age = 10.8, s.d.SD = 2.26). Primary carers completed questionnaires screening for the children's developmental coordination and autism spectrum disorder.
Sleep problems were identified in 60% of young people with 22q11.2DS compared to 23% of sibling controls (OR 5.00, p < 0.001). Two patterns best-described sleep problems in 22q11.2DS: restless sleep and insomnia. Restless sleep was linked to increased ADHD symptoms (OR 1.16, p < 0.001) and impaired executive function (OR 0.975, p = 0.013). Both patterns were associated with elevated symptoms of anxiety disorder (restless sleep: OR 1.10, p = 0.006 and insomnia: OR 1.07, p = 0.045) and developmental coordination disorder (OR 0.968, p = 0.0023, and OR 0.955, p = 0.009). The insomnia pattern was also linked to elevated conduct disorder symptoms (OR 1.53, p = 0.020).
Clinicians and carers should be aware that sleep problems are common in 22q11.2DS and index psychiatric risk, cognitive deficits and motor coordination problems. Future studies should explore the physiology of sleep and the links with the neurodevelopment in these young people.
Gut cell losses contribute to overall feed efficiency due to the energy requirement for cell replenishment. Intestinal epithelial cells are sloughed into the intestinal lumen as digesta passes through the gastrointestinal tract, where cells are degraded by endonucleases. This leads to fragmented DNA being present in faeces, which may be an indicator of gut cell loss. Therefore, measuring host faecal DNA content could have potential as a non-invasive marker of gut cell loss and result in a novel technique for the assessment of how different feed ingredients impact upon gut health. Faecal calprotectin (CALP) is a marker of intestinal inflammation. This was a pilot study designed to test a methodology for extracting and quantifying DNA from pig faeces, and to assess whether any differences in host faecal DNA and CALP could be detected. An additional aim was to determine whether any differences in the above measures were related to the pig performance response to dietary yeast-enriched protein concentrate (YPC). Newly weaned (∼26.5 days of age) Large White × Landrace × Pietrain piglets (8.37 kg ±1.10, n = 180) were assigned to one of four treatment groups (nine replicates of five pigs), differing in dietary YPC content: 0% (control), 2.5%, 5% and 7.5% (w/w). Pooled faecal samples were collected on days 14 and 28 of the 36-day trial. Deoxyribonucleic acid was extracted and quantitative PCR was used to assess DNA composition. Pig genomic DNA was detected using primers specific for the pig cytochrome b (CYTB) gene, and bacterial DNA was detected using universal 16S primers. A pig CALP ELISA was used to assess gut inflammation. Dietary YPC significantly reduced feed conversion ratio (FCR) from weaning to day 14 (P<0.001), but not from day 14 to day 28 (P = 0.220). Pig faecal CYTB DNA content was significantly (P = 0.008) reduced in YPC-treated pigs, with no effect of time, whereas total faecal bacterial DNA content was unaffected by diet or time (P>0.05). Faecal CALP levels were significantly higher at day 14 compared with day 28, but there was no effect of YPC inclusion and no relationship with FCR. In conclusion, YPC reduced faecal CYTB DNA content and this correlated positively with FCR, but was unrelated to gut inflammation, suggesting that it could be a non-invasive marker of gut cell loss. However, further validation experiments by an independent method are required to verify the origin of pig faecal CYTB DNA as being from sloughed intestinal epithelial cells.
To ascertain opinions regarding etiology and preventability of hospital-onset bacteremia and fungemia (HOB) and perspectives on HOB as a potential outcome measure reflecting quality of infection prevention and hospital care.
Hospital epidemiologists and infection preventionist members of the Society for Healthcare Epidemiology of America (SHEA) Research Network.
A web-based, multiple-choice survey was administered via the SHEA Research Network to 133 hospitals.
A total of 89 surveys were completed (67% response rate). Overall, 60% of respondents defined HOB as a positive blood culture on or after hospital day 3. Central line-associated bloodstream infections and intra-abdominal infections were perceived as the most frequent etiologies. Moreover, 61% thought that most HOB events are preventable, and 54% viewed HOB as a measure reflecting a hospital’s quality of care. Also, 29% of respondents’ hospitals already collect HOB data for internal purposes. Given a choice to publicly report central-line–associated bloodstream infections (CLABSIs) and/or HOB, 57% favored reporting either HOB alone (22%) or in addition to CLABSI (35%) and 34% favored CLABSI alone.
Among the majority of SHEA Research Network respondents, HOB is perceived as preventable, reflective of quality of care, and potentially acceptable as a publicly reported quality metric. Further studies on HOB are needed, including validation as a quality measure, assessment of risk adjustment, and formation of evidence-based bundles and toolkits to facilitate measurement and improvement of HOB rates.
There is increasing evidence for shared genetic susceptibility between schizophrenia and bipolar disorder. Although genetic variants only convey subtle increases in risk individually, their combination into a polygenic risk score constitutes a strong disease predictor.
To investigate whether schizophrenia and bipolar disorder polygenic risk scores can distinguish people with broadly defined psychosis and their unaffected relatives from controls.
Using the latest Psychiatric Genomics Consortium data, we calculated schizophrenia and bipolar disorder polygenic risk scores for 1168 people with psychosis, 552 unaffected relatives and 1472 controls.
Patients with broadly defined psychosis had dramatic increases in schizophrenia and bipolar polygenic risk scores, as did their relatives, albeit to a lesser degree. However, the accuracy of predictive models was modest.
Although polygenic risk scores are not ready for clinical use, it is hoped that as they are refined they could help towards risk reduction advice and early interventions for psychosis.
Declaration of interest
R.M.M. has received honoraria for lectures from Janssen, Lundbeck, Lilly, Otsuka and Sunovian.
Different diagnostic interviews are used as reference standards for major depression classification in research. Semi-structured interviews involve clinical judgement, whereas fully structured interviews are completely scripted. The Mini International Neuropsychiatric Interview (MINI), a brief fully structured interview, is also sometimes used. It is not known whether interview method is associated with probability of major depression classification.
To evaluate the association between interview method and odds of major depression classification, controlling for depressive symptom scores and participant characteristics.
Data collected for an individual participant data meta-analysis of Patient Health Questionnaire-9 (PHQ-9) diagnostic accuracy were analysed and binomial generalised linear mixed models were fit.
A total of 17 158 participants (2287 with major depression) from 57 primary studies were analysed. Among fully structured interviews, odds of major depression were higher for the MINI compared with the Composite International Diagnostic Interview (CIDI) (odds ratio (OR) = 2.10; 95% CI = 1.15–3.87). Compared with semi-structured interviews, fully structured interviews (MINI excluded) were non-significantly more likely to classify participants with low-level depressive symptoms (PHQ-9 scores ≤6) as having major depression (OR = 3.13; 95% CI = 0.98–10.00), similarly likely for moderate-level symptoms (PHQ-9 scores 7–15) (OR = 0.96; 95% CI = 0.56–1.66) and significantly less likely for high-level symptoms (PHQ-9 scores ≥16) (OR = 0.50; 95% CI = 0.26–0.97).
The MINI may identify more people as depressed than the CIDI, and semi-structured and fully structured interviews may not be interchangeable methods, but these results should be replicated.
Declaration of interest
Drs Jetté and Patten declare that they received a grant, outside the submitted work, from the Hotchkiss Brain Institute, which was jointly funded by the Institute and Pfizer. Pfizer was the original sponsor of the development of the PHQ-9, which is now in the public domain. Dr Chan is a steering committee member or consultant of Astra Zeneca, Bayer, Lilly, MSD and Pfizer. She has received sponsorships and honorarium for giving lectures and providing consultancy and her affiliated institution has received research grants from these companies. Dr Hegerl declares that within the past 3 years, he was an advisory board member for Lundbeck, Servier and Otsuka Pharma; a consultant for Bayer Pharma; and a speaker for Medice Arzneimittel, Novartis, and Roche Pharma, all outside the submitted work. Dr Inagaki declares that he has received grants from Novartis Pharma, lecture fees from Pfizer, Mochida, Shionogi, Sumitomo Dainippon Pharma, Daiichi-Sankyo, Meiji Seika and Takeda, and royalties from Nippon Hyoron Sha, Nanzando, Seiwa Shoten, Igaku-shoin and Technomics, all outside of the submitted work. Dr Yamada reports personal fees from Meiji Seika Pharma Co., Ltd., MSD K.K., Asahi Kasei Pharma Corporation, Seishin Shobo, Seiwa Shoten Co., Ltd., Igaku-shoin Ltd., Chugai Igakusha and Sentan Igakusha, all outside the submitted work. All other authors declare no competing interests. No funder had any role in the design and conduct of the study; collection, management, analysis and interpretation of the data; preparation, review or approval of the manuscript; and decision to submit the manuscript for publication.
Coinfection with human immunodeficiency virus (HIV) and viral hepatitis is associated with high morbidity and mortality in the absence of clinical management, making identification of these cases crucial. We examined characteristics of HIV and viral hepatitis coinfections by using surveillance data from 15 US states and two cities. Each jurisdiction used an automated deterministic matching method to link surveillance data for persons with reported acute and chronic hepatitis B virus (HBV) or hepatitis C virus (HCV) infections, to persons reported with HIV infection. Of the 504 398 persons living with diagnosed HIV infection at the end of 2014, 2.0% were coinfected with HBV and 6.7% were coinfected with HCV. Of the 269 884 persons ever reported with HBV, 5.2% were reported with HIV. Of the 1 093 050 persons ever reported with HCV, 4.3% were reported with HIV. A greater proportion of persons coinfected with HIV and HBV were males and blacks/African Americans, compared with those with HIV monoinfection. Persons who inject drugs represented a greater proportion of those coinfected with HIV and HCV, compared with those with HIV monoinfection. Matching HIV and viral hepatitis surveillance data highlights epidemiological characteristics of persons coinfected and can be used to routinely monitor health status and guide state and national public health interventions.
A range of endophenotypes characterise psychosis, however there has been limited work understanding if and how they are inter-related.
This multi-centre study includes 8754 participants: 2212 people with a psychotic disorder, 1487 unaffected relatives of probands, and 5055 healthy controls. We investigated cognition [digit span (N = 3127), block design (N = 5491), and the Rey Auditory Verbal Learning Test (N = 3543)], electrophysiology [P300 amplitude and latency (N = 1102)], and neuroanatomy [lateral ventricular volume (N = 1721)]. We used linear regression to assess the interrelationships between endophenotypes.
The P300 amplitude and latency were not associated (regression coef. −0.06, 95% CI −0.12 to 0.01, p = 0.060), and P300 amplitude was positively associated with block design (coef. 0.19, 95% CI 0.10–0.28, p < 0.001). There was no evidence of associations between lateral ventricular volume and the other measures (all p > 0.38). All the cognitive endophenotypes were associated with each other in the expected directions (all p < 0.001). Lastly, the relationships between pairs of endophenotypes were consistent in all three participant groups, differing for some of the cognitive pairings only in the strengths of the relationships.
The P300 amplitude and latency are independent endophenotypes; the former indexing spatial visualisation and working memory, and the latter is hypothesised to index basic processing speed. Individuals with psychotic illnesses, their unaffected relatives, and healthy controls all show similar patterns of associations between endophenotypes, endorsing the theory of a continuum of psychosis liability across the population.
Schizophrenia is a highly heritable disorder, linked to several structural abnormalities of the brain. More specifically, previous findings have suggested that increased gyrification in frontal and temporal regions are implicated in the pathogenesis of schizophrenia.
The current study included participants at high familial risk of schizophrenia who remained well (n = 31), who developed sub-diagnostic symptoms (n = 28) and who developed schizophrenia (n = 9) as well as healthy controls (HC) (n = 16). We first tested whether individuals at high familial risk of schizophrenia carried an increased burden of trait-associated alleles using polygenic risk score analysis. We then assessed the extent to which polygenic risk was associated with gyral folding in the frontal and temporal lobes.
We found that individuals at high familial risk of schizophrenia who developed schizophrenia carried a significantly greater burden of risk-conferring variants for the disorder compared to those at high risk (HR) who developed sub-diagnostic symptoms or remained well and HC. Furthermore, within the HR cohort, there was a significant and positive association between schizophrenia polygenic risk score and bilateral frontal gyrification.
These results suggest that polygenic risk for schizophrenia impacts upon early neurodevelopment to confer greater gyral folding in adulthood and an increased risk of developing the disorder.
The Antarctic Roadmap Challenges (ARC) project identified critical requirements to deliver high priority Antarctic research in the 21st century. The ARC project addressed the challenges of enabling technologies, facilitating access, providing logistics and infrastructure, and capitalizing on international co-operation. Technological requirements include: i) innovative automated in situ observing systems, sensors and interoperable platforms (including power demands), ii) realistic and holistic numerical models, iii) enhanced remote sensing and sensors, iv) expanded sample collection and retrieval technologies, and v) greater cyber-infrastructure to process ‘big data’ collection, transmission and analyses while promoting data accessibility. These technologies must be widely available, performance and reliability must be improved and technologies used elsewhere must be applied to the Antarctic. Considerable Antarctic research is field-based, making access to vital geographical targets essential. Future research will require continent- and ocean-wide environmentally responsible access to coastal and interior Antarctica and the Southern Ocean. Year-round access is indispensable. The cost of future Antarctic science is great but there are opportunities for all to participate commensurate with national resources, expertise and interests. The scope of future Antarctic research will necessitate enhanced and inventive interdisciplinary and international collaborations. The full promise of Antarctic science will only be realized if nations act together.
Although contamination of food can occur at any point from farm to table, restaurant food workers are a common source of foodborne illness. We describe the characteristics of restaurant-associated foodborne disease outbreaks and explore the role of food workers by analysing outbreaks associated with restaurants from 1998 to 2013 reported to the Centers for Disease Control and Prevention's Foodborne Disease Outbreak Surveillance System. We identified 9788 restaurant-associated outbreaks. The median annual number of outbreaks was 620 (interquartile range 618–629). In 3072 outbreaks with a single confirmed aetiology reported, norovirus caused the largest number of outbreaks (1425, 46%). Of outbreaks with a single food reported and a confirmed aetiology, fish (254 outbreaks, 34%) was most commonly implicated, and these outbreaks were commonly caused by scombroid toxin (219 outbreaks, 86% of fish outbreaks). Most outbreaks (79%) occurred at sit-down establishments. The most commonly reported contributing factors were those related to food handling and preparation practices in the restaurant (2955 outbreaks, 61%). Food workers contributed to 2415 (25%) outbreaks. Knowledge of the foods, aetiologies, and contributing factors that result in foodborne disease restaurant outbreaks can help guide efforts to prevent foodborne illness.
Ernietta plateauensis Pflug, 1966 is the type species of the Erniettomorpha, an extinct clade of Ediacaran life. It was likely a gregarious, partially infaunal organism. Despite its ecological and taxonomic significance, there has not been an in-depth systematic description in the literature since the original description fell out of use. A newly discovered field site on Farm Aar in southern Namibia has yielded dozens of specimens buried in original life position. Mudstone and sandstone features associated with the fossils indicate that organisms were buried while still exposed to the water column rather than deposited in a flow event. Ernietta plateauensis was a sac-shaped erniettomorph with a body wall constructed from a double layer of tubes. It possessed an equatorial seam lying perpendicular to the tubes. The body is asymmetrical on either side of this seam. The tubes change direction along the body length and appear to be constricted together in the dorsal part of the organism.
Previous neuroimaging studies indicate abnormalities in cortico-limbic circuitry in mood disorder. Here we employ prospective longitudinal voxel-based morphometry to examine the trajectory of these abnormalities during early stages of illness development.
Unaffected individuals (16–25 years) at high and low familial risk of mood disorder underwent structural brain imaging on two occasions 2 years apart. Further clinical assessment was conducted 2 years after the second scan (time 3). Clinical outcome data at time 3 was used to categorize individuals: (i) healthy controls (‘low risk’, n = 48); (ii) high-risk individuals who remained well (HR well, n = 53); and (iii) high-risk individuals who developed a major depressive disorder (HR MDD, n = 30). Groups were compared using longitudinal voxel-based morphometry. We also examined whether progress to illness was associated with changes in other potential risk markers (personality traits, symptoms scores and baseline measures of childhood trauma), and whether any changes in brain structure could be indexed using these measures.
Significant decreases in right amygdala grey matter were found in HR MDD v. controls (p = 0.001) and v. HR well (p = 0.005). This structural change was not related to measures of childhood trauma, symptom severity or measures of sub-diagnostic anxiety, neuroticism or extraversion, although cross-sectionally these measures significantly differentiated the groups at baseline.
These longitudinal findings implicate structural amygdala changes in the neurobiology of mood disorder. They also provide a potential biomarker for risk stratification capturing additional information beyond clinically ascertained measures.
Introduction: Point of care ultrasound has become an established tool in the initial management of patients with undifferentiated hypotension. Current established protocols (RUSH, ACES, etc) were developed by expert user opinion, rather than objective, prospective data. We wished to use reported disease incidence to develop an informed approach to PoCUS in hypotension using a “4 F’s” approach: Fluid; Form; Function; Filling. Methods: We summarized the incidence of PoCUS findings from an international multicentre RCT, and using a modified Delphi approach incorporating this data we obtained the input of 24 international experts associated with five professional organizations led by the International Federation of Emergency Medicine. The modified Delphi tool was developed to reach an international consensus on how to integrate PoCUS for hypotensive emergency department patients. Results: Rates of abnormal PoCUS findings from 151 patients with undifferentiated hypotension included left ventricular dynamic changes (43%), IVC abnormalities (27%), pericardial effusion (16%), and pleural fluid (8%). Abdominal pathology was rare (fluid 5%, AAA 2%). After two rounds of the survey, using majority consensus, agreement was reached on a SHoC-hypotension protocol comprising: A. Core: 1. Cardiac views (Sub-xiphoid and parasternal windows for pericardial fluid, cardiac form and ventricular function); 2. Lung views for pleural fluid and B-lines for filling status; and 3. IVC views for filling status; B. Supplementary: Additional cardiac views; and C. Additional views (when indicated) including peritoneal fluid, aorta, pelvic for IUP, and proximal leg veins for DVT. Conclusion: An international consensus process based on prospectively collected disease incidence has led to a proposed SHoC-hypotension PoCUS protocol comprising a stepwise clinical-indication based approach of Core, Supplementary and Additional PoCUS views.
Introduction: Point of care ultrasound (PoCUS) provides invaluable information during resuscitation efforts in cardiac arrest by determining presence/absence of cardiac activity and identifying reversible causes such as pericardial tamponade. There is no agreed guideline on how to safely and effectively incorporate PoCUS into the advanced cardiac life support (ACLS) algorithm. We consider that a consensus-based priority checklist using a “4 F’s” approach (Fluid; Form; Function; Filling), would provide a better algorithm during ACLS. Methods: The ultrasound subcommittee of the Australasian College for Emergency Medicine (ACEM) drafted a checklist incorporating PoCUS into the ACLS algorithm. This was further developed using the input of 24 international experts associated with five professional organizations led by the International Federation of Emergency Medicine. A modified Delphi tool was developed to reach an international consensus on how to integrate ultrasound into cardiac arrest algorithms for emergency department patients. Results: Consensus was reached following 3 rounds. The agreed protocol focuses on the timing of PoCUS as well as the specific clinical questions. Core cardiac windows performed during the rhythm check pause in chest compressions are the sub-xiphoid and parasternal cardiac views. Either view should be used to detect pericardial fluid, as well as examining ventricular form (e.g. right heart strain) and function, (e.g. asystole versus organized cardiac activity). Supplementary views include lung views (for absent lung sliding in pneumothorax and for pleural fluid), and IVC views for filling. Additional ultrasound applications are for endotracheal tube confirmation, proximal leg veins for DVT, or for sources of blood loss (AAA, peritoneal/pelvic fluid). Conclusion: The authors hope that this process will lead to a consensus-based SHoC-cardiac arrest guideline on incorporating PoCUS into the ACLS algorithm.
Our knowledge of the universe comes from recording the photon and particle fluxes incident on the Earth from space. We thus require sensitive measurement across the entire energy spectrum, using large telescopes with efficient instrumentation located on superb sites. Technological advances and engineering constraints are nearing the point where we are recording as many photons arriving at a site as is possible. Major advances in the future will come from improving the quality of the site. The ultimate site is, of course, beyond the Earth’s atmosphere, such as on the Moon, but economic limitations prevent our exploiting this avenue to the degree that the scientific community desires. Here we describe an alternative, which offers many of the advantages of space for a fraction of the cost: the Antarctic Plateau.
In western Canada, more money is spent on wild oat herbicides than on any
other weed species, and wild oat resistance to herbicides is the most
widespread resistance issue. A direct-seeded field experiment was conducted
from 2010 to 2014 at eight Canadian sites to determine crop life cycle, crop
species, crop seeding rate, crop usage, and herbicide rate combination
effects on wild oat management and canola yield. Combining 2× seeding rates
of early-cut barley silage with 2× seeding rates of winter cereals and
excluding wild oat herbicides for 3 of 5 yr (2011 to 2013) often led to
similar wild oat density, aboveground wild oat biomass, wild oat seed
density in the soil, and canola yield as a repeated canola–wheat rotation
under a full wild oat herbicide rate regime. Wild oat was similarly well
managed after 3 yr of perennial alfalfa without wild oat herbicides.
Forgoing wild oat herbicides in only 2 of 5 yr from exclusively summer
annual crop rotations resulted in higher wild oat density, biomass, and seed
banks. Management systems that effectively combine diverse and optimal
cultural practices against weeds, and limit herbicide use, reduce selection
pressure for weed resistance to herbicides and prolong the utility of
threatened herbicide tools.