To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: With the emergence of antibiotic resistant threats and the need for appropriate antibiotic use, laboratory microbiology information is important to guide clinical decision making in nursing homes, where access to such data can be limited. Susceptibility data are necessary to inform antibiotic selection and to monitor changes in resistance patterns over time. To contribute to existing data that describe antibiotic resistance among nursing home residents, we summarized antibiotic susceptibility data from organisms commonly isolated from urine cultures collected as part of the CDC multistate, Emerging Infections Program (EIP) nursing home prevalence survey. Methods: In 2017, urine culture and antibiotic susceptibility data for selected organisms were retrospectively collected from nursing home residents’ medical records by trained EIP staff. Urine culture results reported as negative (no growth) or contaminated were excluded. Susceptibility results were recorded as susceptible, non-susceptible (resistant or intermediate), or not tested. The pooled mean percentage tested and percentage non-susceptible were calculated for selected antibiotic agents and classes using available data. Susceptibility data were analyzed for organisms with ≥20 isolates. The definition for multidrug-resistance (MDR) was based on the CDC and European Centre for Disease Prevention and Control’s interim standard definitions. Data were analyzed using SAS v 9.4 software. Results: Among 161 participating nursing homes and 15,276 residents, 300 residents (2.0%) had documentation of a urine culture at the time of the survey, and 229 (76.3%) were positive. Escherichia coli, Proteus mirabilis, Klebsiella spp, and Enterococcus spp represented 73.0% of all urine isolates (N = 278). There were 215 (77.3%) isolates with reported susceptibility data (Fig. 1). Of these, data were analyzed for 187 (87.0%) (Fig. 2). All isolates tested for carbapenems were susceptible. Fluoroquinolone non-susceptibility was most prevalent among E. coli (42.9%) and P. mirabilis (55.9%). Among Klebsiella spp, the highest percentages of non-susceptibility were observed for extended-spectrum cephalosporins and folate pathway inhibitors (25.0% each). Glycopeptide non-susceptibility was 10.0% for Enterococcus spp. The percentage of isolates classified as MDR ranged from 10.1% for E. coli to 14.7% for P. mirabilis. Conclusions: Substantial levels of non-susceptibility were observed for nursing home residents’ urine isolates, with 10% to 56% reported as non-susceptible to the antibiotics assessed. Non-susceptibility was highest for fluoroquinolones, an antibiotic class commonly used in nursing homes, and ≥ 10% of selected isolates were MDR. Our findings reinforce the importance of nursing homes using susceptibility data from laboratory service providers to guide antibiotic prescribing and to monitor levels of resistance.
Background: Antibiotics are among the most commonly prescribed drugs in nursing homes; urinary tract infections (UTIs) are a frequent indication. Although there is no gold standard for the diagnosis of UTIs, various criteria have been developed to inform and standardize nursing home prescribing decisions, with the goal of reducing unnecessary antibiotic prescribing. Using different published criteria designed to guide decisions on initiating treatment of UTIs (ie, symptomatic, catheter-associated, and uncomplicated cystitis), our objective was to assess the appropriateness of antibiotic prescribing among NH residents. Methods: In 2017, the CDC Emerging Infections Program (EIP) performed a prevalence survey of healthcare-associated infections and antibiotic use in 161 nursing homes from 10 states: California, Colorado, Connecticut, Georgia, Maryland, Minnesota, New Mexico, New York, Oregon, and Tennessee. EIP staff reviewed resident medical records to collect demographic and clinical information, infection signs, symptoms, and diagnostic testing documented on the day an antibiotic was initiated and 6 days prior. We applied 4 criteria to determine whether initiation of treatment for UTI was supported: (1) the Loeb minimum clinical criteria (Loeb); (2) the Suspected UTI Situation, Background, Assessment, and Recommendation tool (UTI SBAR tool); (3) adaptation of Infectious Diseases Society of America UTI treatment guidelines for nursing home residents (Crnich & Drinka); and (4) diagnostic criteria for uncomplicated cystitis (cystitis consensus) (Fig. 1). We calculated the percentage of residents for whom initiating UTI treatment was appropriate by these criteria. Results: Of 248 residents for whom UTI treatment was initiated in the nursing home, the median age was 79 years [IQR, 19], 63% were female, and 35% were admitted for postacute care. There was substantial variability in the percentage of residents with antibiotic initiation classified as appropriate by each of the criteria, ranging from 8% for the cystitis consensus, to 27% for Loeb, to 33% for the UTI SBAR tool, to 51% for Crnich and Drinka (Fig. 2). Conclusions: Appropriate initiation of UTI treatment among nursing home residents remained low regardless of criteria used. At best only half of antibiotic treatment met published prescribing criteria. Although insufficient documentation of infection signs, symptoms and testing may have contributed to the low percentages observed, adequate documentation in the medical record to support prescribing should be standard practice, as outlined in the CDC Core Elements of Antibiotic Stewardship for nursing homes. Standardized UTI prescribing criteria should be incorporated into nursing home stewardship activities to improve the assessment and documentation of symptomatic UTI and to reduce inappropriate antibiotic use.
Acute change in mental status (ACMS), defined by the Confusion Assessment Method, is used to identify infections in nursing home residents. A medical record review revealed that none of 15,276 residents had an ACMS documented. Using the revised McGeer criteria with a possible ACMS definition, we identified 296 residents and 21 additional infections. The use of a possible ACMS definition should be considered for retrospective nursing home infection surveillance.
Pregabalin is indicated for the treatment of GAD in adults in Europe. The efficacy and safety of pregabalin for the treatment of adults and elderly patients with GAD has been demonstrated in 6 of 7 short-term clinical trials of 4 to 8 weeks.
To characterise the long-term efficacy and safety of pregabalin in subjects with GAD.
Subjects were randomised to double-blind treatment with either high-dose pregabalin (450-600 mg/d), low-dose pregabalin (150-300 mg/d), or lorazepam (3-4 mg/d) for 3 months. Treatment was extended with drug or blinded placebo for a further 3 months.
At 3 months, mean change from baseline Hamilton Anxiety Rating Scale (HAM-A) for pregabalin high- and low-dose, and for lorazepam ranged from -16.0 to -17.4. Mean change from baseline Clinical Global Impression-Severity (CGI-S) scores ranged from -2.1 to -2.3 and mean CGI-Improvement (CGI-I) scores were 1.9 for each active treatment group. At 6 months, improvement was retained for all 3 active drug groups, even when switched to placebo. HAM-A and CGI-S change from baseline scores ranged from -14.9 to -19.0 and -2.0 to -2.5, respectively. Mean CGI-I scores ranged from 1.5 to 2.3. The most frequently reported adverse events were insomnia, fatigue, dizziness, headache, and somnolence.
Efficacy was observed at 3 months, with maintained improvement in anxiety symptoms over 6 months of treatment. These results are consistent with previously reported efficacy and safety trials of shorter duration with pregabalin and lorazepam in subjects with GAD.
Psychiatry in the UK has longstanding recruitment problems (1). Evidence suggests the positive effects of clinical attachments on attitudes towards psychiatry are often transient (2). We therefore created the Psychiatry Early Experience Programme (PEEP) where year 1 medical students are paired with psychiatry trainees and shadow them at work. Students will ideally remain in PEEP throughout medical school, providing consistent exposure to psychiatry and a broad experience of its subspecialties.
1. To present PEEP
2. To assess:
a. Students’ baseline attitudes to psychiatry
b. PEEPs’ impact on students’ attitudes to psychiatry
A prospective survey based cohort study of King’s College London medical students.
PEEP started in 2013. In this cohort all students that signed up were accepted.
Students’ attitudes towards psychiatry were assessed on recruitment using the ATP-30 questionnaire (3), and will be re-assessed annually.
127 students were recruited. Attitudes were positive overall. 73% listed psychiatry in their top three specialities. 95.3% agreed or strongly agreed that ‘psychiatric illness deserves at least as much attention as physical illness.’ 84.3% disagreed or strongly disagreed that ‘at times it is hard to think of psychiatrists as equal to other doctors.’
Baseline attitudes to psychiatry were positive. By March 2015 we aim to collect and analyse data on students’ attitudes after one year in PEEP. Through on-ongoing analysis of this and future cohorts, we aim to assess the impact of PEEP on improving attitudes to psychiatry and whether this will ultimately improve recruitment.
Pregabalin is indicated for the treatment of generalised anxiety disorder (GAD) in adults in Europe. When pregabalin is discontinued, a 1-week (minimum) taper is recommended to prevent potential discontinuation symptoms.
To evaluate whether a 1-week pregabalin taper, after 3 or 6 months of treatment, is associated with the development of discontinuation symptoms (including rebound anxiety) in subjects with GAD.
Subjects were randomised to double-blind treatment with low- (150-300 mg/d) or high-dose pregabalin (450-600 mg/d) or lorazepam (3-4 mg/d) for 3 months. After 3 months ~25% of subjects in each group (per the original randomisation) underwent a double-blind, 1-week taper, with substitution of placebo. The remaining subjects continued on active treatment for another 3 months and underwent the 1-week taper at 6 months.
Discontinuation after 3 months was associated with low mean changes in Physician Withdrawal Checklist (PWC) scores (range: +1.4 to +2.3) and Hamilton Anxiety Rating Scale (HAM A) scores (range: +0.9 to +2.3) for each pregabalin dose and lorazepam. Discontinuation after 6 months was associated with low mean changes in PWC scores (range: -1.0 to +3.0) and HAM A scores (range: -0.8 to +3.0) for all active drugs and placebo. Incidence of rebound anxiety during pregabalin taper was low and did not appear related to treatment dose or duration.
A 1-week taper following 3 or 6 months of pregabalin treatment was not associated with clinically meaningful discontinuation symptoms as evaluated by changes in the PWC and HAM A rating scales.
At Guy's King's and St Thomas’ School of Medicine, a unique initiative is the Psychiatry Early Experience Programme (PEEP), which allows students to shadow psychiatry trainees at work several times a year. The students’ attitudes towards psychiatry and the scheme are regularly assessed and initial results are already available.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Quaternary processes and environmental changes are often difficult to assess in remote subantarctic islands due to high surface erosion rates and overprinting of sedimentary products in locations that can be a challenge to access. We present a set of high-resolution, multichannel seismic lines and complementary multibeam bathymetry collected off the eastern (leeward) side of the subantarctic Auckland Islands, about 465 km south of New Zealand's South Island. These data constrain the erosive and depositional history of the island group, and they reveal an extensive system of sediment-filled valleys that extend offshore to depths that exceed glacial low-stand sea level. Although shallow, marine, U-shaped valleys and moraines are imaged, the rugged offshore geomorphology of the paleovalley floors and the stratigraphy of infill sediments suggests that the valley floors were shaped by submarine fluvial erosion, and subsequently filled by lacustrine, fjord, and fluvial sedimentary processes.
Evidence has been accumulating regarding alterations in components of the endocannabinoid system in patients with psychosis. Of all the putative risk factors associated with psychosis, being at clinical high-risk for psychosis (CHR) has the strongest association with the onset of psychosis, and exposure to childhood trauma has been linked to an increased risk of development of psychotic disorder. We aimed to investigate whether being at-risk for psychosis and exposure to childhood trauma were associated with altered endocannabinoid levels.
We compared 33 CHR participants with 58 healthy controls (HC) and collected information about previous exposure to childhood trauma as well as plasma samples to analyse endocannabinoid levels.
Individuals with both CHR and experience of childhood trauma had higher N-palmitoylethanolamine (p < 0.001) and anandamide (p < 0.001) levels in peripheral blood compared to HC and those with no childhood trauma. There was also a significant correlation between N-palmitoylethanolamine levels and symptoms as well as childhood trauma.
Our results suggest an association between CHR and/or childhood maltreatment and elevated endocannabinoid levels in peripheral blood, with a greater alteration in those with both CHR status and history of childhood maltreatment compared to those with either of those risks alone. Furthermore, endocannabinoid levels increased linearly with the number of risk factors and elevated endocannabinoid levels correlated with the severity of CHR symptoms and extent of childhood maltreatment. Further studies in larger cohorts, employing longitudinal designs are needed to confirm these findings and delineate the precise role of endocannabinoid alterations in the pathophysiology of psychosis.
Background: Cervical sponylotic myelopathy (CSM) may present with neck and arm pain. This study investiagtes the change in neck/arm pain post-operatively in CSM. Methods: This ambispective study llocated 402 patients through the Canadian Spine Outcomes and Research Network. Outcome measures were the visual analogue scales for neck and arm pain (VAS-NP and VAS-AP) and the neck disability index (NDI). The thresholds for minimum clinically important differences (MCIDs) for VAS-NP and VAS-AP were determined to be 2.6 and 4.1. Results: VAS-NP improved from mean of 5.6±2.9 to 3.8±2.7 at 12 months (P<0.001). VAS-AP improved from 5.8±2.9 to 3.5±3.0 at 12 months (P<0.001). The MCIDs for VAS-NP and VAS-AP were also reached at 12 months. Based on the NDI, patients were grouped into those with mild pain/no pain (33%) versus moderate/severe pain (67%). At 3 months, a significantly high proportion of patients with moderate/severe pain (45.8%) demonstrated an improvement into mild/no pain, whereas 27.2% with mild/no pain demonstrated worsening into moderate/severe pain (P <0.001). At 12 months, 17.4% with mild/no pain experienced worsening of their NDI (P<0.001). Conclusions: This study suggests that neck and arm pain responds to surgical decompression in patients with CSM and reaches the MCIDs for VAS-AP and VAS-NP at 12 months.
The ALMA twenty-six arcmin2 survey of GOODS-S at one millimeter (ASAGAO) is a deep (1σ ∼ 61μJy/beam) and wide area (26 arcmin2) survey on a contiguous field at 1.2 mm. By combining with archival data, we obtained a deeper map in the same region (1σ ∼ 30μJy/beam−1, synthesized beam size 0.59″ × 0.53″), providing the largest sample of sources (25 sources at 5σ, 45 sources at 4.5σ) among ALMA blank-field surveys. The median redshift of the 4.5σ sources is 2.4. The number counts shows that 52% of the extragalactic background light at 1.2 mm is resolved into discrete sources. We create IR luminosity functions (LFs) at z = 1–3, and constrain the faintest luminosity of the LF at 2 < z < 3. The LFs are consistent with previous results based on other ALMA and SCUBA-2 observations, which suggests a positive luminosity evolution and negative density evolution.
Introduction: Acute aortic syndrome (AAS) is a time sensitive aortic catastrophe that is often misdiagnosed. There are currently no Canadian guidelines to aid in diagnosis. Our goal was to adapt the existing American Heart Association (AHA) and European Society of Cardiology (ESC) diagnostic algorithms for AAS into a Canadian evidence based best practices algorithm targeted for emergency medicine physicians. Methods: We chose to adapt existing high-quality clinical practice guidelines (CPG) previously developed by the AHA/ESC using the GRADE ADOLOPMENT approach. We created a National Advisory Committee consisting of 21 members from across Canada including academic, community and remote/rural emergency physicians/nurses, cardiothoracic and cardiovascular surgeons, cardiac anesthesiologists, critical care physicians, cardiologist, radiologists and patient representatives. The Advisory Committee communicated through multiple teleconference meetings, emails and a one-day in person meeting. The panel prioritized questions and outcomes, using the Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach to assess evidence and make recommendations. The algorithm was prepared and revised through feedback and discussions and through an iterative process until consensus was achieved. Results: The diagnostic algorithm is comprised of an updated pre test probability assessment tool with further testing recommendations based on risk level. The updated tool incorporates likelihood of an alternative diagnosis and point of care ultrasound. The final best practice diagnostic algorithm defined risk levels as Low (0.5% no further testing), Moderate (0.6-5% further testing required) and High ( >5% computed tomography, magnetic resonance imaging, trans esophageal echocardiography). During the consensus and feedback processes, we addressed a number of issues and concerns. D-dimer can be used to reduce probability of AAS in an intermediate risk group, but should not be used in a low or high-risk group. Ultrasound was incorporated as a bedside clinical examination option in pre test probability assessment for aortic insufficiency, abdominal/thoracic aortic aneurysms. Conclusion: We have created the first Canadian best practice diagnostic algorithm for AAS. We hope this diagnostic algorithm will standardize and improve diagnosis of AAS in all emergency departments across Canada.
The majority of paediatric Clostridioides difficile infections (CDI) are community-associated (CA), but few data exist regarding associated risk factors. We conducted a case–control study to evaluate CA-CDI risk factors in young children. Participants were enrolled from eight US sites during October 2014–February 2016. Case-patients were defined as children aged 1–5 years with a positive C. difficile specimen collected as an outpatient or ⩽3 days of hospital admission, who had no healthcare facility admission in the prior 12 weeks and no history of CDI. Each case-patient was matched to one control. Caregivers were interviewed regarding relevant exposures. Multivariable conditional logistic regression was performed. Of 68 pairs, 44.1% were female. More case-patients than controls had a comorbidity (33.3% vs. 12.1%; P = 0.01); recent higher-risk outpatient exposures (34.9% vs. 17.7%; P = 0.03); recent antibiotic use (54.4% vs. 19.4%; P < 0.0001); or recent exposure to a household member with diarrhoea (41.3% vs. 21.5%; P = 0.04). In multivariable analysis, antibiotic exposure in the preceding 12 weeks was significantly associated with CA-CDI (adjusted matched odds ratio, 6.25; 95% CI 2.18–17.96). Improved antibiotic prescribing might reduce CA-CDI in this population. Further evaluation of the potential role of outpatient healthcare and household exposures in C. difficile transmission is needed.
Many patients with advanced serious illness or at the end of life experience delirium, a potentially reversible form of acute brain dysfunction, which may impair ability to participate in medical decision-making and to engage with their loved ones. Screening for delirium provides an opportunity to address modifiable causes. Unfortunately, delirium remains underrecognized. The main objective of this pilot was to validate the brief Confusion Assessment Method (bCAM), a two-minute delirium-screening tool, in a veteran palliative care sample.
This was a pilot prospective, observational study that included hospitalized patients evaluated by the palliative care service at a single Veterans’ Administration Medical Center. The bCAM was compared against the reference standard, the Diagnostic and Statistical Manual of Mental Disorders, fifth edition. Both assessments were blinded and conducted within 30 minutes of each other.
We enrolled 36 patients who were a median of 67 years (interquartile range 63–73). The primary reasons for admission to the hospital were sepsis or severe infection (33%), severe cardiac disease (including heart failure, cardiogenic shock, and myocardial infarction) (17%), or gastrointestinal/liver disease (17%). The bCAM performed well against the Diagnostic and Statistical Manual of Mental Disorders, fifth edition, for detecting delirium, with a sensitivity (95% confidence interval) of 0.80 (0.4, 0.96) and specificity of 0.87 (0.67, 0.96).
Significance of Results
Delirium was present in 27% of patients enrolled and never recognized by the palliative care service in routine clinical care. The bCAM provided good sensitivity and specificity in a pilot of palliative care patients, providing a method for nonpsychiatrically trained personnel to detect delirium.
Filamentary structures can form within the beam of protons accelerated during the interaction of an intense laser pulse with an ultrathin foil target. Such behaviour is shown to be dependent upon the formation time of quasi-static magnetic field structures throughout the target volume and the extent of the rear surface proton expansion over the same period. This is observed via both numerical and experimental investigations. By controlling the intensity profile of the laser drive, via the use of two temporally separated pulses, both the initial rear surface proton expansion and magnetic field formation time can be varied, resulting in modification to the degree of filamentary structure present within the laser-driven proton beam.
BACKGROUND: IGTS is a rare phenomenon of paradoxical germ cell tumor (GCT) growth during or following treatment despite normalization of tumor markers. We sought to evaluate the frequency, clinical characteristics and outcome of IGTS in patients in 21 North-American and Australian institutions. METHODS: Patients with IGTS diagnosed from 2000-2017 were retrospectively evaluated. RESULTS: Out of 739 GCT diagnoses, IGTS was identified in 33 patients (4.5%). IGTS occurred in 9/191 (4.7%) mixed-malignant GCTs, 4/22 (18.2%) immature teratomas (ITs), 3/472 (0.6%) germinomas/germinomas with mature teratoma, and in 17 secreting non-biopsied tumours. Median age at GCT diagnosis was 10.9 years (range 1.8-19.4). Male gender (84%) and pineal location (88%) predominated. Of 27 patients with elevated markers, median serum AFP and Beta-HCG were 70 ng/mL (range 9.2-932) and 44 IU/L (range 4.2-493), respectively. IGTS occurred at a median time of 2 months (range 0.5-32) from diagnosis, during chemotherapy in 85%, radiation in 3%, and after treatment completion in 12%. Surgical resection was attempted in all, leading to gross total resection in 76%. Most patients (79%) resumed GCT chemotherapy/radiation after surgery. At a median follow-up of 5.3 years (range 0.3-12), all but 2 patients are alive (1 succumbed to progressive disease, 1 to malignant transformation of GCT). CONCLUSION: IGTS occurred in less than 5% of patients with GCT and most commonly after initiation of chemotherapy. IGTS was more common in patients with IT-only on biopsy than with mixed-malignant GCT. Surgical resection is a principal treatment modality. Survival outcomes for patients who developed IGTS are favourable.
The Molonglo Observatory Synthesis Telescope (MOST) is an 18000 m2 radio telescope located 40 km from Canberra, Australia. Its operating band (820–851 MHz) is partly allocated to telecommunications, making radio astronomy challenging. We describe how the deployment of new digital receivers, Field Programmable Gate Array-based filterbanks, and server-class computers equipped with 43 Graphics Processing Units, has transformed the telescope into a versatile new instrument (UTMOST) for studying the radio sky on millisecond timescales. UTMOST has 10 times the bandwidth and double the field of view compared to the MOST, and voltage record and playback capability has facilitated rapid implementaton of many new observing modes, most of which operate commensally. UTMOST can simultaneously excise interference, make maps, coherently dedisperse pulsars, and perform real-time searches of coherent fan-beams for dispersed single pulses. UTMOST operates as a robotic facility, deciding how to efficiently target pulsars and how long to stay on source via real-time pulsar folding, while searching for single pulse events. Regular timing of over 300 pulsars has yielded seven pulsar glitches and three Fast Radio Bursts during commissioning. UTMOST demonstrates that if sufficient signal processing is applied to voltage streams, innovative science remains possible even in hostile radio frequency environments.
The class of radio transients called Fast Radio Bursts (FRBs) encompasses enigmatic single pulses, each unique in its own way, hindering a consensus for their origin. The key to demystifying FRBs lies in discovering many of them in order to identity commonalities – and in real time, in order to find potential counterparts at other wavelengths. The recently upgraded UTMOST in Australia, is undergoing a backend transformation to rise as a fast transient detection machine. The first interferometric detections of FRBs with UTMOST, place their origin beyond the near-field region of the telescope thus ruling out local sources of interference as a possible origin. We have localised these bursts to much better than the ones discovered at the Parkes radio telescope and have plans to upgrade UTMOST to be capable of much better localisation still.
Timing of weed emergence and seed persistence in the soil influence the ability to implement timely and effective control practices. Emergence patterns and seed persistence of kochia populations were monitored in 2010 and 2011 at sites in Kansas, Colorado, Wyoming, Nebraska, and South Dakota. Weekly observations of emergence were initiated in March and continued until no new emergence occurred. Seed was harvested from each site, placed into 100-seed mesh packets, and buried at depths of 0, 2.5, and 10 cm in fall of 2010 and 2011. Packets were exhumed at 6-mo intervals over 2 yr. Viability of exhumed seeds was evaluated. Nonlinear mixed-effects Weibull models were fit to cumulative emergence (%) across growing degree days (GDD) and to viable seed (%) across burial time to describe their fixed and random effects across site-years. Final emergence densities varied among site-years and ranged from as few as 4 to almost 380,000 seedlings m−2. Across 11 site-years in Kansas, cumulative GDD needed for 10% emergence were 168, while across 6 site-years in Wyoming and Nebraska, only 90 GDD were needed; on the calendar, this date shifted from early to late March. The majority (>95%) of kochia seed did not persist for more than 2 yr. Remaining seed viability was generally >80% when seeds were exhumed within 6 mo after burial in March, and declined to <5% by October of the first year after burial. Burial did not appear to increase or decrease seed viability over time but placed seed in a position from which seedling emergence would not be possible. High seedling emergence that occurs very early in the spring emphasizes the need for fall or early spring PRE weed control such as tillage, herbicides, and cover crops, while continued emergence into midsummer emphasizes the need for extended periods of kochia management.