To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
OBJECTIVES/GOALS: Many older sepsis survivors develop chronic critical illness (CCI) with poor outcomes. Sepsis is caused by a dysregulated immune response and biomarkers reflecting PICS. The purpose was to compare serial PICS biomarkers in a) older (versus young) adults and b) older CCI (versus older RAP) patients to gain insight into underlying pathobiology of CCI. METHODS/STUDY POPULATION: Prospective longitudinal study with young (â‰¤ 45 years) and older (â‰¥ 65 years) septic adults who were characterized by a) baseline predisposition, b) hospital outcomes, c) serial SOFA organ dysfunction scores over 14 days, d) Zubrod Performance status at three, six and 12-month follow-up and e) mortality over 12 months. Serial blood samples over 14 days were analyzed for selected biomarkers reflecting PICS. RESULTS/ANTICIPATED RESULTS: Compared to the young, more older adults developed CCI (20% vs 42%) and had markedly worse serial SOFA scores, performance status and mortality over 12 months. Additionally, older (versus young) and older CCI (versus older RAP) patients had more persistent aberrations in biomarkers reflecting inflammation, immunosuppression, stress metabolism, lack of anabolism and anti-angiogenesis over 14 days after sepsis. DISCUSSION/SIGNIFICANCE: Older (versus young) and older CCI (versus older RAP) patient subgroups demonstrate early biomarker evidence of the underlying pathobiology of PICS. The population of older sepsis survivors is need of interventions to lower systemic inflammation and stimulate anabolism to prevent skeletal muscle wasting and disability.
Response to lithium in patients with bipolar disorder is associated with clinical and transdiagnostic genetic factors. The predictive combination of these variables might help clinicians better predict which patients will respond to lithium treatment.
To use a combination of transdiagnostic genetic and clinical factors to predict lithium response in patients with bipolar disorder.
This study utilised genetic and clinical data (n = 1034) collected as part of the International Consortium on Lithium Genetics (ConLi+Gen) project. Polygenic risk scores (PRS) were computed for schizophrenia and major depressive disorder, and then combined with clinical variables using a cross-validated machine-learning regression approach. Unimodal, multimodal and genetically stratified models were trained and validated using ridge, elastic net and random forest regression on 692 patients with bipolar disorder from ten study sites using leave-site-out cross-validation. All models were then tested on an independent test set of 342 patients. The best performing models were then tested in a classification framework.
The best performing linear model explained 5.1% (P = 0.0001) of variance in lithium response and was composed of clinical variables, PRS variables and interaction terms between them. The best performing non-linear model used only clinical variables and explained 8.1% (P = 0.0001) of variance in lithium response. A priori genomic stratification improved non-linear model performance to 13.7% (P = 0.0001) and improved the binary classification of lithium response. This model stratified patients based on their meta-polygenic loadings for major depressive disorder and schizophrenia and was then trained using clinical data.
Using PRS to first stratify patients genetically and then train machine-learning models with clinical predictors led to large improvements in lithium response prediction. When used with other PRS and biological markers in the future this approach may help inform which patients are most likely to respond to lithium treatment.
Glyphosate’s efficacy is influenced by the amount absorbed and translocated throughout the plant to inhibit 5-enolpyruvyl shikimate-3-phosphate synthase (EPSPS). Glyphosate resistance can be due to target-site (TS) or non–target site (NTS) resistance mechanisms. TS resistance includes an altered target site and gene overexpression, while NTS resistance includes reduced absorption, reduced translocation, enhanced metabolism, and exclusion/sequestration. The goal of this research was to elucidate the mechanism(s) of glyphosate resistance in common ragweed (Ambrosia artemisiifolia L.) from Ontario, Canada. The resistance factor for this glyphosate-resistant (GR) A. artemisiifolia biotype is 5.1. No amino acid substitutions were found at positions 102 or 106 of the EPSPS enzyme in this A. artemisiifolia biotype. Based on [14C]glyphosate studies, there was no difference in glyphosate absorption or translocation between glyphosate-susceptible (GS) and GR A. artemisiifolia biotypes. Radio-labeled glyphosate metabolites were similar for GS and GR A. artemisiifolia 96 h after application. Glyphosate resistance in this A. artemisiifolia biotype is not due to an altered target site due to amino acid substitutions at positions 102 and 106 in the EPSPS and is not due to the NTS mechanisms of reduced absorption, reduced translocation, or enhanced metabolism.
Identifying the most effective ways to support career development of early stage investigators in clinical and translational science should yield benefits for the biomedical research community. Institutions with Clinical and Translational Science Awards (CTSA) offer KL2 programs to facilitate career development; however, the sustained impact has not been widely assessed.
A survey comprised of quantitative and qualitative questions was sent to 2144 individuals that had previously received support through CTSA KL2 mechanisms. The 547 responses were analyzed with identifying information redacted.
Respondents held MD (47%), PhD (36%), and MD/PhD (13%) degrees. After KL2 support was completed, physicians’ time was divided 50% to research and 30% to patient care, whereas PhD respondents devoted 70% time to research. Funded research effort averaged 60% for the cohort. Respondents were satisfied with their career progression. More than 95% thought their current job was meaningful. Two-thirds felt confident or very confident in their ability to sustain a career in clinical and translational research. Factors cited as contributing to career success included protected time, mentoring, and collaborations.
This first large systematic survey of KL2 alumni provides valuable insight into the group’s perceptions of the program and outcome information. Former scholars are largely satisfied with their career choice and direction, national recognition of their expertise, and impact of their work. Importantly, they identified training activities that contributed to success. Our results and future analysis of the survey data should inform the framework for developing platforms to launch sustaining careers of translational scientists.
This chapter synthesises insights from the Deep Decarbonisation Pathways Project (DDPP), which provided detailed analysis of how 16 countries representing three-quarters of global emissions can transition to very low-carbon economies. The four ‘pillars’ of decarbonisation are identified as: achieving low or zero-carbon electricity supply; electrification and fuel switching in transport, industry and housing; ambitious energy efficiency improvements; and reducing non-energy emissions. The chapter focuses on decarbonisation scenarios for Australia. It shows that electricity supply can be readily decarbonised and greatly expanded to cater for electrification of transport, industry and buildings. There would be remaining emissions principally from industry and agriculture, these could be fully compensated through land-based carbon sequestration. The analysis shows that such decarbonisation would be consistent with continued growth in GDP and trade, and would require very little change in economic structure of Australia’s economy. Australia is rich in renewable energy potential, which could re-enable new industries such as energy-intensive manufacturing for export
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
Scimitar syndrome is a rare CHD composed of partial anomalous pulmonary venous connection from the right lung, via a scimitar vein, to the inferior vena cava rather than the left atrium. Genetic conditions associated with scimitar syndrome have not been well investigated at present.
Our study included patients with scimitar syndrome diagnosed at Texas Children’s Hospital from January 1987 to July 2020. Medical records were evaluated to determine if genetic testing was performed, including chromosomal microarray analysis or whole-exome sequencing. Copy number variants identified as pathogenic/likely pathogenic and variants of unknown significance were collected. Analyses of cardiac and extracardiac findings were performed via chart review.
Ninety-eight patients were identified with scimitar syndrome, 89 of which met inclusion criteria. A chromosome analysis or chromosomal microarray analysis was performed in 18 patients (20%). Whole-exome sequencing was performed in six patients following negative chromosomal microarray analysis testing. A molecular genetic diagnosis was made in 7 of 18 cases (39% of those tested). Ninety-six per cent of the cohort had some type of extracardiac finding, with 43% having asthma and 20% having a gastrointestinal pathology. Of the seven patients with positive genetic testing, all had extracardiac anomalies with all but one having gastrointestinal findings and 30% having congenital diaphragmatic hernia.
Genetic testing revealed an underlying diagnosis in roughly 40% of those tested. Given the relatively high prevalence of pathogenic variants, we recommend chromosomal microarray analysis and whole-exome sequencing for patients with scimitar syndrome and extracardiac defects.
Antibiotics are frequently prescribed inappropriately for acute respiratory infections in the outpatient setting. We report the implementation of a multifaceted outpatient antimicrobial stewardship initiative resulting in a 12.3% absolute reduction of antibiotic prescribing for acute bronchitis in primary care clinics receiving active interventions.
Nearly three times as many people detained in a jail have a serious mental illness (SMI) when compared to community samples. Once an individual with SMI gets involved in the criminal justice system, they are more likely than the general population to stay in the system, face repeated incarcerations, and return to prison more quickly when compared to their nonmentally ill counterparts.
Rock debris covers ~30% of glacier ablation areas in the Central Himalaya and modifies the impact of atmospheric conditions on mass balance. The thermal properties of supraglacial debris are diurnally variable but remain poorly constrained for monsoon-influenced glaciers over the timescale of the ablation season. We measured vertical debris profile temperatures at 12 sites on four glaciers in the Everest region with debris thickness ranging from 0.08 to 2.8 m. Typically, the length of the ice ablation season beneath supraglacial debris was 160 days (15 May to 22 October)—a month longer than the monsoon season. Debris temperature gradients were approximately linear (r2 > 0.83), measured as −40°C m–1 where debris was up to 0.1 m thick, −20°C m–1 for debris 0.1–0.5 m thick, and −4°C m–1 for debris greater than 0.5 m thick. Our results demonstrate that the influence of supraglacial debris on the temperature of the underlying ice surface, and therefore melt, is stable at a seasonal timescale and can be estimated from near-surface temperature. These results have the potential to greatly improve the representation of ablation in calculations of debris-covered glacier mass balance and projections of their response to climate change.
The movement of healthcare professionals (HCPs) induces an indirect contact network: touching a patient or the environment in one area, then again elsewhere, can spread healthcare-associated pathogens from 1 patient to another. Thus, understanding HCP movement is vital to calibrating mathematical models of healthcare-associated infections. Because long-term care facilities (LTCFs) are an important locus of transmission and have been understudied relative to hospitals, we developed a system for measuring contact patterns specifically within an LTCF. Methods: To measure HCP movement patterns, we used badges (credit-card–sized, programmable, battery-powered devices with wireless proximity sensors) worn by HCPs and placed in 30 locations for 3 days. Each badge broadcasts a brief message every 8 seconds. When received by other badges within range, the recipients recorded the time, source badge identifier, and signal strength. By fusing the data collected by all badges with a facility map, we estimated when and for how long each HCP was in any of the locations where instruments had been installed. Results: Combining the messages captured by all of our devices, we calculated the dwell time for each job type (eg, nurses, nursing assistants, physical therapists) in different locations (eg, resident rooms, dining areas, nurses stations, hallways, etc). Although dwell times over all job and area types averaged ∼100 seconds, the standard deviation was large (115 seconds), with a mean of maximums by job type of ∼450 seconds. For example, nursing assistants spent substantially more time in resident rooms and transitioned across rooms at a much higher rate. Overall, each distribution exhibits a power-law–like characteristic. By aggregating the data from devices with location data extracted from the floor plan, we were able to produce an explicit trace for each individual (identified only by job type) for each day and to compute cross-table transition probabilities by area for each job type. Conclusions: We developed a portable system for measuring contact patterns in long-term care settings. Our results confirm that frequent interactions between HCPs and LTC residents occur, but they are not uniform across job types or resident locations. The data produced by our system can be used to better calibrate mathematical models of pathogen spread in LTCs. Moreover, our system can be easily and quickly deployed to any healthcare settings to similarly inform outbreak investigations.
Disclosures: Scott Fridkin reports that his spouse receives a consulting fee from the vaccine industry.
Background: Certain nursing home (NH) resident care tasks have a higher risk for multidrug-resistant organisms (MDRO) transfer to healthcare personnel (HCP), which can result in transmission to residents if HCPs fail to perform recommended infection prevention practices. However, data on HCP-resident interactions are limited and do not account for intrafacility practice variation. Understanding differences in interactions, by HCP role and unit, is important for informing MDRO prevention strategies in NHs. Methods: In 2019, we conducted serial intercept interviews; each HCP was interviewed 6–7 times for the duration of a unit’s dayshift at 20 NHs in 7 states. The next day, staff on a second unit within the facility were interviewed during the dayshift. HCP on 38 units were interviewed to identify healthcare personnel (HCP)–resident care patterns. All unit staff were eligible for interviews, including certified nursing assistants (CNAs), nurses, physical or occupational therapists, physicians, midlevel practitioners, and respiratory therapists. HCP were asked to list which residents they had cared for (within resident rooms or common areas) since the prior interview. Respondents selected from 14 care tasks. We classified units into 1 of 4 types: long-term, mixed, short stay or rehabilitation, or ventilator or skilled nursing. Interactions were classified based on the risk of HCP contamination after task performance. We compared proportions of interactions associated with each HCP role and performed clustered linear regression to determine the effect of unit type and HCP role on the number of unique task types performed per interaction. Results: Intercept-interviews described 7,050 interactions and 13,843 care tasks. Except in ventilator or skilled nursing units, CNAs have the greatest proportion of care interactions (interfacility range, 50%–60%) (Fig. 1). In ventilator and skilled nursing units, interactions are evenly shared between CNAs and nurses (43% and 47%, respectively). On average, CNAs in ventilator and skilled nursing units perform the most unique task types (2.5 task types per interaction, Fig. 2) compared to other unit types (P < .05). Compared to CNAs, most other HCP types had significantly fewer task types (0.6–1.4 task types per interaction, P < .001). Across all facilities, 45.6% of interactions included tasks that were higher-risk for HCP contamination (eg, transferring, wound and device care, Fig. 3). Conclusions: Focusing infection prevention education efforts on CNAs may be most efficient for preventing MDRO transmission within NH because CNAs have the most HCP–resident interactions and complete more tasks per visit. Studies of HCP-resident interactions are critical to improving understanding of transmission mechanisms as well as target MDRO prevention interventions.
Funding: Centers for Disease Control and Prevention (grant no. U01CK000555-01-00)
Disclosures: Scott Fridkin, consulting fee, vaccine industry (spouse)
The CDC recommends that consultant pharmacists support antimicrobial stewardship programs (ASPs) in long-term care facilities (LTCFs). We studied CDC-recommended ASP core elements implementation and antibiotic use in LTCFs before and after training consultant pharmacists. Methods: Between August 2017 and October 2017, consultant pharmacists from a regional long-term care pharmacy attended 5 didactic sessions preparing them to assist LTCFs in implementation of CDC-recommended ASP core elements. Training also included creating a process for evaluating appropriateness of all systemic antibiotics and providing prescriber feedback during their monthly mandatory drug-regimen reviews. Once monthly “meet-the-expert” sessions were held with consultant pharmacists throughout the project (November 2017 to December 2018). LTCF enrollment began in November 2017 and >90% of facilities joined by January 2018. After enrollment, consultant pharmacists initiated ASP interventions including antibiotic reviews and feedback using standard templates. They also held regular meetings with infection preventionists to discuss Core Elements implementation and provided various ASP resources to LTCFs (eg, antibiotic policy template, guidance documents and standard assessment and communication tools). Data collection included ASP Core Elements, antibiotic starts, days of therapy (DOT), and resident days (RD). The McNemar test, the Wilcoxon signed-rank test, generalized estimating equation model, and the classic repeated measures approach were used to compare the presence of all 7 core elements and antibiotic use during the baseline (2017) and intervention (2018) year.Results: In total, 9 trained consultant pharmacists assisted 32 LTCFs with ASP implementation. When evaluating 27 LTCFs that provided complete data, a significant increase in presence of all 7 Core Elements after the intervention was noted compared to baseline (67% vs 0; median Core Elements, 7 vs 2; range, 6–7 vs 1–6; P < .001). Median monthly antibiotic starts per 1,000 RD and DOT per 1,000 RD decreased in 2018 compared to 2017: 8.93 versus 9.91 (P < .01) and 106.47 versus 141.59 (P < .001), respectively. However, variations in antibiotic use were detected among facilities (Table 1). When comparing trends, antibiotic starts and DOT were already trending downward during 2017 (Fig. 1A and 1B). On average, antibiotic starts decreased by 0.27 per 1,000 RD (P < .001) and DOT by 1.92 per 1,000 RD (P < .001) each month during 2017. Although antibiotic starts remained mostly stable in 2018, DOT continued to decline further (average monthly decline, 2.60 per 1,000 RD; P < .001). When analyzing aggregated mean, antibiotic use across all sites per month by year, DOT were consistently lower throughout 2018 and antibiotic starts were lower for the first 9 months (Fig. 1C and 1D). Conclusions: Consultant pharmacists can play an important role in strengthening ASPs and in decreasing antibiotic use in LTCFs. Educational programs should be developed nationally to train long-term care consultant pharmacists in ASP implementation.
Funding: Merck & Co., Inc, provided funding for this study.
Disclosures: Muhammad Salman Ashraf and Scott Bergman report receipt of a research grant from Merck.
Generalization of conditioned-fear, a core feature of post-traumatic stress disorder (PTSD), has been the focus of several recent neuroimaging studies. A striking outcome of these studies is the frequency with which neural correlates of generalization fall within hubs of well-established functional networks including salience (SN), central executive (CEN), and default networks (DN). Neural substrates of generalization found to date may thus reflect traces of large-scale brain networks that form more expansive neural representations of generalization. The present study includes the first network-based analysis of generalization and PTSD-related abnormalities therein.
fMRI responses in established intrinsic connectivity networks (ICNs) representing SN, CEN, and DN were assessed during a generalized conditioned-fear task in male combat veterans (N = 58) with wide-ranging PTSD symptom severity. The task included five rings of graded size. Extreme sizes served as conditioned danger-cues (CS+: paired with shock) and safety-cues (CS−), and the three intermediate sizes served as generalization stimuli (GSs) forming a continuum-of-size between CS+ and CS–. Generalization-gradients were assessed as behavioral and ICN response slopes from CS+, through GSs, to CS–. Increasing PTSD symptomatology was predicted to relate to less-steep slopes indicative of stronger generalization.
SN, CEN, and DN responses fell along generalization-gradients with levels of generalization within and between SN and CEN scaling with PTSD symptom severity.
Neural substrates of generalized conditioned-fear include large-scale networks that adhere to the functional organization of the brain. Current findings implicate levels of generalization in SN and CEN as promising neural markers of PTSD.
Amazon's Mechanical Turk is widely used for data collection; however, data quality may be declining due to the use of virtual private servers to fraudulently gain access to studies. Unfortunately, we know little about the scale and consequence of this fraud, and tools for social scientists to detect and prevent this fraud are underdeveloped. We first analyze 38 studies and show that this fraud is not new, but has increased recently. We then show that these fraudulent respondents provide particularly low-quality data and can weaken treatment effects. Finally, we provide two solutions: an easy-to-use application for identifying fraud in the existing datasets and a method for blocking fraudulent respondents in Qualtrics surveys.
People with severe mental illness (SMI) have numerous risk factors that may predispose them to food insecurity (FI); however, the prevalence of FI and its effects on health are under-researched in this population. The present study aimed to describe the prevalence of FI and its relationship to lifestyle factors in people with SMI. This cross-sectional study recruited people with SMI receiving long-acting injectable (LAI) antipsychotic medication from community services at three sites in Sydney, Australia. Assessments were completed on physical health and lifestyle factors. χ2 Tests, independent-samples t tests and binary logistic regression analyses were calculated to examine relationships between lifestyle factors and FI. In total, 233 people completed the assessments: 154 were males (66 %), mean age 44·8 (sd 12·7) years, and the majority (70 %) had a diagnosis of schizophrenia. FI was present in 104 participants (45 %). People with FI were less likely to consume fruits (OR 0·42, 95 % CI 0·24, 0·74, P = 0·003), vegetables (OR 0·39, 95 % CI 0·22, 0·69, P = 0·001) and protein-based foods (OR 0·45, 95 % CI 0·25, 0·83, P = 0·011) at least once daily, engaged in less moderate to vigorous physical activity (min) (OR 0·997, 95 % CI 0·993, 1·000, P = 0·044), and were more likely to smoke (OR 1·89, 95 % CI 1·08, 3·32, P = 0·026). FI is highly prevalent among people with SMI receiving LAI antipsychotic medications. Food-insecure people with SMI engage in less healthy lifestyle behaviours, increasing the risk of future non-communicable disease.
Posttraumatic stress disorder (PTSD) is often complicated by the after-effects of mild traumatic brain injury (mTBI). The mixture of brain conditions results in abnormal affective and cognitive functioning, as well as maladaptive behavior. To better understand how brain activity explains cognitive and emotional processes in these conditions, we used an emotional N-back task and functional magnetic resonance imaging (fMRI) to study neural responses in US military veterans after deployments to Iraq and Afghanistan. Additionally, we sought to examine whether hierarchical dimensional models of maladaptive personality could account for the relationship between combat-related brain conditions and fMRI responses under cognitive and affective challenge. FMRI data, measures of PTSD symptomatology (PTSS), blast-induced mTBI (bmTBI) severity, and maladaptive personality (MMPI-2-RF) were gathered from 93 veterans. Brain regions central to emotion regulation were selected for analysis, and consisted of bilateral amygdala, bilateral dorsolateral prefrontal (dlPFC), and ventromedial prefrontal/subgenual anterior cingulate (vmPFC-sgACC). Cognitive load increased activity in dlPFC and reduced activity in emotional responding brain regions. However, individuals with greater PTSS showed blunted deactivations in bilateral amygdala and vmPFC-sgACC, and weaker responses in right dlPFC. Additionally, we found that elevated emotional/internalizing dysfunction (EID), specifically low positive emotionality (RC2), accounted for PTSS-related changes in bilateral amygdala under increased cognitive load. Findings suggest that PTSS might result in amygdala and vmPFC-sgACC activity resistant to moderation by cognitive demands, reflecting emotion dysregulation despite a need to marshal cognitive resources. Anhedonia may be an important target for interventions that improve the affective and cognitive functioning of individuals with PTSD.
Presenteeism, or working while ill, by healthcare personnel (HCP) experiencing influenza-like illness (ILI) puts patients and coworkers at risk. However, hospital policies and practices may not consistently facilitate HCP staying home when ill.
Objective and methods:
We conducted a mixed-methods survey in March 2018 of Emerging Infections Network infectious diseases physicians, describing institutional experiences with and policies for HCP working with ILI.
Of 715 physicians, 367 (51%) responded. Of 367, 135 (37%) were unaware of institutional policies. Of the remaining 232 respondents, 206 (89%) reported institutional policies regarding work restrictions for HCP with influenza or ILI, but only 145 (63%) said these were communicated at least annually. More than half of respondents (124, 53%) reported that adherence to work restrictions was not monitored or enforced. Work restrictions were most often not perceived to be enforced for physicians-in-training and attending physicians. Nearly all (223, 96%) reported that their facility tracked laboratory-confirmed influenza (LCI) in patients; 85 (37%) reported tracking ILI. For employees, 109 (47%) reported tracking of LCI and 53 (23%) reported tracking ILI. For independent physicians, not employed by the facility, 30 (13%) reported tracking LCI and 11 (5%) ILI.
More than one-third of respondents were unaware of whether their institutions had policies to prevent HCP with ILI from working; among those with knowledge of institutional policies, dissemination, monitoring, and enforcement of these policies was highly variable. Improving communication about work-restriction policies, as well as monitoring and enforcement, may help prevent the spread of infections from HCP to patients.
Downy brome, feral rye, and jointed goatgrass are problematic winter annual grasses in central Great Plains winter wheat production. Integrated control strategies are needed to manage winter annual grasses and reduce selection pressure exerted on these weed populations by the limited herbicide options currently available. Harvest weed-seed control (HWSC) methods aim to remove or destroy weed seeds, thereby reducing seed-bank enrichment at crop harvest. An added advantage is the potential to reduce herbicide-resistant weed seeds that are more likely to be present at harvest, thereby providing a nonchemical resistance-management strategy. Our objective was to assess the potential for HWSC of winter annual grass weeds in winter wheat by measuring seed retention at harvest and destruction percentage in an impact mill. During 2015 and 2016, 40 wheat fields in eastern Colorado were sampled. Seed retention was quantified and compared per weed species by counting seed retained above the harvested fraction of the wheat upper canopy (15 cm and above), seed retained below 15 cm, and shattered seed on the soil surface at wheat harvest. A stand-mounted impact mill device was used to determine the percent seed destruction of grass weed species in processed wheat chaff. Averaged across both years, seed retention (±SE) was 75% ± 2.9%, 90% ± 1.7%, and 76% ± 4.3% for downy brome, feral rye, and jointed goatgrass, respectively. Seed retention was most variable for downy brome, because 59% of the samples had at least 75% seed retention, whereas the proportions for feral rye and jointed goatgrass samples with at least 75% seed retention were 93% and 70%, respectively. Weed seed destruction percentages were at least 98% for all three species. These results suggest HWSC could be implemented as an integrated strategy for winter annual grass management in central Great Plains winter wheat cropping systems.