We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To describe the genomic analysis and epidemiologic response related to a slow and prolonged methicillin-resistant Staphylococcus aureus (MRSA) outbreak.
Design:
Prospective observational study.
Setting:
Neonatal intensive care unit (NICU).
Methods:
We conducted an epidemiologic investigation of a NICU MRSA outbreak involving serial baby and staff screening to identify opportunities for decolonization. Whole-genome sequencing was performed on MRSA isolates.
Results:
A NICU with excellent hand hygiene compliance and longstanding minimal healthcare-associated infections experienced an MRSA outbreak involving 15 babies and 6 healthcare personnel (HCP). In total, 12 cases occurred slowly over a 1-year period (mean, 30.7 days apart) followed by 3 additional cases 7 months later. Multiple progressive infection prevention interventions were implemented, including contact precautions and cohorting of MRSA-positive babies, hand hygiene observers, enhanced environmental cleaning, screening of babies and staff, and decolonization of carriers. Only decolonization of HCP found to be persistent carriers of MRSA was successful in stopping transmission and ending the outbreak. Genomic analyses identified bidirectional transmission between babies and HCP during the outbreak.
Conclusions:
In comparison to fast outbreaks, outbreaks that are “slow and sustained” may be more common to units with strong existing infection prevention practices such that a series of breaches have to align to result in a case. We identified a slow outbreak that persisted among staff and babies and was only stopped by identifying and decolonizing persistent MRSA carriage among staff. A repeated decolonization regimen was successful in allowing previously persistent carriers to safely continue work duties.
One in six nursing home residents and staff with positive SARS-CoV-2 tests ≥90 days after initial infection had specimen cycle thresholds (Ct) <30. Individuals with specimen Ct<30 were more likely to report symptoms but were not different from individuals with high Ct value specimens by other clinical and testing data.
Early in the COVID-19 pandemic, the World Health Organization stressed the importance of daily clinical assessments of infected patients, yet current approaches frequently consider cross-sectional timepoints, cumulative summary measures, or time-to-event analyses. Statistical methods are available that make use of the rich information content of longitudinal assessments. We demonstrate the use of a multistate transition model to assess the dynamic nature of COVID-19-associated critical illness using daily evaluations of COVID-19 patients from 9 academic hospitals. We describe the accessibility and utility of methods that consider the clinical trajectory of critically ill COVID-19 patients.
Introduced mammalian predators are responsible for the decline and extinction of many native species, with rats (genus Rattus) being among the most widespread and damaging invaders worldwide. In a naturally fragmented landscape, we demonstrate the multi-year effectiveness of snap traps in the removal of Rattus rattus and Rattus exulans from lava-surrounded forest fragments ranging in size from <0.1 to >10 ha. Relative to other studies, we observed low levels of fragment recolonization. Larger rats were the first to be trapped, with the average size of trapped rats decreasing over time. Rat removal led to distinct shifts in the foraging height and location of mongooses and mice, emphasizing the need to focus control efforts on multiple invasive species at once. Furthermore, because of a specially designed trap casing, we observed low non-target capture rates, suggesting that on Hawai‘i and similar islands lacking native rodents the risk of killing non-target species in snap traps may be lower than the application of rodenticides, which have the potential to contaminate food webs. These efforts demonstrate that targeted snap-trapping is an effective removal method for invasive rats in fragmented habitats and that, where used, monitoring of recolonization should be included as part of a comprehensive biodiversity management strategy.
Psychotropic prescription rates continue to increase in the United States (USA). Few studies have investigated whether social-structural factors may play a role in psychotropic medication use independent of mental illness. Food insecurity is prevalent among people living with HIV in the USA and has been associated with poor mental health. We investigated whether food insecurity was associated with psychotropic medication use independent of the symptoms of depression and anxiety among women living with HIV in the USA.
Methods
We used cross-sectional data from the Women's Interagency HIV Study (WIHS), a nationwide cohort study. Food security (FS) was the primary explanatory variable, measured using the Household Food Security Survey Module. First, we used multivariable linear regressions to test whether FS was associated with symptoms of depression (Center for Epidemiologic Studies Depression [CESD] score), generalised anxiety disorder (GAD-7 score) and mental health-related quality of life (MOS-HIV Mental Health Summary score; MHS). Next, we examined associations of FS with the use of any psychotropic medications, including antidepressants, sedatives and antipsychotics, using multivariable logistic regressions adjusting for age, race/ethnicity, income, education and alcohol and substance use. In separate models, we additionally adjusted for symptoms of depression (CESD score) and anxiety (GAD-7 score).
Results
Of the 905 women in the sample, two-thirds were African-American. Lower FS (i.e. worse food insecurity) was associated with greater symptoms of depression and anxiety in a dose–response relationship. For the psychotropic medication outcomes, marginal and low FS were associated with 2.06 (p < 0.001; 95% confidence interval [CI] = 1.36–3.13) and 1.99 (p < 0.01; 95% CI = 1.26–3.15) times higher odds of any psychotropic medication use, respectively, before adjusting for depression and anxiety. The association of very low FS with any psychotropic medication use was not statistically significant. A similar pattern was found for antidepressant and sedative use. After additionally adjusting for CESD and GAD-7 scores, marginal FS remained associated with 1.93 (p < 0.05; 95% CI = 1.16–3.19) times higher odds of any psychotropic medication use. Very low FS, conversely, was significantly associated with lower odds of antidepressant use (adjusted odds ratio = 0.42; p < 0.05; 95% CI = 0.19–0.96).
Conclusions
Marginal FS was associated with higher odds of using psychotropic medications independent of depression and anxiety, while very low FS was associated with lower odds. These complex findings may indicate that people experiencing very low FS face barriers to accessing mental health services, while those experiencing marginal FS who do access services are more likely to be prescribed psychotropic medications for distress arising from social and structural factors.
To assess the impact of a newly developed Central-Line Insertion Site Assessment (CLISA) score on the incidence of local inflammation or infection for CLABSI prevention.
Design:
A pre- and postintervention, quasi-experimental quality improvement study.
Setting and participants:
Adult inpatients with central venous catheters (CVCs) hospitalized in an intensive care unit or oncology ward at a large academic medical center.
Methods:
We evaluated CLISA score impact on insertion site inflammation and infection (CLISA score of 2 or 3) incidence in the baseline period (June 2014–January 2015) and the intervention period (April 2015–October 2017) using interrupted times series and generalized linear mixed-effects multivariable analyses. These were run separately for days-to-line removal from identification of a CLISA score of 2 or 3. CLISA score interrater reliability and photo quiz results were evaluated.
Results:
Among 6,957 CVCs assessed 40,846 times, percentage of lines with CLISA score of 2 or 3 in the baseline and intervention periods decreased by 78.2% (from 22.0% to 4.7%), with a significant immediate decrease in the time-series analysis (P < .001). According to the multivariable regression, the intervention was associated with lower percentage of lines with a CLISA score of 2 or 3, after adjusting for age, gender, CVC body location, and hospital unit (odds ratio, 0.15; 95% confidence interval, 0.06–0.34; P < .001). According to the multivariate regression, days to removal of lines with CLISA score of 2 or 3 was 3.19 days faster after the intervention (P < .001). Also, line dwell time decreased 37.1% from a mean of 14 days (standard deviation [SD], 10.6) to 8.8 days (SD, 9.0) (P < .001). Device utilization ratios decreased 9% from 0.64 (SD, 0.08) to 0.58 (SD, 0.06) (P = .039).
Conclusions:
The CLISA score creates a common language for assessing line infection risk and successfully promotes high compliance with best practices in timely line removal.
Purple nutsedge (Cyperus rotundus L.) and yellow nutsedge (C. esculentus L.) appeared to be equally acceptable for oviposition by caged Bactra verutana Zeller, but purple nutsedge was significantly more suitable as a host: 90% of the larvae survived to maturity on purple nutsedge compared with 65% on yellow nutsedge. Responses of the plant species to both larval feeding injury and plant density were similar but purple nutsedge tended to be injured more than yellow nutsedge. At a high shoot density (nine shoots per pot), production of tubers by purple nutsedge was more adversely affected by feeding of five larvae per shoot than was production by yellow nutsedge: tuber dry weights were reduced 93 and 80% and numbers of tubers per pot were reduced 77 and 62%, respectively. Production of inflorescences was greatly reduced in both species. The effect of B. verutana on inflorescences may be more important for yellow nutsedge, which is generally considered to reproduce freely by seeds. Both species of nutsedge probably would be about equally affected by augmentation of B. verutana populations as a method of biological control.
Several herbicide-based weed management programs for glyphosate-tolerant cotton were compared in eight field studies across Alabama during 1996 and 1997. Weed management programs ranged from traditional, soil-applied residual herbicide programs to more recently developed total postemergence (POST) herbicide programs. Pitted morningglory and sicklepod control was best achieved with fluometuron applied preemergence (PRE) followed by (fb) a single POST over-the-top (POT) application of glyphosate fb a POST-directed application of glyphosate. Annual grass control was better with the preplant incorporated (PPI) programs at two of three locations in both years. Treatments that included at least one glyphosate POT application gave increased grass control over no glyphosate or pyrithiobac POT. Velvetleaf control was improved with the addition of glyphosate POT. A herbicide program using no POST herbicides yielded significantly less seed cotton than any program using POST herbicides at one location. PRE- and POST-only weed management programs at another location produced more seed cotton and gave greater net returns than PPI programs. Similarly, net returns at that same location were equivalent for both PRE- and POST-only programs, and less for PPI programs. POST-only programs yielded highest amounts of seed cotton and netted greater returns.
A telephone survey was conducted with growers in Iowa, Illinois, Indiana, Nebraska, Mississippi, and North Carolina to discern the utilization of the glyphosate-resistant (GR) trait in crop rotations, weed pressure, tillage practices, herbicide use, and perception of GR weeds. This paper focuses on survey results regarding herbicide decisions made during the 2005 cropping season. Less than 20% of the respondents made fall herbicide applications. The most frequently used herbicides for fall applications were 2,4-D and glyphosate, and these herbicides were also the most frequently used for preplant burndown weed control in the spring. Atrazine and acetochlor were frequently used in rotations containing GR corn. As expected, crop rotations using a GR crop had a high percentage of respondents that made one to three POST applications of glyphosate per year. GR corn, GR cotton, and non-GR crops had the highest percentage of growers applying non-glyphosate herbicides during the 2005 growing season. A crop rotation containing GR soybean had the greatest negative impact on non-glyphosate use. Overall, glyphosate use has continued to increase, with concomitant decreases in utilization of other herbicides.
Corn and soybean growers in Illinois, Indiana, Iowa, Mississippi, Nebraska, and North Carolina, as well as cotton growers in Mississippi and North Carolina, were surveyed about their views on changes in problematic weeds and weed pressure in cropping systems based on a glyphosate-resistant (GR) crop. No growers using a GR cropping system for more than 5 yr reported heavy weed pressure. Over all cropping systems investigated (continuous GR soybean, continuous GR cotton, GR corn/GR soybean, GR soybean/non-GR crop, and GR corn/non-GR crop), 0 to 7% of survey respondents reported greater weed pressure after implementing rotations using GR crops, whereas 31 to 57% felt weed pressure was similar and 36 to 70% indicated that weed pressure was less. Pigweed, morningglory, johnsongrass, ragweed, foxtail, and velvetleaf were mentioned as their most problematic weeds, depending on the state and cropping system. Systems using GR crops improved weed management compared with the technologies used before the adoption of GR crops. However, the long-term success of managing problematic weeds in GR cropping systems will require the development of multifaceted integrated weed management programs that include glyphosate as well as other weed management tactics.
A phone survey was administered to 1,195 growers in six states (Illinois, Indiana, Iowa, Mississippi, Nebraska, and North Carolina). The survey measured producers' crop history, perception of glyphosate-resistant (GR) weeds, past and present weed pressure, tillage practices, and herbicide use as affected by the adoption of GR crops. This article describes the changes in tillage practice reported in the survey. The adoption of a GR cropping system resulted in a large increase in the percentage of growers using no-till and reduced-till systems. Tillage intensity declined more in continuous GR cotton and GR soybean (45 and 23%, respectively) than in rotations that included GR corn or non-GR crops. Tillage intensity declined more in the states of Mississippi and North Carolina than in the other states, with 33% of the growers in these states shifting to more conservative tillage practices after the adoption of a GR crop. This was primarily due to the lower amount of conservation tillage adoption in these states before GR crop availability. Adoption rates of no-till and reduced-till systems increased as farm size decreased. Overall, producers in a crop rotation that included a GR crop shifted from a relatively more tillage-intense system to reduced-till or no-till systems after implementing a GR crop into their production system.
Over 175 growers in each of six states (Illinois, Indiana, Iowa, Mississippi, Nebraska, and North Carolina) were surveyed by telephone to assess their perceptions of the benefits of utilizing the glyphosate-resistant (GR) crop trait in corn, cotton, and soybean. The survey was also used to determine the weed management challenges growers were facing after using this trait for a minimum of 4 yr. This survey allowed the development of baseline information on how weed management and crop production practices have changed since the introduction of the trait. It provided useful information on common weed management issues that should be addressed through applied research and extension efforts. The survey also allowed an assessment of the perceived levels of concern among growers about glyphosate resistance in weeds and whether they believed they had experienced glyphosate resistance on their farms. Across the six states surveyed, producers reported 38, 97, and 96% of their corn, cotton, and soybean hectarage planted in a GR cultivar. The most widely adopted GR cropping system was a GR soybean/non-GR crop rotation system; second most common was a GR soybean/GR corn crop rotation system. The non-GR crop component varied widely, with the most common crops being non-GR corn or rice. A large range in farm size for the respondents was observed, with North Carolina having the smallest farms in all three crops. A large majority of corn and soybean growers reported using some type of crop rotation system, whereas very few cotton growers rotated out of cotton. Overall, rotations were much more common in Midwestern states than in Southern states. This is important information as weed scientists assist growers in developing and using best management practices to minimize the development of glyphosate resistance.
A survey of farmers from six U.S. states (Indiana, Illinois, Iowa, Nebraska, Mississippi, and North Carolina) was conducted to assess the farmers' views on glyphosate-resistant (GR) weeds and tactics used to prevent or manage GR weed populations in genetically engineered (GE) GR crops. Only 30% of farmers thought GR weeds were a serious issue. Few farmers thought field tillage and/or using a non-GR crop in rotation with GR crops would be an effective strategy. Most farmers did not recognize the role that the recurrent use of an herbicide plays in evolution of resistance. A substantial number of farmers underestimated the potential for GR weed populations to evolve in an agroecosystem dominated by glyphosate as the weed control tactic. These results indicate there are major challenges that the agriculture and weed science communities must face to implement long-term sustainable GE GR-based cropping systems within the agroecosystem.
During 1990 we surveyed the southern sky using a multi-beam receiver at frequencies of 4850 and 843 MHz. The half-power beamwidths were 4 and 25 arcmin respectively. The finished surveys cover the declination range between +10 and −90 degrees declination, essentially complete in right ascension, an area of 7.30 steradians. Preliminary analysis of the 4850 MHz data indicates that we will achieve a five sigma flux density limit of about 30 mJy. We estimate that we will find between 80 000 and 90 000 new sources above this limit. This is a revised version of the paper presented at the Regional Meeting by the first four authors; the surveys now have been completed.
Vegetation affects feedbacks in Earth's hydrologic system, but is constrained by physiological adaptations. In extant ecosystems, the mechanisms controlling plant water used can be measured experimentally; for extinct plants in the recent geological past, water use can be inferred from nearest living relatives, assuming minimal evolutionary change. In deep time, where no close living relatives exist, fossil material provides the only information for inferring plant water use. However, mechanistic models for extinct plant water use must be built on first principles and tested on extant plants. Plants serve as a conduit for water movement from the soil to the atmosphere, constrained by tissue-level construction and gross architecture. No single feature, such as stomata or veins, encompasses enough of the complexity underpinning water-use physiology to serve as the basis of a model of functional water use in all (or perhaps any) extinct plants. Rather, a “functional whole plant” model must be used. To understand the interplay between plant and atmosphere, water use in relation to environmental conditions is investigated in an extinct plant, the seed fern Medullosa ((Division Pteridospermatophyta), by reviewing methods for reconstructing physiological variables such as leaf and stem hydraulic capacity, photosynthetic rate, transpiration rate, stomatal conductance, and albedo. Medullosans had the potential for extremely high photosynthetic and assimilation rates, water transport, stomatal conductance, and transpiration—rates comparable to later angiosperms. When these high growth and gas exchange rates of medullosans are combined with the unique atmospheric gas composition of the late Paleozoic atmosphere, complex vegetation-environmental feedbacks are expected despite their basal phylogenetic position relative to post-Paleozoic seed plants.