To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Flax yield can be severely reduced by weeds. The combination of limited herbicide options and the spread of herbicide-resistant weeds across the prairies has resulted in a need for more weed control options for flax producers. The objective of this research was to evaluate the tolerance of flax to topramezone, pyroxasulfone, flumioxazin, and fluthiacet-methyl applied alone as well as in a mix with currently registered herbicides. These herbicides were applied alone and in mixtures at the 1X and 2X rates and compared with three industry standards and one nontreated control. This experiment was conducted at Carman, MB, and Saskatoon, SK, as a randomized complete block with four replications. Data were collected for crop population, crop height, yield, and thousand-seed weight. Ratings for crop damage (phytotoxicity) were also taken at three separate time intervals: 7 to 14, 21 to 28, and 56+ d after treatment. Crop tolerance to these herbicides varied between site-years. This was largely attributed to differences in spring moisture conditions and the differences in soil characteristics between sites. Herbicide injury was transient. Hence, no herbicide or combination of herbicides significantly impacted crop yield consistently. Flumioxazin was the least promising herbicide evaluated, as it caused severe crop damage (>90%) when conditions were conducive. Overall, flax had excellent tolerance to fluthiacet-methyl, pyroxasulfone, and topramezone. Flax had excellent crop safety to the combination of pyroxasulfone + sulfentrazone. However, mixing fluthiacet-methyl and topramezone with MCPA and bromoxynil, respectively, increased crop damage and would not be recommended.
Despite the significant health benefits of breastfeeding for the mother and the infant, economic class and race disparities in breastfeeding rates persist. Support for breastfeeding from the father of the infant is associated with higher rates of breastfeeding initiation. However, little is known about the factors that may promote or deter father support of breastfeeding, especially in fathers exposed to contextual adversity such as poverty and violence. Using a mixed methods approach, the primary aims of the current work were to (1) elicit, using qualitative methodology, the worries, barriers and promotive factors for breastfeeding that expectant mothers and fathers identify as they prepare to parent a new infant, and (2) to examine factors that influence the parental breastfeeding intentions of both mothers and fathers using quantitative methodology. A sample (N=95) of expectant, third trimester mothers and fathers living in a low-income, urban environment in Midwestern USA, were interviewed from October 2013 to February 2015 about their infant feeding intentions. Compared with fathers, mothers more often identified the benefits of breastfeeding for the infant’s health and the economic advantage of breastfeeding. Mothers also identified more personal and community breastfeeding support resources. Fathers viewed their own support of breastfeeding as important but expressed a lack of knowledge about the breastfeeding process and often excluded themselves from discussions about infant feeding. The results point to important targets for interventions that aim to increase breastfeeding initiation rates in vulnerable populations in the US by increasing father support for breastfeeding.
To evaluate probiotics for the primary prevention of Clostridium difficile infection (CDI) among hospital inpatients.
A before-and-after quality improvement intervention comparing 12-month baseline and intervention periods.
A 694-bed teaching hospital.
We administered a multispecies probiotic comprising L. acidophilus (CL1285), L. casei (LBC80R), and L. rhamnosus (CLR2) to eligible antibiotic recipients within 12 hours of initial antibiotic receipt through 5 days after final dose. We excluded (1) all patients on neonatal, pediatric and oncology wards; (2) all individuals receiving perioperative prophylactic antibiotic recipients; (3) all those restricted from oral intake; and (4) those with pancreatitis, leukopenia, or posttransplant. We defined CDI by symptoms plus C. difficile toxin detection by polymerase chain reaction. Our primary outcome was hospital-onset CDI incidence on eligible hospital units, analyzed using segmented regression.
The study included 251 CDI episodes among 360,016 patient days during the baseline and intervention periods, and the incidence rate was 7.0 per 10,000 patient days. The incidence rate was similar during baseline and intervention periods (6.9 vs 7.0 per 10,000 patient days; P=.95). However, compared to the first 6 months of the intervention, we detected a significant decrease in CDI during the final 6 months (incidence rate ratio, 0.6; 95% confidence interval, 0.4–0.9; P=.009). Testing intensity remained stable between the baseline and intervention periods: 19% versus 20% of stools tested were C. difficile positive by PCR, respectively. From medical record reviews, only 26% of eligible patients received a probiotic per the protocol.
Despite poor adherence to the protocol, there was a reduction in the incidence of CDI during the intervention, which was delayed ~6 months after introducing probiotic for primary prevention.
Hospitalized patients with suspected tuberculosis (TB) are placed in airborne isolation until 3 sputum smear samples are negative for acid-fast bacilli (AFB). The Xpert MTB/RIF assay (“Xpert”) nucleic acid amplification test (NAAT) to identify Mycobacterium tuberculosis DNA and resistance to rifampicin is superior to AFB sputum smear microscopy for the diagnosis of TB.
To compare the performance of a single Xpert to AFB smear microscopy for time to airborne infection isolation (AII) discontinuation.
Consecutive patients over 17 years of age in AII for suspected pulmonary TB between October 1, 2014, and March 31, 2016, with leftover respiratory AFB samples were enrolled in this study. A single Xpert was performed on the first available sample. Demographic, clinical, and microbiological data were recorded for each patient. We compared the duration of AII using a single Xpert to AFB smear microscopy under multiple theoretical scenarios using Kaplan-Meier cumulative incidence curves and the log-rank test.
In total, 131 samples were included in our performance analysis of the Xpert, and 114 samples were included in our AII analysis. Overall, 81 patients (65%) were immunosuppressed, of whom 46 (37%) were positive for human immunodeficiency virus (HIV). The sensitivity and specificity of Xpert for diagnosis of M. tuberculosis infection were 67% and 100%, respectively. Xpert was negative in all cases of nontuberculous mycobacteria. Use of a single Xpert reduced AII duration from a median of 67 hours per patient to 42 hours with usual reporting, to 26 hours with direct communication, and to 12 hours with immediate testing.
A single negative Xpert result can reduce AII duration compared to the AFB smear microscopy technique under multiple theoretical scenarios.
Objectives: Human immunodeficiency virus (HIV) disproportionately affects Hispanics/Latinos in the United States, yet little is known about neurocognitive impairment (NCI) in this group. We compared the rates of NCI in large well-characterized samples of HIV-infected (HIV+) Latinos and (non-Latino) Whites, and examined HIV-associated NCI among subgroups of Latinos. Methods: Participants included English-speaking HIV+ adults assessed at six U.S. medical centers (194 Latinos, 600 Whites). For overall group, age: M=42.65 years, SD=8.93; 86% male; education: M=13.17, SD=2.73; 54% had acquired immunodeficiency syndrome. NCI was assessed with a comprehensive test battery with normative corrections for age, education and gender. Covariates examined included HIV-disease characteristics, comorbidities, and genetic ancestry. Results: Compared with Whites, Latinos had higher rates of global NCI (42% vs. 54%), and domain NCI in executive function, learning, recall, working memory, and processing speed. Latinos also fared worse than Whites on current and historical HIV-disease characteristics, and nadir CD4 partially mediated ethnic differences in NCI. Yet, Latinos continued to have more global NCI [odds ratio (OR)=1.59; 95% confidence interval (CI)=1.13–2.23; p<.01] after adjusting for significant covariates. Higher rates of global NCI were observed with Puerto Rican (n=60; 71%) versus Mexican (n=79, 44%) origin/descent; this disparity persisted in models adjusting for significant covariates (OR=2.40; CI=1.11–5.29; p=.03). Conclusions: HIV+ Latinos, especially of Puerto Rican (vs. Mexican) origin/descent had increased rates of NCI compared with Whites. Differences in rates of NCI were not completely explained by worse HIV-disease characteristics, neurocognitive comorbidities, or genetic ancestry. Future studies should explore culturally relevant psychosocial, biomedical, and genetic factors that might explain these disparities and inform the development of targeted interventions. (JINS, 2018, 24, 163–175)
Six radio telescopes were operated as the first southern hemisphere VLBI array in April and May 1982. Observations were made at 2.3 and 8.4 Ghz. This array produced VLBI images of 28 southern hemisphere radio sources, high accuracy VLBI geodesy between southern hemisphere sites, and subarcsecond radio astrometry of celestial sources south of declination −45 degrees. This paper discusses only the astrophysical aspects of the experiment.
Objectives: The present study examined differences in neurocognitive outcomes among non-Hispanic Black and White stroke survivors using the NIH Toolbox-Cognition Battery (NIHTB-CB), and investigated the roles of healthcare variables in explaining racial differences in neurocognitive outcomes post-stroke. Methods: One-hundred seventy adults (91 Black; 79 White), who participated in a multisite study were included (age: M=56.4; SD=12.6; education: M=13.7; SD=2.5; 50% male; years post-stroke: 1–18; stroke type: 72% ischemic, 28% hemorrhagic). Neurocognitive function was assessed with the NIHTB-CB, using demographically corrected norms. Participants completed measures of socio-demographic characteristics, health literacy, and healthcare use and access. Stroke severity was assessed with the Modified Rankin Scale. Results: An independent samples t test indicated Blacks showed more neurocognitive impairment (NIHTB-CB Fluid Composite T-score: M=37.63; SD=11.67) than Whites (Fluid T-score: M=42.59, SD=11.54; p=.006). This difference remained significant after adjusting for reading level (NIHTB-CB Oral Reading), and when stratified by stroke severity. Blacks also scored lower on health literacy, reported differences in insurance type, and reported decreased confidence in the doctors treating them. Multivariable models adjusting for reading level and injury severity showed that health literacy and insurance type were statistically significant predictors of the Fluid cognitive composite (p<.001 and p=.02, respectively) and significantly mediated racial differences on neurocognitive impairment. Conclusions: We replicated prior work showing that Blacks are at increased risk for poorer neurocognitive outcomes post-stroke than Whites. Health literacy and insurance type might be important modifiable factors influencing these differences. (JINS, 2017, 23, 640–652)
This chapter investigates the amount of research interest in the relationship between mental health and terrorism. In addition, Johnson and colleagues have begun to identify the seminal research in the field as the frequency of publications increased allowing dominant and coherent trends of study to emerge. Both broad theoretical advances and focused conceptual refinements have been identified and discussed. The authors have also sought to identify the broader lacunae in the field and suggest future directions for research. Their results reveal that the dramatic increase in research focusing on the topics of terrorism and mental health reached their highpoints five to ten years after the September 11, 2001, attacks on the World Trade Center in New York and the Pentagon in Washington, DC, and the hijacked plane that crashed in Pennsylvania. The first highpoint occurred after five years with respect to health and terrorism specifically, while research on terrorism in general continued to rise in the social science literature for another five years before its first major descent was recorded in 2012. The reaction to those attacks themselves dominated much of the research. In addition to ongoing attention to mental health topics and risk behaviors occurring as a result of those attacks, a developing trend in positive outcomes such as post-traumatic growth (PTG) and especially resilience have been noted.
While the frequency of published research generally has started to wane in the second decade following the September 11 attacks, many of the questions raised by the research in the first decade remain unanswered. Will new tragedies in understudied parts of the world demand resurgence in research focusing on the association between mental health and terrorism, or will it take another crisis among Western nations (e.g., the current wave of refugees into Europe from non-European Union nations)? With waves of terrorism washing over large regions of the globe, what factors determine who and what topics are drawing the attention of mental health researchers? How would you define resilience in the face of terrorism and what examples of it can you provide? What work remains to be done on resilience to broaden its application to sociology?
Efficient natural dispersal of herbicide-resistance alleles via seed and pollen can markedly accelerate the incidence of herbicide-resistant weed populations across an agroecoregion. Studies were conducted in western Canada in 2014 and 2015 to investigate pollen- and seed-mediated gene flow in kochia. Pollen-mediated gene flow (PMGF) from glyphosate-resistant (GR) to non-GR kochia was quantified in a field trial (hub and spoke design) at Saskatoon, Saskatchewan. Seed-mediated gene flow of acetolactate synthase (ALS) inhibitor-resistant kochia as a function of tumbleweed speed and distance was estimated in cereal stubble fields at Lethbridge, Alberta and Scott, Saskatchewan. Regression analysis indicated that outcrossing from GR to adjacent non-GR kochia ranged from 5.3 to 7.5%, declining exponentially to 0.1 to 0.4% at 96 m distance. However, PMGF was significantly influenced by prevailing wind direction during pollination (maximum of 11 to 17% outcrossing down-wind). Seed dropped by tumbleweeds varied with distance and plant speed, approaching 90% or more (ca. 100,000 seeds or more) at distances of up to 1,000 m and plant speeds of up to 300 cm s–1. This study highlights the efficient proximal (pollen) and distal (seed) gene movement of this important GR weed.
We say a graph is (Qn,Qm)-saturated if it is a maximal Qm-free subgraph of the n-dimensional hypercube Qn. A graph is said to be (Qn,Qm)-semi-saturated if it is a subgraph of Qn and adding any edge forms a new copy of Qm. The minimum number of edges a (Qn,Qm)-saturated graph (respectively (Qn,Qm)-semi-saturated graph) can have is denoted by sat(Qn,Qm) (respectively s-sat(Qn,Qm)). We prove that
for fixed m, disproving a conjecture of Santolupo that, when m=2, this limit is 1/4. Further, we show by a different method that sat(Qn, Q2)=O(2n), and that s-sat(Qn, Qm)=O(2n), for fixed m. We also prove the lower bound
Giant ragweed has been increasing as a major weed of row crops in the last 30 yr, but quantitative data regarding its pattern and mechanisms of spread in crop fields are lacking. To address this gap, we conducted a Web-based survey of certified crop advisors in the U.S. Corn Belt and Ontario, Canada. Participants were asked questions regarding giant ragweed and crop production practices for the county of their choice. Responses were mapped and correlation analyses were conducted among the responses to determine factors associated with giant ragweed populations. Respondents rated giant ragweed as the most or one of the most difficult weeds to manage in 45% of 421 U.S. counties responding, and 57% of responding counties reported giant ragweed populations with herbicide resistance to acetolactate synthase inhibitors, glyphosate, or both herbicides. Results suggest that giant ragweed is increasing in crop fields outward from the east-central U.S. Corn Belt in most directions. Crop production practices associated with giant ragweed populations included minimum tillage, continuous soybean, and multiple-application herbicide programs; ecological factors included giant ragweed presence in noncrop edge habitats, early and prolonged emergence, and presence of the seed-burying common earthworm in crop fields. Managing giant ragweed in noncrop areas could reduce giant ragweed migration from noncrop habitats into crop fields and slow its spread. Where giant ragweed is already established in crop fields, including a more diverse combination of crop species, tillage practices, and herbicide sites of action will be critical to reduce populations, disrupt emergence patterns, and select against herbicide-resistant giant ragweed genotypes. Incorporation of a cereal grain into the crop rotation may help suppress early giant ragweed emergence and provide chemical or mechanical control options for late-emerging giant ragweed.
The Full-sky Astrometric Mapping Explorer (FAME) is designed to perform an all-sky, astrometric survey with unprecedented accuracy. It will create a rigid astrometric catalog of 4 × 107 stars with 5 < mV < 15. For bright stars, 5 < mV < 9, FAME will determine positions and parallaxes accurate to < 50 μas, with proper motion errors < 50 μas/yr. For fainter stars, 9 < mV < 15, FAME will determine positions and parallaxes accurate to < 500 μas, with proper motion errors < 500 μas/yr. It will also collect photometric data on these 4 × 107 stars in four Sloan Digital Sky Survey colors. NASA selected FAME to be one of five MIDEX missions funded for a concept study. In October 1999, NASA selected FAME for launch in 2004 as the MIDEX-4 mission in its Explorer program.
To determine the effect of graft choice (allograft, bone-patellar tendon-bone autograft, or hamstring autograft) on deep tissue infections following anterior cruciate ligament (ACL) reconstructions.
Retrospective cohort study.
SETTING AND POPULATION
Patients from 6 US health plans who underwent ACL reconstruction from January 1, 2000, through December 31, 2008.
We identified ACL reconstructions and potential postoperative infections using claims data. A hierarchical stratified sampling strategy was used to identify patients for medical record review to confirm ACL reconstructions and to determine allograft vs autograft tissue implanted, clinical characteristics, and infection status. We estimated infection rates overall and by graft type. We used logistic regression to assess the association between infections and patients’ demographic characteristics, comorbidities, and choice of graft.
On review of 1,452 medical records, we found 55 deep wound infections. With correction for sampling weights, infection rates varied by graft type: 0.5% (95% CI, 0.3%-0.8%) with allografts, 0.6% (0.1%–1.5%) with bone-patellar tendon-bone autografts, and 2.5% (1.9%–3.1%) with hamstring autograft. After adjusting for potential confounders, we found an increased infection risk with hamstring autografts compared with allografts (odds ratio, 5.9; 95% CI, 2.8–12.8). However, there was no difference in infection risk among bone-patellar tendon-bone autografts vs allografts (odds ratio, 1.2; 95% CI, 0.3–4.8).
The overall risk for deep wound infections following ACL reconstruction is low but it does vary by graft type. Infection risk was highest in hamstring autograft recipients compared with allograft recipients and bone-patellar tendon-bone autograft recipients.
In western Canada, more money is spent on wild oat herbicides than on any other weed species, and wild oat resistance to herbicides is the most widespread resistance issue. A direct-seeded field experiment was conducted from 2010 to 2014 at eight Canadian sites to determine crop life cycle, crop species, crop seeding rate, crop usage, and herbicide rate combination effects on wild oat management and canola yield. Combining 2× seeding rates of early-cut barley silage with 2× seeding rates of winter cereals and excluding wild oat herbicides for 3 of 5 yr (2011 to 2013) often led to similar wild oat density, aboveground wild oat biomass, wild oat seed density in the soil, and canola yield as a repeated canola–wheat rotation under a full wild oat herbicide rate regime. Wild oat was similarly well managed after 3 yr of perennial alfalfa without wild oat herbicides. Forgoing wild oat herbicides in only 2 of 5 yr from exclusively summer annual crop rotations resulted in higher wild oat density, biomass, and seed banks. Management systems that effectively combine diverse and optimal cultural practices against weeds, and limit herbicide use, reduce selection pressure for weed resistance to herbicides and prolong the utility of threatened herbicide tools.
The objective of this research was to describe proportional differences across time and region in management practices among southern cotton farmers who experienced glyphosate-resistant (GR) weeds on their farms earlier than those who experienced them later and among farmers who were closest to one of four historical outbreak epicenters: Lauderdale County, TN; Macon County, GA; Edgecombe County, NC; and Terry County, TX. A mail survey was conducted with cotton farmers in 2012 from 13 southern, cotton-producing states. Survey responses on practices used by farmers were classified into three broad categories of labor, mechanical/tillage/chemical (MTC), and cultural. Proportions of respondents using practices from each category were identified by time and region; across which, proportional-difference tests were conducted. Results indicated respondents encountering GR weeds earlier were more likely than farmers who experienced them later to use the three broad-category practices (labor, 98 vs. 92%; MTC, 95 vs. 89%; and cultural, 86 vs. 76%) and specific practices, including hooded sprayers (76 vs. 58%), in-season herbicide change (83 vs. 60%), and field-border management (60 vs. 35%). Also, respondents closest to Lauderdale County were more likely than farmers closest to Edgecombe County to use broad-labor practices (99 vs. 91%) and specific practices, including hand hoeing (96 vs. 84%), hand spraying (49 vs. 31%), spot spraying (76 vs. 59%), wick applicator (13 vs. 11%), and field-border management (58 vs. 39%). Education programs on weed management can be developed and tailored according to the time and regional differences to provide effective information and communication channels to farmers.
Estimating population sizes in the heavily traded grey parrots of West and Central Africa would provide insights into conservation status and sustainability of harvests. Ideally, density estimates would be derived from a standardized method such as distance sampling, but survey efforts are hampered by the extensive ranges, patchy distribution, variable abundance, cryptic habits and high mobility of the parrots as well as by logistical difficulties and limited resources. We carried out line transect distance sampling alongside a simpler encounter rate method at 10 sites across five West and Central African countries. Density estimates were variable across sites, from 0–0.5 individuals km−2 in Côte d'Ivoire and central Democratic Republic of the Congo to c. 30 km−2 in Cameroon and > 70 km−2 on the island of Príncipe. Most significantly, we identified the relationship between densities estimated from distance sampling and simple encounter rates, which has important applications in monitoring grey parrots: (1) to convert records of parrot groups encountered in a day's activities by anti-poaching patrols within protected areas into indicative density estimates, (2) to confirm low density in areas where parrots are so rare that distance sampling is not feasible, and (3) to provide a link between anecdotal records and local density estimates. Encounter rates of less than one parrot group per day of walking are a reality in most forests within the species’ ranges. Densities in these areas are expected to be one individual km−2 or lower, and local harvest should be disallowed on this basis.
Infants in the neonatal intensive care unit (NICU) are at increased risk for methicillin-resistant Staphylococcus aureus (MRSA) acquisition. Outbreaks may be difficult to identify due in part to limitations in current molecular genotyping available in clinical practice. Comparison of genome-wide single nucleotide polymorphisms (SNPs) may identify epidemiologically distinct isolates among a population sample that appears homogenous when evaluated using conventional typing methods.
To investigate a putative MRSA outbreak in a NICU utilizing whole-genome sequencing and phylogenetic analysis to identify recent transmission events.
Clinical and surveillance specimens collected during clinical care and outbreak investigation.
A total of 17 neonates hospitalized in a 43-bed level III NICU in northeastern Florida from December 2010 to October 2011 were included in this study.
We assessed epidemiological data in conjunction with 4 typing methods: antibiograms, PFGE, spa types, and phylogenetic analysis of genome-wide SNPs.
Among the 17 type USA300 isolates, 4 different spa types were identified using pulsed-field gel electrophoresis. Phylogenetic analysis identified 5 infants as belonging to 2 clusters of epidemiologically linked cases and excluded 10 unlinked cases from putative transmission events. The availability of these results during the initial investigation would have improved infection control interventions.
Whole-genome sequencing and phylogenetic analysis are invaluable tools for epidemic investigation; they identify transmission events and exclude cases mistakenly implicated by traditional typing methods. When routinely applied to surveillance and investigation in the clinical setting, this approach may provide actionable intelligence for measured, appropriate, and effective interventions.
Infect. Control Hosp. Epidemiol. 2015;36(7):777–785