To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In recent years, soybean acreage has increased significantly in western Canada. One of the challenges associated with growing soybean in western Canada is the control of volunteer glyphosate-resistant (GR) canola, as the majority of soybean cultivars are also glyphosate resistant. The objective of this research was to determine the impact of soybean seeding rate and planting date on competition with volunteer canola. We also attempted to determine how high seeding rate could be raised while still being economically feasible for producers. Soybean was seeded at five different seeding rates (targeted 10, 20, 40, 80 and 160 plants m-2) and three planting dates (targeted mid-May, late May, and early June) at four sites across western Canada in 2014 and 2015. Soybean yield consistently increased with higher seeding rates, while volunteer canola biomass decreased. Planting date generally produced variable results across site-years. An economic analysis determined that the optimal rate was 40 to 60 plants m-2 depending on market price, while the optimal planting date range was from May 20th to June 1st.
Increasing fluorination of organosilyl nitrile solvents improves ionic conductivities of lithium salt electrolytes, resulting from higher values of salt dissociation. Ionic conductivities at 298 K range from 1.5 to 3.2 mS/cm for LiPF6 salt concentrations at 0.6 or 0.7 M. The authors also report on solvent blend electrolytes where the fluoroorganosilyl (FOS) nitrile solvent is mixed with ethylene carbonate and diethyl carbonate. Ionic conductivities of the FOS solvent/carbonate blend electrolytes increase achieving ionic conductivities at 298 K of 5.5–6.3 mS/cm and salt dissociation values ranging from 0.42 to 0.45. Salt dissociation generally decreases with increasing temperature.
The authors report on 7Li, 19F, and 1H pulsed field gradient NMR measurements of 26 organosilyl nitrile solvent-based electrolytes of either lithium bis(trifluorosulfonyl)imide (LiTFSI) or lithium hexafluorophosphate. Lithium transport numbers (as high as 0.50) were measured and are highest in the LiTFSI electrolytes. The authors also report on solvent blend electrolytes of fluoroorganosilyl (FOS) nitrile solvent mixed with ethylene carbonate (EC) and diethyl carbonate. Solvent diffusion measurements on an electrolyte with 6% FOS suggest both the FOS and EC solvate the lithium cation. By comparing lithium transport and transference numbers, the authors find less ion pairing in FOS nitrile carbonate blend electrolytes and difluoroorganosilyl nitrile electrolytes.
Flax yield can be severely reduced by weeds. The combination of limited herbicide options and the spread of herbicide-resistant weeds across the prairies has resulted in a need for more weed control options for flax producers. The objective of this research was to evaluate the tolerance of flax to topramezone, pyroxasulfone, flumioxazin, and fluthiacet-methyl applied alone as well as in a mix with currently registered herbicides. These herbicides were applied alone and in mixtures at the 1X and 2X rates and compared with three industry standards and one nontreated control. This experiment was conducted at Carman, MB, and Saskatoon, SK, as a randomized complete block with four replications. Data were collected for crop population, crop height, yield, and thousand-seed weight. Ratings for crop damage (phytotoxicity) were also taken at three separate time intervals: 7 to 14, 21 to 28, and 56+ d after treatment. Crop tolerance to these herbicides varied between site-years. This was largely attributed to differences in spring moisture conditions and the differences in soil characteristics between sites. Herbicide injury was transient. Hence, no herbicide or combination of herbicides significantly impacted crop yield consistently. Flumioxazin was the least promising herbicide evaluated, as it caused severe crop damage (>90%) when conditions were conducive. Overall, flax had excellent tolerance to fluthiacet-methyl, pyroxasulfone, and topramezone. Flax had excellent crop safety to the combination of pyroxasulfone + sulfentrazone. However, mixing fluthiacet-methyl and topramezone with MCPA and bromoxynil, respectively, increased crop damage and would not be recommended.
Despite the significant health benefits of breastfeeding for the mother and the infant, economic class and race disparities in breastfeeding rates persist. Support for breastfeeding from the father of the infant is associated with higher rates of breastfeeding initiation. However, little is known about the factors that may promote or deter father support of breastfeeding, especially in fathers exposed to contextual adversity such as poverty and violence. Using a mixed methods approach, the primary aims of the current work were to (1) elicit, using qualitative methodology, the worries, barriers and promotive factors for breastfeeding that expectant mothers and fathers identify as they prepare to parent a new infant, and (2) to examine factors that influence the parental breastfeeding intentions of both mothers and fathers using quantitative methodology. A sample (N=95) of expectant, third trimester mothers and fathers living in a low-income, urban environment in Midwestern USA, were interviewed from October 2013 to February 2015 about their infant feeding intentions. Compared with fathers, mothers more often identified the benefits of breastfeeding for the infant’s health and the economic advantage of breastfeeding. Mothers also identified more personal and community breastfeeding support resources. Fathers viewed their own support of breastfeeding as important but expressed a lack of knowledge about the breastfeeding process and often excluded themselves from discussions about infant feeding. The results point to important targets for interventions that aim to increase breastfeeding initiation rates in vulnerable populations in the US by increasing father support for breastfeeding.
To evaluate probiotics for the primary prevention of Clostridium difficile infection (CDI) among hospital inpatients.
A before-and-after quality improvement intervention comparing 12-month baseline and intervention periods.
A 694-bed teaching hospital.
We administered a multispecies probiotic comprising L. acidophilus (CL1285), L. casei (LBC80R), and L. rhamnosus (CLR2) to eligible antibiotic recipients within 12 hours of initial antibiotic receipt through 5 days after final dose. We excluded (1) all patients on neonatal, pediatric and oncology wards; (2) all individuals receiving perioperative prophylactic antibiotic recipients; (3) all those restricted from oral intake; and (4) those with pancreatitis, leukopenia, or posttransplant. We defined CDI by symptoms plus C. difficile toxin detection by polymerase chain reaction. Our primary outcome was hospital-onset CDI incidence on eligible hospital units, analyzed using segmented regression.
The study included 251 CDI episodes among 360,016 patient days during the baseline and intervention periods, and the incidence rate was 7.0 per 10,000 patient days. The incidence rate was similar during baseline and intervention periods (6.9 vs 7.0 per 10,000 patient days; P=.95). However, compared to the first 6 months of the intervention, we detected a significant decrease in CDI during the final 6 months (incidence rate ratio, 0.6; 95% confidence interval, 0.4–0.9; P=.009). Testing intensity remained stable between the baseline and intervention periods: 19% versus 20% of stools tested were C. difficile positive by PCR, respectively. From medical record reviews, only 26% of eligible patients received a probiotic per the protocol.
Despite poor adherence to the protocol, there was a reduction in the incidence of CDI during the intervention, which was delayed ~6 months after introducing probiotic for primary prevention.
Hospitalized patients with suspected tuberculosis (TB) are placed in airborne isolation until 3 sputum smear samples are negative for acid-fast bacilli (AFB). The Xpert MTB/RIF assay (“Xpert”) nucleic acid amplification test (NAAT) to identify Mycobacterium tuberculosis DNA and resistance to rifampicin is superior to AFB sputum smear microscopy for the diagnosis of TB.
To compare the performance of a single Xpert to AFB smear microscopy for time to airborne infection isolation (AII) discontinuation.
Consecutive patients over 17 years of age in AII for suspected pulmonary TB between October 1, 2014, and March 31, 2016, with leftover respiratory AFB samples were enrolled in this study. A single Xpert was performed on the first available sample. Demographic, clinical, and microbiological data were recorded for each patient. We compared the duration of AII using a single Xpert to AFB smear microscopy under multiple theoretical scenarios using Kaplan-Meier cumulative incidence curves and the log-rank test.
In total, 131 samples were included in our performance analysis of the Xpert, and 114 samples were included in our AII analysis. Overall, 81 patients (65%) were immunosuppressed, of whom 46 (37%) were positive for human immunodeficiency virus (HIV). The sensitivity and specificity of Xpert for diagnosis of M. tuberculosis infection were 67% and 100%, respectively. Xpert was negative in all cases of nontuberculous mycobacteria. Use of a single Xpert reduced AII duration from a median of 67 hours per patient to 42 hours with usual reporting, to 26 hours with direct communication, and to 12 hours with immediate testing.
A single negative Xpert result can reduce AII duration compared to the AFB smear microscopy technique under multiple theoretical scenarios.
Objectives: Human immunodeficiency virus (HIV) disproportionately affects Hispanics/Latinos in the United States, yet little is known about neurocognitive impairment (NCI) in this group. We compared the rates of NCI in large well-characterized samples of HIV-infected (HIV+) Latinos and (non-Latino) Whites, and examined HIV-associated NCI among subgroups of Latinos. Methods: Participants included English-speaking HIV+ adults assessed at six U.S. medical centers (194 Latinos, 600 Whites). For overall group, age: M=42.65 years, SD=8.93; 86% male; education: M=13.17, SD=2.73; 54% had acquired immunodeficiency syndrome. NCI was assessed with a comprehensive test battery with normative corrections for age, education and gender. Covariates examined included HIV-disease characteristics, comorbidities, and genetic ancestry. Results: Compared with Whites, Latinos had higher rates of global NCI (42% vs. 54%), and domain NCI in executive function, learning, recall, working memory, and processing speed. Latinos also fared worse than Whites on current and historical HIV-disease characteristics, and nadir CD4 partially mediated ethnic differences in NCI. Yet, Latinos continued to have more global NCI [odds ratio (OR)=1.59; 95% confidence interval (CI)=1.13–2.23; p<.01] after adjusting for significant covariates. Higher rates of global NCI were observed with Puerto Rican (n=60; 71%) versus Mexican (n=79, 44%) origin/descent; this disparity persisted in models adjusting for significant covariates (OR=2.40; CI=1.11–5.29; p=.03). Conclusions: HIV+ Latinos, especially of Puerto Rican (vs. Mexican) origin/descent had increased rates of NCI compared with Whites. Differences in rates of NCI were not completely explained by worse HIV-disease characteristics, neurocognitive comorbidities, or genetic ancestry. Future studies should explore culturally relevant psychosocial, biomedical, and genetic factors that might explain these disparities and inform the development of targeted interventions. (JINS, 2018, 24, 163–175)
Six radio telescopes were operated as the first southern hemisphere VLBI array in April and May 1982. Observations were made at 2.3 and 8.4 Ghz. This array produced VLBI images of 28 southern hemisphere radio sources, high accuracy VLBI geodesy between southern hemisphere sites, and subarcsecond radio astrometry of celestial sources south of declination −45 degrees. This paper discusses only the astrophysical aspects of the experiment.
Objectives: The present study examined differences in neurocognitive outcomes among non-Hispanic Black and White stroke survivors using the NIH Toolbox-Cognition Battery (NIHTB-CB), and investigated the roles of healthcare variables in explaining racial differences in neurocognitive outcomes post-stroke. Methods: One-hundred seventy adults (91 Black; 79 White), who participated in a multisite study were included (age: M=56.4; SD=12.6; education: M=13.7; SD=2.5; 50% male; years post-stroke: 1–18; stroke type: 72% ischemic, 28% hemorrhagic). Neurocognitive function was assessed with the NIHTB-CB, using demographically corrected norms. Participants completed measures of socio-demographic characteristics, health literacy, and healthcare use and access. Stroke severity was assessed with the Modified Rankin Scale. Results: An independent samples t test indicated Blacks showed more neurocognitive impairment (NIHTB-CB Fluid Composite T-score: M=37.63; SD=11.67) than Whites (Fluid T-score: M=42.59, SD=11.54; p=.006). This difference remained significant after adjusting for reading level (NIHTB-CB Oral Reading), and when stratified by stroke severity. Blacks also scored lower on health literacy, reported differences in insurance type, and reported decreased confidence in the doctors treating them. Multivariable models adjusting for reading level and injury severity showed that health literacy and insurance type were statistically significant predictors of the Fluid cognitive composite (p<.001 and p=.02, respectively) and significantly mediated racial differences on neurocognitive impairment. Conclusions: We replicated prior work showing that Blacks are at increased risk for poorer neurocognitive outcomes post-stroke than Whites. Health literacy and insurance type might be important modifiable factors influencing these differences. (JINS, 2017, 23, 640–652)
This chapter investigates the amount of research interest in the relationship between mental health and terrorism. In addition, Johnson and colleagues have begun to identify the seminal research in the field as the frequency of publications increased allowing dominant and coherent trends of study to emerge. Both broad theoretical advances and focused conceptual refinements have been identified and discussed. The authors have also sought to identify the broader lacunae in the field and suggest future directions for research. Their results reveal that the dramatic increase in research focusing on the topics of terrorism and mental health reached their highpoints five to ten years after the September 11, 2001, attacks on the World Trade Center in New York and the Pentagon in Washington, DC, and the hijacked plane that crashed in Pennsylvania. The first highpoint occurred after five years with respect to health and terrorism specifically, while research on terrorism in general continued to rise in the social science literature for another five years before its first major descent was recorded in 2012. The reaction to those attacks themselves dominated much of the research. In addition to ongoing attention to mental health topics and risk behaviors occurring as a result of those attacks, a developing trend in positive outcomes such as post-traumatic growth (PTG) and especially resilience have been noted.
While the frequency of published research generally has started to wane in the second decade following the September 11 attacks, many of the questions raised by the research in the first decade remain unanswered. Will new tragedies in understudied parts of the world demand resurgence in research focusing on the association between mental health and terrorism, or will it take another crisis among Western nations (e.g., the current wave of refugees into Europe from non-European Union nations)? With waves of terrorism washing over large regions of the globe, what factors determine who and what topics are drawing the attention of mental health researchers? How would you define resilience in the face of terrorism and what examples of it can you provide? What work remains to be done on resilience to broaden its application to sociology?
Efficient natural dispersal of herbicide-resistance alleles via seed and pollen can markedly accelerate the incidence of herbicide-resistant weed populations across an agroecoregion. Studies were conducted in western Canada in 2014 and 2015 to investigate pollen- and seed-mediated gene flow in kochia. Pollen-mediated gene flow (PMGF) from glyphosate-resistant (GR) to non-GR kochia was quantified in a field trial (hub and spoke design) at Saskatoon, Saskatchewan. Seed-mediated gene flow of acetolactate synthase (ALS) inhibitor-resistant kochia as a function of tumbleweed speed and distance was estimated in cereal stubble fields at Lethbridge, Alberta and Scott, Saskatchewan. Regression analysis indicated that outcrossing from GR to adjacent non-GR kochia ranged from 5.3 to 7.5%, declining exponentially to 0.1 to 0.4% at 96 m distance. However, PMGF was significantly influenced by prevailing wind direction during pollination (maximum of 11 to 17% outcrossing down-wind). Seed dropped by tumbleweeds varied with distance and plant speed, approaching 90% or more (ca. 100,000 seeds or more) at distances of up to 1,000 m and plant speeds of up to 300 cm s–1. This study highlights the efficient proximal (pollen) and distal (seed) gene movement of this important GR weed.
We say a graph is (Qn,Qm)-saturated if it is a maximal Qm-free subgraph of the n-dimensional hypercube Qn. A graph is said to be (Qn,Qm)-semi-saturated if it is a subgraph of Qn and adding any edge forms a new copy of Qm. The minimum number of edges a (Qn,Qm)-saturated graph (respectively (Qn,Qm)-semi-saturated graph) can have is denoted by sat(Qn,Qm) (respectively s-sat(Qn,Qm)). We prove that
for fixed m, disproving a conjecture of Santolupo that, when m=2, this limit is 1/4. Further, we show by a different method that sat(Qn, Q2)=O(2n), and that s-sat(Qn, Qm)=O(2n), for fixed m. We also prove the lower bound
Giant ragweed has been increasing as a major weed of row crops in the last
30 yr, but quantitative data regarding its pattern and mechanisms of spread
in crop fields are lacking. To address this gap, we conducted a Web-based
survey of certified crop advisors in the U.S. Corn Belt and Ontario, Canada.
Participants were asked questions regarding giant ragweed and crop
production practices for the county of their choice. Responses were mapped
and correlation analyses were conducted among the responses to determine
factors associated with giant ragweed populations. Respondents rated giant
ragweed as the most or one of the most difficult weeds to manage in 45% of
421 U.S. counties responding, and 57% of responding counties reported giant
ragweed populations with herbicide resistance to acetolactate synthase
inhibitors, glyphosate, or both herbicides. Results suggest that giant
ragweed is increasing in crop fields outward from the east-central U.S. Corn
Belt in most directions. Crop production practices associated with giant
ragweed populations included minimum tillage, continuous soybean, and
multiple-application herbicide programs; ecological factors included giant
ragweed presence in noncrop edge habitats, early and prolonged emergence,
and presence of the seed-burying common earthworm in crop fields. Managing
giant ragweed in noncrop areas could reduce giant ragweed migration from
noncrop habitats into crop fields and slow its spread. Where giant ragweed
is already established in crop fields, including a more diverse combination
of crop species, tillage practices, and herbicide sites of action will be
critical to reduce populations, disrupt emergence patterns, and select
against herbicide-resistant giant ragweed genotypes. Incorporation of a
cereal grain into the crop rotation may help suppress early giant ragweed
emergence and provide chemical or mechanical control options for
late-emerging giant ragweed.
The Full-sky Astrometric Mapping Explorer (FAME) is designed to perform an all-sky, astrometric survey with unprecedented accuracy. It will create a rigid astrometric catalog of 4 × 107 stars with 5 < mV < 15. For bright stars, 5 < mV < 9, FAME will determine positions and parallaxes accurate to < 50 μas, with proper motion errors < 50 μas/yr. For fainter stars, 9 < mV < 15, FAME will determine positions and parallaxes accurate to < 500 μas, with proper motion errors < 500 μas/yr. It will also collect photometric data on these 4 × 107 stars in four Sloan Digital Sky Survey colors. NASA selected FAME to be one of five MIDEX missions funded for a concept study. In October 1999, NASA selected FAME for launch in 2004 as the MIDEX-4 mission in its Explorer program.
To determine the effect of graft choice (allograft, bone-patellar tendon-bone autograft, or hamstring autograft) on deep tissue infections following anterior cruciate ligament (ACL) reconstructions.
Retrospective cohort study.
SETTING AND POPULATION
Patients from 6 US health plans who underwent ACL reconstruction from January 1, 2000, through December 31, 2008.
We identified ACL reconstructions and potential postoperative infections using claims data. A hierarchical stratified sampling strategy was used to identify patients for medical record review to confirm ACL reconstructions and to determine allograft vs autograft tissue implanted, clinical characteristics, and infection status. We estimated infection rates overall and by graft type. We used logistic regression to assess the association between infections and patients’ demographic characteristics, comorbidities, and choice of graft.
On review of 1,452 medical records, we found 55 deep wound infections. With correction for sampling weights, infection rates varied by graft type: 0.5% (95% CI, 0.3%-0.8%) with allografts, 0.6% (0.1%–1.5%) with bone-patellar tendon-bone autografts, and 2.5% (1.9%–3.1%) with hamstring autograft. After adjusting for potential confounders, we found an increased infection risk with hamstring autografts compared with allografts (odds ratio, 5.9; 95% CI, 2.8–12.8). However, there was no difference in infection risk among bone-patellar tendon-bone autografts vs allografts (odds ratio, 1.2; 95% CI, 0.3–4.8).
The overall risk for deep wound infections following ACL reconstruction is low but it does vary by graft type. Infection risk was highest in hamstring autograft recipients compared with allograft recipients and bone-patellar tendon-bone autograft recipients.