To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In 2013, the national surveillance case definition for West Nile virus (WNV) disease was revised to remove fever as a criterion for neuroinvasive disease and require at most subjective fever for non-neuroinvasive disease. The aims of this project were to determine how often afebrile WNV disease occurs and assess differences among patients with and without fever. We included cases with laboratory evidence of WNV disease reported from four states in 2014. We compared demographics, clinical symptoms and laboratory evidence for patients with and without fever and stratified the analysis by neuroinvasive and non-neuroinvasive presentations. Among 956 included patients, 39 (4%) had no fever; this proportion was similar among patients with and without neuroinvasive disease symptoms. For neuroinvasive and non-neuroinvasive patients, there were no differences in age, sex, or laboratory evidence between febrile and afebrile patients, but hospitalisations were more common among patients with fever (P < 0.01). The only significant difference in symptoms was for ataxia, which was more common in neuroinvasive patients without fever (P = 0.04). Only 5% of non-neuroinvasive patients did not meet the WNV case definition due to lack of fever. The evidence presented here supports the changes made to the national case definition in 2013.
A national need is to prepare for and respond to accidental or intentional disasters categorized as chemical, biological, radiological, nuclear, or explosive (CBRNE). These incidents require specific subject-matter expertise, yet have commonalities. We identify 7 core elements comprising CBRNE science that require integration for effective preparedness planning and public health and medical response and recovery. These core elements are (1) basic and clinical sciences, (2) modeling and systems management, (3) planning, (4) response and incident management, (5) recovery and resilience, (6) lessons learned, and (7) continuous improvement. A key feature is the ability of relevant subject matter experts to integrate information into response operations. We propose the CBRNE medical operations science support expert as a professional who (1) understands that CBRNE incidents require an integrated systems approach, (2) understands the key functions and contributions of CBRNE science practitioners, (3) helps direct strategic and tactical CBRNE planning and responses through first-hand experience, and (4) provides advice to senior decision-makers managing response activities. Recognition of both CBRNE science as a distinct competency and the establishment of the CBRNE medical operations science support expert informs the public of the enormous progress made, broadcasts opportunities for new talent, and enhances the sophistication and analytic expertise of senior managers planning for and responding to CBRNE incidents.
Introduction: Intravenous insertion (IVI) is identified by children as extremely painful and the resultant distress can have lasting negative consequences. There is an urgent need to effectively manage such procedures. Our primary objective was to compare the pain and distress of IVI with the addition of humanoid robot-based distraction to standard care, versus standard care alone. Methods: This two-armed randomized controlled trial (RCT) was conducted from April 2017 to May 2018 at the Stollery Children's Hospital emergency department (ED). Children aged 6 to 11 years who required IVI were included. Exclusion criteria included hearing or visual impairments, neurocognitive delays, sensory impairment to pain, previous enrolment, and discretion of the ED clinical staff. Primary outcomes were measured using the Observational Scale of Behavioural Distress-Revised (OSBD-R) (distress) and the Faces Pain Scale-Revised (FPS-R) (pain). A total of 426 pediatric patients were screened and 340 were excluded. Results: We recruited 86 children, of which 55% (47/86) were male; 9% (7/82) were premature at birth; 82% (67/82) had a previous ED visit; 30% (25/82) required previous hospitalization; 78% (64/82) had previous IV placement and 96% (78/81) received topical anesthesia. The mean total OSBD-R score was 1.49 ± 2.36 (standard care) compared to 0.78 ± 1.32 (robot group) (p = 0.047). The median FPS-R during the IV procedure was 4 (IQR 2,6) in the standard care group alone, compared to 2 (IQR 0,4) with the addition of humanoid robot-based distraction (p = 0.10). Change in parental state anxiety pre-procedure versus post-procedure was not significantly different between groups (p = 0.49). Parental satisfaction with the IV start was 93% (39/42) in the robot arm compared to 74% (29/39) in the standard care arm (p = 0.03). Parents were also more satisfied with management of their child's pain in the robot group (95% very satisfied) compared with standard care (72% very satisfied) (p = 0.002). Conclusion: A statistically significant reduction in distress was observed with the addition of robot-based distraction to standard care. Humanoid robot-based distraction therapy reduces distress and to a lesser extent, pain, in children undergoing IVI in the ED. Further trials are required to confirm utility in other age groups and settings.
Background: Buprenorphine/naloxone (bup/nal) is a partial opioid agonist/antagonist and recommended first line treatment for opioid use disorder (OUD). Emergency departments (EDs) are a key point of contact with the healthcare system for patients living with OUD. Aim Statement: We implemented a multi-disciplinary quality improvement project to screen patients for OUD, initiate bup/nal for eligible individuals, and provide rapid next business day walk-in referrals to addiction clinics in the community. Measures & Design: From May to September 2018, our team worked with three ED sites and three addiction clinics to pilot the program. Implementation involved alignment with regulatory requirements, physician education, coordination with pharmacy to ensure in-ED medication access, and nurse education. The project is supported by a full-time project manager, data analyst, operations leaders, physician champions, provincial pharmacy, and the Emergency Strategic Clinical Network leadership team. For our pilot, our evaluation objective was to determine the degree to which our initiation and referral pathway was being utilized. We used administrative data to track the number of patients given bup/nal in ED, their demographics and whether they continued to fill bup/nal prescriptions 30 days after their ED visit. Addiction clinics reported both the number of patients referred to them and the number of patients attending their referral. Evaluation/Results: Administrative data shows 568 opioid-related visits to ED pilot sites during the pilot phase. Bup/nal was given to 60 unique patients in the ED during 66 unique visits. There were 32 (53%) male patients and 28 (47%) female patients. Median patient age was 34 (range: 21 to 79). ED visits where bup/nal was given had a median length of stay of 6 hours 57 minutes (IQR: 6 hours 20 minutes) and Canadian Triage Acuity Scores as follows: Level 1 – 1 (2%), Level 2 – 21 (32%), Level 3 – 32 (48%), Level 4 – 11 (17%), Level 5 – 1 (2%). 51 (77%) of these visits led to discharge. 24 (47%) discharged patients given bup/nal in ED continued to fill bup/nal prescriptions 30 days after their index ED visit. EDs also referred 37 patients with OUD to the 3 community clinics, and 16 of those individuals (43%) attended their first follow-up appointment. Discussion/Impact: Our pilot project demonstrates that with dedicated resources and broad institutional support, ED patients with OUD can be appropriately initiated on bup/nal and referred to community care.
Objectives: Studies of neurocognitively elite older adults, termed SuperAgers, have identified clinical predictors and neurobiological indicators of resilience against age-related neurocognitive decline. Despite rising rates of older persons living with HIV (PLWH), SuperAging (SA) in PLWH remains undefined. We aimed to establish neuropsychological criteria for SA in PLWH and examined clinically relevant correlates of SA. Methods: 734 PLWH and 123 HIV-uninfected participants between 50 and 64 years of age underwent neuropsychological and neuromedical evaluations. SA was defined as demographically corrected (i.e., sex, race/ethnicity, education) global neurocognitive performance within normal range for 25-year-olds. Remaining participants were labeled cognitively normal (CN) or impaired (CI) based on actual age. Chi-square and analysis of variance tests examined HIV group differences on neurocognitive status and demographics. Within PLWH, neurocognitive status differences were tested on HIV disease characteristics, medical comorbidities, and everyday functioning. Multinomial logistic regression explored independent predictors of neurocognitive status. Results: Neurocognitive status rates and demographic characteristics differed between PLWH (SA=17%; CN=38%; CI=45%) and HIV-uninfected participants (SA=35%; CN=55%; CI=11%). In PLWH, neurocognitive groups were comparable on demographic and HIV disease characteristics. Younger age, higher verbal IQ, absence of diabetes, fewer depressive symptoms, and lifetime cannabis use disorder increased likelihood of SA. SA reported increased independence in everyday functioning, employment, and health-related quality of life than non-SA. Conclusions: Despite combined neurological risk of aging and HIV, youthful neurocognitive performance is possible for older PLWH. SA relates to improved real-world functioning and may be better explained by cognitive reserve and maintenance of cardiometabolic and mental health than HIV disease severity. Future research investigating biomarker and lifestyle (e.g., physical activity) correlates of SA may help identify modifiable neuroprotective factors against HIV-related neurobiological aging. (JINS, 2019, 25, 507–519)
The increased use of insecticide seed treatments in rice has raised many questions about the potential benefits of these products. In 2014 and 2015, a field experiment was conducted near Stuttgart and Lonoke, AR, to evaluate whether an insecticide seed treatment could possibly lessen injury from acetolactate synthase (ALS)–inhibiting herbicides in imidazolinone-resistant (IR) rice. Two IR cultivars were tested (a hybrid, ‘CLXL745’, and an inbred, ‘CL152’), with and without an insecticide seed treatment (thiamethoxam). Four different herbicide combinations were evaluated: a nontreated control, two applications of bispyribac-sodium (hereafter bispyribac), two applications of imazethapyr, and two applications of imazethapyr plus bispyribac. The first herbicide application was to two- to three-leaf rice, and the second immediately prior to flooding (one- to two-tiller). At both 2 and 4 wk after final treatment (WAFT), the sequential applications of imazethapyr or bispyribac plus imazethapyr were more injurious to CLXL745 than CL152. This increased injury led to decreased groundcover 3 WAFT. Rice treated with thiamethoxam was less injured than nontreated rice and had improved groundcover and greater canopy heights. Even with up to 32% injury, the rice plants recovered by the end of the growing season, and yields within a cultivar were similar with and without a thiamethoxam seed treatment across all herbicide treatments. Based on these results, thiamethoxam can partially protect rice from injury caused by ALS-inhibiting herbicides as well as increase groundcover and canopy height; that is, the injury to rice never negatively affected yield.
With the recent discovery of a dozen dusty star-forming galaxies and around 30 quasars at z > 5 that are hyper-luminous in the infrared (μ LIR > 1013 L⊙, where μ is a lensing magnification factor), the possibility has opened up for SPICA, the proposed ESA M5 mid-/far-infrared mission, to extend its spectroscopic studies toward the epoch of reionisation and beyond. In this paper, we examine the feasibility and scientific potential of such observations with SPICA’s far-infrared spectrometer SAFARI, which will probe a spectral range (35–230 μm) that will be unexplored by ALMA and JWST. Our simulations show that SAFARI is capable of delivering good-quality spectra for hyper-luminous infrared galaxies at z = 5 − 10, allowing us to sample spectral features in the rest-frame mid-infrared and to investigate a host of key scientific issues, such as the relative importance of star formation versus AGN, the hardness of the radiation field, the level of chemical enrichment, and the properties of the molecular gas. From a broader perspective, SAFARI offers the potential to open up a new frontier in the study of the early Universe, providing access to uniquely powerful spectral features for probing first-generation objects, such as the key cooling lines of low-metallicity or metal-free forming galaxies (fine-structure and H2 lines) and emission features of solid compounds freshly synthesised by Population III supernovae. Ultimately, SAFARI’s ability to explore the high-redshift Universe will be determined by the availability of sufficiently bright targets (whether intrinsically luminous or gravitationally lensed). With its launch expected around 2030, SPICA is ideally positioned to take full advantage of upcoming wide-field surveys such as LSST, SKA, Euclid, and WFIRST, which are likely to provide extraordinary targets for SAFARI.
The north-west European population of Bewick’s Swan Cygnus columbianus bewickii declined by 38% between 1995 and 2010 and is listed as ‘Endangered’ on the European Red List of birds. Here, we combined information on food resources within the landscape with long-term data on swan numbers, habitat use, behaviour and two complementary measures of body condition, to examine whether changes in food type and availability have influenced the Bewick’s Swan’s use of their main wintering site in the UK, the Ouse Washes and surrounding fens. Maximum number of Bewick’s Swans rose from 620 in winter 1958/59 to a high of 7,491 in winter 2004/05, before falling to 1,073 birds in winter 2013/14. Between winters 1958/59 and 2014/15 the Ouse Washes supported between 0.5 and 37.9 % of the total population wintering in north-west Europe (mean ± 95 % CI = 18.1 ± 2.4 %). Swans fed on agricultural crops, shifting from post-harvest remains of root crops (e.g. sugar beet and potatoes) in November and December to winter-sown cereals (e.g. wheat) in January and February. Inter-annual variation in the area cultivated for these crops did not result in changes in the peak numbers of swans occurring on the Ouse Washes. Behavioural and body condition data indicated that food supplies on the Ouse Washes and surrounding fens remain adequate to allow the birds to gain and maintain good body condition throughout winter with no increase in foraging effort. Our findings suggest that the recent decline in numbers of Bewick’s Swans at this internationally important site was not linked to inadequate food resources.
In 2017, we surveyed 101 SHEA Research Network hospitals regarding Legionnaires’ disease (LD). Of 29 respondents, 94% have or are developing a water management plan with varying characteristics and personnel engaged. Most LD diagnostic testing is limited to urine antigen testing. Many opportunities to improve LD prevention and diagnosis exist.
Each year there are multiple reports of drift occurrences, and the majority of drift complaints in rice are from imazethapyr or glyphosate. In 2014 and 2015, multiple field experiments were conducted near Stuttgart, AR, and near Lonoke, AR, to evaluate whether insecticide seed treatments would reduce injury from glyphosate or imazethapyr drift or decrease the recovery time following exposure to a low rate of these herbicides. Study I was referred to as the “seed treatment study,” and Study II was the “drift timing study.” In the seed treatment study the conventional rice cultivar ‘Roy J’ was planted, and herbicide treatments included imazethapyr at 10.5 g ai ha–1, glyphosate at 126 g ae ha–1, or no herbicide. Each plot had either a seed treatment of thiamethoxam, clothianidin, chlorantraniliprole, or no insecticide seed treatment. The herbicides were applied at the two- to three-leaf growth stage. Crop injury was assessed 1, 3, and 5 wk after application. Averaged over site-years, thiamethoxam-treated rice had less injury than rice with no insecticide seed treatment at each rating, along with an increased yield. Clothianidin-treated rice had an increased yield over no insecticide seed treatment, but the reduction in injury for both herbicides was less pronounced than in the thiamethoxam-treated plots. Overall, chlorantraniliprole was generally the least effective of the three insecticides in reducing injury from either herbicide and in protecting rice yield potential. A second experiment conducted at Stuttgart, AR, was meant to determine whether damage to rice from glyphosate and imazethapyr was influenced by the timing (15, 30, and 45 d after planting) of exposure to herbicides for thiamethoxam-treated and nontreated rice. There was an overall reduction in injury with the use of thiamethoxam, but the reduction in injury was not dependent on the timing of the drift event. Reduction in damage from physical drift of glyphosate and imazethapyr as well as increased yields over the absence of an insecticide seed treatment appear to be an added benefit.
Three-dimensional (3D) printing technology is a promising method for bone tissue engineering applications. For enhanced bone regeneration, it is important to have printable ink materials with appealing properties such as construct interconnectivity, mechanical strength, controlled degradation rates, and the presence of bioactive materials. In this respect, we develop a composite ink composed of polycaprolactone (PCL), poly(D,L-lactide-co-glycolide) (PLGA), and hydroxyapatite particles (HAps) and 3D print it into porous constructs. In vitro study revealed that composite constructs had higher mechanical properties, surface roughness, quicker degradation profile, and cellular behaviors compared to PCL counterparts. Furthermore, in vivo results showed that 3D-printed composite constructs had a positive influence on bone regeneration due to the presence of newly formed mineralized bone tissue and blood vessel formation. Therefore, 3D printable ink made of PCL/PLGA/HAp can be a highly useful material for 3D printing of bone tissue constructs.
Introduction: TREKK is a national knowledge mobilization network of clinicians, researchers and parents aimed at improving emergency care for children by increasing collaborations between general and pediatric emergency departments (ED). This study aimed to determine patterns of knowledge sharing within the network and identify connections, barriers and opportunities to obtaining pediatric information and training. Methods: Social network analysis (SNA) uses network theory to understand patterns of interaction. Two SNAs were conducted in 2014 and 2015 using an online network survey distributed to 37 general EDs. Data was analyzed using UCI Net and Netdraw to identify connections, knowledge sharing and knowledge brokers within the network. Building on these results, we then conducted 22 semi-structured follow-up interviews (2016) with healthcare professionals (HCPs) at General EDs across Canada, purposefully sampled to include individuals from connected and disconnected sites, as identified in the SNA. Interviews were analyzed by 2 reviewers using content and thematic analysis. Results: SNA data was analyzed for 135 participants across the network. Results from 2014 showed that the network was divided along provincial lines, with most individuals connecting with colleagues within their own institution. Results from 2015 showed more inter-site interconnectivity and a reduction in isolated sites over time from 17 to 3. Interview participants included physicians (59%) and nurses (41%) from 18 general EDs in urban (68%) and rural/remote (32%) Canada. HCPs sought information both formally and informally, by using guidelines, talking to colleagues, and attending pediatric related training sessions. Network structure and processes were felt to increase connections, support practice change, and promote standards of care. Participants identified personal, organizational and system-level barriers to information and skill acquisition, including resources and personal costs, geography, dissemination, and time. Providing easy access to information at the point of care was promoted through enhancing content visibility and by embedding resources into local systems. There remains a need to share successful methods of local dissemination and implementation across the network, and to leverage local professional champions such as clinical nurse liaisons. Conclusion: This study highlights the power of a network to increase connections between HCPs working in general and pediatric EDs. Findings reinforce the critical role of ongoing network evaluation to improve the design and delivery of knowledge mobilization initiatives.
A substantial proportion of persons with mental disorders seek treatment from complementary and alternative medicine (CAM) professionals. However, data on how CAM contacts vary across countries, mental disorders and their severity, and health care settings is largely lacking. The aim was therefore to investigate the prevalence of contacts with CAM providers in a large cross-national sample of persons with 12-month mental disorders.
In the World Mental Health Surveys, the Composite International Diagnostic Interview was administered to determine the presence of past 12 month mental disorders in 138 801 participants aged 18–100 derived from representative general population samples. Participants were recruited between 2001 and 2012. Rates of self-reported CAM contacts for each of the 28 surveys across 25 countries and 12 mental disorder groups were calculated for all persons with past 12-month mental disorders. Mental disorders were grouped into mood disorders, anxiety disorders or behavioural disorders, and further divided by severity levels. Satisfaction with conventional care was also compared with CAM contact satisfaction.
An estimated 3.6% (standard error 0.2%) of persons with a past 12-month mental disorder reported a CAM contact, which was two times higher in high-income countries (4.6%; standard error 0.3%) than in low- and middle-income countries (2.3%; standard error 0.2%). CAM contacts were largely comparable for different disorder types, but particularly high in persons receiving conventional care (8.6–17.8%). CAM contacts increased with increasing mental disorder severity. Among persons receiving specialist mental health care, CAM contacts were reported by 14.0% for severe mood disorders, 16.2% for severe anxiety disorders and 22.5% for severe behavioural disorders. Satisfaction with care was comparable with respect to CAM contacts (78.3%) and conventional care (75.6%) in persons that received both.
CAM contacts are common in persons with severe mental disorders, in high-income countries, and in persons receiving conventional care. Our findings support the notion of CAM as largely complementary but are in contrast to suggestions that this concerns person with only mild, transient complaints. There was no indication that persons were less satisfied by CAM visits than by receiving conventional care. We encourage health care professionals in conventional settings to openly discuss the care patients are receiving, whether conventional or not, and their reasons for doing so.
The discovery of the first electromagnetic counterpart to a gravitational wave signal has generated follow-up observations by over 50 facilities world-wide, ushering in the new era of multi-messenger astronomy. In this paper, we present follow-up observations of the gravitational wave event GW170817 and its electromagnetic counterpart SSS17a/DLT17ck (IAU label AT2017gfo) by 14 Australian telescopes and partner observatories as part of Australian-based and Australian-led research programs. We report early- to late-time multi-wavelength observations, including optical imaging and spectroscopy, mid-infrared imaging, radio imaging, and searches for fast radio bursts. Our optical spectra reveal that the transient source emission cooled from approximately 6 400 K to 2 100 K over a 7-d period and produced no significant optical emission lines. The spectral profiles, cooling rate, and photometric light curves are consistent with the expected outburst and subsequent processes of a binary neutron star merger. Star formation in the host galaxy probably ceased at least a Gyr ago, although there is evidence for a galaxy merger. Binary pulsars with short (100 Myr) decay times are therefore unlikely progenitors, but pulsars like PSR B1534+12 with its 2.7 Gyr coalescence time could produce such a merger. The displacement (~2.2 kpc) of the binary star system from the centre of the main galaxy is not unusual for stars in the host galaxy or stars originating in the merging galaxy, and therefore any constraints on the kick velocity imparted to the progenitor are poor.
Previous research has shown that some insecticide seed treatments provide safening effects in rice following exposure to low rates of the herbicides glyphosate and imazethapyr. However, no research has been conducted to determine whether a similar effect may be seen in soybean or grain sorghum, two important rotational crops across the Midsouth. To evaluate the potential safening effects of insecticide seed treatments in these two crops, field trials were conducted in Marianna, AR, in 2015 and 2016, and near Colt, AR, in 2016. In soybean, glyphosate, glufosinate, 2,4-D, dicamba, halosulfuron, mesotrione, tembotrione, and propanil were applied at low rates to simulate drift events, in combination with the insecticide seed treatments thiamethoxam and clothianidin at labeled rates. In grain sorghum, glyphosate, imazethapyr, and quizalofop were applied at low rates in combination with the insecticide seed treatments thiamethoxam, clothianidin, and imidacloprid at labeled rates. Injury reduction was seen at 1 site-year for glyphosate, glufosinate, 2,4-D, dicamba, mesotrione, and tembotrione, and at 2 of 3 site-years for halosulfuron. At 1 site-year, the safening in halosulfuron resulted in increases in both crop height and yield. In grain sorghum, reducing injury via seed treatments was generally more successful. All three herbicides applied in sorghum displayed instances of injury reduction when seed treatments were used at 1 or more site-years, including reducing injury upward of 40% in the case of quizalofop+clothianidin at Marianna in 2016. For 2 site-years, injury reduction through the use of insecticides resulted in increases in crop height and grain yield in grain sorghum compared with no insecticide use. Although the degree of safening seen varied depending on site-year in both crops, growers who use insecticide seed treatments on an annual basis may expect to see a safening effect from drift events of most herbicides evaluated in both soybean and grain sorghum.
The treatment gap between the number of people with mental disorders and the number treated represents a major public health challenge. We examine this gap by socio-economic status (SES; indicated by family income and respondent education) and service sector in a cross-national analysis of community epidemiological survey data.
Data come from 16 753 respondents with 12-month DSM-IV disorders from community surveys in 25 countries in the WHO World Mental Health Survey Initiative. DSM-IV anxiety, mood, or substance disorders and treatment of these disorders were assessed with the WHO Composite International Diagnostic Interview (CIDI).
Only 13.7% of 12-month DSM-IV/CIDI cases in lower-middle-income countries, 22.0% in upper-middle-income countries, and 36.8% in high-income countries received treatment. Highest-SES respondents were somewhat more likely to receive treatment, but this was true mostly for specialty mental health treatment, where the association was positive with education (highest treatment among respondents with the highest education and a weak association of education with treatment among other respondents) but non-monotonic with income (somewhat lower treatment rates among middle-income respondents and equivalent among those with high and low incomes).
The modest, but nonetheless stronger, an association of education than income with treatment raises questions about a financial barriers interpretation of the inverse association of SES with treatment, although future within-country analyses that consider contextual factors might document other important specifications. While beyond the scope of this report, such an expanded analysis could have important implications for designing interventions aimed at increasing mental disorder treatment among socio-economically disadvantaged people.
Florpyrauxifen-benzyl is a new herbicide under development in rice that will provide an alternative mode of action to control barnyardgrass. Multiple greenhouse experiments evaluated florpyrauxifen-benzyl efficacy on barnyardgrass accessions collected in rice fields across Arkansas, and to evaluate its efficacy on herbicide-resistant biotypes. In one experiment, florpyrauxifen-benzyl was applied at the labeled rate of 30 g ai ha−1 to 152 barnyardgrass accessions collected from 21 Arkansas counties. Florpyrauxifen-benzyl at 30 g ai ha−1 effectively controlled barnyardgrass and subsequently reduced plant height and aboveground biomass. In a dose-response experiment, susceptible-, acetolactate synthase (ALS)-, propanil-, and quinclorac-resistant barnyardgrass biotypes were subjected to nine rates of florpyrauxifen-benzyl ranging from 0 to 120 g ai ha−1. The effective dose required to provide 90% control, plant height reduction, and biomass reduction of the susceptible and resistant biotypes fell below the anticipated labeled rate of 30 g ai ha−1. Based on these results, quinclorac-resistant barnyardgrass as well as other resistant biotypes can be controlled with florpyrauxifen-benzyl at 30 g ai ha−1. Overall, results from these studies indicate that florpyrauxifen-benzyl can be an effective tool for controlling susceptible and currently existing herbicide-resistant barnyardgrass biotypes in rice. Additionally, the unique auxin chemistry of florpyrauxifen-benzyl will introduce an alternative mechanism of action in rice weed control thus acting as an herbicide-resistance management tool.
The mid-infrared range contains many spectral features associated with large molecules and dust grains such as polycyclic aromatic hydrocarbons and silicates. These are usually very strong compared to fine-structure gas lines, and thus valuable in studying the spectral properties of faint distant galaxies. In this paper, we evaluate the capability of low-resolution mid-infrared spectroscopic surveys of galaxies that could be performed by SPICA. The surveys are designed to address the question how star formation and black hole accretion activities evolved over cosmic time through spectral diagnostics of the physical conditions of the interstellar/circumnuclear media in galaxies. On the basis of results obtained with Herschel far-infrared photometric surveys of distant galaxies and Spitzer and AKARI near- to mid-infrared spectroscopic observations of nearby galaxies, we estimate the numbers of the galaxies at redshift z > 0.5, which are expected to be detected in the polycyclic aromatic hydrocarbon features or dust continuum by a wide (10 deg2) or deep (1 deg2) blind survey, both for a given observation time of 600 h. As by-products of the wide blind survey, we also expect to detect debris disks, through the mid-infrared excess above the photospheric emission of nearby main-sequence stars, and we estimate their number. We demonstrate that the SPICA mid-infrared surveys will efficiently provide us with unprecedentedly large spectral samples, which can be studied further in the far-infrared with SPICA.
IR spectroscopy in the range 12–230 μm with the SPace IR telescope for Cosmology and Astrophysics (SPICA) will reveal the physical processes governing the formation and evolution of galaxies and black holes through cosmic time, bridging the gap between the James Webb Space Telescope and the upcoming Extremely Large Telescopes at shorter wavelengths and the Atacama Large Millimeter Array at longer wavelengths. The SPICA, with its 2.5-m telescope actively cooled to below 8 K, will obtain the first spectroscopic determination, in the mid-IR rest-frame, of both the star-formation rate and black hole accretion rate histories of galaxies, reaching lookback times of 12 Gyr, for large statistically significant samples. Densities, temperatures, radiation fields, and gas-phase metallicities will be measured in dust-obscured galaxies and active galactic nuclei, sampling a large range in mass and luminosity, from faint local dwarf galaxies to luminous quasars in the distant Universe. Active galactic nuclei and starburst feedback and feeding mechanisms in distant galaxies will be uncovered through detailed measurements of molecular and atomic line profiles. The SPICA’s large-area deep spectrophotometric surveys will provide mid-IR spectra and continuum fluxes for unbiased samples of tens of thousands of galaxies, out to redshifts of z ~ 6.
Objectives: Human immunodeficiency virus (HIV) disproportionately affects Hispanics/Latinos in the United States, yet little is known about neurocognitive impairment (NCI) in this group. We compared the rates of NCI in large well-characterized samples of HIV-infected (HIV+) Latinos and (non-Latino) Whites, and examined HIV-associated NCI among subgroups of Latinos. Methods: Participants included English-speaking HIV+ adults assessed at six U.S. medical centers (194 Latinos, 600 Whites). For overall group, age: M=42.65 years, SD=8.93; 86% male; education: M=13.17, SD=2.73; 54% had acquired immunodeficiency syndrome. NCI was assessed with a comprehensive test battery with normative corrections for age, education and gender. Covariates examined included HIV-disease characteristics, comorbidities, and genetic ancestry. Results: Compared with Whites, Latinos had higher rates of global NCI (42% vs. 54%), and domain NCI in executive function, learning, recall, working memory, and processing speed. Latinos also fared worse than Whites on current and historical HIV-disease characteristics, and nadir CD4 partially mediated ethnic differences in NCI. Yet, Latinos continued to have more global NCI [odds ratio (OR)=1.59; 95% confidence interval (CI)=1.13–2.23; p<.01] after adjusting for significant covariates. Higher rates of global NCI were observed with Puerto Rican (n=60; 71%) versus Mexican (n=79, 44%) origin/descent; this disparity persisted in models adjusting for significant covariates (OR=2.40; CI=1.11–5.29; p=.03). Conclusions: HIV+ Latinos, especially of Puerto Rican (vs. Mexican) origin/descent had increased rates of NCI compared with Whites. Differences in rates of NCI were not completely explained by worse HIV-disease characteristics, neurocognitive comorbidities, or genetic ancestry. Future studies should explore culturally relevant psychosocial, biomedical, and genetic factors that might explain these disparities and inform the development of targeted interventions. (JINS, 2018, 24, 163–175)