To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We describe the design and deployment of GREENBURST, a commensal Fast Radio Burst (FRB) search system at the Green Bank Telescope. GREENBURST uses the dedicated L-band receiver tap to search over the 960–1 920 MHz frequency range for pulses with dispersion measures out to
. Due to its unique design, GREENBURST is capable of conducting searches for FRBs when the L-band receiver is not being used for scheduled observing. This makes it a sensitive single pixel detector capable of reaching deeper in the radio sky. While single pulses from Galactic pulsars and rotating radio transients will be detectable in our observations, and will form part of the database we archive, the primary goal is to detect and study FRBs. Based on recent determinations of the all-sky rate, we predict that the system will detect approximately one FRB for every 2–3 months of continuous operation. The high sensitivity of GREENBURST means that it will also be able to probe the slope of the FRB fluence distribution, which is currently uncertain in this observing band.
We apply two methods to estimate the 21-cm bispectrum from data taken within the Epoch of Reionisation (EoR) project of the Murchison Widefield Array (MWA). Using data acquired with the Phase II compact array allows a direct bispectrum estimate to be undertaken on the multiple redundantly spaced triangles of antenna tiles, as well as an estimate based on data gridded to the uv-plane. The direct and gridded bispectrum estimators are applied to 21 h of high-band (167–197 MHz; z = 6.2–7.5) data from the 2016 and 2017 observing seasons. Analytic predictions for the bispectrum bias and variance for point-source foregrounds are derived. We compare the output of these approaches, the foreground contribution to the signal, and future prospects for measuring the bispectra with redundant and non-redundant arrays. We find that some triangle configurations yield bispectrum estimates that are consistent with the expected noise level after 10 h, while equilateral configurations are strongly foreground-dominated. Careful choice of triangle configurations may be made to reduce foreground bias that hinders power spectrum estimators, and the 21-cm bispectrum may be accessible in less time than the 21-cm power spectrum for some wave modes, with detections in hundreds of hours.
To sustainably improve cleaning of high-touch surfaces (HTSs) in acute-care hospitals using a multimodal approach to education, reduction of barriers to cleaning, and culture change for environmental services workers.
The study was conducted in 2 academic acute-care hospitals, 2 community hospitals, and an academic pediatric and women’s hospital.
Frontline environmental services workers.
A 5-module educational program, using principles of adult learning theory, was developed and presented to environmental services workers. Audience response system (ARS), videos, demonstrations, role playing, and graphics were used to illustrate concepts of and the rationale for infection prevention strategies. Topics included hand hygiene, isolation precautions, personal protective equipment (PPE), cleaning protocols, and strategies to overcome barriers. Program evaluation included ARS questions, written evaluations, and objective assessments of occupied patient room cleaning. Changes in hospital-onset C. difficile infection (CDI) and methicillin-resistant S. aureus (MRSA) bacteremia were evaluated.
On average, 357 environmental service workers participated in each module. Most (93%) rated the presentations as ‘excellent’ or ‘very good’ and agreed that they were useful (95%), reported that they were more comfortable donning/doffing PPE (91%) and performing hand hygiene (96%) and better understood the importance of disinfecting HTSs (96%) after the program. The frequency of cleaning individual HTSs in occupied rooms increased from 26% to 62% (P < .001) following the intervention. Improvement was sustained 1-year post intervention (P < .001). A significant decrease in CDI was associated with the program.
A novel program that addressed environmental services workers’ knowledge gaps, challenges, and barriers was well received and appeared to result in learning, behavior change, and sustained improvements in cleaning.
Though theory suggests that individual differences in neuroticism (a tendency to experience negative emotions) would be associated with altered functioning of the amygdala (which has been linked with emotionality and emotion dysregulation in childhood, adolescence, and adulthood), results of functional neuroimaging studies have been contradictory and inconclusive. We aimed to clarify the relationship between neuroticism and three hypothesized neural markers derived from functional magnetic resonance imaging during negative emotion face processing: amygdala activation, amygdala habituation, and amygdala-prefrontal connectivity, each of which plays an important role in the experience and regulation of emotions. We used general linear models to examine the relationship between trait neuroticism and the hypothesized neural markers in a large sample of over 500 young adults. Although neuroticism was not significantly associated with magnitude of amygdala activation or amygdala habituation, it was associated with amygdala–ventromedial prefrontal cortex connectivity, which has been implicated in emotion regulation. Results suggest that trait neuroticism may represent a failure in top-down control and regulation of emotional reactions, rather than overactive emotion generation processes, per se. These findings suggest that neuroticism, which has been associated with increased rates of transdiagnostic psychopathology, may represent a failure in the inhibitory neurocircuitry associated with emotion regulation.
Exceptional sub-micrometer details of shell microstructure are preserved in phosphatic micro-steinkerns representing several phyla from shell beds of the Upper Ordovician of the Cincinnati Arch region, USA. These fossils provide the most detailed record of Ordovician mollusk shell microstructures, as well as exceptional details on the earliest cases of undisputed nacre. The trend towards nacre in the Mollusca is one aspect of the surge in escalation between mollusks and their predators during the Great Ordovician Biodiversification Event.
We describe the motivation and design details of the ‘Phase II’ upgrade of the Murchison Widefield Array radio telescope. The expansion doubles to 256 the number of antenna tiles deployed in the array. The new antenna tiles enhance the capabilities of the Murchison Widefield Array in several key science areas. Seventy-two of the new tiles are deployed in a regular configuration near the existing array core. These new tiles enhance the surface brightness sensitivity of the array and will improve the ability of the Murchison Widefield Array to estimate the slope of the Epoch of Reionisation power spectrum by a factor of ∼3.5. The remaining 56 tiles are deployed on long baselines, doubling the maximum baseline of the array and improving the array u, v coverage. The improved imaging capabilities will provide an order of magnitude improvement in the noise floor of Murchison Widefield Array continuum images. The upgrade retains all of the features that have underpinned the Murchison Widefield Array’s success (large field of view, snapshot image quality, and pointing agility) and boosts the scientific potential with enhanced imaging capabilities and by enabling new calibration strategies.
Knowledge of the effects of burial depth and burial duration on seed viability and, consequently, seedbank persistence of Palmer amaranth (Amaranthus palmeri S. Watson) and waterhemp [Amaranthus tuberculatus (Moq.) J. D. Sauer] ecotypes can be used for the development of efficient weed management programs. This is of particular interest, given the great fecundity of both species and, consequently, their high seedbank replenishment potential. Seeds of both species collected from five different locations across the United States were investigated in seven states (sites) with different soil and climatic conditions. Seeds were placed at two depths (0 and 15 cm) for 3 yr. Each year, seeds were retrieved, and seed damage (shrunken, malformed, or broken) plus losses (deteriorated and futile germination) and viability were evaluated. Greater seed damage plus loss averaged across seed origin, burial depth, and year was recorded for lots tested at Illinois (51.3% and 51.8%) followed by Tennessee (40.5% and 45.1%) and Missouri (39.2% and 42%) for A. palmeri and A. tuberculatus, respectively. The site differences for seed persistence were probably due to higher volumetric water content at these sites. Rates of seed demise were directly proportional to burial depth (α=0.001), whereas the percentage of viable seeds recovered after 36 mo on the soil surface ranged from 4.1% to 4.3% compared with 5% to 5.3% at the 15-cm depth for A. palmeri and A. tuberculatus, respectively. Seed viability loss was greater in the seeds placed on the soil surface compared with the buried seeds. The greatest influences on seed viability were burial conditions and time and site-specific soil conditions, more so than geographical location. Thus, management of these weed species should focus on reducing seed shattering, enhancing seed removal from the soil surface, or adjusting tillage systems.
Posttraumatic stress disorder (PTSD) and stress/trauma exposure are cross-sectionally associated with advanced DNA methylation age relative to chronological age. However, longitudinal inquiry and examination of associations between advanced DNA methylation age and a broader range of psychiatric disorders is lacking. The aim of this study was to examine if PTSD, depression, generalized anxiety, and alcohol-use disorders predicted acceleration of DNA methylation age over time (i.e. an increasing pace, or rate of advancement, of the epigenetic clock).
Genome-wide DNA methylation and a comprehensive set of psychiatric symptoms and diagnoses were assessed in 179 Iraq/Afghanistan war veterans who completed two assessments over the course of approximately 2 years. Two DNA methylation age indices (Horvath and Hannum), each a weighted index of an array of genome-wide DNA methylation probes, were quantified. The pace of the epigenetic clock was operationalized as change in DNA methylation age as a function of time between assessments.
Analyses revealed that alcohol-use disorders (p = 0.001) and PTSD avoidance and numbing symptoms (p = 0.02) at Time 1 were associated with an increasing pace of the epigenetic clock over time, per the Horvath (but not the Hannum) index of cellular aging.
This is the first study to suggest that posttraumatic psychopathology is longitudinally associated with a quickened pace of the epigenetic clock. Results raise the possibility that accelerated cellular aging is a common biological consequence of stress-related psychopathology, which carries implications for identifying mechanisms of stress-related cellular aging and developing interventions to slow its pace.
Chickens have been selected for millenniums on their ability to select food in complex and variable environments. Artificial selection for juvenile body weight using a single balanced food might have modified the ability of chickens to adapt to a choice feeding situation (Siegel and Dunnington, 1990). However, diet selection for protein has been demonstrated in many recently published experiments (for review: Forbes and Shariatmadari, 1994).
Broiler chickens have been selected for increased growth rate and adapted to consumer demands. A range of commercial products is being developed from slower growing country-type meat chickens (named ‘Label’ in France) as well as from faster growing broilers with a high yield of breast meat. Nutritionists have to satisfy a need for feeding programmes adapted to meet the demands for various end-products. Selection for growth rate has to be shown to change the food intake behaviour of chickens (i.e. Barbato et al., 1980). Emmans (1991) demonstrated that chickens are able to adjust their food choices if the paradigm permits an adequate choice. In a series of experiments (see Picard et al., 1994 for review) the authors concluded that ‘high producing animals are not necessarily those that will react most clearly to an amino acid deficiency by altering their food intake and/or their feeding behavior’.
To assess antimicrobial prescriber knowledge, attitudes, and practices (KAP) regarding antimicrobial stewardship (AS) and associated barriers to optimal prescribing.
A convenience sample of 2,900 US antimicrobial prescribers at 5 acute-care hospitals within a hospital network.
The following characteristics were assessed with an anonymous, online survey in February 2015: attitudes and practices related to antimicrobial resistance, AS programs, and institutional AS resources; antimicrobial prescribing and AS knowledge; and practices and confidence related to antimicrobial prescribing.
In total, 402 respondents completed the survey. Knowledge gaps were identified through case-based questions. Some respondents sometimes selected overly broad therapy for the susceptibilities given (29%) and some “usually” or “always” preferred using the most broad-spectrum empiric antimicrobials possible (32%). Nearly all (99%) reported reviewing antimicrobial appropriateness at 48–72 hours, but only 55% reported “always” doing so. Furthermore, 45% of respondents felt that they had not received adequate training regarding antimicrobial prescribing. Some respondents lacked confidence selecting empiric therapy using antibiograms (30%), interpreting susceptibility results (24%), de-escalating therapy (18%), and determining duration of therapy (31%). Postprescription review and feedback (PPRF) was the most commonly cited AS intervention (79%) with potential to improve patient care.
Barriers to appropriate antimicrobial selection and de-escalation of antimicrobial therapy were identified among front-line prescribers in acute-care hospitals. Prescribers desired more AS-related education and identified PPRF as the most helpful AS intervention to improve patient care. Educational interventions should be preceded by and tailored to local assessment of educational needs.
A substantial proportion of persons with mental disorders seek treatment from complementary and alternative medicine (CAM) professionals. However, data on how CAM contacts vary across countries, mental disorders and their severity, and health care settings is largely lacking. The aim was therefore to investigate the prevalence of contacts with CAM providers in a large cross-national sample of persons with 12-month mental disorders.
In the World Mental Health Surveys, the Composite International Diagnostic Interview was administered to determine the presence of past 12 month mental disorders in 138 801 participants aged 18–100 derived from representative general population samples. Participants were recruited between 2001 and 2012. Rates of self-reported CAM contacts for each of the 28 surveys across 25 countries and 12 mental disorder groups were calculated for all persons with past 12-month mental disorders. Mental disorders were grouped into mood disorders, anxiety disorders or behavioural disorders, and further divided by severity levels. Satisfaction with conventional care was also compared with CAM contact satisfaction.
An estimated 3.6% (standard error 0.2%) of persons with a past 12-month mental disorder reported a CAM contact, which was two times higher in high-income countries (4.6%; standard error 0.3%) than in low- and middle-income countries (2.3%; standard error 0.2%). CAM contacts were largely comparable for different disorder types, but particularly high in persons receiving conventional care (8.6–17.8%). CAM contacts increased with increasing mental disorder severity. Among persons receiving specialist mental health care, CAM contacts were reported by 14.0% for severe mood disorders, 16.2% for severe anxiety disorders and 22.5% for severe behavioural disorders. Satisfaction with care was comparable with respect to CAM contacts (78.3%) and conventional care (75.6%) in persons that received both.
CAM contacts are common in persons with severe mental disorders, in high-income countries, and in persons receiving conventional care. Our findings support the notion of CAM as largely complementary but are in contrast to suggestions that this concerns person with only mild, transient complaints. There was no indication that persons were less satisfied by CAM visits than by receiving conventional care. We encourage health care professionals in conventional settings to openly discuss the care patients are receiving, whether conventional or not, and their reasons for doing so.
The discovery of the first electromagnetic counterpart to a gravitational wave signal has generated follow-up observations by over 50 facilities world-wide, ushering in the new era of multi-messenger astronomy. In this paper, we present follow-up observations of the gravitational wave event GW170817 and its electromagnetic counterpart SSS17a/DLT17ck (IAU label AT2017gfo) by 14 Australian telescopes and partner observatories as part of Australian-based and Australian-led research programs. We report early- to late-time multi-wavelength observations, including optical imaging and spectroscopy, mid-infrared imaging, radio imaging, and searches for fast radio bursts. Our optical spectra reveal that the transient source emission cooled from approximately 6 400 K to 2 100 K over a 7-d period and produced no significant optical emission lines. The spectral profiles, cooling rate, and photometric light curves are consistent with the expected outburst and subsequent processes of a binary neutron star merger. Star formation in the host galaxy probably ceased at least a Gyr ago, although there is evidence for a galaxy merger. Binary pulsars with short (100 Myr) decay times are therefore unlikely progenitors, but pulsars like PSR B1534+12 with its 2.7 Gyr coalescence time could produce such a merger. The displacement (~2.2 kpc) of the binary star system from the centre of the main galaxy is not unusual for stars in the host galaxy or stars originating in the merging galaxy, and therefore any constraints on the kick velocity imparted to the progenitor are poor.
This study aims to investigate the climate–malaria associations in nine cities selected from malaria high-risk areas in China. Daily reports of malaria cases in Anhui, Henan, and Yunnan Provinces for 2005–2012 were obtained from the Chinese Center for Disease Control and Prevention. Generalized estimating equation models were used to quantify the city-specific climate–malaria associations. Multivariate random-effects meta-regression analyses were used to pool the city-specific effects. An inverted-U-shaped curve relationship was observed between temperatures, average relative humidity, and malaria. A 1 °C increase of maximum temperature (Tmax) resulted in 6·7% (95% CI 4·6–8·8%) to 15·8% (95% CI 14·1–17·4%) increase of malaria, with corresponding lags ranging from 7 to 45 days. For minimum temperature (Tmin), the effect estimates peaked at lag 0 to 40 days, ranging from 5·3% (95% CI 4·4–6·2%) to 17·9% (95% CI 15·6–20·1%). Malaria is more sensitive to Tmin in cool climates and Tmax in warm climates. The duration of lag effect in a cool climate zone is longer than that in a warm climate zone. Lagged effects did not vanish after an epidemic season but waned gradually in the following 2–3 warm seasons. A warming climate may potentially increase the risk of malaria resurgence in China.
An internationally approved and globally used classification scheme for the diagnosis of CHD has long been sought. The International Paediatric and Congenital Cardiac Code (IPCCC), which was produced and has been maintained by the International Society for Nomenclature of Paediatric and Congenital Heart Disease (the International Nomenclature Society), is used widely, but has spawned many “short list” versions that differ in content depending on the user. Thus, efforts to have a uniform identification of patients with CHD using a single up-to-date and coordinated nomenclature system continue to be thwarted, even if a common nomenclature has been used as a basis for composing various “short lists”. In an attempt to solve this problem, the International Nomenclature Society has linked its efforts with those of the World Health Organization to obtain a globally accepted nomenclature tree for CHD within the 11th iteration of the International Classification of Diseases (ICD-11). The International Nomenclature Society has submitted a hierarchical nomenclature tree for CHD to the World Health Organization that is expected to serve increasingly as the “short list” for all communities interested in coding for congenital cardiology. This article reviews the history of the International Classification of Diseases and of the IPCCC, and outlines the process used in developing the ICD-11 congenital cardiac disease diagnostic list and the definitions for each term on the list. An overview of the content of the congenital heart anomaly section of the Foundation Component of ICD-11, published herein in its entirety, is also included. Future plans for the International Nomenclature Society include linking again with the World Health Organization to tackle procedural nomenclature as it relates to cardiac malformations. By doing so, the Society will continue its role in standardising nomenclature for CHD across the globe, thereby promoting research and better outcomes for fetuses, children, and adults with congenital heart anomalies.
Whether monozygotic (MZ) and dizygotic (DZ) twins differ from each other in a variety of phenotypes is important for genetic twin modeling and for inferences made from twin studies in general. We analyzed whether there were differences in individual, maternal and paternal education between MZ and DZ twins in a large pooled dataset. Information was gathered on individual education for 218,362 adult twins from 27 twin cohorts (53% females; 39% MZ twins), and on maternal and paternal education for 147,315 and 143,056 twins respectively, from 28 twin cohorts (52% females; 38% MZ twins). Together, we had information on individual or parental education from 42 twin cohorts representing 19 countries. The original education classifications were transformed to education years and analyzed using linear regression models. Overall, MZ males had 0.26 (95% CI [0.21, 0.31]) years and MZ females 0.17 (95% CI [0.12, 0.21]) years longer education than DZ twins. The zygosity difference became smaller in more recent birth cohorts for both males and females. Parental education was somewhat longer for fathers of DZ twins in cohorts born in 1990–1999 (0.16 years, 95% CI [0.08, 0.25]) and 2000 or later (0.11 years, 95% CI [0.00, 0.22]), compared with fathers of MZ twins. The results show that the years of both individual and parental education are largely similar in MZ and DZ twins. We suggest that the socio-economic differences between MZ and DZ twins are so small that inferences based upon genetic modeling of twin data are not affected.