To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We have detected 27 new supernova remnants (SNRs) using a new data release of the GLEAM survey from the Murchison Widefield Array telescope, including the lowest surface brightness SNR ever detected, G 0.1 – 9.7. Our method uses spectral fitting to the radio continuum to derive spectral indices for 26/27 candidates, and our low-frequency observations probe a steeper spectrum population than previously discovered. None of the candidates have coincident WISE mid-IR emission, further showing that the emission is non-thermal. Using pulsar associations we derive physical properties for six candidate SNRs, finding G 0.1 – 9.7 may be younger than 10 kyr. Sixty per cent of the candidates subtend areas larger than 0.2 deg2 on the sky, compared to < 25% of previously detected SNRs. We also make the first detection of two SNRs in the Galactic longitude range 220°–240°.
This work makes available a further
of the GaLactic and Extragalactic All-sky Murchison Widefield Array (GLEAM) survey, covering half of the accessible galactic plane, across 20 frequency bands sampling 72–231 MHz, with resolution
. Unlike previous GLEAM data releases, we used multi-scale CLEAN to better deconvolve large-scale galactic structure. For the galactic longitude ranges
$345^\circ < l < 67^\circ$
$180^\circ < l < 240^\circ$
, we provide a compact source catalogue of 22 037 components selected from a 60-MHz bandwidth image centred at 200 MHz, with RMS noise
and position accuracy better than 2 arcsec. The catalogue has a completeness of 50% at
, and a reliability of 99.86%. It covers galactic latitudes
towards the galactic centre and
for other regions, and is available from Vizier; images covering
for all longitudes are made available on the GLEAM Virtual Observatory (VO).server and SkyView.
We examined the latest data release from the GaLactic and Extragalactic All-sky Murchison Widefield Array (GLEAM) survey covering 345° < l < 60° and 180° < l < 240°, using these data and that of the Widefield Infrared Survey Explorer to follow up proposed candidate Supernova Remnant (SNR) from other sources. Of the 101 candidates proposed in the region, we are able to definitively confirm ten as SNRs, tentatively confirm two as SNRs, and reclassify five as H ii regions. A further two are detectable in our images but difficult to classify; the remaining 82 are undetectable in these data. We also investigated the 18 unclassified Multi-Array Galactic Plane Imaging Survey (MAGPIS) candidate SNRs, newly confirming three as SNRs, reclassifying two as H ii regions, and exploring the unusual spectra and morphology of two others.
We examined Clostridioides difficile infection (CDI) prevention practices and their relationship with hospital-onset healthcare facility-associated CDI rates (CDI rates) in Veterans Affairs (VA) acute-care facilities.
From January 2017 to February 2017, we conducted an electronic survey of CDI prevention practices and hospital characteristics in the VA. We linked survey data with CDI rate data for the period January 2015 to December 2016. We stratified facilities according to whether their overall CDI rate per 10,000 bed days of care was above or below the national VA mean CDI rate. We examined whether specific CDI prevention practices were associated with an increased risk of a CDI rate above the national VA mean CDI rate.
All 126 facilities responded (100% response rate). Since implementing CDI prevention practices in July 2012, 60 of 123 facilities (49%) reported a decrease in CDI rates; 22 of 123 facilities (18%) reported an increase, and 41 of 123 (33%) reported no change. Facilities reporting an increase in the CDI rate (vs those reporting a decrease) after implementing prevention practices were 2.54 times more likely to have CDI rates that were above the national mean CDI rate. Whether a facility’s CDI rates were above or below the national mean CDI rate was not associated with self-reported cleaning practices, duration of contact precautions, availability of private rooms, or certification of infection preventionists in infection prevention.
We found considerable variation in CDI rates. We were unable to identify which particular CDI prevention practices (i.e., bundle components) were associated with lower CDI rates.
The spread of the Zika virus (ZIKV) in the Americas led to large outbreaks across the region and most of the Southern hemisphere. Of greatest concern were complications following acute infection during pregnancy. At the beginning of the outbreak, the risk to unborn babies and their clinical presentation was unclear. This report describes the methods and results of the UK surveillance response to assess the risk of ZIKV to children born to returning travellers. Established surveillance systems operating within the UK – the paediatric and obstetric surveillance units for rare diseases, and national laboratory monitoring – enabled rapid assessment of this emerging public health threat. A combined total of 11 women experiencing adverse pregnancy outcomes after possible ZIKV exposure were reported by the three surveillance systems; five miscarriages, two intrauterine deaths and four children with clinical presentations potentially associated with ZIKV infection. Sixteen women were diagnosed with ZIKV during pregnancy in the UK. Amongst the offspring of these women, there was unequivocal laboratory evidence of infection in only one child. In the UK, the number and risk of congenital ZIKV infection for travellers returning from ZIKV-affected countries is very small.
This study is aimed at developing a Rural Primary Health Care (PHC) Model for delivering comprehensive PHC for dementia in rural settings and addressing the gap in knowledge about disseminating and implementing evidence-based dementia care in a rural PHC context.
Limited access to specialists and services in rural areas leads to increased responsibility for dementia diagnosis and management in PHC, yet a gap exists in evidence-based best practices for rural dementia care.
Elements of the Rural PHC Model for Dementia were based on seven principles of effective PHC for dementia identified from published research and organized into three domains: team-based care, decision support, and specialist-to-provider support. Since 2013 the researchers have collaborated with a rural PHC team in a community of 1000 people in the Canadian province of Saskatchewan to operationalize these elements in ways that were feasible in the local context. The five-step approach included: building relationships; conducting a problem analysis/needs assessment; identifying core and adaptable elements of a decision support tool embedded in the model and resolving applicability issues; implementing and adapting the intervention with local stakeholders; and sustaining the model while incrementally scaling up.
Developing and sustaining relationships at regional and PHC team levels was critical. A comprehensive needs assessment identified challenges related to all domains of the Rural PHC Model. An existing decision support tool for dementia diagnosis and management was adapted and embedded in the team’s electronic medical record. Strategies for operationalizing other model elements included integrating team-based care co-ordination into the decision support tool and family-centered case conferences. Research team specialists provided educational sessions on topics identified by the PHC team. This paper provides an example of a community-based process for adapting evidence-based practice principles to a real-world setting.
This study examined the interplay between a polygenic composite and cortisol activity as moderators of the mediational pathway among family adversity, youth negative emotional reactivity to family conflict, and their psychological problems. The longitudinal design contained three annual measurement occasions with 279 adolescents (Mean age = 13.0 years) and their parents. Latent difference score analyses indicated that observational ratings of adversity in interparental and parent–child interactions at Wave 1 predicted increases in a multimethod, multi-informant assessment of youth negative emotional reactivity to family conflict from Waves 1 to 2. Changes in youth negative emotional reactivity, in turn, predicted increases in a multi-informant (i.e., parents, adolescent, and teacher) assessment of psychological problems from Waves 1 to 3. Consistent with differential susceptibility theory, the association between family adversity and negative emotional reactivity was stronger for adolescents who carried more sensitivity alleles in a polygenic composite consisting of 5-HTTLPR, DRD4 VNTR, and BDNF polymorphisms. Analyses of adolescent cortisol in the period surrounding a family disagreement task at Wave 1 revealed that overall cortisol output, rather than cortisol reactivity, served as an endophenotype of the polygenic composite. Overall cortisol output was specifically associated with polygenic plasticity and moderated the association between family adversity and youth negative emotional reactivity in the same for better or for worse manner as the genetic composite. Finally, moderator-mediated-moderation analyses indicated that the moderating role of the polygenic plasticity composite was mediated by the moderating role of adolescent cortisol output in the association between family adversity and their emotional reactivity.
Essential tremor (ET) is associated with psychological difficulties, including anxiety and depression. Demoralization (feelings of helplessness, hopelessness, inability to cope), another manifestation of psychological distress, has yet to be investigated in ET. Our objectives are to (1) estimate the prevalence of demoralization in ET, (2) assess its clinical correlates, and (3) determine whether demoralization correlates with tremor severity.
We administered the Kissane Demoralization Scale (KDS-II) and several psychosocial evaluations (ie, scales assessing subjective incompetence, resilience, and depression [eg, Geriatric Depression Scale]) to 60 ET subjects. Tremor was assessed with a disability score and total tremor score. KDS-II >8 indicated demoralization.
Among 60 ET subjects (mean age = 70.2 ± 6.8 years), the prevalence of demoralization was 13.3%, 95% confidence interval = 6.9–24.2%. Although there was overlap between demoralization and depression (10% of the sample meeting criteria for both), 54% of depressed subjects were not demoralized, and 25% of demoralized subjects were not depressed. Demoralization correlated with psychological factors, but demoralized subjects did not have significantly higher total tremor scores, tremor disability scores, or years with tremor.
Demoralization has a prevalence of 13.3% in ET, similar to that in other chronic or terminal illnesses (eg, cancer 13–18%, Parkinson’s disease 18.1%, coronary heart disease 20%). Demoralization was not a function of increased tremor severity, suggesting that it is a separable construct, which could dictate how a patient copes with his/her disease. These data further our understanding of the psychological and psychosocial correlates of ET.
Background: A Will, Power of Attorney, and Advanced Healthcare Directive are critical to guide decision-making in patients with dementia. We identified characteristics that are associated with the existence of these documents in patients who presented to a rural and remote memory clinic (RRMC). Methods: Ninety-five consecutive patients were included in this study. Patients and caregivers completed questionnaires on initial presentation to the RRMC and patients were asked if they had legal documents. Patients also completed neuropsychological testing. Statistical analysis (t-test and χ2 test) was performed to identify significant variables. Results: Seventy (73.7%) patients had a Will, 62 (65.3%) had a Power of Attorney, and 21 (22.1%) had an Advanced Healthcare Directive. Having a Will was associated with good quality of life (p = 0.001), living alone or with a spouse or partner only (p = 0.034), poor verbal fluency (p = 0.055), and European ethnicity (p = 0.028). Factors associated with having a Power of Attorney included good quality of life (p = 0.031), living alone or with a spouse or partner only (p = 0.053), and poor verbal fluency (p = 0.015). Old age (p = 0.015), poor verbal fluency (p = 0.023), and greater severity of cognitive and functional impairment (p = 0.023) were associated with having an Advanced Healthcare Directive. Conclusions: Our results indicate that poor quality of life, good performance on verbal fluency, Indigenous ethnicity, and living with others are associated with a lower likelihood of legal documents in patients with dementia. These factors can help physicians identify patients at risk of leaving their legal affairs unattended to. Physicians should discuss the creation of legal documents early on in patients with signs of dementia.
Eight latest Eocene to earliest Miocene stratigraphic surfaces have been identified in petroleum well data from the Taranaki Basin, New Zealand. These surfaces define seven regional sedimentary packages, of variable thickness and lithofacies, forming a mixed siliciclastic–carbonate system. The evolving tectonic setting, particularly the initial development of the Australian–Pacific convergent margin, controlled geographic, stratigraphic and facies variability. This tectonic signal overprinted a regional transgressive trend that culminated in latest Oligocene times. The earliest influence of active compressional tectonics is reflected in the preservation of latest Eocene – Early Oligocene deepwater sediments in the northern Taranaki Basin. Thickness patterns for all mid Oligocene units onwards show a shift in sedimentation to the eastern Taranaki Basin, controlled by reverse movement on the Taranaki Fault System. This resulted in the deposition of a thick sedimentary wedge, initially of coarse clastic sediments, later carbonate dominated, in the foredeep close to the fault. In contrast, Oligocene active normal faulting in a small sub-basin in the south may represent the most northerly evidence for rifting in southern Zealandia, related to Emerald Basin formation. The Early Miocene period saw a return to clastic-dominated deposition, the onset of regional regression and the southward propagation of compressional tectonics.
We describe the parameters of a low-frequency all-sky survey of compact radio sources using Interplanetary Scintillation, undertaken with the Murchison Widefield Array. While this survey gives important complementary information to low-resolution survey, providing information on the sub-arsecond structure of every source, a survey of this kind has not been attempted in the era of low-frequency imaging arrays such as the Murchison Widefield Array and LOw Frequency Array. Here we set out the capabilities of such a survey, describing the limitations imposed by the heliocentric observing geometry and by the instrument itself. We demonstrate the potential for Interplanetary Scintillation measurements at any point on the celestial sphere and we show that at 160 MHz, reasonable results can be obtained within 30° of the ecliptic (2π str: half the sky). We also suggest some observational strategies and describe the first such survey, the Murchison Widefield Array Phase I Interplanetary Scintillation survey. Finally we analyse the potential of the recently upgraded Murchison Widefield Array and discuss the potential of the Square Kilometre Array-low to use Interplanetary Scintillation to probe sub-mJy flux density levels at sub-arcsecond angular resolution.
The value of the nosological distinction between non-affective and affective psychosis has frequently been challenged. We aimed to investigate the transdiagnostic dimensional structure and associated characteristics of psychopathology at First Episode Psychosis (FEP). Regardless of diagnostic categories, we expected that positive symptoms occurred more frequently in ethnic minority groups and in more densely populated environments, and that negative symptoms were associated with indices of neurodevelopmental impairment.
This study included 2182 FEP individuals recruited across six countries, as part of the EUropean network of national schizophrenia networks studying Gene–Environment Interactions (EU-GEI) study. Symptom ratings were analysed using multidimensional item response modelling in Mplus to estimate five theory-based models of psychosis. We used multiple regression models to examine demographic and context factors associated with symptom dimensions.
A bifactor model, composed of one general factor and five specific dimensions of positive, negative, disorganization, manic and depressive symptoms, best-represented associations among ratings of psychotic symptoms. Positive symptoms were more common in ethnic minority groups. Urbanicity was associated with a higher score on the general factor. Men presented with more negative and less depressive symptoms than women. Early age-at-first-contact with psychiatric services was associated with higher scores on negative, disorganized, and manic symptom dimensions.
Our results suggest that the bifactor model of psychopathology holds across diagnostic categories of non-affective and affective psychosis at FEP, and demographic and context determinants map onto general and specific symptom dimensions. These findings have implications for tailoring symptom-specific treatments and inform research into the mood-psychosis spectrum.
The Appalachian region of the United States is home to the largest temperate deciduous forest in the world, though surface mining has caused significant forest loss. Many former coal mines are now dominated by invasive plants, which often inhibit establishment of desirable species, especially slower-growing native trees. Autumn-olive (Elaeagnus umbellata Thunb.) is a nonnative, nitrogen-fixing shrub that was historically planted on former coalfields, but now impedes reclamation. To better understand the influence of E. umbellata management practices on hardwood establishment, we evaluated two common management practices: cutting and cut stump herbicide treatment. Planted native tree species, including black cherry (Prunus serotina Ehrh.), pin oak (Quercus palustris Münchh.), and red maple (Acer rubrum L.), were monitored for survival and performance over two growing seasons following E. umbellata removal. In each plot, we also measured plant-available nitrate (NO3−) and ammonium (NH4+) in soils using ionic exchange membranes. At the end of the first growing season, native tree survival was high, and the presence or absence of E. umbellata had little effect on tree survival or growth, despite the higher plant-available nitrate where E. umbellata was present. By the end of the second growing season, native tree survival dropped to 20% to 60% and varied among E. umbellata treatments. Survival was highest when E. umbellata was cut and treated with herbicide, though tree growth was similar across all treatments without E. umbellata. When establishing native trees to replace E. umbellata, cutting and herbicide application treatment of the invader resulted in the highest overall efficacy (100% control), though the most cost-effective method may be to simply cut mature stands despite regrowth, as this resulted in equivalent native tree growth over 2 yr. While this allowed E. umbellata regeneration, it provided sufficient invader control to allow initial tree establishment. Cutting and herbicide application treatment resulted in less E. umbellata regeneration and appears to provide greater assurance that established trees will persist over the long term.
Antibodies at gastrointestinal mucosal membranes play a vital role in immunological protection against a range of pathogens, including helminths. Gastrointestinal health is central to efficient livestock production, and such infections cause significant losses. Fecal samples were taken from 114 cattle, across three beef farms, with matched blood samples taken from 22 of those animals. To achieve fecal antibody detection, a novel fecal supernatant was extracted. Fecal supernatant and serum samples were then analysed, using adapted enzyme-linked immunosorbent assay protocols, for levels of total immunoglobulin (Ig)A, IgG, IgM, and Teladorsagia circumcincta-specific IgA, IgG, IgM and IgE (in the absence of reagents for cattle-specific nematode species). Fecal nematode egg counts were conducted on all fecal samples. Assays performed successfully and showed that IgA was the predominant antibody in fecal samples, whereas IgG was predominant in serum. Total IgA in feces and serum correlated within individuals (0.581, P = 0.005), but other Ig types did not. Results support the hypothesis that the tested protocols are an effective method for the non-invasive assessment of cattle immunology. The method could be used as part of animal health assessments, although further work is required to interpret the relationship between results and levels of infection and immunity.
Background: Underweight eating disorders (EDs) are notoriously difficult to treat, although a growing evidence base suggests that outpatient cognitive behaviour therapy for EDs (CBT-ED) can be effective for a large proportion of individuals. Aims: To investigate the effectiveness of CBT-ED for underweight EDs in a ‘real-world’ settings. Method: Sixty-three adults with underweight EDs (anorexia nervosa or atypical anorexia nervosa) began outpatient CBT-ED in a National Health Service setting. Results: Fifty-four per cent completed treatment, for whom significant changes were observed on measures of ED symptoms, psychological distress and psychosocial impairment. There was also a large effect on body weight at end-of-treatment. Conclusions: The results suggest that good outcomes can be achieved by the majority of those who complete treatment, although treatment non-completion remains a significant barrier to recovery. Future studies should focus on improving treatment retention, as evidence suggests that CBT-ED in ‘real-world’ settings is effective.
Ecosystem services related to biodiversity, including cultural services, are essential for agricultural production such as viticulture. In agricultural landscapes, pesticides and mechanization threaten biodiversity, lead to landscape simplification and may reduce ecosystem services. On the other hand, consumers are more and more aware of environmental issues in food production. We investigated if landscape complexity, including soil management practices, was (i) appreciated by visitors and (ii) presented by winegrowers and tourism professionals in the French vineyards with the designation of geographical origin (DGO) ‘Coteaux du Layon’. Our goal was to determine if landscape complexity provides cultural ecosystem services such as aesthetics beneficial for the wine trade and the DGO region's attractiveness. We analyzed the iconographic content and the composition of landscape photographs on 50 websites to investigate if local winegrowers and tourism professionals associate biodiversity in the landscape and soil management practices with wine promotion. A questionnaire was realized to study the perception of local landscapes by interviewing 192 visitors of the region. The benefits of landscape complexity and soil management practices favoring biodiversity in viticulture were known and appreciated by many visitors, even if photographs of wine and traditional practices appeared to encourage wine purchasing. Local winegrowers’ representation of the DGO region only partially served these preferences; instead they mainly presented the wine-growing region by photographs focusing on wine bottles and vineyards. Consumer's preferences showed that complex landscapes could provide cultural ecosystem services that winegrowers are still less aware of. Therefore, complexity-targeted landscape planning including vegetation cover in soil management should be included in policy recommendations as agroecological measures for sustainable DGO production.
Background: A will, power of attorney and advanced healthcare directive are critical to guide decision-making in people with cognitive decline. We identified characteristics that are associated with the existence of these documents in patients who presented to a rural and remote memory clinic (RRMC). Methods: 95 consecutive patients were included in this study. Patients and caregivers completed questionnaires on initial presentation to the RRMC and patients were asked if they have legal documents. Patients also completed neuropsychological testing. Statistical analysis (t-test and χ2 test) was performed to identify significant variables. Results: 70 patients had a will, 62 had a power of attorney and 21 had an advanced healthcare directive. Having a will was associated with good quality of life (p=0.001), living alone (p=0.034), poor verbal fluency (p=0.055) and European ethnicity (p=0.028). Factors associated with having a power of attorney included good quality of life (p=0.031), living alone (p=0.053) and poor verbal fluency (p=0.015). Old age (p=0.015), poor verbal fluency (p=0.023) and severity of cognitive and functional impairment (p=0.023) were associated with having an advanced healthcare directive. Conclusions: Our results indicate that poor quality of life, good verbal fluency, non-European ethnicity and living with others are associated with a lower likelihood of creating legal documents in patients with cognitive decline