To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In the past decade, network analysis (NA) has been applied to psychopathology to quantify complex symptom relationships. This statistical technique has demonstrated much promise, as it provides researchers the ability to identify relationships across many symptoms in one model and can identify central symptoms that may predict important clinical outcomes. However, network models are highly influenced by node selection, which could limit the generalizability of findings. The current study (N = 6850) tests a comprehensive, cognitive–behavioral model of eating-disorder symptoms using items from two, widely used measures (Eating Disorder Examination Questionnaire and Eating Pathology Symptoms Inventory).
We used NA to identify central symptoms and compared networks across the duration of illness (DOI), as chronicity is one of the only known predictors of poor outcome in eating disorders (EDs).
Our results suggest that eating when not hungry and feeling fat were the most central symptoms across groups. There were no significant differences in network structure across DOI, meaning the connections between symptoms remained relatively consistent. However, differences emerged in central symptoms, such that cognitive symptoms related to overvaluation of weight/shape were central in individuals with shorter DOI, and behavioral central symptoms emerged more in medium and long DOI.
Our results have important implications for the treatment of individuals with enduring EDs, as they may have a different core, maintaining symptoms. Additionally, our findings highlight the importance of using comprehensive, theoretically- or empirically-derived models for NA.
The Murchison Widefield Array (MWA) is an open access telescope dedicated to studying the low-frequency (80–300 MHz) southern sky. Since beginning operations in mid-2013, the MWA has opened a new observational window in the southern hemisphere enabling many science areas. The driving science objectives of the original design were to observe 21 cm radiation from the Epoch of Reionisation (EoR), explore the radio time domain, perform Galactic and extragalactic surveys, and monitor solar, heliospheric, and ionospheric phenomena. All together
programs recorded 20 000 h producing 146 papers to date. In 2016, the telescope underwent a major upgrade resulting in alternating compact and extended configurations. Other upgrades, including digital back-ends and a rapid-response triggering system, have been developed since the original array was commissioned. In this paper, we review the major results from the prior operation of the MWA and then discuss the new science paths enabled by the improved capabilities. We group these science opportunities by the four original science themes but also include ideas for directions outside these categories.
The Murchison Widefield Array (MWA) is an electronically steered low-frequency (<300 MHz) radio interferometer, with a ‘slew’ time less than 8 s. Low-frequency (∼100 MHz) radio telescopes are ideally suited for rapid response follow-up of transients due to their large field of view, the inverted spectrum of coherent emission, and the fact that the dispersion delay between a 1 GHz and 100 MHz pulse is on the order of 1–10 min for dispersion measures of 100–2000 pc/cm3. The MWA has previously been used to provide fast follow-up for transient events including gamma-ray bursts (GRBs), fast radio bursts (FRBs), and gravitational waves, using systems that respond to gamma-ray coordinates network packet-based notifications. We describe a system for automatically triggering MWA observations of such events, based on Virtual Observatory Event standard triggers, which is more flexible, capable, and accurate than previous systems. The system can respond to external multi-messenger triggers, which makes it well-suited to searching for prompt coherent radio emission from GRBs, the study of FRBs and gravitational waves, single pulse studies of pulsars, and rapid follow-up of high-energy superflares from flare stars. The new triggering system has the capability to trigger observations in both the regular correlator mode (limited to ≥0.5 s integrations) and using the Voltage Capture System (VCS, 0.1 ms integration) of the MWA and represents a new mode of operation for the MWA. The upgraded standard correlator triggering capability has been in use since MWA observing semester 2018B (July–Dec 2018), and the VCS and buffered mode triggers will become available for observing in a future semester.
A new fossil site in a previously unexplored part of western Madagascar (the Beanka Protected Area) has yielded remains of many recently extinct vertebrates, including giant lemurs (Babakotia radofilai, Palaeopropithecus kelyus, Pachylemur sp., and Archaeolemur edwardsi), carnivores (Cryptoprocta spelea), the aardvark-like Plesiorycteropus sp., and giant ground cuckoos (Coua). Many of these represent considerable range extensions. Extant species that were extirpated from the region (e.g., Prolemur simus) are also present. Calibrated radiocarbon ages for 10 bones from extinct primates span the last three millennia. The largely undisturbed taphonomy of bone deposits supports the interpretation that many specimens fell in from a rock ledge above the entrance. Some primates and other mammals may have been prey items of avian predators, but human predation is also evident. Strontium isotope ratios (87Sr/86Sr) suggest that fossils were local to the area. Pottery sherds and bones of extinct and extant vertebrates with cut and chop marks indicate human activity in previous centuries. Scarcity of charcoal and human artifacts suggests only occasional visitation to the site by humans. The fossil assemblage from this site is unusual in that, while it contains many sloth lemurs, it lacks ratites, hippopotami, and crocodiles typical of nearly all other Holocene subfossil sites on Madagascar.
TwinsUK is the largest cohort of community-dwelling adult twins in the UK. The registry comprises over 14,000 volunteer twins (14,838 including mixed, single and triplets); it is predominantly female (82%) and middle-aged (mean age 59). In addition, over 1800 parents and siblings of twins are registered volunteers. During the last 27 years, TwinsUK has collected numerous questionnaire responses, physical/cognitive measures and biological measures on over 8500 subjects. Data were collected alongside four comprehensive phenotyping clinical visits to the Department of Twin Research and Genetic Epidemiology, King’s College London. Such collection methods have resulted in very detailed longitudinal clinical, biochemical, behavioral, dietary and socioeconomic cohort characterization; it provides a multidisciplinary platform for the study of complex disease during the adult life course, including the process of healthy aging. The major strength of TwinsUK is the availability of several ‘omic’ technologies for a range of sample types from participants, which includes genomewide scans of single-nucleotide variants, next-generation sequencing, metabolomic profiles, microbiomics, exome sequencing, epigenetic markers, gene expression arrays, RNA sequencing and telomere length measures. TwinsUK facilitates and actively encourages sharing the ‘TwinsUK’ resource with the scientific community — interested researchers may request data via the TwinsUK website (http://twinsuk.ac.uk/resources-for-researchers/access-our-data/) for their own use or future collaboration with the study team. In addition, further cohort data collection is planned via the Wellcome Open Research gateway (https://wellcomeopenresearch.org/gateways). The current article presents an up-to-date report on the application of technological advances, new study procedures in the cohort and future direction of TwinsUK.
Objectives: Maintaining two active languages may increase cognitive and brain reserve among bilingual individuals. We explored whether such a neuroprotective effect was manifested in the performance of memory tests for participants with amnestic mild cognitive impairment (aMCI). Methods: We compared 42 bilinguals to 25 monolinguals on verbal and nonverbal memory tests. We used: (a) the Loewenstein-Acevedo Scales for Semantic Interference and Learning (LASSI-L), a sensitive test that taps into proactive, retroactive, and recovery from proactive semantic interference (verbal memory), and (b) the Benson Figure delayed recall (nonverbal memory). A subsample had volumetric MRI scans. Results: The bilingual group significantly outperformed the monolingual group on two LASSI-L cued recall measures (Cued A2 and Cued B2). A measure of maximum learning (Cued A2) showed a correlation with the volume of the left hippocampus in the bilingual group only. Cued B2 recall (sensitive to recovery from proactive semantic interference) was correlated with the volume of the hippocampus and the entorhinal cortex of both cerebral hemispheres in the bilingual group, as well as with the left and right hippocampus in the monolingual group. The memory advantage in bilinguals on these measures was associated with higher inhibitory control as measured by the Stroop Color-Word test. Conclusions: Our results demonstrated a superior performance of aMCI bilinguals over aMCI monolinguals on selected verbal memory tasks. This advantage was not observed in nonverbal memory. Superior memory performance of bilinguals over monolinguals suggests that bilinguals develop a different and perhaps more efficient semantic association system that influences verbal recall. (JINS, 2019, 25, 15–28)
Palaeochannels of lowland rivers provide a means of investigating the sensitivity of river response to climate-driven hydrologic change. About 80 palaeochannels of the lower Macquarie River of southeastern Australia record the evolution of this distributive fluvial system. Six Macquarie palaeochannels were dated by single-grain optically stimulated luminescence. The largest of the palaeochannels (Quombothoo, median age 54 ka) was on average 284 m wide, 12 times wider than the modern river (24 m) and with 21 times greater meander wavelength. Palaeo-discharge then declined, resulting in a younger, narrower, group of palaeochannels, Bibbijibbery (125 m wide, 34 ka), Billybingbone (92 m, 20 ka), Milmiland (112 m, 22 ka), and Mundadoo (86 m, 5.6 ka). Yet these channels were still much larger than the modern river and were continuous downstream to the confluence with the Barwon-Darling River. At 5.5 ka, a further decrease in river discharge led to the formation of the narrow modern river, the ecologically important Macquarie Marshes, and Marra Creek palaeochannel (31 m, 2.1 ka) and diminished sediment delivery to the Barwon-Darling River as palaeo-discharge fell further. The hydrologic changes suggest precipitation was a driving forcing on catchment discharge in addition to a temperature-driven runoff response.
The triazines are one of the most widely used herbicide classes ever developed and are critical for managing weed populations that have developed herbicide resistance. These herbicides are traditionally valued for their residual weed control in more than 50 crops. Scientific literature suggests that atrazine, and perhaps other s-triazines, may no longer remain persistent in soils due to enhanced microbial degradation. Experiments examined the rate of degradation of atrazine and two other triazine herbicides, simazine and metribuzin, in both atrazine-adapted and non-history Corn Belt soils, with similar soils being used from each state as a comparison of potential triazine degradation. In three soils with no history of atrazine use, the t1/2 of atrazine was at least four times greater than in three soils with a history of atrazine use. Simazine degradation in the same three sets of soils was 2.4 to 15 times more rapid in history soils than non-history soils. Metribuzin in history soils degraded at 0.6, 0.9, and 1.9 times the rate seen in the same three non-history soils. These results indicate enhanced degradation of the symmetrical triazine simazine, but not of the asymmetrical triazine metribuzin.
The enormous economic burden of dementia in the United States of America falls disproportionately on families coping with this devastating disease. Black Americans, who are at greater risk of developing dementia than white Americans, hold on average less than one-eighth of the wealth of white Americans. This study explores whether dementia exacerbates this wealth disparity by examining dementia's effect on wealth trajectories of black versus non-black Americans over an eight-year period preceding death, using five waves of data (beginning in 2002 or 2004) on decedents in the 2012 and 2014 waves of the Health and Retirement Study (N = 2,429). Dementia is associated with a loss of 97 per cent of wealth among black Americans, compared with 42 per cent among non-black Americans, while wealth loss among black and non-black Americans without dementia did not differ substantially (15% versus 19%). Dementia appears to increase the probability of wealth exhaustion among both black and non-black Americans, although the estimate is no longer significant after adjusting for all covariates (for blacks, odds ratio (OR) = 2.04, 95% confidence interval (CI) = 0.83, 5.00; for non-blacks, OR = 1.47, 95% CI = 0.95, 2.27). Dementia has a negative association with home-ownership, and the loss or sale of a home may play a mediating role in the exhaustion of wealth among black Americans with dementia.
Hill (Twin Research and Human Genetics, Vol. 21, 2018, 84–88) presented a critique of our recently published paper in Cell Reports entitled ‘Large-Scale Cognitive GWAS Meta-Analysis Reveals Tissue-Specific Neural Expression and Potential Nootropic Drug Targets’ (Lam et al., Cell Reports, Vol. 21, 2017, 2597–2613). Specifically, Hill offered several interrelated comments suggesting potential problems with our use of a new analytic method called Multi-Trait Analysis of GWAS (MTAG) (Turley et al., Nature Genetics, Vol. 50, 2018, 229–237). In this brief article, we respond to each of these concerns. Using empirical data, we conclude that our MTAG results do not suffer from ‘inflation in the FDR [false discovery rate]’, as suggested by Hill (Twin Research and Human Genetics, Vol. 21, 2018, 84–88), and are not ‘more relevant to the genetic contributions to education than they are to the genetic contributions to intelligence’.
OBJECTIVES/SPECIFIC AIMS: Objectives/goals: Describe the process used to develop leveled competencies and associated examples. Discuss the final leveled competencies and their potential use in clinical research professional workforce initiatives. METHODS/STUDY POPULATION: The revised JTFCTC Framework 2.0 has 51 competency statements, representing 8 domains. Each competency statement has now been refined to delineate fundamental, skilled or advanced levels of knowledge and capability. Typically, the fundamental level describes the competency for a professional that requires some coaching and oversight, but is able to understand and identify basic concepts. The skilled level of the competency reflects the professional’s solid understanding of the competency and use of the information to take action independently in most situations. The advanced level embodies high level thinking, problem solving, and the ability to guide others in the competency. The process for developing both the three levels and examples involved 5 workgroups, each chaired by a content expert and comprising of national/international clinical research experts, including representatives from research sites, professional associations, government, and industry and academic sponsors. RESULTS/ANTICIPATED RESULTS: The committee developed 51 specific competencies arrayed across 3 levels and examples of each to demonstrate an appropriate application of the competency. The competencies and examples, and potential utilization, will be described. DISCUSSION/SIGNIFICANCE OF IMPACT: The use of competencies in the context of workforce development and training initiatives is helping to create standards for the clinical research profession. These leveled competencies allow for an important refinement to the standards that can be used to enhance the quality and safety of the clinical research enterprise and guide workforce development.
Objectives: As the number of adolescents and young adults (AYAs) surviving congenital heart disease (CHD) grows, studies of long-term outcomes are needed. CHD research documents poor executive function (EF) and cerebellum (CB) abnormalities in children. We examined whether AYAs with CHD exhibit reduced EF and CB volumes. We hypothesized a double dissociation such that the posterior CB is related to EF while the anterior CB is related to motor function. We also investigated whether the CB contributes to EF above and beyond processing speed. Methods: Twenty-two AYAs with CHD and 22 matched healthy controls underwent magnetic resonance imaging and assessment of EF, processing speed, and motor function. Volumetric data were calculated using a cerebellar atlas (SUIT) developed for SPM. Group differences were compared with t tests, relationships were tested with Pearson’s correlations and Fisher’s r to z transformation, and hierarchical regression was used to test the CB’s unique contributions to EF. Results: CHD patients had reduced CB total, lobular, and white matter volume (d=.52–.99) and poorer EF (d=.79–1.01) compared to controls. Significant correlations between the posterior CB and EF (r=.29–.48) were identified but there were no relationships between the anterior CB and motor function nor EF. The posterior CB predicted EF above and beyond processing speed (ps<.001). Conclusions: This study identified a relationship between the posterior CB and EF, which appears to be particularly important for inhibitory processes and abstract reasoning. The unique CB contribution to EF above and beyond processing speed alone warrants further study. (JINS, 2018, 24, 939–948)
The Neotoma Paleoecology Database is a community-curated data resource that supports interdisciplinary global change research by enabling broad-scale studies of taxon and community diversity, distributions, and dynamics during the large environmental changes of the past. By consolidating many kinds of data into a common repository, Neotoma lowers costs of paleodata management, makes paleoecological data openly available, and offers a high-quality, curated resource. Neotoma’s distributed scientific governance model is flexible and scalable, with many open pathways for participation by new members, data contributors, stewards, and research communities. The Neotoma data model supports, or can be extended to support, any kind of paleoecological or paleoenvironmental data from sedimentary archives. Data additions to Neotoma are growing and now include >3.8 million observations, >17,000 datasets, and >9200 sites. Dataset types currently include fossil pollen, vertebrates, diatoms, ostracodes, macroinvertebrates, plant macrofossils, insects, testate amoebae, geochronological data, and the recently added organic biomarkers, stable isotopes, and specimen-level data. Multiple avenues exist to obtain Neotoma data, including the Explorer map-based interface, an application programming interface, the neotoma R package, and digital object identifiers. As the volume and variety of scientific data grow, community-curated data resources such as Neotoma have become foundational infrastructure for big data science.
On 27 April 2015, Washington health authorities identified Escherichia coli O157:H7 infections associated with dairy education school field trips held in a barn 20–24 April. Investigation objectives were to determine the magnitude of the outbreak, identify the source of infection, prevent secondary illness transmission and develop recommendations to prevent future outbreaks. Case-finding, hypothesis generating interviews, environmental site visits and a case–control study were conducted. Parents and children were interviewed regarding event activities. Odds ratios (OR) and 95% confidence intervals (CI) were computed. Environmental testing was conducted in the barn; isolates were compared to patient isolates using pulsed-field gel electrophoresis (PFGE). Sixty people were ill, 11 (18%) were hospitalised and six (10%) developed haemolytic uremic syndrome. Ill people ranged in age from <1 year to 47 years (median: 7), and 20 (33%) were female. Twenty-seven case-patients and 88 controls were enrolled in the case–control study. Among first-grade students, handwashing (i.e. soap and water, or hand sanitiser) before lunch was protective (adjusted OR 0.13; 95% CI 0.02–0.88, P = 0.04). Barn samples yielded E. coli O157:H7 with PFGE patterns indistinguishable from patient isolates. This investigation provided epidemiological, laboratory and environmental evidence for a large outbreak of E. coli O157:H7 infections from exposure to a contaminated barn. The investigation highlights the often overlooked risk of infection through exposure to animal environments as well as the importance of handwashing for disease prevention. Increased education and encouragement of infection prevention measures, such as handwashing, can prevent illness.
The discovery of the first electromagnetic counterpart to a gravitational wave signal has generated follow-up observations by over 50 facilities world-wide, ushering in the new era of multi-messenger astronomy. In this paper, we present follow-up observations of the gravitational wave event GW170817 and its electromagnetic counterpart SSS17a/DLT17ck (IAU label AT2017gfo) by 14 Australian telescopes and partner observatories as part of Australian-based and Australian-led research programs. We report early- to late-time multi-wavelength observations, including optical imaging and spectroscopy, mid-infrared imaging, radio imaging, and searches for fast radio bursts. Our optical spectra reveal that the transient source emission cooled from approximately 6 400 K to 2 100 K over a 7-d period and produced no significant optical emission lines. The spectral profiles, cooling rate, and photometric light curves are consistent with the expected outburst and subsequent processes of a binary neutron star merger. Star formation in the host galaxy probably ceased at least a Gyr ago, although there is evidence for a galaxy merger. Binary pulsars with short (100 Myr) decay times are therefore unlikely progenitors, but pulsars like PSR B1534+12 with its 2.7 Gyr coalescence time could produce such a merger. The displacement (~2.2 kpc) of the binary star system from the centre of the main galaxy is not unusual for stars in the host galaxy or stars originating in the merging galaxy, and therefore any constraints on the kick velocity imparted to the progenitor are poor.
The treatment gap between the number of people with mental disorders and the number treated represents a major public health challenge. We examine this gap by socio-economic status (SES; indicated by family income and respondent education) and service sector in a cross-national analysis of community epidemiological survey data.
Data come from 16 753 respondents with 12-month DSM-IV disorders from community surveys in 25 countries in the WHO World Mental Health Survey Initiative. DSM-IV anxiety, mood, or substance disorders and treatment of these disorders were assessed with the WHO Composite International Diagnostic Interview (CIDI).
Only 13.7% of 12-month DSM-IV/CIDI cases in lower-middle-income countries, 22.0% in upper-middle-income countries, and 36.8% in high-income countries received treatment. Highest-SES respondents were somewhat more likely to receive treatment, but this was true mostly for specialty mental health treatment, where the association was positive with education (highest treatment among respondents with the highest education and a weak association of education with treatment among other respondents) but non-monotonic with income (somewhat lower treatment rates among middle-income respondents and equivalent among those with high and low incomes).
The modest, but nonetheless stronger, an association of education than income with treatment raises questions about a financial barriers interpretation of the inverse association of SES with treatment, although future within-country analyses that consider contextual factors might document other important specifications. While beyond the scope of this report, such an expanded analysis could have important implications for designing interventions aimed at increasing mental disorder treatment among socio-economically disadvantaged people.
To determine the impact of recurrent Clostridium difficile infection (RCDI) on patient behaviors following illness.
Using a computer algorithm, we searched the electronic medical records of 7 Chicago-area hospitals to identify patients with RCDI (2 episodes of CDI within 15 to 56 days of each other). RCDI was validated by medical record review. Patients were asked to complete a telephone survey. The survey included questions regarding general health, social isolation, symptom severity, emotional distress, and prevention behaviors.
In total, 119 patients completed the survey (32%). On average, respondents were 57.4 years old (standard deviation, 16.8); 57% were white, and ~50% reported hospitalization for CDI. At the time of their most recent illness, patients rated their diarrhea as high severity (58.5%) and their exhaustion as extreme (30.7%). Respondents indicated that they were very worried about getting sick again (41.5%) and about infecting others (31%). Almost 50% said that they have washed their hands more frequently (47%) and have increased their use of soap and water (45%) since their illness. Some of these patients (22%–32%) reported eating out less, avoiding certain medications and public areas, and increasing probiotic use. Most behavioral changes were unrelated to disease severity.
Having had RCDI appears to increase prevention-related behaviors in some patients. While some behaviors are appropriate (eg, handwashing), others are not supported by evidence of decreased risk and may negatively impact patient quality of life. Providers should discuss appropriate prevention behaviors with their patients and should clarify that other behaviors (eg, eating out less) will not affect their risk of future illness.
The Square Kilometre Array will be an amazing instrument for pulsar astronomy. While the full SKA will be sensitive enough to detect all pulsars in the Galaxy visible from Earth, already with SKA1, pulsar searches will discover enough pulsars to increase the currently known population by a factor of four, no doubt including a range of amazing unknown sources. Real time processing is needed to deal with the 60 PB of pulsar search data collected per day, using a signal processing pipeline required to perform more than 10 POps. Here we present the suggested design of the pulsar search engine for the SKA and discuss challenges and solutions to the pulsar search venture.
Poorer patient views of mental health inpatient treatment predict both further admissions and, for those admitted involuntarily, longer admissions. As advocated in the UK Francis report, we investigated the hypothesis that improving staff training improves patients’ views of ward care.
Cluster randomised trial with stepped wedge design in 16 acute mental health wards randomised (using the ralloc procedure in Stata) by an independent statistician in three waves to staff training. A psychologist trained ward staff on evidence-based group interventions and then supported their introduction to each ward. The main outcome was blind self-report of perceptions of care (VOICE) before or up to 2 years after staff training between November 2008 and January 2013.
In total, 1108 inpatients took part (616 admitted involuntarily under the English Mental Health Act). On average 51.6 staff training sessions were provided per ward. Involuntary patient's perceptions of, and satisfaction with, mental health wards improved after staff training (N582, standardised effect −0·35, 95% CI −0·57 to −0·12, p = 0·002; interaction p value 0·006) but no benefit to those admitted voluntarily (N469, −0.01, 95% CI −0.23 to 0.22, p = 0.955) and no strong evidence of an overall effect (N1058, standardised effect −0.18 s.d., 95% CI −0.38 to 0.01, p = 0.062). The training costs around £10 per patient per week. Resource allocation changed towards patient perceived meaningful contacts by an average of £12 (95% CI −£76 to £98, p = 0.774).
Staff training improved the perceptions of the therapeutic environment in those least likely to want an inpatient admission, those formally detained. This change might enhance future engagement with all mental health services and prevent the more costly admissions.