To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
There are a variety of causes of acute heart failure in children including myocarditis, genetic/metabolic conditions, and congenital heart defects. In cases with a structurally normal heart and a negative personal and family history, myocarditis is often presumed to be the cause, but we hypothesise that genetic disorders contribute to a significant portion of these cases. We reviewed our cases of children who presented with acute heart failure and underwent genetic testing from 2008 to 2017. Eighty-seven percent of these individuals were found to have either a genetic syndrome or pathogenic or likely pathogenic variant in a cardiac-related gene. None of these individuals had a personal or family history of cardiomyopathy that was suggestive of a genetic aetiology prior to presentation. All of these individuals either passed away or were listed for cardiac transplantation indicating genetic testing may provide important information regarding prognosis in addition to providing information critical to assessment of family members.
Bio-Detection Dogs (BDDs) are used in some high-income countries as a diagnostic intervention, yet little is known about their potential in low/middle-income countries with limited diagnostic resources. This exploratory study investigated the opportunities and implications of deploying BDDs as a mobile diagnostic intervention to identify people with asymptomatic malaria, particularly at ports of entry, as an important step to malaria elimination in a population. A qualitative study design consisting of participant observation, five focus group discussions and informal conversations was employed in The Gambia in April–May 2017. A disciplined German Shepherd companion dog (not trained as a BDD) was introduced to research participants and their perceptions recorded. Field-notes and discussions were transcribed, translated and analysed thematically. Most research participants viewed positively the possibility of using BDDs to detect malaria, with the major advantage of being non-invasive. Some concerns, however, were raised regarding safety and efficacy, as well as cultural issues around the place of dogs within human society. The Gambia is a rabies-endemic country, and unfamiliar dogs are not usually approached, with implications for how research participants perceived BDDs. Understanding such concerns and working with local people to address such issues must be part of any successful strategy to deploy BDDs in new settings. Bio-Detection Dogs represent a potentially non-invasive diagnostic tool for the detection of asymptomatic or chronic malaria infections, particularly in areas with very low parasite rates. However, it is important to understand local concerns and work closely with communities to address those concerns. Wider deployment of BDDs will also require careful planning and sustained financial support.
Later life is a period of increased risk of disability, but there is little quantitative evidence regarding the exclusion of older people (through discrimination and avoidance) due to their health conditions. This study aims to (1) measure the prevalence of disability exclusion in later life, (2) examine how experiences of exclusion differ by disability type, and (3) investigate the association of exposure to exclusion with psychological distress.
Using data from the 2015 ABS Survey of Disability, Ageing and Carers, we calculated the prevalence of people aged 55 years and over with a disability experiencing discrimination and engaging in avoidance behaviors, disaggregated by 18 detailed disability types. Modified Log-Poisson models were fitted to estimate Prevalence Ratios to measure the association between exclusion and psychological distress, stratified by disability type.
In 2015, about 5% of Australians aged 55 years and over with a disability reported experiencing an instance of disability discrimination, and one in four reported avoiding a situation or context due to their disability. Accounting for psychosocial comorbidities and with extensive demographic controls, exposure to disability avoidance (PR = 1.9, 95% CI 1.7, 2.1) or discrimination (PR = 1.7, 95% CI 1.4, 2.1) almost doubled the probability of experiencing psychological distress. Effects were heightened for individuals reporting specific disabilities including sensory and speech and physical disabilities as well as those reporting a head injury, stroke, or acquired brain injury.
Despite protections against disability discrimination in legislation, discrimination and avoidance due to disability is prevalent and is associated with poor mental health outcomes.
Early life exposures affect health and disease across the life course and potentially across multiple generations. The Clinical and Translational Research Institutes (CTSIs) offer an opportunity to utilize and link existing databases to conduct lifespan research.
A survey with Lifespan Domain Taskforce expert input was created and distributed to lead lifespan researchers at each of the 64 CTSIs. The survey requested information regarding institutional databases related to early life exposure, child-maternal health, or lifespan research.
Of 64 CTSI, 88% provided information on a total of 130 databases. Approximately 59% (n=76/130) had an associated biorepository. Longitudinal data were available for 72% (n=93/130) of reported databases. Many of the biorepositories (n=44/76; 68%) have standard operating procedures that can be shared with other researchers.
The majority of CTSI databases and biorepositories focusing on child-maternal health and lifespan research could be leveraged for lifespan research, increased generalizability and enhanced multi-institutional research in the United States.
Conventional wisdom assumes that increased censorship will strictly decrease access to information. We delineate circumstances when increases in censorship expand access to information for a substantial subset of the population. When governments suddenly impose censorship on previously uncensored information, citizens accustomed to acquiring this information will be incentivized to learn methods of censorship evasion. These evasion tools provide continued access to the newly blocked information—and also extend users’ ability to access information that has long been censored. We illustrate this phenomenon using millions of individual-level actions of social media users in China before and after the block of Instagram. We show that the block inspired millions of Chinese users to acquire virtual private networks, and that these users subsequently joined censored websites like Twitter and Facebook. Despite initially being apolitical, these new users began browsing blocked political pages on Wikipedia, following Chinese political activists on Twitter, and discussing highly politicized topics such as opposition protests in Hong Kong.
While our fascination with understanding the past is sufficient to warrant an increased focus on synthesis, solutions to important problems facing modern society require understandings based on data that only archaeology can provide. Yet, even as we use public monies to collect ever-greater amounts of data, modes of research that can stimulate emergent understandings of human behavior have lagged behind. Consequently, a substantial amount of archaeological inference remains at the level of the individual project. We can more effectively leverage these data and advance our understandings of the past in ways that contribute to solutions to contemporary problems if we adapt the model pioneered by the National Center for Ecological Analysis and Synthesis to foster synthetic collaborative research in archaeology. We propose the creation of the Coalition for Archaeological Synthesis coordinated through a U.S.-based National Center for Archaeological Synthesis. The coalition will be composed of established public and private organizations that provide essential scholarly, cultural heritage, computational, educational, and public engagement infrastructure. The center would seek and administer funding to support collaborative analysis and synthesis projects executed through coalition partners. This innovative structure will enable the discipline to address key challenges facing society through evidentially based, collaborative synthetic research.
In 2012, the Government invited local councils in England to participate in a pilot programme to test direct payments in residential care. While the programme was set up to allow for comprehensive summative evaluation, the uptake of direct payments in residential care was substantially lower than anticipated, with only 40 people in receipt of one at the end of the programme. Drawing on qualitative data collected for the evaluation, this paper aims to understand better the barriers to implementing direct payments in residential care. Evidence from the use of direct payments in domiciliary care identified gatekeeping by council frontline staff as a major barrier for service users to access direct payments. Our findings suggest that, whilst selectivity of both service users and providers was an integral part of the programme design, gatekeeping does not fully explain the poor take-up. Other factors played a part, such as lack of clarity about the benefits of direct payments for care home residents, the limited range and scope of choice of services for residents, and concerns from care providers about the financial impact of direct payments on their financial sustainability.
The early and effective detection of neurocognitive disorders poses a key diagnostic challenge. We examined performance on common cognitive bedside tests according to differing delirium syndromal status and clinical (motor) subtypes in hospitalized elderly medical inpatients.
A battery of nine bedside cognitive tests was performed on elderly medical inpatients with DSM-IV delirium, subsyndromal delirium (SSD), and no delirium (ND). Patients with delirium were compared according to clinical (motor) subtypes.
A total of 198 patients (mean age 79.14 ± 8.26) were assessed with full syndromal delirium (FSD: n = 110), SSD (n = 45), and ND (n = 43). Delirium status was not associated with differences in terms of gender distribution, age, or overall medication use. Dementia burden increased with greater delirium status. Overall, the ability to meaningfully engage with the tests varied from 59% for the Vigilance B test to 85% for Spatial Span Forward test and was lowest in patients with FSD, where engagement ranged from 32% for the Vigilance B test to 77% for the Spatial Span Forwards test. The ND group was distinguished from SSD group for the Months of the year backwards, Vigilance B, global VSP, Clock Drawing test, and Interlocking Pentagons test. The SSD group was distinguished from the FSD group by Vigilance A, Spatial Span Forward, and Spatial Span Backwards. Regarding differences among motor subtypes in terms of percentage engagement and performance, the No subtype group had higher ratings across all tests. Delirious patients with no subtype had significantly lower scores on the DRS-R98 than for the other three subtype categories.
Simple bedside tests of attention, vigilance, and visuospatial ability are useful in distinguishing neurocognitive disorders, including SSD from other presentations.
Majority of the individuals with hepatitis C virus (HCV) infection in England are people who inject drugs, a vulnerable and disenfranchised cohort with poor engagement with secondary care. Our aim is to describe our experiences in setting up a successful nurse led HCV service at a substance misuse service (SMS).
We justify the need for a community HCV service and review the different community based models. Our experiences in engaging with stakeholders, obtaining funding, service set up, challenges faced and key recommendations are discussed. Finally, a summary of interim clinical outcomes is presented.
A successful community based “one-stop” nurse led HCV service was set up in Dec 2013 at a large SMS. It provides all aspects of care (blood borne virus screening, non-invasive assessment of hepatic fibrosis, Hepatology input, HCV treatment, peer mentor, social and psychiatrist support, and opiod substitution) at one site. Interim clinical data indicate high service uptake with HCV treatment outcomes comparable to secondary care.
The advent of direct acting antivirals provides a unique opportunity for HCV elimination in England by 2030. Our “one-stop” integrated and multidisciplinary community HCV model suggests that HCV care can be successfully delivered outside of a hospital setting and warrants national adoption.
Whether monozygotic (MZ) and dizygotic (DZ) twins differ from each other in a variety of phenotypes is important for genetic twin modeling and for inferences made from twin studies in general. We analyzed whether there were differences in individual, maternal and paternal education between MZ and DZ twins in a large pooled dataset. Information was gathered on individual education for 218,362 adult twins from 27 twin cohorts (53% females; 39% MZ twins), and on maternal and paternal education for 147,315 and 143,056 twins respectively, from 28 twin cohorts (52% females; 38% MZ twins). Together, we had information on individual or parental education from 42 twin cohorts representing 19 countries. The original education classifications were transformed to education years and analyzed using linear regression models. Overall, MZ males had 0.26 (95% CI [0.21, 0.31]) years and MZ females 0.17 (95% CI [0.12, 0.21]) years longer education than DZ twins. The zygosity difference became smaller in more recent birth cohorts for both males and females. Parental education was somewhat longer for fathers of DZ twins in cohorts born in 1990–1999 (0.16 years, 95% CI [0.08, 0.25]) and 2000 or later (0.11 years, 95% CI [0.00, 0.22]), compared with fathers of MZ twins. The results show that the years of both individual and parental education are largely similar in MZ and DZ twins. We suggest that the socio-economic differences between MZ and DZ twins are so small that inferences based upon genetic modeling of twin data are not affected.
The relative role of the stellar radiation field, the stellar outflows and the interstellar radiation field (ISRF) in transforming the molecular ejecta into atomic gas was the subject of our ISO LWS and SWS spectroscopy study of 24 evolved stars which span the range from AGB stars to proto-planetary nebulae (PPNs) and PNs. The far-infrared (FIR) atomic fine-structure lines are powerful probes of the warm atomic gas in photodissociation regions (PDRs) and shocks. This paper summarizes and compares the ISO spectroscopy studies of carbon-rich (C-rich) and oxygen-rich (O-rich) evolved stars, published by Fong et al. (2001) and Castro-Carrizo et al. (2001), respectively. We find that photodissociation, not shocks, is responsible for the chemical change from molecular to atomic gas.
Gene × Environment interaction contributes to externalizing disorders in childhood and adolescence, but little is known about whether such effects are long lasting or present in adulthood. We examined gene–environment interplay in the concurrent and prospective associations between antisocial peer affiliation and externalizing disorders (antisocial behavior and substance use disorders) at ages 17, 20, 24, and 29. The sample included 1,382 same-sex twin pairs participating in the Minnesota Twin Family Study. We detected a Gene × Environment interaction at age 17, such that additive genetic influences on antisocial behavior and substance use disorders were greater in the context of greater antisocial peer affiliation. This Gene × Environment interaction was not present for antisocial behavior symptoms after age 17, but it was for substance use disorder symptoms through age 29 (though effect sizes were largest at age 17). The results suggest adolescence is a critical period for the development of externalizing disorders wherein exposure to greater environmental adversity is associated with a greater expression of genetic risk. This form of Gene × Environment interaction may persist through young adulthood for substance use disorders, but it appears to be limited to adolescence for antisocial behavior.
A trend toward greater body size in dizygotic (DZ) than in monozygotic (MZ) twins has been suggested by some but not all studies, and this difference may also vary by age. We analyzed zygosity differences in mean values and variances of height and body mass index (BMI) among male and female twins from infancy to old age. Data were derived from an international database of 54 twin cohorts participating in the COllaborative project of Development of Anthropometrical measures in Twins (CODATwins), and included 842,951 height and BMI measurements from twins aged 1 to 102 years. The results showed that DZ twins were consistently taller than MZ twins, with differences of up to 2.0 cm in childhood and adolescence and up to 0.9 cm in adulthood. Similarly, a greater mean BMI of up to 0.3 kg/m2 in childhood and adolescence and up to 0.2 kg/m2 in adulthood was observed in DZ twins, although the pattern was less consistent. DZ twins presented up to 1.7% greater height and 1.9% greater BMI than MZ twins; these percentage differences were largest in middle and late childhood and decreased with age in both sexes. The variance of height was similar in MZ and DZ twins at most ages. In contrast, the variance of BMI was significantly higher in DZ than in MZ twins, particularly in childhood. In conclusion, DZ twins were generally taller and had greater BMI than MZ twins, but the differences decreased with age in both sexes.
For over 100 years, the genetics of human anthropometric traits has attracted scientific interest. In particular, height and body mass index (BMI, calculated as kg/m2) have been under intensive genetic research. However, it is still largely unknown whether and how heritability estimates vary between human populations. Opportunities to address this question have increased recently because of the establishment of many new twin cohorts and the increasing accumulation of data in established twin cohorts. We started a new research project to analyze systematically (1) the variation of heritability estimates of height, BMI and their trajectories over the life course between birth cohorts, ethnicities and countries, and (2) to study the effects of birth-related factors, education and smoking on these anthropometric traits and whether these effects vary between twin cohorts. We identified 67 twin projects, including both monozygotic (MZ) and dizygotic (DZ) twins, using various sources. We asked for individual level data on height and weight including repeated measurements, birth related traits, background variables, education and smoking. By the end of 2014, 48 projects participated. Together, we have 893,458 height and weight measures (52% females) from 434,723 twin individuals, including 201,192 complete twin pairs (40% monozygotic, 40% same-sex dizygotic and 20% opposite-sex dizygotic) representing 22 countries. This project demonstrates that large-scale international twin studies are feasible and can promote the use of existing data for novel research purposes.
Dietary intake/status of the trace mineral Se may affect the risk of developing hypertensive conditions of pregnancy, i.e. pre-eclampsia and pregnancy-induced hypertension (PE/PIH). In the present study, we evaluated Se status in UK pregnant women to establish whether pre-pregnant Se status or Se supplementation affected the risk of developing PE/PIH. The samples originated from the SPRINT (Selenium in PRegnancy INTervention) study that randomised 230 UK primiparous women to treatment with Se (60 μg/d) or placebo from 12 weeks of gestation. Whole-blood Se concentration was measured at 12 and 35 weeks, toenail Se concentration at 16 weeks, plasma selenoprotein P (SEPP1) concentration at 35 weeks and plasma glutathione peroxidase (GPx3) activity at 12, 20 and 35 weeks. Demographic data were collected at baseline. Participants completed a FFQ. UK pregnant women had whole-blood Se concentration lower than the mid-range of other populations, toenail Se concentration considerably lower than US women, GPx3 activity considerably lower than US and Australian pregnant women, and low baseline SEPP1 concentration (median 3·00, range 0·90–5·80 mg/l). Maternal age, education and social class were positively associated with Se status. After adjustment, whole-blood Se concentration was higher in women consuming Brazil nuts (P= 0·040) and in those consuming more than two seafood portions per week (P= 0·054). A stepwise logistic regression model revealed that among the Se-related risk factors, only toenail Se (OR 0·38, 95 % CI 0·17, 0·87, P= 0·021) significantly affected the OR for PE/PIH. On excluding non-compliers with Se treatment, Se supplementation also significantly reduced the OR for PE/PIH (OR 0·30, 95 % CI 0·09, 1·00, P= 0·049). In conclusion, UK women have low Se status that increases their risk of developing PE/PIH. Therefore, UK women of childbearing age need to improve their Se status.