We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Some children are more affected by specific family environments than others, as a function of differences in their genetic make-up. However, longitudinal studies of genetic moderation of parenting effects during early childhood have not been conducted. We examined developmental profiles of child behavior problems between 18 months and age 8 in a longitudinal parent–offspring sample of 361 adopted children. In toddlerhood (18 months), observed structured parenting indexed parental guidance in service of task goals. Biological parent psychopathology served as an index of genetic influences on children’s behavior problems. Four profiles of child behavior problems were identified: low stable (11%), average stable (50%), higher stable (29%), and high increasing (11%). A multinominal logistic regression analysis indicated a genetically moderated effect of structured parenting, such that for children whose biological mother had higher psychopathology, the odds of the child being in the low stable group increased as structured parenting increased. Conversely, for children whose biological mother had lower psychopathology, the odds of being in the low stable group was reduced when structured parenting increased. Results suggest that increasing structured parenting is an effective strategy for children at higher genetic risk for psychopathology, but may be detrimental for those at lower genetic risk.
A standardised multi-site approach to manage paediatric post-operative chylothorax does not exist and leads to unnecessary practice variation. The Chylothorax Work Group utilised the Pediatric Critical Care Consortium infrastructure to address this gap.
Methods:
Over 60 multi-disciplinary providers representing 22 centres convened virtually as a quality initiative to develop an algorithm to manage paediatric post-operative chylothorax. Agreement was objectively quantified for each recommendation in the algorithm by utilising an anonymous survey. “Consensus” was defined as ≥ 80% of responses as “agree” or “strongly agree” to a recommendation. In order to determine if the algorithm recommendations would be correctly interpreted in the clinical environment, we developed ex vivo simulations and surveyed patients who developed the algorithm and patients who did not.
Results:
The algorithm is intended for all children (<18 years of age) within 30 days of cardiac surgery. It contains rationale for 11 central chylothorax management recommendations; diagnostic criteria and evaluation, trial of fat-modified diet, stratification by volume of daily output, timing of first-line medical therapy for “low” and “high” volume patients, and timing and duration of fat-modified diet. All recommendations achieved “consensus” (agreement >80%) by the workgroup (range 81–100%). Ex vivo simulations demonstrated good understanding by developers (range 94–100%) and non-developers (73%–100%).
Conclusions:
The quality improvement effort represents the first multi-site algorithm for the management of paediatric post-operative chylothorax. The algorithm includes transparent and objective measures of agreement and understanding. Agreement to the algorithm recommendations was >80%, and overall understanding was 94%.
Whole-genome sequencing (WGS) shotgun metagenomics (metagenomics) attempts to sequence the entire genetic content straight from the sample. Diagnostic advantages lie in the ability to detect unsuspected, uncultivatable, or very slow-growing organisms.
Objective:
To evaluate the clinical and economic effects of using WGS and metagenomics for outbreak management in a large metropolitan hospital.
Design:
Cost-effectiveness study.
Setting:
Intensive care unit and burn unit of large metropolitan hospital.
Patients:
Simulated intensive care unit and burn unit patients.
Methods:
We built a complex simulation model to estimate pathogen transmission, associated hospital costs, and quality-adjusted life years (QALYs) during a 32-month outbreak of carbapenem-resistant Acinetobacter baumannii (CRAB). Model parameters were determined using microbiology surveillance data, genome sequencing results, hospital admission databases, and local clinical knowledge. The model was calibrated to the actual pathogen spread within the intensive care unit and burn unit (scenario 1) and compared with early use of WGS (scenario 2) and early use of WGS and metagenomics (scenario 3) to determine their respective cost-effectiveness. Sensitivity analyses were performed to address model uncertainty.
Results:
On average compared with scenario 1, scenario 2 resulted in 14 fewer patients with CRAB, 59 additional QALYs, and $75,099 cost savings. Scenario 3, compared with scenario 1, resulted in 18 fewer patients with CRAB, 74 additional QALYs, and $93,822 in hospital cost savings. The likelihoods that scenario 2 and scenario 3 were cost-effective were 57% and 60%, respectively.
Conclusions:
The use of WGS and metagenomics in infection control processes were predicted to produce favorable economic and clinical outcomes.
We present the data and initial results from the first pilot survey of the Evolutionary Map of the Universe (EMU), observed at 944 MHz with the Australian Square Kilometre Array Pathfinder (ASKAP) telescope. The survey covers
$270 \,\mathrm{deg}^2$
of an area covered by the Dark Energy Survey, reaching a depth of 25–30
$\mu\mathrm{Jy\ beam}^{-1}$
rms at a spatial resolution of
$\sim$
11–18 arcsec, resulting in a catalogue of
$\sim$
220 000 sources, of which
$\sim$
180 000 are single-component sources. Here we present the catalogue of single-component sources, together with (where available) optical and infrared cross-identifications, classifications, and redshifts. This survey explores a new region of parameter space compared to previous surveys. Specifically, the EMU Pilot Survey has a high density of sources, and also a high sensitivity to low surface brightness emission. These properties result in the detection of types of sources that were rarely seen in or absent from previous surveys. We present some of these new results here.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
Aims
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Method
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Results
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
Conclusions
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
The feasibility of non-pharmacological public health interventions (NPIs) such as physical distancing or isolation at home to prevent severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) transmission in low-resource countries is unknown. Household survey data from 54 African countries were used to investigate the feasibility of SARS-CoV-2 NPIs in low-resource settings. Across the 54 countries, approximately 718 million people lived in households with ⩾6 individuals at home (median percentage of at-risk households 56% (95% confidence interval (CI), 51% to 60%)). Approximately 283 million people lived in households where ⩾3 people slept in a single room (median percentage of at-risk households 15% (95% CI, 13% to 19%)). An estimated 890 million Africans lack on-site water (71% (95% CI, 62% to 80%)), while 700 million people lacked in-home soap/washing facilities (56% (95% CI, 42% to 73%)). The median percentage of people without a refrigerator in the home was 79% (95% CI, 67% to 88%), while 45% (95% CI, 39% to 52%) shared toilet facilities with other households. Individuals in low-resource settings have substantial obstacles to implementing NPIs for mitigating SARS-CoV-2 transmission. These populations urgently need to be prioritised for coronavirus disease 2019 vaccination to prevent disease and to contain the global pandemic.
In the First-HD pivotal trial, the maximum deutetrabenazine dose evaluated to treat chorea associated with Huntington’s disease (HD chorea) was 48 mg/d, which is the approved maximum dose for this population. In ARC-HD, an open-label extension study evaluating the long-term efficacy and safety of deutetrabenazine to treat HD chorea, dosage ranged from 6 mg/d to 72 mg/d, with doses ≥12 mg/d administered twice daily. Doses in ARC-HD were increased by 6 mg/d per week in a response-driven manner based on efficacy and tolerability until 48 mg/d (Week 8). At the investigator’s discretion, further increases were permitted by 12 mg/d per week to a maximum of 72 mg/d. This post-hoc analysis evaluates the safety and tolerability of deutetrabenazine >48 mg/d compared to ≤48 mg/d to treat HD chorea in ARC-HD.
Methods
Patient counts and safety assessments were attributed to patients when they received a dose of either ≤48 mg/d or >48 mg/d. For 9 selected adverse events (AEs), we compared AE rates adjusted for duration of drug exposure (as number of AEs/year) at ≤48 mg/d or >48 mg/d. The AE rates were determined after titration when participants were on stable doses of deutetrabenazine.
Results
All 113 patients were exposed to doses ≤48 mg/d (177.1 patient-years) and 49 patients were ever exposed to doses >48 mg/d (74.1 patient-years). In patients taking deutetrabenazine >48 mg/d compared to ≤48 mg/d after the titration period, there were no apparent differences in exposure-adjusted AE rates.
Conclusions
Based on clinical experience, some patients with HD may benefit from doses higher than 48 mg/d to adequately control chorea. These doses were tolerated without apparent increase in the exposure-adjusted rates of selected AEs after titration. This analysis does not address the occurrence of other AEs or whether adequate efficacy was achieved at lower doses, factors that may have influenced dose increases.
Funding
Teva Pharmaceutical Industries Ltd., Petach Tikva, Israel
Chorea is a prominent motor dysfunction in Huntington’s disease (HD). Deutetrabenazine, a vesicular monoamine transporter 2 (VMAT2) inhibitor, is FDA-approved for the treatment of chorea in HD. In the pivotal, 12-week First-HD trial, deutetrabenazine treatment reduced the Unified Huntington’s Disease Rating Scale (UHDRS) total maximal chorea (TMC) score versus placebo. ARC-HD, an open-label extension study, evaluated long-term safety and efficacy of deutetrabenazine dosed in a response-driven manner for treatment of HD chorea.
Methods
Patients who completed First-HD (Rollover) and patients who converted overnight from a stable dose of tetrabenazine (Switch) were included. Safety was assessed over the entire treatment period; exposure-adjusted incidence rates (EAIRs; adverse events [AEs] per person-year) were calculated. A stable, post-titration time point of 8 weeks was chosen for efficacy analyses.
Results
Of 119 patients enrolled (Rollover, n=82; Switch, n=37), 100 (84%) completed ≥1 year of treatment (mean [SD] follow-up, 119 [48] weeks). End of study EAIRs for patients in the Rollover and Switch cohorts, respectively, were: any AE, 2.6 and 4.3; serious AEs, 0.13 and 0.14; AEs leading to dose suspension, 0.05 and 0.04. Overall, 68% and 73% of patients in Rollover and Switch, respectively, experienced a study drug–related AE. Most common AEs possibly related to study drug were somnolence (17% Rollover; 27% Switch), depression (23%; 19%), anxiety (9%; 11%), insomnia (10%; 8%), and akathisia (9%; 14%). Rates of AEs of interest include suicidality (9%; 3%) and parkinsonism (6%; 11%). In both cohorts, mean UHDRS TMC score and total motor score (TMS) decreased from baseline to Week 8; mean (SD) change in TMC score (units) was –4.4 (3.1) and –2.1 (3.3) and change in TMS was –7.1 (7.3) and –2.4 (8.7) in Rollover and Switch, respectively. While receiving stable dosing from Week 8 to 132 (or end of treatment), patients showed minimal change in TMC score (0.9 [5.0]), but TMS increased compared to Week 8 (9.0 [11.3]). Upon drug withdrawal, there were no remarkable AEs and TMC scores increased 4.4 (3.7) units compared to end of treatment.
Conclusions
The type and severity of AEs observed in long-term deutetrabenazine exposure are consistent with the previous study. Efficacy in reducing chorea persisted over time. There was no unexpected worsening of HD or chorea associated with HD upon deutetrabenazine withdrawal.
Funding
Teva Pharmaceutical Industries Ltd., Petach Tikva, Israel
Adults who were born preterm are at increased risk of hypertension and cardiovascular disease in later life. Infants born late preterm are the majority of preterm births; however, the effect of late preterm on risk of cardiovascular disease is unclear. The objective of this study was to assess whether vascular health and cardiac autonomic control differ in a group of late preterm newborn infants compared to a group of term-born infants.
A total of 35 healthy late preterm newborn infants, with normal growth (34–36 completed weeks’ gestation) and 139 term-born infants (37–42 weeks’ gestation) were compared in this study. Aortic wall thickening, assessed as aortic intima–media thickness (IMT) by high-resolution ultrasound, and cardiac autonomic control, assessed by heart rate variability, were measured during the first week of life. Postnatal age of full-term and late preterm infants at the time of the study was 5 days (standard deviation [SD] 5) and 4 days (SD 3), respectively.
Infants born late preterm show reduced aortic IMT (574 μm [SD 51] vs. 612 μm [SD 73]) and reduced heart rate variability [log total power 622.3 (606.5) ms2 vs. 1180. 6 (1114.3) ms2], compared to term infants. These associations remained even after adjustment for sex and birth weight.
Infants born late preterm show selective differences in markers of cardiovascular risk, with potentially beneficial differences in aortic wall thickness in contrast to potentially detrimental differences in autonomic control, when compared with term-born control infants. These findings provide pathophysiologic evidence to support an increased risk of hypertension and sudden cardiac death in individuals born late preterm.
Finely laminated (cm–μm scale) metalliferous precipitates are widespread in the surficial environment, especially around mineral deposits and reflect biogeochemical processes that can pervade near-surface environments on a larger scale. Examples in this paper involve precipitates of the transition metals Fe, Cu and Mn with minor Co, Ni, V and Zn; the metalloids As and Sb; and authigenic Au. Mobility and re-precipitation are driven primarily by geochemical disequilibrium, especially with respect to pH and redox states, that arises from complex interactions between biological processes, geological processes, and variations in the surrounding environment. Different degrees of chemical disequilibrium arise on small spatial scales on time scales of days to millennia. Interactions between biota, waters and rocks in these small near-surface settings affect the biogeochemical environments. Sulfur- and iron-oxidising bacteria are common biogeochemical agents associated with sulfide-bearing lithologies, but localised reductive environments can also develop, leading to gradients in pH and redox state and differential metal mobility. In general, there is commonly a spatial separation of Fe-rich precipitates from those with Cu and Mn, and other transition metals also follow Cu and Mn rather than Fe. Metalloids As and Sb have a strong affinity for Fe under oxidising conditions, but not under more reducing conditions. However, complex biogeochemical parageneses of laminated metalliferous deposits preclude prediction of finer formation details. The textures, mineral species, and metal associations within these deposits are likely to be encountered in all facets of mineral deposit development: initial exploration activity of near-surface locations, mining of shallow portions of orebodies, especially supergene zones, and downstream environmental management with respect to discharging metalliferous waters.
Rapid spread of coronavirus disease 2019 (COVID-19) has affected people with intellectual disability disproportionately. Existing data does not provide enough information to understand factors associated with increased deaths in those with intellectual disability. Establishing who is at high risk is important in developing prevention strategies, given risk factors or comorbidities in people with intellectual disability may be different to those in the general population.
Aims
To identify comorbidities, demographic and clinical factors of those individuals with intellectual disability who have died from COVID-19.
Method
An observational descriptive case series looking at deaths because of COVID-19 in people with intellectual disability was conducted. Along with established risk factors observed in the general population, possible specific risk factors and comorbidities in people with intellectual disability for deaths related to COVID-19 were examined. Comparisons between mild and moderate-to-profound intellectual disability subcohorts were undertaken.
Results
Data on 66 deaths in individuals with intellectual disability were analysed. This group was younger (mean age 64 years) compared with the age of death in the general population because of COVID-19. High rates of moderate-to-profound intellectual disability (n = 43), epilepsy (n = 29), mental illness (n = 29), dysphagia (n = 23), Down syndrome (n = 20) and dementia (n = 15) were observed.
Conclusions
This is the first study exploring associations between possible risk factors and comorbidities found in COVID-19 deaths in people with intellectual disability. Our data provides insight into possible factors for deaths in people with intellectual disability. Some of the factors varied between the mild and moderate-to-profound intellectual disability groups. This highlights an urgent need for further systemic inquiry and study of the possible cumulative impact of these factors and comorbidities given the possibility of COVID-19 resurgence.
New York City's first case of SARS-associated coronavirus (SARS-CoV-2) disease 2019 (COVID-19) was identified on 1 March 2020, prompting rapid restructuring of hospital-based services to accommodate the increasing numbers of medical admissions. Non-essential services were eliminated but in-patient treatment of psychiatric illnesses was necessarily maintained.
Aims
To detail the response of the NYU Langone Health in-patient psychiatric services to the COVID-19 outbreak from 1 March to 1 May 2020.
Method
Process improvement/quality improvement study.
Results
Over this time period, our two in-patient psychiatric units (57 total beds) treated 238 patients, including COVID-19-positive and -negative individuals. Testing for COVID-19 was initially limited to symptomatic patients but expanded over the 62-day time frame. In total, 122 SARS-CoV-2 polymerase chain reaction (PCR) tests were performed in 98 patients. We observed an overall rate of COVID-19 infection of 15.6% in the patients who were tested, with an asymptomatic positive rate of 13.7%. Although phased roll-out of testing impaired the ability to fully track on-unit transmission of COVID-19, 3% of cases were clearly identified as results of on-unit transmission.
Conclusions
Our experience indicates that, with appropriate precautions, patients in need of in-patient psychiatric admission who have COVID-19 can be safely managed. We provide suggested guidelines for COVID-19 management on in-patient psychiatric units which incorporate our own experiences as well as published recommendations.
Poverty and social exclusion are a gendered phenomenon. They are rooted deeply in the stereotypes, biases, prejudices, and discriminations against women, especially those suffering from poor living conditions. Unfortunately, gender inequality is manifested in most, if not all, major life domains. It is therefore important to understand the gender aspect of poverty and social exclusion through a psychological lens. We begin this chapter by introducing the concepts of multi-dimensional poverty and social exclusion with a sketch of the gender disparities displayed in these areas. We turn next to several mainstream psychological theories which have attempted to investigate and interpret the relationship between poverty and gender inequality from the dispositional, motivational, cognitive, and behavioural perspectives. Finally, we evaluate the reliability, objectivity, and generalisability of the reviewed theories and studies and offer suggestions for future research.
Previous genetic association studies have failed to identify loci robustly associated with sepsis, and there have been no published genetic association studies or polygenic risk score analyses of patients with septic shock, despite evidence suggesting genetic factors may be involved. We systematically collected genotype and clinical outcome data in the context of a randomized controlled trial from patients with septic shock to enrich the presence of disease-associated genetic variants. We performed genomewide association studies of susceptibility and mortality in septic shock using 493 patients with septic shock and 2442 population controls, and polygenic risk score analysis to assess genetic overlap between septic shock risk/mortality with clinically relevant traits. One variant, rs9489328, located in AL589740.1 noncoding RNA, was significantly associated with septic shock (p = 1.05 × 10–10); however, it is likely a false-positive. We were unable to replicate variants previously reported to be associated (p < 1.00 × 10–6 in previous scans) with susceptibility to and mortality from sepsis. Polygenic risk scores for hematocrit and granulocyte count were negatively associated with 28-day mortality (p = 3.04 × 10–3; p = 2.29 × 10–3), and scores for C-reactive protein levels were positively associated with susceptibility to septic shock (p = 1.44 × 10–3). Results suggest that common variants of large effect do not influence septic shock susceptibility, mortality and resolution; however, genetic predispositions to clinically relevant traits are significantly associated with increased susceptibility and mortality in septic individuals.
Wild sheep and many primitive domesticated breeds have two coats: coarse hairs covering shorter, finer fibres. Both are shed annually. Exploitation of wool for apparel in the Bronze Age encouraged breeding for denser fleeces and continuously growing white fibres. The Merino is regarded as the culmination of this process. Archaeological discoveries, ancient images and parchment records portray this as an evolutionary progression, spanning millennia. However, examination of the fleeces from feral, two-coated and woolled sheep has revealed a ready facility of the follicle population to change from shedding to continuous growth and to revert from domesticated to primitive states. Modifications to coat structure, colour and composition have occurred in timeframes and to sheep population sizes that exclude the likelihood of variations arising from mutations and natural selection. The features are characteristic of the domestication phenotype: an assemblage of developmental, physiological, skeletal and hormonal modifications common to a wide variety of species under human control. The phenotypic similarities appeared to result from an accumulation of cryptic genetic changes early during vertebrate evolution. Because they did not affect fitness in the wild, the mutations were protected from adverse selection, becoming apparent only after exposure to a domestic environment. The neural crest, a transient embryonic cell population unique to vertebrates, has been implicated in the manifestations of the domesticated phenotype. This hypothesis is discussed with reference to the development of the wool follicle population and the particular roles of Notch pathway genes, culminating in the specific cell interactions that typify follicle initiation.
The C40 city-network claims a position of global leadership in the governance of climate change. This chapter provides a brief overview of the history of the network, its member cities, and their collective aims and objectives. The chapter introduces the empirical puzzle around which the book is organized, namely the ability of the C40 to achieve coordinated action from a diverse collection of cities despite relying on voluntary participation and engagement. The ability to do so sets the C40 apart from other similar city-networks and begs the question as to how it has been able to achieve coordination and collective effort. The chapter asserts that such voluntary coordination is only possible through the formation of a collective identity and draws on ideas from the scholarship on social fields, social constructivism, and social movements to develop a theory of global urban governance fields that explains when, how, and why the C40 has managed to generate convergence around a set of governance norms and a shared governance identity.