We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In urban and peri-urban areas across the Global South, politicians, planners and developers are engaged in a voracious scramble to refashion land for global real estate investment, and transfer state power to private sector actors. Much of this development has taken place on the outskirts of the traditional metropoles, in the territorially flexible urban frontier. At the forefront of these processes in India, is Gurgaon, a privately developed metropolis on the south-western hinterlands of New Delhi, that has long been touted as India's flagship neoliberal city. Subaltern Frontiers tells a story of India's remarkable urban transformation by examining the politics of land and labour that have shaped the city of Gurgaon. The book examines how the country's flagship post-liberalisation urban project has been shaped and filtered through agrarian and subaltern histories, logics, and subjects. In doing so, the book explores how the production of globalised property and labour in contemporary urban India is filtered through colonial instruments of land governance, living histories of uneven agrarian development, material geographies of labour migration, and the worldly aspirations of peasant-agriculturalists.
The crystal structure of baricitinib has been solved and refined using synchrotron X-ray powder diffraction data and optimized using density functional techniques. Baricitinib crystallizes in space group I2/a (#15) with a = 11.81128(11), b = 7.06724(6), c = 42.5293(3) Å, β = 91.9280(4)°, V = 3548.05(5) Å3, and Z = 8. The crystal structure is characterized by hydrogen-bonded double layers parallel to the ab-planes. The dimers form a graph set R2,2(8). The sulfone ends of the molecules reside in the interlayer regions. The powder pattern has been submitted to ICDD for inclusion in the Powder Diffraction File™ (PDF®).
Patients with Duchenne muscular dystrophy have multiple risk factors for lower extremity oedema. This study sought to define the frequency and predictors of oedema. Patients aged 15 years and older were screened by patient questionnaire, and the presence of oedema was confirmed by subsequent physical exam. Twenty-four of 52 patients (46%) had oedema, 12 of whom had swelling extending above the foot and two with sores/skin breakdown. There was no significant difference in age, frequency, or duration of glucocorticoid use, non-invasive respiratory support use, forced vital capacity, cardiac medication use, or ejection fraction between patients with and without oedema (all p > 0.2). Those with oedema had a greater time since the loss of ambulation (8.4 years versus 3.5 years; p = 0.004), higher body mass index (28.3 versus 24.8; p = 0.014), and lower frequency of deflazacort use (67% versus 89%; p = 0.008). Multivariate analysis revealed a longer duration of loss of ambulation (p = 0.02) and higher body mass index (p = 0.009) as predictors of oedema. Lower extremity oedema is common in Duchenne muscular dystrophy but independent of cardiac function. Interventions focused on minimising body mass index increases over time may be a therapeutic target.
Mounting evidence suggests that the first few months of life are critical for the development of obesity. The relationships between the timing of solid food introduction and the risk of childhood obesity have been examined previously; however, evidence for the association of timing of infant formula introduction remains scarce. This study aimed to examine whether the timing of infant formula introduction is associated with growth z-scores and overweight at ages 1 and 3 years. This study included 5733 full-term (≥ 37 gestational weeks) and normal birth weight (≥ 2500 and < 4000 g) children in the Born in Guangzhou Cohort Study, a prospective cohort study with data collected at 6 weeks, 6, 12 and 36 months. Compared with infant formula introduction at 0–3 months, introduction at 4–6 months was associated with the lower BMI, weight-for-age and weight-for-length z-scores at 1 and 3 years old. Also, introduction at 4–6 months was associated with the lower odds of at-risk of overweight at age 1 (adjusted OR 0·72, 95 % CI 0·55, 0·94) and 3 years (adjusted OR 0·50, 95 % CI 0·30, 0·85). Introduction at 4–6 months also decreased the odds of overweight at age 1 year (adjusted OR 0·42, 95 % CI 0·21, 0·84) but not at age 3 years. Based on our findings, compared with introduction within the first 3 months, introduction at 4–6 months has a reduction on later high BMI risk and at-risk of overweight. However, these results need to be replicated in other well-designed studies before more firm recommendations can be made.
In 2013, the Danish Health Authorities recommended a change in prophylactic iron supplementation to 40–50 mg/d from gestational week 10. Hence, the aims of the present study were (1) to estimate the prevalence of women who follow the Danish recommendation on iron supplementation during the last 3 weeks of the first trimester of pregnancy and (2) to identify potential sociodemographic, reproductive and health-related pre-pregnancy predictors for iron supplementation during the first trimester. We conducted a cross-sectional study with data from the hospital-based Copenhagen Pregnancy Cohort. Characteristics were analysed by descriptive statistics and multivariable logistic regression analysis was performed to examine the associations between predictors and iron supplementation during the last 3 weeks of the first trimester. The study population consisted of 23 533 pregnant women attending antenatal care at Copenhagen University Hospital - Rigshospitalet from October 2013 to May 2019. The prevalence of iron supplementation according to recommendations was 49⋅1 %. The pre-pregnancy factors of ≥40 years of age, the educational level below a higher degree and a vegetarian or vegan diet were identified as predictors for iron supplementation during the first trimester of pregnancy. Approximately half of the women were supplemented with the recommended dose of iron during the first trimester of pregnancy. We identified pre-pregnancy predictors associated with iron supplementation. Interventions that target women of reproductive age are needed. An enhanced focus on iron supplementation during pregnancy should be incorporated in pre-pregnancy and interpregnancy counselling.
Case-only longitudinal studies are common in psychiatry. Further, it is assumed that psychiatric ratings and questionnaire results of healthy controls stay stable over foreseeable time ranges. For cognitive tests, improvements over time are expected, but data for more than two administrations are scarce.
Aims
We comprehensively investigated the longitudinal course for trends over time in cognitive and symptom measurements for severe mental disorders. Assessments included the Trail Making Tests, verbal Digit Span tests, Global Assessment of Functioning, Inventory of Depressive Symptomatology, the Positive and Negative Syndrome Scale, and the Young Mania Rating Scale, among others.
Method
Using the data of control individuals (n = 326) from the PsyCourse study who had up to four assessments over 18 months, we modelled the course using linear mixed models or logistic regression. The slopes or odds ratios were estimated and adjusted for age and gender. We also assessed the robustness of these results using a longitudinal non-parametric test in a sensitivity analysis.
Results
Small effects were detected for most cognitive tests, indicating a performance improvement over time (P < 0.05). However, for most of the symptom rating scales and questionnaires, no effects were detected, in line with our initial hypothesis.
Conclusions
The slightly but consistently improved performance in the cognitive tests speaks of a test-unspecific positive trend, while psychiatric ratings and questionnaire results remain stable over the observed period. These detectable improvements need to be considered when interpreting longitudinal courses. We therefore recommend recruiting control participants if cognitive tests are administered.
Limited data exist on training of European paediatric and adult congenital cardiologists.
Methods:
A structured and approved questionnaire was circulated to national delegates of Association for European Paediatric and Congenital Cardiology in 33 European countries.
Results:
Delegates from 30 countries (91%) responded. Paediatric cardiology was not recognised as a distinct speciality by the respective ministry of Health in seven countries (23%). Twenty countries (67%) have formally accredited paediatric cardiology training programmes, seven (23%) have substantial informal (not accredited or certified) training, and three (10%) have very limited or no programme. Twenty-two countries have a curriculum. Twelve countries have a national training director. There was one paediatric cardiology centre per 2.66 million population (range 0.87–9.64 million), one cardiac surgical centre per 4.73 million population (range 1.63–10.72 million), and one training centre per 4.29 million population (range 1.63–10.72 million population). The median number of paediatric cardiology fellows per training programme was 4 (range 1–17), and duration of training was 3 years (range 2–5 years). An exit examination in paediatric cardiology was conducted in 16 countries (53%) and certification provided by 20 countries (67%). Paediatric cardiologist number is affected by gross domestic product (R2 = 0.41).
Conclusion:
Training varies markedly across European countries. Although formal fellowship programmes exist in many countries, several countries have informal training or no training. Only a minority of countries provide both exit examination and certification. Harmonisation of training and standardisation of exit examination and certification could reduce variation in training thereby promoting high-quality care by European congenital cardiologists.
Response to lithium in patients with bipolar disorder is associated with clinical and transdiagnostic genetic factors. The predictive combination of these variables might help clinicians better predict which patients will respond to lithium treatment.
Aims
To use a combination of transdiagnostic genetic and clinical factors to predict lithium response in patients with bipolar disorder.
Method
This study utilised genetic and clinical data (n = 1034) collected as part of the International Consortium on Lithium Genetics (ConLi+Gen) project. Polygenic risk scores (PRS) were computed for schizophrenia and major depressive disorder, and then combined with clinical variables using a cross-validated machine-learning regression approach. Unimodal, multimodal and genetically stratified models were trained and validated using ridge, elastic net and random forest regression on 692 patients with bipolar disorder from ten study sites using leave-site-out cross-validation. All models were then tested on an independent test set of 342 patients. The best performing models were then tested in a classification framework.
Results
The best performing linear model explained 5.1% (P = 0.0001) of variance in lithium response and was composed of clinical variables, PRS variables and interaction terms between them. The best performing non-linear model used only clinical variables and explained 8.1% (P = 0.0001) of variance in lithium response. A priori genomic stratification improved non-linear model performance to 13.7% (P = 0.0001) and improved the binary classification of lithium response. This model stratified patients based on their meta-polygenic loadings for major depressive disorder and schizophrenia and was then trained using clinical data.
Conclusions
Using PRS to first stratify patients genetically and then train machine-learning models with clinical predictors led to large improvements in lithium response prediction. When used with other PRS and biological markers in the future this approach may help inform which patients are most likely to respond to lithium treatment.
Beginning in the late 1970s, China’s economy produced the largest growth spurt in recorded history. This striking departure from the economic experience of the previous 200 years encourages onlookers to view recent economic success as a “miracle” that requires neither economic nor historical explanation. Such thinking ignores common elements that have shaped China’s long-term economic trajectory: forces propelling spurts of innovation and growth, restrictions that often impede these dynamic forces, and enduring features of China’s polity that generate tensions between centralized authoritarian power and economic growth. Neglect of these historical legacies invites misconceptions about the current boom’s origin and the economy’s likely future path. History and economics figure prominently in our analysis of both.
There is limited information on the volume of antibiotic prescribing that is influenza-associated, resulting from influenza infections and their complications (such as streptococcal pharyngitis and otitis media). Here, we estimated age/diagnosis-specific proportions of antibiotic prescriptions (fills) for the Kaiser Permanente Northern California population during 2010–2018 that were influenza-associated. The proportion of influenza-associated antibiotic prescribing among all antibiotic prescribing was higher in children aged 5–17 years compared to children aged under 5 years, ranging from 1.4% [95% CI (0.7–2.1)] in aged <1 year to 2.7% (1.9–3.4) in aged 15–17 years. For adults aged over 20 years, the proportion of influenza-associated antibiotic prescribing among all antibiotic prescribing was lower, ranging from 0.7% (0.5–1) for aged 25–29 years to 1.6% (1.2–1.9) for aged 60–64 years. Most of the influenza-associated antibiotic prescribing in children aged under 10 years was for ear infections, while for age groups over 25 years, 45–84% of influenza-associated antibiotic prescribing was for respiratory diagnoses without a bacterial indication. This suggests a modest benefit of increasing influenza vaccination coverage for reducing antibiotic prescribing, as well as the potential benefit of other measures to reduce unnecessary antibiotic prescribing for respiratory diagnoses with no bacterial indication in persons aged over 25 years, both of which may further contribute to the mitigation of antimicrobial resistance.
To date, besides genome-wide association studies, a variety of other genetic analyses (e.g. polygenic risk scores, whole-exome sequencing and whole-genome sequencing) have been conducted, and a large amount of data has been gathered for investigating the involvement of common, rare and very rare types of DNA sequence variants in bipolar disorder. Also, non-invasive neuroimaging methods can be used to quantify changes in brain structure and function in patients with bipolar disorder.
Aims
To provide a comprehensive assessment of genetic findings associated with bipolar disorder, based on the evaluation of different genomic approaches and neuroimaging studies.
Method
We conducted a PubMed search of all relevant literatures from the beginning to the present, by querying related search strings.
Results
ANK3, CACNA1C, SYNE1, ODZ4 and TRANK1 are five genes that have been replicated as key gene candidates in bipolar disorder pathophysiology, through the investigated studies. The percentage of phenotypic variance explained by the identified variants is small (approximately 4.7%). Bipolar disorder polygenic risk scores are associated with other psychiatric phenotypes. The ENIGMA-BD studies show a replicable pattern of lower cortical thickness, altered white matter integrity and smaller subcortical volumes in bipolar disorder.
Conclusions
The low amount of explained phenotypic variance highlights the need for further large-scale investigations, especially among non-European populations, to achieve a more complete understanding of the genetic architecture of bipolar disorder and the missing heritability. Combining neuroimaging data with genetic data in large-scale studies might help researchers acquire a better knowledge of the engaged brain regions in bipolar disorder.
Fibricola and Neodiplostomum are diplostomid genera with very similar morphology that are currently separated based on their definitive hosts. Fibricola spp. are normally found in mammals, while Neodiplostomum spp. typically parasitize birds. Previously, no DNA sequence data was available for any member of Fibricola. We generated nuclear ribosomal and mtDNA sequences of Fibricola cratera (type-species), Fibricola lucidum and 6 species of Neodiplostomum. DNA sequences were used to examine phylogenetic interrelationships among Fibricola and Neodiplostomum and re-evaluate their systematics. Molecular phylogenies and morphological study suggest that Fibricola should be considered a junior synonym of Neodiplostomum. Therefore, we synonymize the two genera and transfer all members of Fibricola into Neodiplostomum. Specimens morphologically identified as Neodiplostomum cratera belonged to 3 distinct phylogenetic clades based on mitochondrial data. One of those clades also included sequences of specimens identified morphologically as Neodiplostomum lucidum. Further study is necessary to resolve the situation regarding the morphology of N. cratera. Our results demonstrated that some DNA sequences of N. americanum available in GenBank originate from misidentified Neodiplostomum banghami. Molecular phylogentic data revealed at least 2 independent host-switching events between avian and mammalian hosts in the evolutionary history of Neodiplostomum; however, the directionality of these host-switching events remains unclear.
Cross-species evidence suggests that the ability to exert control over a stressor is a key dimension of stress exposure that may sensitize frontostriatal-amygdala circuitry to promote more adaptive responses to subsequent stressors. The present study examined neural correlates of stressor controllability in young adults. Participants (N = 56; Mage = 23.74, range = 18–30 years) completed either the controllable or uncontrollable stress condition of the first of two novel stressor controllability tasks during functional magnetic resonance imaging (fMRI) acquisition. Participants in the uncontrollable stress condition were yoked to age- and sex-matched participants in the controllable stress condition. All participants were subsequently exposed to uncontrollable stress in the second task, which is the focus of fMRI analyses reported here. A whole-brain searchlight classification analysis revealed that patterns of activity in the right dorsal anterior insula (dAI) during subsequent exposure to uncontrollable stress could be used to classify participants' initial exposure to either controllable or uncontrollable stress with a peak of 73% accuracy. Previous experience of exerting control over a stressor may change the computations performed within the right dAI during subsequent stress exposure, shedding further light on the neural underpinnings of stressor controllability.
The last 50 years have seen an increasing dependence on academic institutions to develop and commercialize new biomedical innovations, a responsibility for which many universities are ill-equipped. To address this need, we created LEAP, an asset development and gap fund program at Washington University in St. Louis (WUSTL). Beyond awarding funds to promising projects, this program aimed to promote a culture of academic entrepreneurship, and thus improve WUSTL technology transfer, by providing university inventors with individualized consulting and industry expert feedback. The purpose of this work is to document the structure of the LEAP program and evaluate its impact on the WUSTL entrepreneurial ecosystem. Our analysis utilizes program data, participant surveys, and WUSTL technology transfer office records to demonstrate that LEAP consistently attracted new investigators and that the training provided by the program was both impactful and highly valued by participants. We also show that an increase in annual WUSTL start-up formation during the years after LEAP was established and implicate the program in this increase. Taken together, our results illustrate that programs like LEAP could serve as a model for other institutions that seek to support academic entrepreneurship initiatives.
As clinical trials were rapidly initiated in response to the COVID-19 pandemic, Data and Safety Monitoring Boards (DSMBs) faced unique challenges overseeing trials of therapies never tested in a disease not yet characterized. Traditionally, individual DSMBs do not interact or have the benefit of seeing data from other accruing trials for an aggregated analysis to meaningfully interpret safety signals of similar therapeutics. In response, we developed a compliant DSMB Coordination (DSMBc) framework to allow the DSMB from one study investigating the use of SARS-CoV-2 convalescent plasma to treat COVID-19 to review data from similar ongoing studies for the purpose of safety monitoring.
Methods:
The DSMBc process included engagement of DSMB chairs and board members, execution of contractual agreements, secure data acquisition, generation of harmonized reports utilizing statistical graphics, and secure report sharing with DSMB members. Detailed process maps, a secure portal for managing DSMB reports, and templates for data sharing and confidentiality agreements were developed.
Results:
Four trials participated. Data from one trial were successfully harmonized with that of an ongoing trial. Harmonized reports allowing for visualization and drill down into the data were presented to the ongoing trial’s DSMB. While DSMB deliberations are confidential, the Chair confirmed successful review of the harmonized report.
Conclusion:
It is feasible to coordinate DSMB reviews of multiple independent studies of a similar therapeutic in similar patient cohorts. The materials presented mitigate challenges to DSMBc and will help expand these initiatives so DSMBs may make more informed decisions with all available information.
Back pain is one of the largest drivers of workplace injury and lost productivity in industries around the world. Back injuries were one of the leading reasons in resulting in days away from work at 38.5% across all occupations, increasing for manual laborers to 43%. While the cause of the back pain can vary across occupations, for materiel movers it is often caused from repetitive poor lifting. To reduce the issues, the Aerial Porter Exoskeleton (APEx) was created. The APEx uses a hip-mounted, powered exoskeleton attached to an adjustable vest. An onboard computer calculates the configuration of the user to determine when to activate. Lift form is assisted by using a novel lumbar brace mounted on the sides of the hips. Properly worn, the APEx holds the user upright while providing additional hip torque through a lift. This was tested by having participants complete a lifting test with the exoskeleton worn in the “on” configuration compared with the exoskeleton not worn. The APEx has been shown to deliver 30 Nm of torque in lab testing. The activity recognition algorithm has also been shown to be accurate in 95% of tested conditions. When worn by subjects, testing has shown average peak reductions of 14.9% BPM, 8% in VO2 consumption, and an 8% change in perceived effort favoring the APEx.
The CLEAR Trial recently found that decolonization reduced infections and hospitalizations in MRSA carriers in the year following hospital discharge. In this secondary analysis, we explored whether decolonization had a similar benefit in the subgroup of trial participants who harbored USA300, using two different definitions for the USA300 strain-type.
Whole-genome sequencing (WGS) shotgun metagenomics (metagenomics) attempts to sequence the entire genetic content straight from the sample. Diagnostic advantages lie in the ability to detect unsuspected, uncultivatable, or very slow-growing organisms.
Objective:
To evaluate the clinical and economic effects of using WGS and metagenomics for outbreak management in a large metropolitan hospital.
Design:
Cost-effectiveness study.
Setting:
Intensive care unit and burn unit of large metropolitan hospital.
Patients:
Simulated intensive care unit and burn unit patients.
Methods:
We built a complex simulation model to estimate pathogen transmission, associated hospital costs, and quality-adjusted life years (QALYs) during a 32-month outbreak of carbapenem-resistant Acinetobacter baumannii (CRAB). Model parameters were determined using microbiology surveillance data, genome sequencing results, hospital admission databases, and local clinical knowledge. The model was calibrated to the actual pathogen spread within the intensive care unit and burn unit (scenario 1) and compared with early use of WGS (scenario 2) and early use of WGS and metagenomics (scenario 3) to determine their respective cost-effectiveness. Sensitivity analyses were performed to address model uncertainty.
Results:
On average compared with scenario 1, scenario 2 resulted in 14 fewer patients with CRAB, 59 additional QALYs, and $75,099 cost savings. Scenario 3, compared with scenario 1, resulted in 18 fewer patients with CRAB, 74 additional QALYs, and $93,822 in hospital cost savings. The likelihoods that scenario 2 and scenario 3 were cost-effective were 57% and 60%, respectively.
Conclusions:
The use of WGS and metagenomics in infection control processes were predicted to produce favorable economic and clinical outcomes.
Two introduced carnivores, the European red fox Vulpes vulpes and domestic cat Felis catus, have had extensive impacts on Australian biodiversity. In this study, we collate information on consumption of Australian birds by the fox, paralleling a recent study reporting on birds consumed by cats. We found records of consumption by foxes on 128 native bird species (18% of the non-vagrant bird fauna and 25% of those species within the fox’s range), a smaller tally than for cats (343 species, including 297 within the fox’s Australian range, a subset of that of the cat). Most (81%) bird species eaten by foxes are also eaten by cats, suggesting that predation impacts are compounded. As with consumption by cats, birds that nest or forage on the ground are most likely to be consumed by foxes. However, there is also some partitioning, with records of consumption by foxes but not cats for 25 bird species, indicating that impacts of the two predators may also be complementary. Bird species ≥3.4 kg were more likely to be eaten by foxes, and those <3.4 kg by cats. Our compilation provides an inventory and describes characteristics of Australian bird species known to be consumed by foxes, but we acknowledge that records of predation do not imply population-level impacts. Nonetheless, there is sufficient information from other studies to demonstrate that fox predation has significant impacts on the population viability of some Australian birds, especially larger birds, and those that nest or forage on the ground.