We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The Personalized Advantage Index (PAI) shows promise as a method for identifying the most effective treatment for individual patients. Previous studies have demonstrated its utility in retrospective evaluations across various settings. In this study, we explored the effect of different methodological choices in predictive modelling underlying the PAI.
Methods
Our approach involved a two-step procedure. First, we conducted a review of prior studies utilizing the PAI, evaluating each study using the Prediction model study Risk Of Bias Assessment Tool (PROBAST). We specifically assessed whether the studies adhered to two standards of predictive modeling: refraining from using leave-one-out cross-validation (LOO CV) and preventing data leakage. Second, we examined the impact of deviating from these methodological standards in real data. We employed both a traditional approach violating these standards and an advanced approach implementing them in two large-scale datasets, PANIC-net (n = 261) and Protect-AD (n = 614).
Results
The PROBAST-rating revealed a substantial risk of bias across studies, primarily due to inappropriate methodological choices. Most studies did not adhere to the examined prediction modeling standards, employing LOO CV and allowing data leakage. The comparison between the traditional and advanced approach revealed that ignoring these standards could systematically overestimate the utility of the PAI.
Conclusion
Our study cautions that violating standards in predictive modeling may strongly influence the evaluation of the PAI's utility, possibly leading to false positive results. To support an unbiased evaluation, crucial for potential clinical application, we provide a low-bias, openly accessible, and meticulously annotated script implementing the PAI.
Summary: The aging of the population poses significant challenges in healthcare, necessitating innovative approaches. Advancements in brain imaging and artificial intelligence now allow for characterizing an individual’s state through their brain age,’’ derived from observable brain features. Exploring an individual’s biological age’’ rather than chronological age is becoming crucial to identify relevant clinical indicators and refine risk models for age-related diseases. However, traditional brain age measurement has limitations, focusing solely on brain structure assessment while neglecting functional efficiency.
Our study focuses on developing neurocognitive ages’’ specific to cognitive systems to enhance the precision of decline estimation. Leveraging international (NKI2, ADNI) and Canadian (CIMA- Q, COMPASS-ND) databases with neuroimaging and neuropsychological data from older adults [control subjects with no cognitive impairment (CON): n = 1811; people living with mild cognitive impairment (MCI): n = 1341; with Alzheimer’s disease (AD): n= 513], we predicted individual brain ages within groups. These estimations were enriched with neuropsychological data to generate specific neurocognitive ages. We used longitudinal statistical models to map evolutionary trajectories. Comparing the accuracy of neurocognitive ages to traditional brain ages involved statistical learning techniques and precision measures.
The results demonstrated that neurocognitive age enhances the prediction of individual brain and cognition change trajectories related to aging and dementia. This promising approach could strengthen diagnostic reliability, facilitate early detection of at-risk profiles, and contribute to the emergence of precision gerontology/geriatrics.
SCN2A encodes a voltage-gated sodium channel (designated NaV1.2) vital for generating neuronal action potentials. Pathogenic SCN2A variants are associated with a diverse array of neurodevelopmental disorders featuring neonatal or infantile onset epilepsy, developmental delay, autism, intellectual disability and movement disorders. SCN2A is a high confidence risk gene for autism spectrum disorder and a commonly discovered cause of neonatal onset epilepsy. This remarkable clinical heterogeneity is mirrored by extensive allelic heterogeneity and complex genotype-phenotype relationships partially explained by divergent functional consequences of pathogenic variants. Emerging therapeutic strategies targeted to specific patterns of NaV1.2 dysfunction offer hope to improving the lives of individuals affected by SCN2A-related disorders. This Element provides a review of the clinical features, genetic basis, pathophysiology, pharmacology and treatment of these genetic conditions authored by leading experts in the field and accompanied by perspectives shared by affected families. This title is also available as Open Access on Cambridge Core.
Chrono-medicine considers circadian biology in disease management, including combined lifestyle and medicine interventions. Exercise and nutritional interventions are well-known for their efficacy in managing type 2 diabetes, and metformin remains a widely used pharmacological agent. However, metformin may reduce exercise capacity and interfere with skeletal muscle adaptations, creating barriers to exercise adherence. Research into optimising the timing of exercise has shown promise, particularly for glycaemic management in people with type 2 diabetes. Aligning exercise timing with circadian rhythms and nutritional intake may maximise benefits. Nutritional timing also plays a crucial role in glycaemic control. Recent research suggests that not only what we eat but when we eat significantly impacts glycaemic control, with strategies like time-restricted feeding (TRF) showing promise in reducing caloric intake, improving glycaemic regulation and enhancing overall metabolic health. These findings suggest that meal timing could be an important adjunct to traditional dietary and exercise approaches in managing diabetes and related metabolic disorders. When taking a holistic view of Diabetes management and the diurnal environment, one must also consider the circadian biology of medicines. Metformin has a circadian profile in plasma, and our recent study suggests that morning exercise combined with pre-breakfast metformin intake reduces glycaemia more effectively than post-breakfast intake. In this review, we aim to explore the integration of circadian biology into type 2 diabetes management by examining the timing of exercise, nutrition and medication. In conclusion, chrono-medicine offers a promising, cost-effective strategy for managing type 2 diabetes. Integrating precision timing of exercise, nutrition and medication into treatment plans requires considering the entire diurnal environment, including lifestyle and occupational factors, to develop comprehensive, evidence-based healthcare strategies.
Psychiatric research applies statistical methods that can be divided in two frameworks: causal inference and prediction. Recent proposals suggest a down-prioritisation of causal inference and argue that prediction paves the road to ‘precision psychiatry’ (i.e., individualised treatment). In this perspective, we critically appraise these proposals.
Methods:
We outline strengths and weaknesses of causal inference and prediction frameworks and describe the link between clinical decision-making and counterfactual predictions (i.e., causality). We describe three key causal structures that, if not handled correctly, may cause erroneous interpretations, and three pitfalls in prediction research.
Results:
Prediction and causal inference are both needed in psychiatric research and their relative importance is context-dependent. When individualised treatment decisions are needed, causal inference is necessary.
Conclusion:
This perspective defends the importance of causal inference for precision psychiatry.
Patients with cystic fibrosis (CF) experience frequent episodes of acute decline in lung function called pulmonary exacerbations (PEx). An existing clinical and place-based precision medicine algorithm that accurately predicts PEx could include racial and ethnic biases in clinical and geospatial training data, leading to unintentional exacerbation of health inequities.
Methods:
We estimated receiver operating characteristic curves based on predictions from a nonstationary Gaussian stochastic process model for PEx within 3, 6, and 12 months among 26,392 individuals aged 6 years and above (2003–2017) from the US CF Foundation Patient Registry. We screened predictors to identify reasons for discriminatory model performance.
Results:
The precision medicine algorithm performed worse predicting a PEx among Black patients when compared with White patients or to patients of another race for all three prediction horizons. There was little to no difference in prediction accuracies among Hispanic and non-Hispanic patients for the same prediction horizons. Differences in F508del, smoking households, secondhand smoke exposure, primary and secondary road densities, distance and drive time to the CF center, and average number of clinical evaluations were key factors associated with race.
Conclusions:
Racial differences in prediction accuracies from our PEx precision medicine algorithm exist. Misclassification of future PEx was attributable to several underlying factors that correspond to race: CF mutation, location where the patient lives, and clinical awareness. Associations of our proxies with race for CF-related health outcomes can lead to systemic racism in data collection and in prediction accuracies from precision medicine algorithms constructed from it.
Precision medicine is an emergent medical paradigm that uses information technology to inform the use of targeted therapies and treatments. One of the first steps of precision medicine involves acquiring the patient’s informed consent to protect their rights to autonomous medical decision-making. In pediatrics, there exists mixed recommendations and guidelines of consent-related practices designed to safeguard pediatric patient interests while protecting their autonomy. Here, we provide a high-level, clinical primer of (1) ethical informed consent frameworks widely used in clinical practice and (2) promising modern adaptations to improve informed consent practices in pediatric precision medicine. Given the rapid scientific advances and adoption of precision medicine, we highlight the dual need to both consider the clinical implementation of consent in pediatric precision medicine workflows as well as build rapport with pediatric patients and their substitute decision-makers working alongside interdisciplinary health teams.
Less than a third of patients with depression achieve successful remission with standard first-step antidepressant monotherapy. The process for determining appropriate second-step care is often based on clinical intuition and involves a protracted course of trial and error, resulting in substantial patient burden and unnecessary delay in the provision of optimal treatment. To address this problem, we adopt an ensemble machine learning approach to improve prediction accuracy of remission in response to second-step treatments.
Method
Data were derived from the Level 2 stage of the STAR*D dataset, which included 1439 patients who were randomized into one of seven different second-step treatment strategies after failing to achieve remission during first-step antidepressant treatment. Ensemble machine learning models, comprising several individual algorithms, were evaluated using nested cross-validation on 155 predictor variables including clinical and demographic measures.
Results
The ensemble machine learning algorithms exhibited differential classification performance in predicting remission status across the seven second-step treatments. For the full set of predictors, AUC values ranged from 0.51 to 0.82 depending on the second-step treatment type. Predicting remission was most successful for cognitive therapy (AUC = 0.82) and least successful for other medication and combined treatment options (AUCs = 0.51–0.66).
Conclusion
Ensemble machine learning has potential to predict second-step treatment. In this study, predictive performance varied by type of treatment, with greater accuracy in predicting remission in response to behavioral treatments than to pharmacotherapy interventions. Future directions include considering more informative predictor modalities to enhance prediction of second-step treatment response.
The personalised oncology paradigm remains challenging to deliver despite technological advances in genomics-based identification of actionable variants combined with the increasing focus of drug development on these specific targets. To ensure we continue to build concerted momentum to improve outcomes across all cancer types, financial, technological and operational barriers need to be addressed. For example, complete integration and certification of the ‘molecular tumour board’ into ‘standard of care’ ensures a unified clinical decision pathway that both counteracts fragmentation and is the cornerstone of evidence-based delivery inside and outside of a research setting. Generally, integrated delivery has been restricted to specific (common) cancer types either within major cancer centres or small regional networks. Here, we focus on solutions in real-world integration of genomics, pathology, surgery, oncological treatments, data from clinical source systems and analysis of whole-body imaging as digital data that can facilitate cost-effectiveness analysis, clinical trial recruitment, and outcome assessment. This urgent imperative for cancer also extends across the early diagnosis and adjuvant treatment interventions, individualised cancer vaccines, immune cell therapies, personalised synthetic lethal therapeutics and cancer screening and prevention. Oncology care systems worldwide require proactive step-changes in solutions that include inter-operative digital working that can solve patient centred challenges to ensure inclusive, quality, sustainable, fair and cost-effective adoption and efficient delivery. Here we highlight workforce, technical, clinical, regulatory and economic challenges that prevent the implementation of precision oncology at scale, and offer a systematic roadmap of integrated solutions for standard of care based on minimal essential digital tools. These include unified decision support tools, quality control, data flows within an ethical and legal data framework, training and certification, monitoring and feedback. Bridging the technical, operational, regulatory and economic gaps demands the joint actions from public and industry stakeholders across national and global boundaries.
Precision medicine for cardiomyopathies holds great promise to improve patient outcomes costs by shifting the focus to patient-specific treatment decisions, maximising the use of therapies most likely to lead to benefit and minimising unnecessary intervention. Dilated cardiomyopathy (DCM), characterised by left ventricular dilatation and impairment, is a major cause of heart failure globally. Advances in genomic medicine have increased our understanding of the genetic architecture of DCM. Understanding the functional implications of genetic variation to reveal genotype-specific disease mechanisms is the subject of intense investigation, with advanced cardiac imaging and mutliomics approaches playing important roles. This may lead to increasing use of novel, targeted therapy. Individualised treatment and risk stratification is however made more complex by the modifying effects of common genetic variation and acquired environmental factors that help explain the variable expressivity of rare genetic variants and gene elusive disease. The next frontier must be expanding work into early disease to understand the mechanisms that drive disease expression, so that the focus can be placed on disease prevention rather than management of later symptomatic disease. Overcoming these challenges holds the key to enabling a paradigm shift in care from the management of symptomatic heart failure to prevention of disease.
The purpose of this study is to evaluate the validity of the standard approach in expert judgment for evaluating precision medicines, in which experts are required to estimate outcomes as if they did not have access to diagnostic information, whereas in fact, they do.
Methods
Fourteen clinicians participated in an expert judgment task to estimate the cost and medical outcomes of the use of exome sequencing in pediatric patients with intractable epilepsy in Thailand. Experts were randomly assigned to either an “unblind” or “blind” group; the former was provided with the exome sequencing results for each patient case prior to the judgment task, whereas the latter was not provided with the exome sequencing results. Both groups were asked to estimate the outcomes for the counterfactual scenario, in which patients had not been tested by exome sequencing.
Results
Our study did not show significant results, possibly due to the small sample size of both participants and case studies.
Conclusions
A comparison of the unblind and blind approach did not show conclusive evidence that there is a difference in outcomes. However, until further evidence suggests otherwise, we recommend the blind approach as preferable when using expert judgment to evaluate precision medicines because this approach is more representative of the counterfactual scenario than the unblind approach.
Psychotropic medication efficacy and tolerability are critical treatment issues faced by individuals with psychiatric disorders and their healthcare providers. For some people, it can take months to years of a trial-and-error process to identify a medication with the ideal efficacy and tolerability profile. Current strategies (e.g. clinical practice guidelines, treatment algorithms) for addressing this issue can be useful at the population level, but often fall short at the individual level. This is, in part, attributed to interindividual variation in genes that are involved in pharmacokinetic (i.e. absorption, distribution, metabolism, elimination) and pharmacodynamic (e.g. receptors, signaling pathways) processes that in large part, determine whether a medication will be efficacious or tolerable. A precision prescribing strategy know as pharmacogenomics (PGx) assesses these genomic variations, and uses it to inform selection and dosing of certain psychotropic medications. In this review, we describe the path that led to the emergence of PGx in psychiatry, the current evidence base and implementation status of PGx in the psychiatric clinic, and finally, the future growth potential of precision psychiatry via the convergence of the PGx-guided strategy with emerging technologies and approaches (i.e. pharmacoepigenomics, pharmacomicrobiomics, pharmacotranscriptomics, pharmacoproteomics, pharmacometabolomics) to personalize treatment of psychiatric disorders.
Edited by
Xiuzhen Huang, Cedars-Sinai Medical Center, Los Angeles,Jason H. Moore, Cedars-Sinai Medical Center, Los Angeles,Yu Zhang, Trinity University, Texas
Pharmacogenomics is the study of genetic factors that influence drug response. Pharmacogenomics combines pharmacology and genomics to identify genetic predictors of variability in drug response that can be used to maximize drug efficacy while minimizing drug toxicity in order to tailor drug therapy for patients, thus improving patient care and reducing healthcare costs. In this chapter we review the field of pharmacogenomics in its current state and clinical practice. Recent research, methods, and resources for pharmacogenomics are reviewed in detail. We discuss the advantages and challenges in pharmacogenomic studies. We elaborate on the barriers to clinical translation of pharmacogenetic discoveries and the efforts of various institutions and consortia to mitigate these barriers. We also discuss applications and clinical translation of pharmacogenomic research moving forward, along with social, ethical, and economic issues that require attention. We conclude by previewing the use of big data, multi-omics data, advanced computing technology, and statistical methods by scientists across disciplinary boundaries along with the efforts of government organizations, clinicians, and patients that could lead to successful and clinically translatable pharmacogenomic discoveries, ushering in an era of precision medicine.
This article aims to explore the ethical issues arising from attempts to diversify genomic data and include individuals from underserved groups in studies exploring the relationship between genomics and health. We employed a qualitative synthesis design, combining data from three sources: 1) a rapid review of empirical articles published between 2000 and 2022 with a primary or secondary focus on diversifying genomic data, or the inclusion of underserved groups and ethical issues arising from this, 2) an expert workshop and 3) a narrative review. Using these three sources we found that ethical issues are interconnected across structural factors and research practices. Structural issues include failing to engage with the politics of knowledge production, existing inequities, and their effects on how harms and benefits of genomics are distributed. Issues related to research practices include a lack of reflexivity, exploitative dynamics and the failure to prioritise meaningful co-production. Ethical issues arise from both the structure and the practice of research, which can inhibit researcher and participant opportunities to diversify data in an ethical way. Diverse data are not ethical in and of themselves, and without being attentive to the social, historical and political contexts that shape the lives of potential participants, endeavours to diversify genomic data run the risk of worsening existing inequities. Efforts to construct more representative genomic datasets need to develop ethical approaches that are situated within wider attempts to make the enterprise of genomics more equitable.
Psychiatric disorders are associated with significant social and economic burdens, many of which are related to issues with current diagnosis and treatments. The coronavirus (COVID-19) pandemic is estimated to have increased the prevalence and burden of major depressive and anxiety disorders, indicating an urgent need to strengthen mental health systems globally. To date, current approaches adopted in drug discovery and development for psychiatric disorders have been relatively unsuccessful. Precision psychiatry aims to tailor healthcare more closely to the needs of individual patients and, when informed by neuroscience, can offer the opportunity to improve the accuracy of disease classification, treatment decisions, and prevention efforts. In this review, we highlight the growing global interest in precision psychiatry and the potential for the National Institute of Health-devised Research Domain Criteria (RDoC) to facilitate the implementation of transdiagnostic and improved treatment approaches. The need for current psychiatric nosology to evolve with recent scientific advancements and increase awareness in emerging investigators/clinicians of the value of this approach is essential. Finally, we examine current challenges and future opportunities of adopting the RDoC-associated translational and transdiagnostic approaches in clinical studies, acknowledging that the strength of RDoC is that they form a dynamic framework of guiding principles that is intended to evolve continuously with scientific developments into the future. A collaborative approach that recruits expertise from multiple disciplines, while also considering the patient perspective, is needed to pave the way for precision psychiatry that can improve the prognosis and quality of life of psychiatric patients.
In the years following FDA approval of direct-to-consumer, genetic-health-risk/DTCGHR testing, millions of people in the US have sent their DNA to companies to receive personal genome health risk information without physician or other learned medical professional involvement. In Personal Genome Medicine, Michael J. Malinowski examines the ethical, legal, and social implications of this development. Drawing from the past and present of medicine in the US, Malinowski applies law, policy, public and private sector practices, and governing norms to analyze the commercial personal genome sequencing and testing sectors and to assess their impact on the future of US medicine. Written in relatable and accessible language, the book also proposes regulatory reforms for government and medical professionals that will enable technological advancements while maintaining personal and public health standards.
UK Biobank is an intensively characterised prospective cohort of 500,000 adults aged 40–69 years when recruited between 2006 and 2010. The study was established to enable researchers worldwide to undertake health-related research in the public interest. The existence of such a large, detailed prospective cohort with a high degree of participant engagement enabled its rapid repurposing for coronavirus disease-2019 (COVID-19) research. In response to the pandemic, the frequency of updates on hospitalisations and deaths among participants was immediately increased, and new data linkages were established to national severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) testing and primary care health records to facilitate research into the determinants of severe COVID-19. UK Biobank also instigated several sub-studies on COVID-19. In 2020, monthly blood samples were collected from approximately 20,000 individuals to investigate the distribution and determinants of SARS-CoV-2 infection, and to assess the persistence of antibodies following infection with another blood sample collected after 12 months. UK Biobank also performed repeat imaging of approximately 2,000 participants (half of whom had evidence of previous SARS-CoV-2 infection and half did not) to investigate the impact of the virus on changes in measures of internal organ structure and function. In addition, approximately 200,000 UK Biobank participants took part in a self-test SARS-CoV-2 antibody sub-study (between February and November 2021) to collect objective data on previous SARS-CoV-2 infection. These studies are enabling unique research into the genetic, lifestyle and environmental determinants of SARS-CoV-2 infection and severe COVID-19, as well as their long-term health effects. UK Biobank’s contribution to the national and international response to the pandemic represents a case study for its broader value, now and in the future, to precision medicine research.
Observational studies are notoriously susceptible to bias, and parallel-group randomized trials are important to identify the best overall treatment for eligible patients. Yet, such trials can be expected to be a misleading indicator of the best treatment for some subgroups or individual patients. In selected circumstances, patients can be treated in n-of-1 trials to address the inherent heterogeneity of treatment response in clinical populations. Such trials help to accomplish the ultimate goal of all biomedical research, to optimize the care of individual patients.
Humans operating in extreme environments often conduct their operations at the edges of the limits of human performance. Sometimes, they are required to push these limits to previously unattained levels. As a result, their margins for error in execution are much smaller than that found in the general public. These same small margins for error that impact execution may also impact risk, safety, health, and even survival. Thus, humans operating in extreme environments have a need for greater refinement in their preparation, training, fitness, and medical care. Precision medicine (PM) is uniquely suited to address the needs of those engaged in these extreme operations because of its depth of molecular analysis, derived precision countermeasures, and ability to match each individual (and his or her specific molecular phenotype) with any given operating context (environment). Herein, we present an overview of a systems approach to PM in extreme environments, which affords clinicians one method to contextualize the inputs, processes, and outputs that can form the basis of a formal practice. For the sake of brevity, this overview is focused on molecular dynamics, while providing only a brief introduction to the also important physiologic and behavioral phenotypes in PM. Moreover, rather than a full review, it highlights important concepts, while using only selected citations to illustrate those concepts. It further explores, by demonstration, the basic principles of using functionally characterized molecular networks to guide the practical application of PM in extreme environments. At its core, PM in extreme environments is about attention to incremental gains and losses in molecular network efficiency that can scale to produce notable changes in health and performance. The aim of this overview is to provide a conceptual overview of one approach to PM in extreme environments, coupled with a selected suite of practical considerations for molecular profiling and countermeasures.