Hostname: page-component-5c6d5d7d68-wpx84 Total loading time: 0 Render date: 2024-08-28T06:21:17.358Z Has data issue: false hasContentIssue false

Linking Digital Traits from Facial Expression, Voice, and Head Motion to Montgomery–Åsberg Depression Rating Scale Subscales

Published online by Cambridge University Press:  27 August 2024

Z. Zhu
Affiliation:
1Boehringer Ingelheim Pharmaceuticals, Inc., CT, United States
Y. Wu
Affiliation:
1Boehringer Ingelheim Pharmaceuticals, Inc., CT, United States
J. Seidel
Affiliation:
2Boehringer Ingelheim International GmbH, Ingelheim am Rhein, Germany
D. Roy
Affiliation:
1Boehringer Ingelheim Pharmaceuticals, Inc., CT, United States
E. Salzmann*
Affiliation:
2Boehringer Ingelheim International GmbH, Ingelheim am Rhein, Germany
*
*Corresponding author.

Abstract

Core share and HTML view are not available for this content. However, as you have access to this content, a full PDF is available via the ‘Save PDF’ action button.
Introduction

The 10-item Montgomery–Åsberg Depression Rating Scale (MADRS) measures different dimensions of depression symptomatology. Digital traits may generate deeper understanding of the MADRS subscales and provide insights about depression symptomatology.

Objectives

To identify digital traits that predict specific MADRS subscales and ascertain which digital traits are important for which MADRS subscales.

Methods

During a Phase II decentralised clinical trial in major depressive disorder (MDD), patients completed the MADRS and used AiCure (LLC, New York, NY, USA), a smartphone application, to complete image description tasks at baseline. Digital measurements identified from the literature as relevant to MDD symptomatology were conducted using audio and video data derived from the image description tasks. Digital measurements included speech (rate, sentiment and first-person singular pronouns), vocal acoustics (intensity, pause fraction and fundamental frequency), facial expressivity (regional facial movement) and head pose (Euclidean and angular head movement). Digital traits analysis involved data pre-processing followed by machine learning (ML) using Elastic Net, Decision Tree, and Random Forest models; model performance was evaluated using 5-fold cross-validation and mean absolute error (MAE). Important digital traits were calculated by percentage change in MAE after permuting a specific variable. Important digital traits for the MADRS Apparent Sadness subscale score were mapped to defined, interpretable domains.

Results

The ML model predictions varied for different MADRS subscales (Table). Overall, Elastic Net and Random Forest models outperformed Decision Tree across all subscales scores other than suicidal thoughts. Half of the literature-based digital traits contributed to the prediction of ≥1 MADRS sadness sub-scale score. The important digital traits for the Apparent Sadness subscale score could be mapped to 4 domains (Figure); this aligned with findings from the literature.

Image:

Image 2:

Conclusions

Digital traits collected from patients with MDD were able to predict certain MADRS subscales better than others.

Funding

Boehringer Ingelheim.

Disclosure of Interest

Z. Zhu Employee of: Boehringer Ingelheim Pharmaceuticals, Inc., Y. Wu Employee of: Boehringer Ingelheim Pharmaceuticals, Inc., J. Seidel Employee of: Boehringer Ingelheim International GmbH, D. Roy Employee of: Boehringer Ingelheim Pharmaceuticals, Inc., E. Salzmann Employee of: Boehringer Ingelheim International GmbH

Type
Abstract
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
© The Author(s), 2024. Published by Cambridge University Press on behalf of European Psychiatric Association
Submit a response

Comments

No Comments have been published for this article.