Machine learning prediction of anxiety symptoms in social anxiety disorder: utilizing multimodal data from virtual reality sessions
被引:2
作者:
Park, Jin-Hyun
论文数: 0引用数: 0
h-index: 0
机构:
Korea Univ, Dept Biomed Informat, Coll Med, Seoul, South KoreaKorea Univ, Dept Biomed Informat, Coll Med, Seoul, South Korea
Park, Jin-Hyun
[1
]
Shin, Yu-Bin
论文数: 0引用数: 0
h-index: 0
机构:
Korea Univ, Dept Psychiat, Coll Med, Seoul, South KoreaKorea Univ, Dept Biomed Informat, Coll Med, Seoul, South Korea
Shin, Yu-Bin
[2
]
论文数: 引用数:
h-index:
机构:
Jung, Dooyoung
[3
]
论文数: 引用数:
h-index:
机构:
Hur, Ji-Won
[4
]
论文数: 引用数:
h-index:
机构:
Pack, Seung Pil
[5
]
Lee, Heon-Jeong
论文数: 0引用数: 0
h-index: 0
机构:
Korea Univ, Dept Psychiat, Coll Med, Seoul, South KoreaKorea Univ, Dept Biomed Informat, Coll Med, Seoul, South Korea
Lee, Heon-Jeong
[2
]
Lee, Hwamin
论文数: 0引用数: 0
h-index: 0
机构:
Korea Univ, Dept Biomed Informat, Coll Med, Seoul, South KoreaKorea Univ, Dept Biomed Informat, Coll Med, Seoul, South Korea
Lee, Hwamin
[1
]
Cho, Chul-Hyun
论文数: 0引用数: 0
h-index: 0
机构:
Korea Univ, Dept Biomed Informat, Coll Med, Seoul, South Korea
Korea Univ, Dept Psychiat, Coll Med, Seoul, South KoreaKorea Univ, Dept Biomed Informat, Coll Med, Seoul, South Korea
Cho, Chul-Hyun
[1
,2
]
机构:
[1] Korea Univ, Dept Biomed Informat, Coll Med, Seoul, South Korea
[2] Korea Univ, Dept Psychiat, Coll Med, Seoul, South Korea
[3] Ulsan Natl Inst Sci & Technol UNIST, Grad Sch Hlth Sci & Technol, Dept Biomed Engn, Ulsan, South Korea
[4] Korea Univ, Sch Psychiat, Seoul, South Korea
[5] Korea Univ, Dept Biotechnol & Bioinformat, Sejong, South Korea
来源:
FRONTIERS IN PSYCHIATRY
|
2025年
/
15卷
基金:
新加坡国家研究基金会;
关键词:
machine learning;
multimodal data;
digital phenotyping;
digital psychiatry;
social anxiety disorder;
virtual reality intervention;
anxiety prediction;
HEART-RATE-VARIABILITY;
METAANALYSIS;
RESPONSES;
PHOBIA;
MODEL;
D O I:
10.3389/fpsyt.2024.1504190
中图分类号:
R749 [精神病学];
学科分类号:
100205 ;
摘要:
Introduction Machine learning (ML) is an effective tool for predicting mental states and is a key technology in digital psychiatry. This study aimed to develop ML algorithms to predict the upper tertile group of various anxiety symptoms based on multimodal data from virtual reality (VR) therapy sessions for social anxiety disorder (SAD) patients and to evaluate their predictive performance across each data type.Methods This study included 32 SAD-diagnosed individuals, and finalized a dataset of 132 samples from 25 participants. It utilized multimodal (physiological and acoustic) data from VR sessions to simulate social anxiety scenarios. This study employed extended Geneva minimalistic acoustic parameter set for acoustic feature extraction and extracted statistical attributes from time series-based physiological responses. We developed ML models that predict the upper tertile group for various anxiety symptoms in SAD using Random Forest, extreme gradient boosting (XGBoost), light gradient boosting machine (LightGBM), and categorical boosting (CatBoost) models. The best parameters were explored through grid search or random search, and the models were validated using stratified cross-validation and leave-one-out cross-validation.Results The CatBoost, using multimodal features, exhibited high performance, particularly for the Social Phobia Scale with an area under the receiver operating characteristics curve (AUROC) of 0.852. It also showed strong performance in predicting cognitive symptoms, with the highest AUROC of 0.866 for the Post-Event Rumination Scale. For generalized anxiety, the LightGBM's prediction for the State-Trait Anxiety Inventory-trait led to an AUROC of 0.819. In the same analysis, models using only physiological features had AUROCs of 0.626, 0.744, and 0.671, whereas models using only acoustic features had AUROCs of 0.788, 0.823, and 0.754.Conclusions This study showed that a ML algorithm using integrated multimodal data can predict upper tertile anxiety symptoms in patients with SAD with higher performance than acoustic or physiological data obtained during a VR session. The results of this study can be used as evidence for personalized VR sessions and to demonstrate the strength of the clinical use of multimodal data.
[62]
Mostajeran F, 2020, 2020 IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES (VR 2020), P303, DOI [10.1109/VR46266.2020.1581528155120, 10.1109/VR46266.2020.00-54]
[62]
Mostajeran F, 2020, 2020 IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES (VR 2020), P303, DOI [10.1109/VR46266.2020.1581528155120, 10.1109/VR46266.2020.00-54]