Anxiety Level Recognition for Virtual Reality Therapy System Using Physiological Signals

被引:88
作者
Salkevicius, Justas [1 ]
Damasevicius, Robertas [1 ,2 ]
Maskeliunas, Rytis [2 ,3 ]
Laukiene, Ilona [4 ]
机构
[1] Kaunas Univ Technol, Dept Software Engn, LT-51368 Kaunas, Lithuania
[2] Silesian Tech Univ, Inst Math, PL-44100 Gliwice, Poland
[3] Kaunas Univ Technol, Dept Multimedia Engn, LT-51368 Kaunas, Lithuania
[4] Lithuanian Univ Hlth Sci, Med Acad, Clin Psychiat, LT-50103 Kaunas, Lithuania
关键词
virtual reality therapy; anxiety recognition; physiological signals; GSR; BVP; STRESS DETECTION; EMOTION RECOGNITION; EXPOSURE THERAPY; DATABASE; EEG;
D O I
10.3390/electronics8091039
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Virtual reality exposure therapy (VRET) can have a significant impact towards assessing and potentially treating various anxiety disorders. One of the main strengths of VRET systems is that they provide an opportunity for a psychologist to interact with virtual 3D environments and change therapy scenarios according to the individual patient's needs. However, to do this efficiently the patient's anxiety level should be tracked throughout the VRET session. Therefore, in order to fully use all advantages provided by the VRET system, a mental stress detection system is needed. The patient's physiological signals can be collected with wearable biofeedback sensors. Signals like blood volume pressure (BVP), galvanic skin response (GSR), and skin temperature can be processed and used to train the anxiety level classification models. In this paper, we combine VRET with mental stress detection and highlight potential uses of this kind of VRET system. We discuss and present a framework for anxiety level recognition, which is a part of our developed cloud-based VRET system. Physiological signals of 30 participants were collected during VRET-based public speaking anxiety treatment sessions. The acquired data were used to train a four-level anxiety recognition model (where each level of low', mild', moderate', and 'high' refer to the levels of anxiety rather than to separate classes of the anxiety disorder). We achieved an 80.1% cross-subject accuracy (using leave-one-subject-out cross-validation) and 86.3% accuracy (using 10 x 10 fold cross-validation) with the signal fusion-based support vector machine (SVM) classifier.
引用
收藏
页数:19
相关论文
共 52 条
[1]   DECAF: MEG-Based Multimodal Database for Decoding Affective Physiological Responses [J].
Abadi, Mojtaba Khomami ;
Subramanian, Ramanathan ;
Kia, Seyed Mostafa ;
Avesani, Paolo ;
Patras, Ioannis ;
Sebe, Nicu .
IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2015, 6 (03) :209-222
[2]   Keep the Stress Away with SoDA: Stress Detection and Alleviation System [J].
Akmandor, Ayten Ozge ;
Jha, Niraj K. .
IEEE Transactions on Multi-Scale Computing Systems, 2017, 3 (04) :269-282
[3]  
[Anonymous], 1969, The practice of behavior therapy
[4]   Emotion Based Music Recommendation System Using Wearable Physiological Sensors [J].
Ayata, Deger ;
Yaslan, Yusuf ;
Kamasak, Mustafa E. .
IEEE TRANSACTIONS ON CONSUMER ELECTRONICS, 2018, 64 (02) :196-203
[5]  
Bandelow B, 2015, DIALOGUES CLIN NEURO, V17, P327
[6]   Trauma management therapy with virtual-reality augmented exposure therapy for combat-related PTSD: A randomized controlled trial [J].
Beidel, Deborah C. ;
Frueh, B. Christopher ;
Neer, Sandra M. ;
Bowers, Clint A. ;
Trachik, Benjamin ;
Uhde, Thomas W. ;
Grubaugh, Anouk .
JOURNAL OF ANXIETY DISORDERS, 2019, 61 :64-74
[7]   A meta-analytic examination of attrition in virtual reality exposure therapy for anxiety disorders [J].
Benbow, Amanda A. ;
Anderson, Page L. .
JOURNAL OF ANXIETY DISORDERS, 2019, 61 :18-26
[8]   Cloudification of Virtual Reality Gliding Simulation Game [J].
Buzys, Rytis ;
Maskeliunas, Rytis ;
Damasevicius, Robertas ;
Sidekerskiene, Tatjana ;
Wozniak, Marcin ;
Wei, Wei .
INFORMATION, 2018, 9 (12)
[9]   Stress detection in daily life scenarios using smart phones and wearable sensors: A survey [J].
Can, Yekta Said ;
Arnrich, Bert ;
Ersoy, Cem .
JOURNAL OF BIOMEDICAL INFORMATICS, 2019, 92
[10]  
Cao WH, 2017, CHIN CONTR CONF, P10995, DOI 10.23919/ChiCC.2017.8029112