Anxiety Level Recognition for Virtual Reality Therapy System Using Physiological Signals

被引:91
作者
Salkevicius, Justas [1 ]
Damasevicius, Robertas [1 ,2 ]
Maskeliunas, Rytis [2 ,3 ]
Laukiene, Ilona [4 ]
机构
[1] Kaunas Univ Technol, Dept Software Engn, LT-51368 Kaunas, Lithuania
[2] Silesian Tech Univ, Inst Math, PL-44100 Gliwice, Poland
[3] Kaunas Univ Technol, Dept Multimedia Engn, LT-51368 Kaunas, Lithuania
[4] Lithuanian Univ Hlth Sci, Med Acad, Clin Psychiat, LT-50103 Kaunas, Lithuania
关键词
virtual reality therapy; anxiety recognition; physiological signals; GSR; BVP; STRESS DETECTION; EMOTION RECOGNITION; EXPOSURE THERAPY; DATABASE; EEG;
D O I
10.3390/electronics8091039
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Virtual reality exposure therapy (VRET) can have a significant impact towards assessing and potentially treating various anxiety disorders. One of the main strengths of VRET systems is that they provide an opportunity for a psychologist to interact with virtual 3D environments and change therapy scenarios according to the individual patient's needs. However, to do this efficiently the patient's anxiety level should be tracked throughout the VRET session. Therefore, in order to fully use all advantages provided by the VRET system, a mental stress detection system is needed. The patient's physiological signals can be collected with wearable biofeedback sensors. Signals like blood volume pressure (BVP), galvanic skin response (GSR), and skin temperature can be processed and used to train the anxiety level classification models. In this paper, we combine VRET with mental stress detection and highlight potential uses of this kind of VRET system. We discuss and present a framework for anxiety level recognition, which is a part of our developed cloud-based VRET system. Physiological signals of 30 participants were collected during VRET-based public speaking anxiety treatment sessions. The acquired data were used to train a four-level anxiety recognition model (where each level of low', mild', moderate', and 'high' refer to the levels of anxiety rather than to separate classes of the anxiety disorder). We achieved an 80.1% cross-subject accuracy (using leave-one-subject-out cross-validation) and 86.3% accuracy (using 10 x 10 fold cross-validation) with the signal fusion-based support vector machine (SVM) classifier.
引用
收藏
页数:19
相关论文
共 52 条
[41]   Gender, Age, Colour, Position and Stress: How They Influence Attention at Workplace? [J].
Raudonis, Vidas ;
Maskeliunas, Rytis ;
Stankevicius, Karolis ;
Damasevicius, Robertas .
COMPUTATIONAL SCIENCE AND ITS APPLICATIONS - ICCSA 2017, PT V, 2017, 10408 :248-264
[42]   Cloud Based Virtual Reality Exposure Therapy Service for Public Speaking Anxiety [J].
Salkevicius, Justas ;
Miskinyte, Audrone ;
Navickas, Lukas .
INFORMATION, 2019, 10 (02)
[43]   Battling the fear of public speaking: Designing software as a service solution for a virtual reality therapy [J].
Salkevicius, Justas ;
Navickas, Lukas .
2018 IEEE 6TH INTERNATIONAL CONFERENCE ON FUTURE INTERNET OF THINGS AND CLOUD WORKSHOPS (W-FICLOUD 2018), 2018, :209-213
[44]   Stress Detection Using Wearable Physiological Sensors [J].
Sandulescu, Virginia ;
Andrews, Sally ;
Ellis, David ;
Bellotto, Nicola ;
Mozos, Oscar Martinez .
ARTIFICIAL COMPUTATION IN BIOLOGY AND MEDICINE, PT I (IWINAC 2015), 2015, 9107 :526-532
[45]  
Scibelli F, 2016, FRONT ICT, V3, P16, DOI [10.3389/fict.2016.00016, DOI 10.3389/FICT.2016.00016]
[46]   The NumPy Array: A Structure for Efficient Numerical Computation [J].
van der Walt, Stefan ;
Colbert, S. Chris ;
Varoquaux, Gael .
COMPUTING IN SCIENCE & ENGINEERING, 2011, 13 (02) :22-30
[47]  
Van Gent P., 2018, P 6 HUMANIST C HAG N, P173
[48]  
Vanitha V, 2016, BIOMED RES-INDIA, V27, pS271
[49]   Identification of Human Response to Virtual 3D Face Stimuli [J].
Vaskevicius, Egidijus ;
Vidugiriene, Ausra ;
Kaminskas, Vytautas .
INFORMATION TECHNOLOGY AND CONTROL, 2014, 43 (01) :47-56
[50]   Emotion Recognition Based on Multi-Variant Correlation of Physiological Signals [J].
Wen, Wanhui ;
Liu, Guangyuan ;
Cheng, Nanpu ;
Wei, Jie ;
Shangguan, Pengchao ;
Huang, Wenjin .
IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2014, 5 (02) :126-140