Facial Emotion Recognition in Immersive Virtual Reality: A Systematic Literature Review

被引:5
作者
Ortmann, Thorben [1 ,2 ]
Wang, Qi [1 ]
Putzar, Larissa [2 ]
机构
[1] Univ West Scotland, Paisley, Renfrew, Scotland
[2] Hamburg Univ Appl Sci, Hamburg, Germany
来源
PROCEEDINGS OF THE 16TH ACM INTERNATIONAL CONFERENCE ON PERVASIVE TECHNOLOGIES RELATED TO ASSISTIVE ENVIRONMENTS, PETRA 2023 | 2023年
关键词
facial expressions; emotion recognition; virtual reality; head-mounted display; affective computing; review; EXPRESSION RECOGNITION; CIRCUMPLEX MODEL;
D O I
10.1145/3594806.3594861
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
With the broader adoption of virtual reality (VR), objective physiological measurements to automatically assess a user's emotional state are gaining importance. Emotions affect human behavior, perception, cognition, and decision-making. Their recognition allows analysis of VR experiences and enables systems to react to and interact with a user's emotions. Facial expressions are one of the most potent and natural signals to recognize emotions. Automatic facial expression recognition (FER) typically relies on facial images. However, users wear head-mounted displays (HMDs) in immersive VR environments, which occlude almost the entire upper half of the face. That severely limits the capabilities of conventional FER methods. We address this emerging challenge with our systematic literature review. To our knowledge, it is the first review on FER in immersive VR scenarios where HMDs partially occlude a user's face. We identified 256 related works and included 21 for detailed analysis. Our review provides a comprehensive overview of the state-of-the-art and draws conclusions for future research.
引用
收藏
页码:77 / 82
页数:6
相关论文
共 35 条
[1]   Facial Muscle Activity Recognition with Reconfigurable Differential Stethoscope-Microphones [J].
Bello, Hymalai ;
Zhou, Bo ;
Lukowicz, Paul .
SENSORS, 2020, 20 (17) :1-28
[2]   BARGAIN: behavioral affective rule-based games adaptation interface-towards emotionally intelligent games: application on a virtual reality environment for socio-moral development [J].
Benlamine, Mohamed S. ;
Dufresne, Aude ;
Beauchamp, Miriam H. ;
Frasson, Claude .
USER MODELING AND USER-ADAPTED INTERACTION, 2021, 31 (02) :287-321
[3]   PhysioHMD: A Conformable, Modular Toolkit for Collecting Physiological Data from Head-Mounted Displays [J].
Bernal, Guillermo ;
Yang, Tao ;
Jain, Abhinandan ;
Maes, Pattie .
ISWC'18: PROCEEDINGS OF THE 2018 ACM INTERNATIONAL SYMPOSIUM ON WEARABLE COMPUTERS, 2018, :160-167
[4]   Performance enhancement of facial electromyogram-based facial-expression recognition for social virtual reality applications using linear discriminant analysis adaptation [J].
Cha, Ho-Seung ;
Im, Chang-Hwan .
VIRTUAL REALITY, 2022, 26 (01) :385-398
[5]   Real-Time Recognition of Facial Expressions Using Facial Electromyograms Recorded Around the Eyes for Social Virtual Reality Applications [J].
Cha, Ho-Seung ;
Choi, Seong-Jun ;
Im, Chang-Hwan .
IEEE ACCESS, 2020, 8 :62065-62075
[6]   Emotion recognition using facial expressions in an immersive virtual reality application [J].
Chen, Xinrun ;
Chen, Hengxin .
VIRTUAL REALITY, 2023, 27 (03) :1717-1732
[7]   The Past, Present, and Future o f Virtual and Augmented Reality Research: A Network and Cluster Analysis of the Literature [J].
Cipresso, Pietro ;
Chicchi Giglioli, Irene Alice ;
Alcaniz Raya, Mariano ;
Riva, Giuseppe .
FRONTIERS IN PSYCHOLOGY, 2018, 9
[8]   FACIAL EXPRESSIONS OF EMOTION [J].
EKMAN, P ;
OSTER, H .
ANNUAL REVIEW OF PSYCHOLOGY, 1979, 30 :527-554
[9]   MEASURING FACIAL MOVEMENT [J].
EKMAN, P ;
FRIESEN, WV .
ENVIRONMENTAL PSYCHOLOGY AND NONVERBAL BEHAVIOR, 1976, 1 (01) :56-75
[10]   Teacher-student training and triplet loss to reduce the effect of drastic face occlusion Application to emotion recognition, gender identification and age estimation [J].
Georgescu, Mariana-Iuliana ;
Duta, Georgian-Emilian ;
Ionescu, Radu Tudor .
MACHINE VISION AND APPLICATIONS, 2022, 33 (01)