Cybersickness Prediction from Integrated HMD's Sensors: A Multimodal Deep Fusion Approach using Eye-tracking and Head-tracking Data

被引:47
作者
Islam, Rifatul [1 ]
Desai, Kevin [1 ]
Quarles, John [1 ]
机构
[1] Univ Texas San Antonio, San Antonio, TX 78249 USA
来源
2021 IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY (ISMAR 2021) | 2021年
基金
美国国家科学基金会;
关键词
Cybersickness Prediction; Visually induced motion sickness; Eye-tracking; Multimodal Deep Fusion Network; MOTION; EXPOSURE; SYMPTOMS;
D O I
10.1109/ISMAR52148.2021.00017
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Cybersickness prediction is one of the significant research challenges for real-time cybersickness reduction. Researchers have proposed different approaches for predicting cybersickness from bio-physiological data (e.g., heart rate, breathing rate, electroencephalogram). However, collecting bio-physiological data often requires external sensors, limiting locomotion and 3D-object manipulation during the virtual reality (VR) experience. Limited research has been done to predict cybersickness from the data readily available from the integrated sensors in head-mounted displays (HMDs) (e.g., head-tracking, eye-tracking, motion features), allowing free locomotion and 3D-object manipulation. This research proposes a novel deep fusion network to predict cybersickness severity from heterogeneous data readily available from the integrated HMD sensors. We extracted 1755 stereoscopic videos, eye-tracking, and head-tracking data along with the corresponding self-reported cybersickness severity collected from 30 participants during their VR gameplay. We applied several deep fusion approaches with the heterogeneous data collected from the participants. Our results suggest that cybersickness can be predicted with an accuracy of 87.77% and a root-mean-square error of 0.51 when using only eye-tracking and head-tracking data. We concluded that eye-tracking and head-tracking data are well suited for a standalone cybersickness prediction framework.
引用
收藏
页码:31 / 40
页数:10
相关论文
共 59 条
  • [21] Hettinger L. J., 1992, Presence Teleoper. Virtual Environ., V1, P306, DOI [10.1162/pres.1992.1.3.306, DOI 10.1162/PRES.1992.1.3.306]
  • [22] Hirschmüller H, 2008, IEEE T PATTERN ANAL, V30, P328, DOI [10.1109/TPAMI.2007.1166, 10.1109/TPAMl.2007.1166]
  • [23] CyberSense: A Closed-Loop Framework to Detect Cybersickness Severity and Adaptively apply Reduction Techniques
    Islam, Rifatul
    Ang, Samuel
    Quarles, John
    [J]. 2021 IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES ABSTRACTS AND WORKSHOPS (VRW 2021), 2021, : 148 - 155
  • [24] Automatic Detection and Prediction of Cybersickness Severity using Deep Neural Networks from user's Physiological Signals
    Islam, Rifatul
    Lee, Yonggun
    Jaloli, Mehrad
    Muhammad, Imtiaz
    Zhu, Dakai
    Rad, Paul
    Huang, Yufei
    Quarles, John
    [J]. 2020 IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY (ISMAR 2020), 2020, : 400 - 411
  • [25] Islam R, 2020, 2020 IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES WORKSHOPS (VRW 2020), P559, DOI [10.1109/VRW50115.2020.0-145, 10.1109/VRW50115.2020.00131]
  • [26] James G, 2013, SPRINGER TEXTS STAT, V103, P303, DOI 10.1007/978-1-4614-7138-7_8
  • [27] Motion-related artefacts in EEG predict neuronally plausible patterns of activation in fMRI data
    Jansen, Marije
    White, Thomas P.
    Mullinger, Karen J.
    Liddle, Elizabeth B.
    Gowland, Penny A.
    Francis, Susan T.
    Bowtell, Richard
    Liddle, Peter F.
    [J]. NEUROIMAGE, 2012, 59 (01) : 261 - 270
  • [28] Jeong D, 2019, 2019 26TH IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES (VR), P827, DOI [10.1109/VR.2019.8798334, 10.1109/vr.2019.8798334]
  • [29] Jeong H, 2017, IEEE IMAGE PROC, P715, DOI 10.1109/ICIP.2017.8296374
  • [30] Jin W., 2018, AUTOMATIC PREDICTION, DOI [10.1109/GEM. 2018.8516469, DOI 10.1109/GEM.2018.8516469]