Eye Tracking Interaction on Unmodified Mobile VR Headsets Using the Selfie Camera

被引:11
|
作者
Drakopoulos, Panagiotis [1 ]
Koulieris, George-alex [2 ]
Mania, Katerina [1 ]
机构
[1] Tech Univ Crete, Sch Elect & Comp Engn, Univ Campus Kounoupidiana, Khania 73100, Greece
[2] Univ Durham, Math Sci & Comp Sci Bldg, Upper Mountjoy Campus,Stockton Rd, Durham DH1 3LE, England
关键词
Mobile VR; eye tracking; ROBUST PUPIL DETECTION; GAZE;
D O I
10.1145/3456875
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Input methods for interaction in smartphone-based virtual and mixed reality (VR/MR) are currently based on uncomfortable head tracking controlling a pointer on the screen. User fixations are a fast and natural input method for VR/MR interaction. Previously, eye tracking in mobile VR suffered from low accuracy, long processing time, and the need for hardware add-ons such as anti-reflective lens coating and infrared emitters. We present an innovative mobile VR eye tracking methodology utilizing only the eye images from the front-facing (selfie) camera through the headset's lens, without any modifications. Our system first enhances the low-contrast, poorly lit eye images by applying a pipeline of customised low-level image enhancements suppressing obtrusive lens reflections. We then propose an iris region-of-interest detection algorithm that is run only once. This increases the iris tracking speed by reducing the iris search space in mobile devices. We iteratively fit a customised geometric model to the iris to refine its coordinates. We display a thin bezel of light at the top edge of the screen for constant illumination. A confidence metric calculates the probability of successful iris detection. Calibration and linear gaze mapping between the estimated iris centroid and physical pixels on the screen results in low latency, real-time iris tracking. A formal study confirmed that our system's accuracy is similar to eye trackers in commercial VR headsets in the central part of the headset's field-of-view. In a VR game, gaze-driven user completion time was as fast as with head-tracked interaction, without the need for consecutive head motions. In a VR panorama viewer, users could successfully switch between panoramas using gaze.
引用
收藏
页数:20
相关论文
共 50 条
  • [1] Front Camera Eye Tracking for Mobile VR
    Drakopoulos, Panagiotis
    Koulieris, George Alex
    Mania, Katerina
    2020 IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES WORKSHOPS (VRW 2020), 2020, : 643 - 644
  • [2] Leveling the Playing Field: A Comparative Reevaluation of Unmodified Eye Tracking as an Input and Interaction Modality for VR
    Fernandes, Ajoy S.
    Murdison, T. Scott
    Proulx, Michael J.
    IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2023, 29 (05) : 2269 - 2279
  • [3] Power-efficient and shift-robust eye-tracking sensor for portable VR headsets
    Katrychuk, Dmytro
    Griffith, Henry K.
    Komogortsev, Oleg V.
    ETRA 2019: 2019 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS, 2019,
  • [4] Multimodal Biometric Authentication for VR/AR using EEG and Eye Tracking
    Krishna, Vrishab
    Ding, Yi
    Xu, Aiwen
    Hollerer, Tobias
    ICMI'19: ADJUNCT OF THE 2019 INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION, 2019,
  • [5] Eye Tracking and Measurement of Eye Rotation Using a Small Camera Installed roughly next to the Eye
    Hoshino, Kiyoshi
    Noguchi, Yuki
    Ono, Nayuta
    2020 13TH INTERNATIONAL CONFERENCE ON HUMAN SYSTEM INTERACTION (HSI), 2020, : 255 - 260
  • [6] Using Eye Tracking to Investigate Interaction Between Humans and Virtual Agents
    Amorese, Terry
    Greco, Claudia
    Cuciniello, Marialucia
    Buono, Carmela
    Palmero, Cristina
    Buch-Cardona, Pau
    Escalera, Sergio
    Torres, Maria Ines
    Cordasco, Gennaro
    Esposito, Anna
    2022 IEEE CONFERENCE ON COGNITIVE AND COMPUTATIONAL ASPECTS OF SITUATION MANAGEMENT, COGSIMA, 2022, : 125 - 132
  • [7] A preliminary study using a web camera based eye tracking to assess novelty reaction allowing user interaction
    Beltran, Jessica
    Rios-Vazquez, Isaac
    Sanchez-Cortez, Ambar S.
    Navarro, Rene F.
    Maldonado-Cano, Luis A.
    Garcia-Vazquez, Mireya S.
    MEXIHC 2018: PROCEEDINGS OF THE 7TH MEXICAN CONFERENCE ON HUMAN-COMPUTER INTERACTION, 2018,
  • [8] Detection of Relative Afferent Pupillary Defects Using Eye Tracking and a VR Headset
    Bruegger, Dominik
    Grabe, Hilary M.
    Vicini, Rino
    Dysli, Muriel
    Lussi, David
    Abegg, Mathias
    TRANSLATIONAL VISION SCIENCE & TECHNOLOGY, 2023, 12 (06):
  • [9] Using Eye Tracking as Human Computer Interaction Interface
    Schmidt, Holger
    Zimmermann, Gottfried
    HCI INTERNATIONAL 2015 - POSTERS' EXTENDED ABSTRACTS, PT I, 2015, 528 : 523 - 527
  • [10] Multimodal mapping of spatial attention for unilateral spatial neglect in VR: a proof of concept study using eye-tracking and mobile EEG
    Eudave, Luis
    Vourvopoulos, Athanasios
    VIRTUAL REALITY, 2025, 29 (01)