Eye Tracking Interaction on Unmodified Mobile VR Headsets Using the Selfie Camera

被引:11
作者
Drakopoulos, Panagiotis [1 ]
Koulieris, George-alex [2 ]
Mania, Katerina [1 ]
机构
[1] Tech Univ Crete, Sch Elect & Comp Engn, Univ Campus Kounoupidiana, Khania 73100, Greece
[2] Univ Durham, Math Sci & Comp Sci Bldg, Upper Mountjoy Campus,Stockton Rd, Durham DH1 3LE, England
关键词
Mobile VR; eye tracking; ROBUST PUPIL DETECTION; GAZE;
D O I
10.1145/3456875
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Input methods for interaction in smartphone-based virtual and mixed reality (VR/MR) are currently based on uncomfortable head tracking controlling a pointer on the screen. User fixations are a fast and natural input method for VR/MR interaction. Previously, eye tracking in mobile VR suffered from low accuracy, long processing time, and the need for hardware add-ons such as anti-reflective lens coating and infrared emitters. We present an innovative mobile VR eye tracking methodology utilizing only the eye images from the front-facing (selfie) camera through the headset's lens, without any modifications. Our system first enhances the low-contrast, poorly lit eye images by applying a pipeline of customised low-level image enhancements suppressing obtrusive lens reflections. We then propose an iris region-of-interest detection algorithm that is run only once. This increases the iris tracking speed by reducing the iris search space in mobile devices. We iteratively fit a customised geometric model to the iris to refine its coordinates. We display a thin bezel of light at the top edge of the screen for constant illumination. A confidence metric calculates the probability of successful iris detection. Calibration and linear gaze mapping between the estimated iris centroid and physical pixels on the screen results in low latency, real-time iris tracking. A formal study confirmed that our system's accuracy is similar to eye trackers in commercial VR headsets in the central part of the headset's field-of-view. In a VR game, gaze-driven user completion time was as fast as with head-tracked interaction, without the need for consecutive head motions. In a VR panorama viewer, users could successfully switch between panoramas using gaze.
引用
收藏
页数:20
相关论文
共 50 条
  • [31] Anticipation and timing of turn-taking in dialogue interpreting A quantitative study using mobile eye-tracking data
    Vranjes, Jelena
    Oben, Bert
    TARGET-INTERNATIONAL JOURNAL OF TRANSLATION STUDIES, 2022, 34 (04) : 627 - 651
  • [32] 3D measuring method off head and eye tracking system using a single camera
    Nishida, Masataka
    Sakamoto, Kunio
    ELECTRO-OPTICAL AND INFRARED SYSTEMS: TECHNOLOGY AND APPLICATIONS III, 2006, 6395
  • [33] Using Eye Tracking to Explore Consumers' Visual Behavior According to Their Shopping Motivation in Mobile Environments
    Hwang, Yoon Min
    Lee, Kun Chang
    CYBERPSYCHOLOGY BEHAVIOR AND SOCIAL NETWORKING, 2017, 20 (07) : 442 - 447
  • [34] Compensation of Head Movements in Mobile Eye-Tracking Data Using an Inertial Measurement Unit
    Larsson, Linnea
    Schwaller, Andrea
    Holmqvist, Kenneth
    Nystrom, Marcus
    Stridh, Martin
    PROCEEDINGS OF THE 2014 ACM INTERNATIONAL JOINT CONFERENCE ON PERVASIVE AND UBIQUITOUS COMPUTING (UBICOMP'14 ADJUNCT), 2014, : 1161 - 1167
  • [35] Mobile Device Eye Tracking on Dynamic Visual Contents using Edge Computing and Deep Learning
    Gunawardena, Nishan
    Ginige, Jeewani Anupama
    Javadi, Bahman
    Lui, Gough
    2022 ACM SYMPOSIUM ON EYE TRACKING RESEARCH AND APPLICATIONS, ETRA 2022, 2022,
  • [36] Study of a Social Robot's Appearance Using Interviews and a Mobile Eye-Tracking Device
    Dziergwa, Michal
    Frontkiewicz, Mirela
    Kaczmarek, Pawel
    Kedzierski, Jan
    Zagdanska, Marta
    SOCIAL ROBOTICS, ICSR 2013, 2013, 8239 : 170 - 179
  • [37] Game-Based Social Interaction Platform for Cognitive Assessment of Autism Using Eye Tracking
    Chien, Yi-Ling
    Lee, Chia-Hsin
    Chiu, Yen-Nan
    Tsai, Wen-Che
    Min, Yuan-Che
    Lin, Yang-Min
    Wong, Jui-Shen
    Tseng, Yi-Li
    IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, 2023, 31 : 749 - 758
  • [38] Zoned in or zoned out? Investigating immersion in slot machine gambling using mobile eye-tracking
    Murch, W. Spencer
    Limbrick-Oldfield, Eve H.
    Ferrari, Mario A.
    MacDonald, Kent I.
    Fooken, Jolande
    Cherkasova, Mariya V.
    Spering, Miriam
    Clark, Luke
    ADDICTION, 2020, 115 (06) : 1127 - 1138
  • [39] Orchestration Load Indicators and Patterns: n-the-Wild Studies Using Mobile Eye-Tracking
    Prieto, Luis P.
    Sharma, Kshitij
    Kidzinski, Lukasz
    Dillenbourg, Pierre
    IEEE TRANSACTIONS ON LEARNING TECHNOLOGIES, 2018, 11 (02): : 216 - 229
  • [40] PeepList: Adapting ex-post interaction with pervasive display content using eye tracking
    Kajan, Rudolf
    Herout, Adam
    Bednarik, Roman
    Povolny, Filip
    PERVASIVE AND MOBILE COMPUTING, 2016, 30 : 71 - 83