Ultra-Low Power Gaze Tracking for Virtual Reality

被引:12
作者
Li, Tianxing [1 ]
Liu, Qiang [1 ]
Zhou, Xia [1 ]
机构
[1] Dartmouth Coll, Dept Comp Sci, Hanover, NH 03755 USA
来源
PROCEEDINGS OF THE 15TH ACM CONFERENCE ON EMBEDDED NETWORKED SENSOR SYSTEMS (SENSYS'17) | 2017年
基金
美国国家科学基金会;
关键词
Gaze tracking; virtual reality; visible light sensing; EYE-MOVEMENT; TIME EYE; MODEL;
D O I
10.1145/3131672.3131682
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Tracking user's eye fixation direction is crucial to virtual reality (VR): it eases user's interaction with the virtual scene and enables intelligent rendering to improve user's visual experiences and save system energy. Existing techniques commonly rely on cameras and active infrared emitters, making them too expensive and power-hungry for VR headsets (especially mobile VR headsets). We present LiGaze, a low-cost, low-power approach to gaze tracking tailored to VR. It relies on a few low-cost photodiodes, eliminating the need for cameras and active infrared emitters. Reusing light emitted from the VR screen, LiGaze leverages photodiodes around a VR lens to measure reflected screen light in different directions. It then infers gaze direction by exploiting pupil's light absorption property. The core of LiGaze is to deal with screen light dynamics and extract changes in reflected light related to pupil movement. LiGaze infers a 3D gaze vector on the fly using a lightweight regression algorithm. We design and fabricate a LiGaze prototype using off-the-shelf photodiodes. Our comparison to a commercial VR eye tracker (FOVE) shows that LiGaze achieves 6.3. and 10.1. mean within-user and cross-user accuracy. Its sensing and computation consume 791 mu W in total and thus can be completely powered by a credit-card sized solar cell harvesting energy from indoor lighting. LiGaze's simplicity and ultra-low power make it applicable in a wide range of VR headsets to better unleash VR's potential.
引用
收藏
页数:14
相关论文
共 50 条
  • [21] Virtual Gaze : Exploring use of Gaze as Rich Interaction Method with Virtual Agent in Interactive Virtual Reality Content
    Kevin, Stevanus
    Pai, Yun Suen
    Kunze, Kai
    24TH ACM SYMPOSIUM ON VIRTUAL REALITY SOFTWARE AND TECHNOLOGY (VRST 2018), 2018,
  • [22] Vertical Slit Field Effect Transistor in Ultra-Low Power Applications
    Qiu, Xiang
    Marek-Sadowska, Malgorzata
    Maly, Wojciech
    2012 13TH INTERNATIONAL SYMPOSIUM ON QUALITY ELECTRONIC DESIGN (ISQED), 2012, : 384 - 390
  • [23] A Tunnel field effect transistor is a substitute for ultra-low power applications
    Kumar, Ch. Pavan
    Sivani, K.
    2016 INTERNATIONAL CONFERENCE ON ADVANCES IN HUMAN MACHINE INTERACTION (HMI), 2016, : 16 - 19
  • [24] Spatialgaze: towards spatial gaze tracking for extended reality
    Yang, Songzhou
    He, Yuan
    Chen, Yulong
    CCF TRANSACTIONS ON PERVASIVE COMPUTING AND INTERACTION, 2023, 5 (04) : 430 - 446
  • [25] Gaze During Locomotion in Virtual Reality and the Real World
    Drewes, Jan
    Feder, Sascha
    Einhaeuser, Wolfgang
    FRONTIERS IN NEUROSCIENCE, 2021, 15
  • [26] Spatialgaze: towards spatial gaze tracking for extended reality
    Songzhou Yang
    Yuan He
    Yulong Chen
    CCF Transactions on Pervasive Computing and Interaction, 2023, 5 : 430 - 446
  • [27] Touring the 'World Picture': virtual reality and the tourist gaze
    Leotta, Alfio
    Ross, Miriam
    STUDIES IN DOCUMENTARY FILM, 2018, 12 (02) : 150 - 162
  • [28] Visual analytics of gaze tracks in virtual reality environment
    Ryabinin K.V.
    Belousov K.I.
    Scientific Visualization, 2021, 13 (02): : 50 - 66
  • [29] Interaction Techniques Using Head Gaze for Virtual Reality
    Atienza, Rowel
    Blonna, Ryan
    Isabel Saludares, Maria
    Casimiro, Joel
    Fuentes, Vivencio
    2016 IEEE REGION 10 SYMPOSIUM (TENSYMP), 2016, : 110 - 114
  • [30] An optical tracking system for virtual reality
    Hrimech, Hamid
    Merienne, Frederic
    INTELLIGENT SYSTEMS AND AUTOMATION, 2009, 1107 : 175 - 178