Ultra-Low Power Gaze Tracking for Virtual Reality

被引:12
作者
Li, Tianxing [1 ]
Liu, Qiang [1 ]
Zhou, Xia [1 ]
机构
[1] Dartmouth Coll, Dept Comp Sci, Hanover, NH 03755 USA
来源
PROCEEDINGS OF THE 15TH ACM CONFERENCE ON EMBEDDED NETWORKED SENSOR SYSTEMS (SENSYS'17) | 2017年
基金
美国国家科学基金会;
关键词
Gaze tracking; virtual reality; visible light sensing; EYE-MOVEMENT; TIME EYE; MODEL;
D O I
10.1145/3131672.3131682
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Tracking user's eye fixation direction is crucial to virtual reality (VR): it eases user's interaction with the virtual scene and enables intelligent rendering to improve user's visual experiences and save system energy. Existing techniques commonly rely on cameras and active infrared emitters, making them too expensive and power-hungry for VR headsets (especially mobile VR headsets). We present LiGaze, a low-cost, low-power approach to gaze tracking tailored to VR. It relies on a few low-cost photodiodes, eliminating the need for cameras and active infrared emitters. Reusing light emitted from the VR screen, LiGaze leverages photodiodes around a VR lens to measure reflected screen light in different directions. It then infers gaze direction by exploiting pupil's light absorption property. The core of LiGaze is to deal with screen light dynamics and extract changes in reflected light related to pupil movement. LiGaze infers a 3D gaze vector on the fly using a lightweight regression algorithm. We design and fabricate a LiGaze prototype using off-the-shelf photodiodes. Our comparison to a commercial VR eye tracker (FOVE) shows that LiGaze achieves 6.3. and 10.1. mean within-user and cross-user accuracy. Its sensing and computation consume 791 mu W in total and thus can be completely powered by a credit-card sized solar cell harvesting energy from indoor lighting. LiGaze's simplicity and ultra-low power make it applicable in a wide range of VR headsets to better unleash VR's potential.
引用
收藏
页数:14
相关论文
共 50 条
  • [41] GazeTransformer: Gaze Forecasting for Virtual Reality Using Transformer Networks
    Rolff, Tim
    Harms, H. Matthias
    Steinicke, Frank
    Frintrop, Simone
    PATTERN RECOGNITION, DAGM GCPR 2022, 2022, 13485 : 577 - 593
  • [42] Measuring visually guided motor performance in ultra low vision using virtual reality
    Kartha, Arathy
    Sadeghi, Roksana
    Bradley, Chris
    Livingston, Brittnee
    Tran, Chau
    Gee, Will
    Dagnelie, Gislin
    FRONTIERS IN NEUROSCIENCE, 2023, 17
  • [43] A review on carbon nanotube field effect transistors (CNTFETs) for ultra-low power applications
    Prakash, P.
    Sundaram, K. Mohana
    Bennet, M. Anto
    RENEWABLE & SUSTAINABLE ENERGY REVIEWS, 2018, 89 : 194 - 203
  • [44] Performance Evaluation of FinFET Device Under Nanometer Regime for Ultra-low Power Applications
    Devi, M. Parimala
    Ravanan, Velnath
    Kanithan, S.
    Vignesh, N. A.
    SILICON, 2022, 14 (10) : 5745 - 5750
  • [45] Low-Cost Motion-Tracking for Computational Psychometrics Based on Virtual Reality
    Cipresso, Pietro
    Serino, Silvia
    Giglioli, Irene Alice Chicchi
    Giuliano, Igor
    Borra, Davide
    Farina, Andrea
    Riva, Giuseppe
    AUGMENTED AND VIRTUAL REALITY, AVR 2014, 2014, 8853 : 137 - 148
  • [46] Dataset for Eye Tracking on a Virtual Reality Platform
    Garbin, Stephan J.
    Shen, Yiru
    Schuetz, Immo
    Cavin, Robert
    Hughes, Gregory
    Komogortsev, Oleg, V
    Talathi, Sachin S.
    ETRA'20 FULL PAPERS: ACM SYMPOSIUM ON EYE TRACKING RESEARCH AND APPLICATIONS, 2020,
  • [47] Research of the large space tracking on the virtual reality
    Ruan, QQ
    2000 5TH INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING PROCEEDINGS, VOLS I-III, 2000, : 1388 - 1391
  • [48] Smooth head tracking for virtual reality applications
    Amamra, Abdenour
    SIGNAL IMAGE AND VIDEO PROCESSING, 2017, 11 (03) : 479 - 486
  • [49] Eye-tracking on virtual reality: a survey
    Jesús Moreno-Arjonilla
    Alfonso López-Ruiz
    J. Roberto Jiménez-Pérez
    José E. Callejas-Aguilera
    Juan M. Jurado
    Virtual Reality, 2024, 28
  • [50] Smooth head tracking for virtual reality applications
    Abdenour Amamra
    Signal, Image and Video Processing, 2017, 11 : 479 - 486