Ultra-Low Power Gaze Tracking for Virtual Reality

被引:12
作者
Li, Tianxing [1 ]
Liu, Qiang [1 ]
Zhou, Xia [1 ]
机构
[1] Dartmouth Coll, Dept Comp Sci, Hanover, NH 03755 USA
来源
PROCEEDINGS OF THE 15TH ACM CONFERENCE ON EMBEDDED NETWORKED SENSOR SYSTEMS (SENSYS'17) | 2017年
基金
美国国家科学基金会;
关键词
Gaze tracking; virtual reality; visible light sensing; EYE-MOVEMENT; TIME EYE; MODEL;
D O I
10.1145/3131672.3131682
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Tracking user's eye fixation direction is crucial to virtual reality (VR): it eases user's interaction with the virtual scene and enables intelligent rendering to improve user's visual experiences and save system energy. Existing techniques commonly rely on cameras and active infrared emitters, making them too expensive and power-hungry for VR headsets (especially mobile VR headsets). We present LiGaze, a low-cost, low-power approach to gaze tracking tailored to VR. It relies on a few low-cost photodiodes, eliminating the need for cameras and active infrared emitters. Reusing light emitted from the VR screen, LiGaze leverages photodiodes around a VR lens to measure reflected screen light in different directions. It then infers gaze direction by exploiting pupil's light absorption property. The core of LiGaze is to deal with screen light dynamics and extract changes in reflected light related to pupil movement. LiGaze infers a 3D gaze vector on the fly using a lightweight regression algorithm. We design and fabricate a LiGaze prototype using off-the-shelf photodiodes. Our comparison to a commercial VR eye tracker (FOVE) shows that LiGaze achieves 6.3. and 10.1. mean within-user and cross-user accuracy. Its sensing and computation consume 791 mu W in total and thus can be completely powered by a credit-card sized solar cell harvesting energy from indoor lighting. LiGaze's simplicity and ultra-low power make it applicable in a wide range of VR headsets to better unleash VR's potential.
引用
收藏
页数:14
相关论文
共 50 条
  • [31] Tracking Attacks on Virtual Reality Systems
    Rafique, Muhammad Usman
    Cheung, Sen-ching S.
    IEEE CONSUMER ELECTRONICS MAGAZINE, 2020, 9 (02) : 41 - 46
  • [32] FaceVR: Real-Time Gaze-Aware Facial Reenactment in Virtual Reality
    Thies, Justus
    Zollhofer, Michael
    Stamminger, Marc
    Theobalt, Christian
    Niessner, Matthias
    ACM TRANSACTIONS ON GRAPHICS, 2018, 37 (02):
  • [33] FocusFlow: Leveraging Focal Depth for Gaze Interaction in Virtual Reality
    Zhang, Chenyang
    Chen, Tiansu
    Nedungadi, Rohan
    Shafer, Eric
    Soltanaghaei, Elahe
    ADJUNCT PROCEEDINGS OF THE 36TH ANNUAL ACM SYMPOSIUM ON USER INTERFACE SOFTWARE & TECHNOLOGY, UIST 2023 ADJUNCT, 2023,
  • [34] Gaze-Contingent Ocular Parallax Rendering for Virtual Reality
    Konrad, Robert
    Angelopoulos, Anastasios
    Wetzstein, Gordon
    ACM TRANSACTIONS ON GRAPHICS, 2020, 39 (02):
  • [35] Gaze-Contingent Ocular Parallax Rendering for Virtual Reality
    Konrad, Robert
    Angelopoulos, Anastasios
    Wetzstein, Gordon
    SIGGRAPH '19 -ACM SIGGRAPH 2019 TALKS, 2019,
  • [36] Snap, Pursuit and Gain: Virtual Reality Viewport Control by Gaze
    Lee, Hock Siang
    Weidner, Florian
    Sidenmark, Ludwig
    Gellersen, Hans
    PROCEEDINGS OF THE 2024 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYTEMS (CHI 2024), 2024,
  • [37] Gaze-Based Interaction Intention Recognition in Virtual Reality
    Chen, Xiao-Lin
    Hou, Wen-Jun
    ELECTRONICS, 2022, 11 (10)
  • [38] SalientGaze: Saliency-based gaze correction in virtual reality
    Shi, Peiteng
    Billeter, Markus
    Eisemann, Elmar
    COMPUTERS & GRAPHICS-UK, 2020, 91 (83-94): : 83 - 94
  • [39] Harnessing the Power of Virtual Reality
    Liu, Xinhua
    Guo, Li
    Xia, Zhaojie
    Lu, Bona
    Zhao, Mingkun
    Meng, Fanxiao
    Li, Zhouzhou
    Li, Jinghai
    CHEMICAL ENGINEERING PROGRESS, 2012, 108 (07) : 28 - 33
  • [40] Path generation in virtual reality environment based on gaze analysis
    Antonya, Csaba
    Barbuceanu, Florin Grigore
    Rusak, Zoltan
    IEEE AFRICON 2011, 2011,