Ultra-Low Power Gaze Tracking for Virtual Reality

被引:12
作者
Li, Tianxing [1 ]
Liu, Qiang [1 ]
Zhou, Xia [1 ]
机构
[1] Dartmouth Coll, Dept Comp Sci, Hanover, NH 03755 USA
来源
PROCEEDINGS OF THE 15TH ACM CONFERENCE ON EMBEDDED NETWORKED SENSOR SYSTEMS (SENSYS'17) | 2017年
基金
美国国家科学基金会;
关键词
Gaze tracking; virtual reality; visible light sensing; EYE-MOVEMENT; TIME EYE; MODEL;
D O I
10.1145/3131672.3131682
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Tracking user's eye fixation direction is crucial to virtual reality (VR): it eases user's interaction with the virtual scene and enables intelligent rendering to improve user's visual experiences and save system energy. Existing techniques commonly rely on cameras and active infrared emitters, making them too expensive and power-hungry for VR headsets (especially mobile VR headsets). We present LiGaze, a low-cost, low-power approach to gaze tracking tailored to VR. It relies on a few low-cost photodiodes, eliminating the need for cameras and active infrared emitters. Reusing light emitted from the VR screen, LiGaze leverages photodiodes around a VR lens to measure reflected screen light in different directions. It then infers gaze direction by exploiting pupil's light absorption property. The core of LiGaze is to deal with screen light dynamics and extract changes in reflected light related to pupil movement. LiGaze infers a 3D gaze vector on the fly using a lightweight regression algorithm. We design and fabricate a LiGaze prototype using off-the-shelf photodiodes. Our comparison to a commercial VR eye tracker (FOVE) shows that LiGaze achieves 6.3. and 10.1. mean within-user and cross-user accuracy. Its sensing and computation consume 791 mu W in total and thus can be completely powered by a credit-card sized solar cell harvesting energy from indoor lighting. LiGaze's simplicity and ultra-low power make it applicable in a wide range of VR headsets to better unleash VR's potential.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Ultra-Low Power Gaze Tracking for Virtual Reality
    Li, Tianxing
    Akosah, Emmanuel S.
    Liu, Qiang
    Zhou, Xia
    PROCEEDINGS OF THE 15TH ACM CONFERENCE ON EMBEDDED NETWORKED SENSOR SYSTEMS (SENSYS'17), 2017,
  • [2] Ultra-Low Power Gaze Tracking for Virtual Reality
    Li, Tianxing
    Akosah, Emmanuel S.
    Liu, Qiang
    Zhou, Xia
    PROCEEDINGS OF THE 23RD ANNUAL INTERNATIONAL CONFERENCE ON MOBILE COMPUTING AND NETWORKING (MOBICOM '17), 2017, : 490 - 492
  • [3] Gaze Tracking for Eye-Hand Coordination Training Systems in Virtual Reality
    Mutasim, Aunnoy K.
    Stuerzlinger, Wolfgang
    Batmaz, Anil Ufuk
    CHI'20: EXTENDED ABSTRACTS OF THE 2020 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, 2020,
  • [4] Towards Mitigating the Eye Gaze Tracking Uncertainty in Virtual Reality
    Ryabinin, Konstantin
    Chuprina, Svetlana
    COMPUTATIONAL SCIENCE, ICCS 2022, PT IV, 2022, : 623 - 636
  • [5] Perceptual self-position estimation based on gaze tracking in virtual reality
    Liu, Hongmei
    Qin, Huabiao
    VIRTUAL REALITY, 2022, 26 (01) : 269 - 278
  • [6] Eye Tracking in Virtual Reality
    Clay, Viviane
    Koenig, Peter
    Koenig, Sabine
    JOURNAL OF EYE MOVEMENT RESEARCH, 2019, 12 (01):
  • [7] Perceptual self-position estimation based on gaze tracking in virtual reality
    Hongmei Liu
    Huabiao Qin
    Virtual Reality, 2022, 26 : 269 - 278
  • [8] Gaze-based Kinaesthetic Interaction for Virtual Reality
    Li, Zhenxing
    Akkil, Deepak
    Raisamo, Roope
    INTERACTING WITH COMPUTERS, 2020, 32 (01) : 17 - 32
  • [9] Gaze Analysis and Prediction in Virtual Reality
    Hu, Zhiming
    2020 IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES WORKSHOPS (VRW 2020), 2020, : 543 - 544
  • [10] Gaze Behavior in Social Fear Conditioning: An Eye-Tracking Study in Virtual Reality
    Reichenberger, Jonas
    Pfaller, Michael
    Muehlberger, Andreas
    FRONTIERS IN PSYCHOLOGY, 2020, 11