Eye tracking for public displays in the wild

被引:27
作者
Zhang, Yanxia [1 ]
Chong, Ming Ki [1 ]
Muller, Jorg [2 ]
Bulling, Andreas [3 ]
Gellersen, Hans [1 ]
机构
[1] Univ Lancaster, Lancaster, England
[2] Aarhus Univ, Aarhus, Denmark
[3] Max Planck Inst Informat, D-66123 Saarbrucken, Germany
关键词
Eye tracking; Gaze interaction; Public displays; Scrolling; Calibration-free; In-the-wild study; Deployment; GAZE TRACKING;
D O I
10.1007/s00779-015-0866-8
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In public display contexts, interactions are spontaneous and have to work without preparation. We propose gaze as a modality for such contexts, as gaze is always at the ready, and a natural indicator of the user's interest. We present GazeHorizon, a system that demonstrates spontaneous gaze interaction, enabling users to walk up to a display and navigate content using their eyes only. GazeHorizon is extemporaneous and optimised for instantaneous usability by any user without prior configuration, calibration or training. The system provides interactive assistance to bootstrap gaze interaction with unaware users, employs a single off-the-shelf web camera and computer vision for person-independent tracking of the horizontal gaze direction and maps this input to rate-controlled navigation of horizontally arranged content. We have evaluated GazeHorizon through a series of field studies, culminating in a 4-day deployment in a public environment during which over a hundred passers-by interacted with it, unprompted and unassisted. We realised that since eye movements are subtle, users cannot learn gaze interaction from only observing others and as a result guidance is required.
引用
收藏
页码:967 / 981
页数:15
相关论文
共 50 条
  • [1] Eye tracking for public displays in the wild
    Yanxia Zhang
    Ming Ki Chong
    Jörg Müller
    Andreas Bulling
    Hans Gellersen
    Personal and Ubiquitous Computing, 2015, 19 : 967 - 981
  • [2] Eye tracking in the wild
    Hansen, DW
    Pece, AEC
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2005, 98 (01) : 155 - 181
  • [3] GazeLabel: A Cost-free Data Labeling System with Public Displays using Eye-tracking
    Liu, Zhiyuan
    Qiao, Feitong
    Long, Haotian
    Li, Guang
    SENSYS'18: PROCEEDINGS OF THE 16TH CONFERENCE ON EMBEDDED NETWORKED SENSOR SYSTEMS, 2018, : 343 - 344
  • [4] Video-based eye tracking for autostereoscopic displays
    Chen, YS
    Su, CH
    Chen, JH
    Chen, CS
    Hung, YP
    Fuh, CS
    OPTICAL ENGINEERING, 2001, 40 (12) : 2726 - 2734
  • [5] Display Blindness? Looking Again at the Visibility of Situated Displays using Eye Tracking
    Dalton, Nicholas S.
    Collins, Emily
    Marshall, Paul
    CHI 2015: PROCEEDINGS OF THE 33RD ANNUAL CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, 2015, : 3889 - 3898
  • [6] EyeVote in the Wild: Do Users bother Correcting System Errors on Public Displays?
    Khamis, Mohamed
    Trotter, Ludwig
    Tessmann, Markus
    Dannhart, Christina
    Bulling, Andreas
    Alt, Florian
    15TH INTERNATIONAL CONFERENCE ON MOBILE AND UBIQUITOUS MULTIMEDIA (MUM 2016), 2016, : 57 - 62
  • [7] Controlling In-the-Wild Evaluation Studies of Public Displays
    Claes, Sandy
    Wouters, Niels
    Slegers, Karin
    Moere, Andrew Vande
    CHI 2015: PROCEEDINGS OF THE 33RD ANNUAL CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, 2015, : 81 - 84
  • [8] Moving parallax barrier design for eye-tracking autostereoscopic displays
    Yi, Sang-Yi
    Chae, Ho-Byung
    Lee, Seung-Hyun
    2008 3DTV-CONFERENCE: THE TRUE VISION - CAPTURE, TRANSMISSION AND DISPLAY OF 3D VIDEO, 2008, : 145 - 148
  • [9] Eye Tracking Evaluation of User Experience on Large-Scale Displays
    Schall, Andrew
    UNIVERSAL ACCESS IN HUMAN-COMPUTER INTERACTION: ACCESS TO TODAY'S TECHNOLOGIES, PT I, 2015, 9175 : 98 - 108
  • [10] ARETT: Augmented Reality Eye Tracking Toolkit for Head Mounted Displays
    Kapp, Sebastian
    Barz, Michael
    Mukhametov, Sergey
    Sonntag, Daniel
    Kuhn, Jochen
    SENSORS, 2021, 21 (06)