Eye-Hand Typing: Eye Gaze Assisted Finger Typing via Bayesian Processes in AR

被引:0
|
作者
Ren, Yunlei [1 ]
Zhang, Yan [1 ]
Liu, Zhitao [1 ]
Xie, Ning [1 ]
机构
[1] Univ Elect Sci & Technol China, Ctr Future Media, Sch Comp Sci & Engn, Chengdu, Peoples R China
基金
国家重点研发计划;
关键词
Keyboards; Production facilities; Bayes methods; Task analysis; Performance evaluation; Prediction algorithms; Visualization; Augmented reality; text entry; multi-modal interaction; eye-hand coordination; bayesian process; fitts' law; primacy effect; TEXT ENTRY; COORDINATION; INPUT;
D O I
10.1109/TVCG.2024.3372106
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Nowadays, AR HMDs are widely used in scenarios such as intelligent manufacturing and digital factories. In a factory environment, fast and accurate text input is crucial for operators' efficiency and task completion quality. However, the traditional AR keyboard may not meet this requirement, and the noisy environment is unsuitable for voice input. In this article, we introduce Eye-Hand Typing, an intelligent AR keyboard. We leverage the speed advantage of eye gaze and use a Bayesian process based on the information of gaze points to infer users' text input intentions. We improve the underlying keyboard algorithm without changing user input habits, thereby improving factory users' text input speed and accuracy. In real-time applications, when the user's gaze point is on the keyboard, the Bayesian process can predict the most likely characters, vocabulary, or commands that the user will input based on the position and duration of the gaze point and input history. The system can enlarge and highlight recommended text input options based on the predicted results, thereby improving user input efficiency. A user study showed that compared with the current HoloLens 2 system keyboard, Eye-Hand Typing could reduce input error rates by 28.31 % and improve text input speed by 14.5%. It also outperformed a gaze-only technique, being 43.05% more accurate and 39.55% faster. And it was no significant compromise in eye fatigue. Users also showed positive preferences.
引用
收藏
页码:2496 / 2506
页数:11
相关论文
共 10 条
  • [1] How We Type: Eye and Finger Movement Strategies in Mobile Typing
    Jiang, Xinhui
    Li, Yang
    Jokinen, Jussi P. P.
    Hirvola, Viet Ba
    Oulasvirta, Antti
    Reit, Xiangshi
    PROCEEDINGS OF THE 2020 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS (CHI'20), 2020,
  • [2] Interaction between the premotor processes of eye and hand movements: Possible mechanism underlying eye-hand coordination
    Hiraoka, Koichi
    Kurata, Naoatsu
    Sakaguchi, Masato
    Nonaka, Kengo
    Matsumoto, Naoto
    SOMATOSENSORY AND MOTOR RESEARCH, 2014, 31 (01) : 49 - 55
  • [3] Dynamic Bayesian Adjustment of Dwell Time for Faster Eye Typing
    Pi, Jimin
    Koljonen, Paul A.
    Hu, Yong
    Shi, Bertram E.
    IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, 2020, 28 (10) : 2315 - 2324
  • [4] Coordinating Eye-Hand Action via Partially-Observable Markov Decision Processes
    Cheng Yanyun
    Zhu Songhao
    Liang Zhiwei
    Fan Lili
    PROCEEDINGS OF THE 31ST CHINESE CONTROL CONFERENCE, 2012, : 3969 - 3973
  • [5] Coordinated Flexibility: How Initial Gaze Position Modulates Eye-Hand Coordination and Reaching
    Adam, Jos J.
    Buetti, Simona
    Kerzel, Dirk
    JOURNAL OF EXPERIMENTAL PSYCHOLOGY-HUMAN PERCEPTION AND PERFORMANCE, 2012, 38 (04) : 891 - 901
  • [6] Eye-hand re-coordination: A pilot investigation of gaze and reach biofeedback in chronic stroke
    Rizzo, John-Ross
    Beheshti, Mahya
    Shafieesabet, Azadeh
    Fung, James
    Hosseini, Maryam
    Rucker, Janet C.
    Snyder, Lawrence H.
    Hudson, Todd E.
    MATHEMATICAL MODELLING IN MOTOR NEUROSCIENCE: STATE OF THE ART AND TRANSLATION TO THE CLINIC. GAZE ORIENTING MECHANISMS AND DISEASE, 2019, 249 : 361 - 374
  • [7] Object Manipulation Method Using Eye Gaze and Hand-held Controller in AR Space
    Ishibashi, Ryo
    Kawaguchi, Ikkaku
    28TH ACM SYMPOSIUM ON VIRTUAL REALITY SOFTWARE AND TECHNOLOGY, VRST 2022, 2022,
  • [8] Seeing is believing: AR-assisted blind area assembly to support hand–eye coordination
    Shuo Feng
    Weiping He
    Shaohua Zhang
    Mark Billinghurst
    The International Journal of Advanced Manufacturing Technology, 2022, 119 : 8149 - 8158
  • [9] Seeing is believing: AR-assisted blind area assembly to support hand-eye coordination
    Feng, Shuo
    He, Weiping
    Zhang, Shaohua
    Billinghurst, Mark
    INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY, 2022, 119 (11-12) : 8149 - 8158
  • [10] Automated Assessment of Eye-hand Coordination Skill using a Vertical Tracing Task on a Gaze-sensitive Human Computer Interaction Platform for children with Autism
    Rane D.
    Singh M.
    Lahiri U.
    Proceedings of the ACM on Human-Computer Interaction, 2024, 8 (ETRA)