Assessing the data quality of AdHawk MindLink eye-tracking glasses

被引:4
作者
Huang, Zehao [1 ]
Duan, Xiaoting [1 ]
Zhu, Gancheng [1 ]
Zhang, Shuai [1 ]
Wang, Rong [1 ]
Wang, Zhiguo [1 ]
机构
[1] Zhejiang Univ, Ctr Psychol Sci, 148 Tianmushan Rd, Hangzhou 310028, Peoples R China
关键词
Eye-tracking glasses; MEMS; Data quality; Accuracy; Precision; GAZE TRACKING;
D O I
10.3758/s13428-023-02310-2
中图分类号
B841 [心理学研究方法];
学科分类号
040201 ;
摘要
Most commercially available eye-tracking devices rely on video cameras and image processing algorithms to track gaze. Despite this, emerging technologies are entering the field, making high-speed, cameraless eye-tracking more accessible. In this study, a series of tests were conducted to compare the data quality of MEMS-based eye-tracking glasses (AdHawk MindLink) with three widely used camera-based eye-tracking devices (EyeLink Portable Duo, Tobii Pro Glasses 2, and SMI Eye Tracking Glasses 2). The data quality measures assessed in these tests included accuracy, precision, data loss, and system latency. The results suggest that, overall, the data quality of the eye-tracking glasses was lower compared to that of a desktop EyeLink Portable Duo eye-tracker. Among the eye-tracking glasses, the accuracy and precision of the MindLink eye-tracking glasses were either higher or on par with those of Tobii Pro Glasses 2 and SMI Eye Tracking Glasses 2. The system latency of MindLink was approximately 9 ms, significantly lower than that of camera-based eye-tracking devices found in VR goggles. These results suggest that the MindLink eye-tracking glasses show promise for research applications where high sampling rates and low latency are preferred.
引用
收藏
页码:5771 / 5787
页数:17
相关论文
共 79 条
  • [1] Latency Requirements for Foveated Rendering in Virtual Reality
    Albert, Rachel
    Patney, Anjul
    Luebke, David
    Kim, Joohwan
    [J]. ACM TRANSACTIONS ON APPLIED PERCEPTION, 2017, 14 (04)
  • [2] Andersson R, 2010, J EYE MOVEMENT RES, V3
  • [3] Event-Based Near-Eye Gaze Tracking Beyond 10,000 Hz
    Angelopoulos, Anastasios N.
    Martel, Julien N. P.
    Kohli, Amit P.
    Conradt, Jorg
    Wetzstein, Gordon
    [J]. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2021, 27 (05) : 2577 - 2586
  • [4] Saccade Landing Position Prediction for Gaze-Contingent Rendering
    Arabadzhiyska, Elena
    Tursun, Okan Tarhan
    Myszkowski, Karol
    Seidel, Hans-Peter
    Didyk, Piotr
    [J]. ACM TRANSACTIONS ON GRAPHICS, 2017, 36 (04):
  • [5] SynchronEyes: A Novel, Paired Data Set of Eye Movements Recorded Simultaneously with Remote and Wearable Eye-Tracking Devices
    Aziz, Samantha
    Lohr, Dillon J.
    Komogortsev, Oleg
    [J]. 2022 ACM SYMPOSIUM ON EYE TRACKING RESEARCH AND APPLICATIONS, ETRA 2022, 2022,
  • [6] Eye-tracking data quality as affected by ethnicity and experimental design
    Blignaut, Pieter
    Wium, Daniel
    [J]. BEHAVIOR RESEARCH METHODS, 2014, 46 (01) : 67 - 80
  • [7] Evaluating the data quality of the Gazepoint GP3 low-cost eye tracker when used independently by study participants
    Brand, John
    Diamond, Solomon G.
    Thomas, Natalie
    Gilbert-Diamond, Diane
    [J]. BEHAVIOR RESEARCH METHODS, 2021, 53 (04) : 1502 - 1514
  • [8] CHATELAIN P, 2020, IEEE T CYBERNETICS, V50, P153, DOI DOI 10.1109/TCYB.2018.2866274
  • [9] Chennamma H., 2013, arXiv Prepr. arXiv1312.6410, V4, P388
  • [10] A systematic review of hybrid brain-computer interfaces: Taxonomy and usability perspectives
    Choi, Inchul
    Rhiu, Ilsun
    Lee, Yushin
    Yun, Myung Hwan
    Nam, Chang S.
    [J]. PLOS ONE, 2017, 12 (04):