Eye Tracking, Usability, and User Experience: A Systematic Review

被引:31
作者
Novak, Jakub Stepan [1 ]
Masner, Jan [1 ]
Benda, Petr [1 ]
Simek, Pavel [1 ]
Merunka, Vojtech [2 ]
机构
[1] Czech Univ Life Sci Prague, Dept Informat Technol, Prague, Czech Republic
[2] Czech Univ Life Sci Prague, Dept Informat Engn, Prague, Czech Republic
关键词
Eye tracking; UX; machine learning; usability; EMOTION RECOGNITION; COMPLEXITY; MOVEMENTS; DECISIONS; MOUSE; MODEL; GAZE;
D O I
10.1080/10447318.2023.2221600
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Usability and user experience (UX) are emerging concerns around not only application development but everything designed to be used by people. Evaluation of the UX is, by nature, intensely subjective and time-consuming. The article focuses mainly on Eye Tracking, Usability, and User Experience from a general point of view, with an emphasis on automatic data processing. In recent years, new technological approaches have been emerging to quantify usability testing data and improve process automation. Eye tracking technology is a great way to analyze users' interaction with the product, allowing researchers to discover usability issues and even leverage the power of machine learning to recognize various kinds of emotions linked to users' interactions. Existing research concerned with these three main topics has been methodically explored. For this review, we extensively searched 1988 theme-related articles. One hundred and forty-four articles were selected based on meticulous screening, from which 90 were included in this systematic review. The outcomes reveal a significant shift toward a more technologically advanced evaluation of user experience and usability in various areas. The review proposes several opportunities for future research and missing areas connecting user experience, eye tracking, and machine learning into more products focused on problem pattern identification.
引用
收藏
页码:4484 / 4500
页数:17
相关论文
共 105 条
[1]   Towards Automatic Capturing of Traceability Links by Combining Eye Tracking and Interaction Data [J].
Ahrens, Maike .
2020 28TH IEEE INTERNATIONAL REQUIREMENTS ENGINEERING CONFERENCE (RE'20), 2020, :434-439
[2]   Convolutional Neural Network-Based Methods for Eye Gaze Estimation: A Survey [J].
Akinyelu, Andronicus A. ;
Blignaut, Pieter .
IEEE ACCESS, 2020, 8 :142581-142605
[3]  
Amershi S, 2007, 2007 INTERNATIONAL CONFERENCE ON INTELLIGENT USER INTERFACES, P72
[4]   How Reliably Do Eye Parameters Indicate Internal Versus External Attentional Focus? [J].
Annerer-Walcher, Sonja ;
Ceh, Simon M. ;
Putze, Felix ;
Kampen, Marvin ;
Korner, Christof ;
Benedek, Mathias .
COGNITIVE SCIENCE, 2021, 45 (04)
[5]   Comparative Study of User Experience Evaluation Techniques Based on Mouse and Gaze Tracking [J].
Aviz, Igor Leonardo ;
Souza, Kennedy Edson ;
Ribeiro, Elison ;
de Mello Junior, Harold ;
Seruffo, Marcos Cesar da R. .
WEBMEDIA 2019: PROCEEDINGS OF THE 25TH BRAZILLIAN SYMPOSIUM ON MULTIMEDIA AND THE WEB, 2019, :53-56
[6]  
Barreto A., 2005, USER STRESS DETECTIO
[7]   Toward Real-Time System Adaptation Using Excitement Detection from Eye Tracking [J].
Ben Abdessalem, Hamdi ;
Chaouachi, Maher ;
Boukadida, Marwa ;
Frasson, Claude .
INTELLIGENT TUTORING SYSTEMS (ITS 2019), 2019, 11528 :214-223
[8]   Reconstructing User's Attention on the Web through Mouse Movements and Perception-Based Content Identification [J].
Boi, Paolo ;
Fenu, Gianni ;
Spano, Lucio Davide ;
Vargiu, Valentino .
ACM TRANSACTIONS ON APPLIED PERCEPTION, 2016, 13 (03)
[9]   On the necessity of adaptive eye movement classification in conditionally automated driving scenarios [J].
Braunagel, Christian ;
Geisler, David ;
Stolzmann, Wolfgang ;
Rosenstiel, Wolfgang ;
Kasneci, Enkelejda .
2016 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS (ETRA 2016), 2016, :19-26
[10]   Learning Visual Importance for Graphic Designs and Data Visualizations [J].
Bylinskii, Zoya ;
Kim, Nam Wook ;
O'Donovan, Peter ;
Alsheikh, Sami ;
Madan, Spandan ;
Pfister, Hanspeter ;
Durand, Fredo ;
Russell, Bryan ;
Hertzmann, Aaron .
UIST'17: PROCEEDINGS OF THE 30TH ANNUAL ACM SYMPOSIUM ON USER INTERFACE SOFTWARE AND TECHNOLOGY, 2017, :57-69