Human-Robot Interaction Based on Gaze Gestures for the Drone Teleoperation

被引:0
作者
Yu, Mingxin [1 ,2 ]
Lin, Yingzi [2 ]
Schmidt, David [2 ]
Wang, Xiangzhou [1 ]
Wang, Yu [1 ]
机构
[1] Beijing Inst Technol, Sch Automat, Beijing 100081, Peoples R China
[2] Northeastern Univ, Intelligent Human Machine Syst Lab, Boston, MA 02115 USA
来源
JOURNAL OF EYE MOVEMENT RESEARCH | 2014年 / 7卷 / 04期
基金
美国国家科学基金会;
关键词
Human-Robot Interaction; Teleoperation; Gaze gestures; Object Selection; Gaze-controlled Interfaces; EYE-MOVEMENTS;
D O I
暂无
中图分类号
R77 [眼科学];
学科分类号
100212 ;
摘要
Teleoperation has been widely used to perform tasks in dangerous and unreachable environments by replacing humans with controlled agents. The idea of human- robot interaction (HRI) is very important in teleoperation. Conventional HRI input devices include keyboard, mouse and joystick, etc. However, they are not suitable for handicapped users or people with disabilities. These devices also increase the mental workload of normal users due to simultaneous operation of multiple HRI input devices by hand. Hence, HRI based on gaze tracking with an eye tracker is presented in this study. The selection of objects is of great importance and occurs at a high frequency during HRI control. This paper introduces gaze gestures as an object selection strategy into HRI for drone teleoperation. In order to test and validate the performance of gaze gestures selection strategy, we evaluate objective and subjective measurements, respectively. Drone control performance, including mean task completion time and mean error rate, are the objective measurements. The subjective measurement is the analysis of participant perception. The results showed gaze gestures selection strategy has a great potential as an additional HRI for use in agent teleoperation.
引用
收藏
页数:14
相关论文
共 22 条
[1]  
Alapetite A., 2012, P 7 NORD C HUM COMP, P773, DOI DOI 10.1145/2399016.2399140
[2]  
[Anonymous], 2007, Eye tracking methodology: Theory and practice, DOI DOI 10.1007/978-3-319-57883-5
[3]  
Drewes H., 2007, MOBILITY P 4 INT C M, P364, DOI [DOI 10.1145/1378063.1378122, 10.1145/1378063.1378122]
[4]   Vehicle teleoperation interfaces [J].
Fong, T ;
Thorpe, C .
AUTONOMOUS ROBOTS, 2001, 11 (01) :9-18
[5]  
Hansen J. P., 2014, P ACM S EY TRACK RES, P27, DOI [DOI 10.1145/2578153.2578156, 10.1145/2578153.2578156]
[6]  
Heikkila H., 2009, ETRA 12 P S EYE TRAC, P147
[7]  
Huckauf A, 2008, J EYE MOVEMENT RES, V2
[8]   Snap Clutch, a Moded Approach to Solving the Midas Touch Problem [J].
Istance, Howell ;
Bates, Richard ;
Hyrskykari, Aulikki ;
Vickers, Stephen .
PROCEEDINGS OF THE EYE TRACKING RESEARCH AND APPLICATIONS SYMPOSIUM (ETRA 2008), 2008, :221-228
[9]  
Istance Howell., 2010, Proceedings of the 2010 Symposium on Eye-Tracking Research Applications - ETRA'10, P323, DOI DOI 10.1145/1743666.1743740