Fast human-computer interaction by combining gaze pointing and Face gestures

被引:27
作者
Rozado D. [1 ]
Niu J. [2 ]
Lochner M. [2 ]
机构
[1] College of Enterprise and Development, Otago Polytechnic, Fourth Street, Dunedin
[2] CSIRO Digital Productivity Flagship, 15 College Road, Hobart
关键词
Accessibility; Accessible computing; Assistive technologies; Face tracking; Gaze interaction; Human factors; Human factors and ergonomics; Human performance; Human-computer interaction;
D O I
10.1145/3075301
中图分类号
学科分类号
摘要
In this work, we show how our open source accessibility software, the FaceSwitch, can help motor-impaired subjects to efficiently interact with a computer hands-free. The FaceSwitch enhances gaze interaction with video-based face gestures interaction. The emerging multimodal system allows for interaction with a user interface by means of gaze pointing for target selection and facial gestures for target-specific action commands. The FaceSwitch maps facial gestures to specific mouse or keyboard events such as: left mouse click, right mouse click, or page scroll down. Hence, facial gestures serve the purpose of mechanical switches.With this multimodal interaction paradigm, the user gazes at the object in the user interface with which it wants to interact and then triggers a target-specific action by performing a face gesture. Through a rigorous user study, we have obtained quantitative evidence that suggests our proposed interaction paradigm improves the performance of traditional accessibility options, such as gaze-only interaction or gaze with a single mechanical switch interaction while coming close in terms of speed and accuracy with traditional mouse-based interaction.Wemake the FaceSwitch software freely available to the community so the output of our research can help the target audience. © 2017 ACM.
引用
收藏
相关论文
共 18 条
[1]  
Bates R., Istance H., Zooming interfaces!: Enhancing the performance of eye controlled pointing devices, Proceedings of the 5th International ACM Conference on Assistive Technologies (Assets'02), pp. 119-126, (2002)
[2]  
Bates R., Istance H.O., Why are eye mice unpopular? A detailed comparison of head and eye controlled assistive technology pointing devices, Univ. Access Info. Soc., 2, 3, pp. 280-290, (2003)
[3]  
Cook A.M., Miller Polgar J., Assistive Technologies: Principles and Practice, (2014)
[4]  
Grauman K., Betke M., Lombardi J., Gips J., Bradski G.R., Communication via eye blinks and eyebrow raises: Video-based human-computer interfaces, Univ. Access Info. Soc., 2, 4, pp. 359-373, (2003)
[5]  
Hales J., Rozado D., Mardanbegi D., Interacting with Objects in the Environment by Gaze and Hand Gestures, (2013)
[6]  
Owen Istance H., Spinner C., Alan Howarth P., Providing motor impaired users with access to standard graphical user interface (GUI) software via eye-based interaction, Proceedings of the 1st European Conference on Disability, Virtual Reality and Associated Technologies (ECDVRAT'96), pp. 109-116, (1996)
[7]  
Sung J., Kim D., Pose-robust facial expression recognition using view-based 2D + 3D AAM, IEEE Transactions on Systems, Man, and Cybernetics-part A: Systems and Humans, 38, 4, pp. 852-866, (2008)
[8]  
Ohno T., Features of eye gaze interface for selection tasks, Proceedings of the 3rd Asia Pacific Computer Human Interaction, pp. 176-181, (1998)
[9]  
Pomplun M., Ivanovic N., Reingold E.M., Shen J., Empirical evaluation of a novel gaze-controlled zooming interface, Usability Evaluation and Design: Cognitive Engineering, Intelligent Agents and Virtual Reality. Proceedings of the 9th International Conference on Human-computer Interaction, (2001)
[10]  
Porta M., Ravarelli A., Spagnoli G., CeCursor, A contextual eye cursor for general pointing in windows environments, Proceedings of the 2010 Symposium on Eye-tracking Research & Applications (ETRA'10), pp. 331-337, (2010)