A Kinect-based 3D hand-gesture interface for 3D databases

被引:0
作者
Raul Herrera-Acuña
Vasileios Argyriou
Sergio A. Velastin
机构
[1] Kingston University,Digital Research Centre, Faculty of Science, Engineering and Computing
[2] Universidad de Santiago de Chile,Department of Informatic Engineering
来源
Journal on Multimodal User Interfaces | 2015年 / 9卷
关键词
Human-computer interaction; Hand gesture; Programming and developing interfaces; 3D data representation;
D O I
暂无
中图分类号
学科分类号
摘要
The use of natural interfaces improves significantly aspects related to human-computer interaction and consequently the productivity and the overall performance. In this paper we present a novel framework to interact with data elements presented in a 3D space. The system provides two mechanisms to interact using 2D and 3D gestures based on data provided by Kinect and on hand detection and gesture interpretation algorithms. The proposed architecture is analysed indicating that 3D interaction with information is possible, and provides advantages over a 2D interaction over the same problem. Finally, two sets of experiments were performed to evaluate 2D and 3D interaction styles based on natural interfaces focusing on traditional interaction with 3D databases.
引用
收藏
页码:121 / 139
页数:18
相关论文
共 40 条
[1]  
Wachs JP(2011)Vision-based hand-gesture applications Commun ACM 54–2 60-71
[2]  
Kölsch M(2009)A comparison of different input devices for a 3D environment Int J Ind Ergon 39–3 554-563
[3]  
Stern H(2009)NiMoToons: a totally graphic workbench for program tuning and experimentation Electron Notes Theor Comput Sci 258–1 93-107
[4]  
Edan Y(2005)A combined introductory course on human–computer interaction and computer graphics Comput Graph 29 267-272
[5]  
Dang NT(2009)Web3D technologies in learning, education and training: motivations, issues, opportunities Comput Educ 49 3-18
[6]  
Tavanti M(2004)A taxonomy for and analysis of tangible interfaces Pers Ubiquitous Comput 8–5 347-358
[7]  
Rankin I(2012)Detecting objects using color and depth segmentation with Kinect sensor Proc Technol 3–0 196-204
[8]  
Cooper M(2011)Virtual reality from the keyboard/mouse couple to Kinect Ann Phys Rehabil Med 54 239-1120
[9]  
Clerici S(2013)Robust part-based hand gesture recognition using Kinect sensor Multimed IEEE Trans 15 1110-367
[10]  
Zoltan C(2013)Using Kinect for 2D and 3D pointing tasks: performance evaluation, human-computer interaction Interact ModalTech Lect Notes Comput Sci 8007 358-1151