Real-time 3D Hand Gesture Based Mobile Interaction Interface

被引:3
作者
Che, Yunlong [1 ]
Song, Yuxiang [1 ]
Qi, Yue [1 ,2 ,3 ]
机构
[1] State Key Lab Virtual Real Technol & Syst, Beijing, Peoples R China
[2] Peng Cheng Lab, Shenzhen, Peoples R China
[3] Beihang Univ, Qingdao Res Inst, Qingdao, Peoples R China
来源
ADJUNCT PROCEEDINGS OF THE 2019 IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY (ISMAR-ADJUNCT 2019) | 2019年
基金
中国国家自然科学基金; 国家重点研发计划;
关键词
3D Hand pose estimation; Hand gesture recognition; human-mobile interaction; Augmented Reality; Interaction interface; Human-centered computing; Human computer interaction (HCI); Interaction techniques; Gestural input; Computing methodologies; Artificial intelligence; Computer vision; Computer vision problems;
D O I
10.1109/ISMAR-Adjunct.2019.00-41
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Hand gesture recognition is a challenging problem for natural human-computer interaction(HCI). We address this problem by introducing a real-time human-mobile interaction interface with a depth sensor. Our interface consists of two components, 3D hand pose estimation and hand skeleton state based gesture description. Firstly, we propose a 3D hand pose estimation method that combines learning based pose initialization and physical based model fitting, which can estimate the per-frame's hand pose that appears in the depth camera's field of view. Afterwards, we map the estimated pose to gesture, e.g. open or close, through a hand skeleton state based method. With the tracked hand gesture, we can stably and smoothly implement common operations such as 'Touch', 'Grasp' and 'Hold' with mid-air interface. Our main contribution is combine 3D hand pose estimation and hand gesture tracking, and implementing an interaction application system with the details.
引用
收藏
页码:228 / 232
页数:5
相关论文
共 50 条
[31]   Real Time Hand Gesture Recognition for Human Computer Interaction [J].
Agrawal, Rishabh ;
Gupta, Nikita .
2016 IEEE 6TH INTERNATIONAL CONFERENCE ON ADVANCED COMPUTING (IACC), 2016, :470-475
[32]   Real-Time Hand Gesture Recognition: A Comprehensive Review of Techniques, Applications, and Challenges [J].
Mohamed, Aws Saood ;
Hassan, Nidaa Flaih ;
Jamil, Abeer Salim .
CYBERNETICS AND INFORMATION TECHNOLOGIES, 2024, 24 (03) :163-181
[33]   Real-time 3D video-based MR remote collaboration using gesture cues and virtual replicas [J].
Zhang, Xiangyu ;
Bai, Xiaoliang ;
Zhang, Shusheng ;
He, Weiping ;
Wang, Peng ;
Wang, Zhuo ;
Yan, Yuxiang ;
Yu, Quan .
INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY, 2022, 121 (11-12) :7697-7719
[34]   A Real-time Hand Gesture Recognition Algorithm For an Embedded System [J].
You Lei ;
Wang Hongpeng ;
Tan Dianxiong ;
Wangjue .
2014 IEEE INTERNATIONAL CONFERENCE ON MECHATRONICS AND AUTOMATION (IEEE ICMA 2014), 2014, :901-905
[35]   Real-Time Hand Gesture Recognition using Motion Tracking [J].
Chi-Man Pun ;
Hong-Min Zhu ;
Wei Feng .
International Journal of Computational Intelligence Systems, 2011, 4 (2) :277-286
[36]   Survey on 3D Hand Gesture Recognition [J].
Cheng, Hong ;
Yang, Lu ;
Liu, Zicheng .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2016, 26 (09) :1659-1673
[37]   Real-time fingertip localization conditioned on hand gesture classification [J].
Suau, Xavier ;
Alcoverro, Marcel ;
Lopez-Mendez, Adolfo ;
Ruiz-Hidalgo, Javier ;
Casas, Josep R. .
IMAGE AND VISION COMPUTING, 2014, 32 (08) :522-532
[38]   Real-Time Hand Gesture Recognition using Motion Tracking [J].
Pun, Chi-Man ;
Zhu, Hong-Min .
INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE SYSTEMS, 2011, 4 (02) :277-286
[39]   Improving Real-Time Hand Gesture Recognition with Semantic Segmentation [J].
Benitez-Garcia, Gibran ;
Prudente-Tixteco, Lidia ;
Castro-Madrid, Luis Carlos ;
Toscano-Medina, Rocio ;
Olivares-Mercado, Jesus ;
Sanchez-Perez, Gabriel ;
Villalba, Luis Javier Garcia .
SENSORS, 2021, 21 (02) :1-16
[40]   A gesture control system for intuitive 3D interaction with virtual objects [J].
Manders, Corey ;
Farbiz, Farzam ;
Yin, Tang Ka ;
Yuan Miaolong ;
Guan, Chua Gim .
COMPUTER ANIMATION AND VIRTUAL WORLDS, 2010, 21 (02) :117-129