User-Adaptable Hand Pose Estimation Technique for Human-Robot Interaction

被引:1
|
作者
Causo, Albert [1 ]
Ueda, Etsuko [3 ]
Takemura, Kentaro [2 ]
Matsumoto, Yoshio [4 ]
Takamatsu, Jun [2 ]
Ogasawara, Tsukasa [2 ]
机构
[1] Nara Inst Sci & Technol NAIST, Robot Lab, Dept Informat Sci, 8916-5 Takayama Cho, Ikoma City, Nara 6300192, Japan
[2] Nara Inst Sci & Technol NAIST, Ikoma City, Nara 6300192, Japan
[3] Nara Sangyo Univ, Fac Informat, Sango, Nara 6368503, Japan
[4] Natl Inst Adv Ind Sci & Technol, Intelligent Syst Inst, Tsukuba, Ibaraki 3058568, Japan
关键词
human-robot interaction; hand model calibration; vision-based hand pose estimation;
D O I
10.20965/jrm.2009.p0739
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Hand pose estimation using a multi-camera system allows natural non-contact interfacing unlike when using bulky data gloves. To enable any user to use the system regardless of gender or physical differences such as hand size, we propose hand model individualization using only multiple cameras. From the calibration motion, our method estimates the finger link lengths as well as the hand shape by minimizing the gap between the hand model and observation. We confirmed the feasibility of our proposal by comparing 1) actual and estimated link lengths and 2) hand pose estimation results using our calibrated hand model, a prior hand model and data obtained from data glove measurements.
引用
收藏
页码:739 / 748
页数:10
相关论文
共 50 条
  • [21] Finger identification and hand posture recognition for human-robot interaction
    Yin, Xiaoming
    Xie, Ming
    IMAGE AND VISION COMPUTING, 2007, 25 (08) : 1291 - 1300
  • [22] Implementation of Gaze Estimation in Dialogue to Human-Robot Interaction
    Somashekarappa, Vidya
    2022 10TH INTERNATIONAL CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION WORKSHOPS AND DEMOS, ACIIW, 2022,
  • [23] How to include User eXperience in the design of Human-Robot Interaction
    Prati, Elisa
    Peruzzini, Margherita
    Pellicciari, Marcello
    Raffaeli, Roberto
    ROBOTICS AND COMPUTER-INTEGRATED MANUFACTURING, 2021, 68
  • [24] Multimodal QOL Estimation During Human-Robot Interaction
    Nakagawa, Satoshi
    Kuniyoshi, Yasuo
    2024 IEEE INTERNATIONAL CONFERENCE ON DIGITAL HEALTH, ICDH 2024, 2024, : 23 - 32
  • [25] Teaching a Robot to Draw: Hand Gesture Demonstration Based on Human-robot Interaction
    Yang, Limei
    Li, Zhihao
    Lei, Qujiang
    Xu, Jie
    Deng, Yunfu
    Zhong, Yuxin
    TWELFTH INTERNATIONAL CONFERENCE ON MACHINE VISION (ICMV 2019), 2020, 11433
  • [26] User feedback in human-robot interaction: Prosody, gaze and timing
    Skantze, Gabriel
    Oertel, Catharine
    Hjalmarsson, Anna
    14TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2013), VOLS 1-5, 2013, : 1900 - 1904
  • [27] Respiratory distress estimation in human-robot interaction scenario
    Alvarado, Eduardo
    Grageda, Nicolas
    Luzanto, Alejandro
    Mahu, Rodrigo
    Wuth, Jorge
    Mendoza, Laura
    Stern, Richard
    Yoma, Nestor Becerra
    INTERSPEECH 2023, 2023, : 1763 - 1767
  • [28] Human-robot interaction
    Murphy R.R.
    Nomura T.
    Billard A.
    Burke J.L.
    IEEE Robotics and Automation Magazine, 2010, 17 (02) : 85 - 89
  • [29] Human-robot interaction
    Kosuge, K
    Hirata, Y
    IEEE ROBIO 2004: PROCEEDINGS OF THE IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS, 2004, : 8 - 11
  • [30] Real-time Hand Pose Tracking and Classification for Natural Human-Robot Control
    Lima, Bruno
    Junior, Givanildo L. N.
    Amaral, Lucas
    Vieira, Thales
    Ferreira, Bruno
    Vieira, Tiago
    PROCEEDINGS OF THE 14TH INTERNATIONAL JOINT CONFERENCE ON COMPUTER VISION, IMAGING AND COMPUTER GRAPHICS THEORY AND APPLICATIONS (VISAPP), VOL 5, 2019, : 832 - 839