Human-Robot Interaction Using Three-Dimensional Gestures

被引:1
|
作者
Ponmani, K. [1 ]
Sridharan, S. [1 ]
机构
[1] Easwari Engn Coll, Dept ECE, Chennai, Tamil Nadu, India
关键词
Robotic arm; Leap motion controller; Human-computer interaction; Three-dimensional pointing gesture recognition; Application programming interface (API); Internet of Things (IoT);
D O I
10.1007/978-981-10-8575-8_8
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
In the future of manufacturing industries, it is seen that the robots will replace most of the human workers to achieve faster production. All the industrial areas use automated robotic arms which do certain tasks assigned to them with amazing speeds and pinpoint accuracy. But areas such as medicine, space research, and military robotics require robot arms to be manually controlled to operate with objects that cannot be dealt with human hands. In order to achieve speed and accuracy like automated robots, a new approach, called the leap motion technology, a latest invention in human-computer interaction area is proposed. Hence, three-dimensional gestures are used and gesture recognition method using 3D pointing is schemed for interaction of human and robot using leap motion controller in real time. It facilitates the human pointing 3D gestures for the robot to understand in real time. The arm of robot captures the gestures shown by human partner to interact with the user's gestures and perform the task immediately. The position of the human's hand relative to the head is adjusted so that the speed of the robot is controlled. The status of the robot movement can be viewed in LCD. This proposal designs to advance a valuable 3D pointing gesture recognition system in real time for human-robot interaction.
引用
收藏
页码:67 / 76
页数:10
相关论文
共 50 条
  • [1] Human-Robot Interaction Using Pointing Gestures
    Tolgyessy, Michal
    Dekan, Martin
    Hubinsky, Peter
    ISCSIC'18: PROCEEDINGS OF THE 2ND INTERNATIONAL SYMPOSIUM ON COMPUTER SCIENCE AND INTELLIGENT CONTROL, 2018,
  • [2] Pantomimic Gestures for Human-Robot Interaction
    Burke, Michael
    Lasenby, Joan
    IEEE TRANSACTIONS ON ROBOTICS, 2015, 31 (05) : 1225 - 1237
  • [3] Conversational Gestures in Human-Robot Interaction
    Bremner, Paul
    Pipe, Anthony
    Melhuish, Chris
    Fraser, Mike
    Subramanian, Sriram
    2009 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC 2009), VOLS 1-9, 2009, : 1645 - +
  • [4] Incremental learning of gestures for human-robot interaction
    Okada, Shogo
    Kobayashi, Yoichi
    Ishibashi, Satoshi
    Nishida, Toyoaki
    AI & SOCIETY, 2010, 25 (02) : 155 - 168
  • [5] Integration of Gestures and Speech in Human-Robot Interaction
    Meena, Raveesh
    Jokinen, Kristiina
    Wilcock, Graham
    3RD IEEE INTERNATIONAL CONFERENCE ON COGNITIVE INFOCOMMUNICATIONS (COGINFOCOM 2012), 2012, : 673 - 678
  • [6] Pointing Gestures for Human-Robot Interaction with the Humanoid Robot Digit
    Lorentz, Viktor
    Weiss, Manuel
    Hildebrand, Kristian
    Boblan, Ivo
    2023 32ND IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, RO-MAN, 2023, : 1886 - 1892
  • [7] An Underwater Human-Robot Interaction Using Hand Gestures for Fuzzy Control
    Jiang, Yu
    Peng, Xianglong
    Xue, Mingzhu
    Wang, Chong
    Qi, Hong
    INTERNATIONAL JOURNAL OF FUZZY SYSTEMS, 2021, 23 (06) : 1879 - 1889
  • [8] Visual recognition of pointing gestures for human-robot interaction
    Nickel, Kai
    Stiefelhagen, Rainer
    IMAGE AND VISION COMPUTING, 2007, 25 (12) : 1875 - 1884
  • [9] Recognizing Touch Gestures for Social Human-Robot Interaction
    Altuglu, Tugce Balli
    Altun, Kerem
    ICMI'15: PROCEEDINGS OF THE 2015 ACM INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION, 2015, : 407 - 413
  • [10] Understanding and learning of gestures through human-robot interaction
    Kuno, Y
    Murashima, T
    Shimada, N
    Shirai, Y
    2000 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2000), VOLS 1-3, PROCEEDINGS, 2000, : 2133 - 2138