Human-Robot Interaction Using Three-Dimensional Gestures

被引:1
|
作者
Ponmani, K. [1 ]
Sridharan, S. [1 ]
机构
[1] Easwari Engn Coll, Dept ECE, Chennai, Tamil Nadu, India
关键词
Robotic arm; Leap motion controller; Human-computer interaction; Three-dimensional pointing gesture recognition; Application programming interface (API); Internet of Things (IoT);
D O I
10.1007/978-981-10-8575-8_8
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
In the future of manufacturing industries, it is seen that the robots will replace most of the human workers to achieve faster production. All the industrial areas use automated robotic arms which do certain tasks assigned to them with amazing speeds and pinpoint accuracy. But areas such as medicine, space research, and military robotics require robot arms to be manually controlled to operate with objects that cannot be dealt with human hands. In order to achieve speed and accuracy like automated robots, a new approach, called the leap motion technology, a latest invention in human-computer interaction area is proposed. Hence, three-dimensional gestures are used and gesture recognition method using 3D pointing is schemed for interaction of human and robot using leap motion controller in real time. It facilitates the human pointing 3D gestures for the robot to understand in real time. The arm of robot captures the gestures shown by human partner to interact with the user's gestures and perform the task immediately. The position of the human's hand relative to the head is adjusted so that the speed of the robot is controlled. The status of the robot movement can be viewed in LCD. This proposal designs to advance a valuable 3D pointing gesture recognition system in real time for human-robot interaction.
引用
收藏
页码:67 / 76
页数:10
相关论文
共 50 条
  • [41] Human-Robot Interaction
    Sethumadhavan, Arathi
    ERGONOMICS IN DESIGN, 2012, 20 (03) : 27 - +
  • [42] Human-robot interaction
    Murphy R.R.
    Nomura T.
    Billard A.
    Burke J.L.
    IEEE Robotics and Automation Magazine, 2010, 17 (02): : 85 - 89
  • [43] Human-robot interaction
    Kosuge, K
    Hirata, Y
    IEEE ROBIO 2004: PROCEEDINGS OF THE IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS, 2004, : 8 - 11
  • [44] Human-robot interaction
    Sidobre, Daniel
    Broquère, Xavier
    Mainprice, Jim
    Burattini, Ernesto
    Finzi, Alberto
    Rossi, Silvia
    Staffa, Mariacarla
    Springer Tracts in Advanced Robotics, 2012, 80 (STAR): : 123 - 172
  • [45] Human-Robot Interaction
    Ivaldi, Serena
    Pateraki, Maria
    ERCIM NEWS, 2018, (114): : 6 - 7
  • [46] Human Robot Interaction Using Dynamic Hand Gestures
    Zafar, Zuhair
    Salazar, Daniel Alejandro
    Al-Darraji, Salah
    Urukalo, Djordje
    Berns, Karsten
    Rodic, Aleksandar
    ADVANCES IN SERVICE AND INDUSTRIAL ROBOTICS, 2018, 49 : 649 - 656
  • [47] Using Proxemics to Evaluate Human-Robot Interaction
    Feil-Seifer, David
    Mataric, Maja
    PROCEEDINGS OF THE 5TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI 2010), 2010, : 143 - 144
  • [48] Human-Robot Interaction using Mixed Reality
    Gallala, Abir
    Hichri, Bassem
    Plapper, Peter
    INTERNATIONAL CONFERENCE ON ELECTRICAL, COMPUTER AND ENERGY TECHNOLOGIES (ICECET 2021), 2021, : 1387 - 1392
  • [49] Emotion Recognition in Human-Robot Interaction Using the NAO Robot
    Valagkouti, Iro Athina
    Troussas, Christos
    Krouska, Akrivi
    Feidakis, Michalis
    Sgouropoulou, Cleo
    COMPUTERS, 2022, 11 (05)
  • [50] Robot Trajectory Modification Using Human-Robot Force Interaction
    Liu, Hung-Shen
    Song, Kai-Tai
    2017 56TH ANNUAL CONFERENCE OF THE SOCIETY OF INSTRUMENT AND CONTROL ENGINEERS OF JAPAN (SICE), 2017, : 1417 - 1420