Development of a 3D Relative Motion Method for Human-Robot Interaction Assessment

被引:5
|
作者
Ballen-Moreno, Felipe [1 ]
Bautista, Margarita [2 ]
Provot, Thomas [3 ,4 ]
Bourgain, Maxime [3 ,4 ]
Cifuentes, Carlos A. [5 ]
Munera, Marcela [2 ]
机构
[1] Vrije Univ Brussel, Dept Mech Engn, Robot & Multibody Mech R&MM Res Grp, B-1050 Brussels, Belgium
[2] Colombian Sch Engn Julio Garavito, Dept Biomed Engn, Bogota 111166, Colombia
[3] EPF Grad Sch Engn, F-92330 Sceaux, France
[4] Arts & Metiers Inst Technol, Inst Biomecan Humaine Georges Charpak, F-75013 Paris, France
[5] Univ Rosario, Sch Engn Sci & Technol, Bogota 111711, Colombia
关键词
exoskeleton; human-robot interaction; relative motion; EXOSKELETONS;
D O I
10.3390/s22062411
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Exoskeletons have been assessed by qualitative and quantitative features known as performance indicators. Within these, the ergonomic indicators have been isolated, creating a lack of methodologies to analyze and assess physical interfaces. In this sense, this work presents a three-dimensional relative motion assessment method. This method quantifies the difference of orientation between the user's limb and the exoskeleton link, providing a deeper understanding of the Human-Robot interaction. To this end, the AGoRA exoskeleton was configured in a resistive mode and assessed using an optoelectronic system. The interaction quantified a difference of orientation considerably at a maximum value of 41.1 degrees along the sagittal plane. It extended the understanding of the Human-Robot Interaction throughout the three principal human planes. Furthermore, the proposed method establishes a performance indicator of the physical interfaces of an exoskeleton.
引用
收藏
页数:17
相关论文
共 50 条
  • [1] A Robot Navigation Method Based on Human-Robot Interaction for 3D Environment Mapping
    Zhao, Lijun
    Li, Xiaoyu
    Sun, Zhenye
    Wang, Ke
    Yang, Chenguang
    2017 IEEE INTERNATIONAL CONFERENCE ON REAL-TIME COMPUTING AND ROBOTICS (RCAR), 2017, : 409 - 414
  • [2] Realtime 3D Segmentation for Human-Robot Interaction
    Ueckermann, Andre
    Haschke, Robert
    Ritter, Helge
    2013 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2013, : 2136 - 2143
  • [3] 3D Gesture Recognition and Adaptation for Human-Robot Interaction
    Al Mahmud, Jubayer
    Das, Bandhan Chandra
    Shin, Jungpil
    Hasib, Khan Md
    Sadik, Rifat
    Mridha, M. F.
    IEEE ACCESS, 2022, 10 : 116485 - 116513
  • [4] 3D Pointing Gesture Recognition for Human-Robot Interaction
    Lai, Yuhui
    Wang, Chen
    Li, Yanan
    Ge, Shuzhi Sam
    Huang, Deqing
    PROCEEDINGS OF THE 28TH CHINESE CONTROL AND DECISION CONFERENCE (2016 CCDC), 2016, : 4959 - 4964
  • [5] Safe Human-Robot Interaction using 3D Sensor
    Graf, Juergen
    Woern, Heinz
    AUTOMATION 2009, 2009, 2067 : 445 - 447
  • [6] Pointing and Commanding Gesture Recognition in 3D for Human-Robot Interaction
    Rahman, Abid
    Al Mahmud, Jubayer
    Hasanuzzaman, Md.
    2018 INTERNATIONAL CONFERENCE ON INNOVATION IN ENGINEERING AND TECHNOLOGY (ICIET), 2018,
  • [7] 3D Printed Soft Skin for Safe Human-Robot Interaction
    Kim, Joohyung
    Alspach, Alexander
    Yamane, Katsu
    2015 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2015, : 2419 - 2425
  • [8] Human-Robot Interaction through 3D Vision and Force Control
    Jevtic, Aleksandar
    Doisy, Guillaume
    Bodiroza, Sasa
    Edan, Yael
    Hafner, Verena V.
    HRI'14: PROCEEDINGS OF THE 2014 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2014, : 102 - 102
  • [9] Detecting and tracking of 3D face pose for human-robot interaction
    Dornaika, Fadi
    Raducanu, Bogdan
    2008 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-9, 2008, : 1716 - +
  • [10] Human-Robot Co Working by HMM Based 3D Human Motion Recognition
    Pehlivan, Alp Burak
    Oztop, Erhan
    2014 22ND SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2014, : 1547 - 1550