Human-Robot Interaction Using Learning from Demonstrations and a Wearable Glove with Multiple Sensors

被引:3
|
作者
Singh, Rajmeet [1 ]
Mozaffari, Saeed [1 ]
Akhshik, Masoud [1 ]
Ahamed, Mohammed Jalal [1 ]
Rondeau-Gagne, Simon [2 ]
Alirezaee, Shahpour [1 ]
机构
[1] Univ Windsor, Mech Automot & Mat Engn Dept, Windsor, ON N9B 3P4, Canada
[2] Univ Windsor, Dept Chem & Biochem, Windsor, ON N9B 3P4, Canada
关键词
robotic grasping; human-robot interaction; inertia; pressure; flexi sensors; wearable devices; learning from demonstration;
D O I
10.3390/s23249780
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Human-robot interaction is of the utmost importance as it enables seamless collaboration and communication between humans and robots, leading to enhanced productivity and efficiency. It involves gathering data from humans, transmitting the data to a robot for execution, and providing feedback to the human. To perform complex tasks, such as robotic grasping and manipulation, which require both human intelligence and robotic capabilities, effective interaction modes are required. To address this issue, we use a wearable glove to collect relevant data from a human demonstrator for improved human-robot interaction. Accelerometer, pressure, and flexi sensors were embedded in the wearable glove to measure motion and force information for handling objects of different sizes, materials, and conditions. A machine learning algorithm is proposed to recognize grasp orientation and position, based on the multi-sensor fusion method.
引用
收藏
页数:17
相关论文
共 50 条
  • [1] Wearable Sensors for Human-Robot Walking Together
    Moschetti, Alessandra
    Cavallo, Filippo
    Esposito, Dario
    Penders, Jacques
    Di Nuovo, Alessandro
    ROBOTICS, 2019, 8 (02)
  • [2] MoVEInt: Mixture of Variational Experts for Learning Human-Robot Interactions From Demonstrations
    Prasad, Vignesh
    Kshirsagar, Alap
    Koert, Dorothea
    Stock-Homburg, Ruth
    Peters, Jan
    Chalvatzaki, Georgia
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (07): : 6043 - 6050
  • [3] The Effect of Multiple Robot Interaction on Human-Robot Interaction
    Yang, Jeong-Yean
    Kwon, Dong-Soo
    2012 9TH INTERNATIONAL CONFERENCE ON UBIQUITOUS ROBOTS AND AMBIENT INTELLIGENCE (URAL), 2012, : 30 - 33
  • [4] COSPI: Contextual sensors for human-robot interaction
    Ishii, K
    Imai, M
    PROCEEDINGS OF THE TENTH IASTED INTERNATIONAL CONFERENCE ON ROBOTICS AND APPLICATIONS, 2004, : 136 - 141
  • [5] Using the Rhythm of Nonverbal Human-Robot Interaction as a Signal for Learning
    Andry, Pierre
    Blanchard, Arnaud
    Gaussier, Philippe
    IEEE TRANSACTIONS ON AUTONOMOUS MENTAL DEVELOPMENT, 2011, 3 (01) : 30 - 42
  • [6] Quantifying Hypothesis Space Misspecification in Learning From Human-Robot Demonstrations and Physical Corrections
    Bobu, Andreea
    Bajcsy, Andrea
    Fisac, Jaime F.
    Deglurkar, Sampada
    Dragan, Anca D.
    IEEE TRANSACTIONS ON ROBOTICS, 2020, 36 (03) : 835 - 854
  • [7] Wearable Devices for the Assessment of Cognitive Effort for Human-Robot Interaction
    Villani, Valeria
    Righi, Massimiliano
    Sabattini, Lorenzo
    Secchi, Cristian
    IEEE SENSORS JOURNAL, 2020, 20 (21) : 13047 - 13056
  • [8] Human-Robot Interaction Based on use of Capacitive Sensors
    Sekoranja, Bojan
    Basic, Denis
    Svaco, Marko
    Suligoj, Filip
    Jerbic, Bojan
    24TH DAAAM INTERNATIONAL SYMPOSIUM ON INTELLIGENT MANUFACTURING AND AUTOMATION, 2013, 2014, 69 : 464 - 468
  • [9] Emotionally Assisted Human-Robot Interaction Using a Wearable Device for Reading Facial Expressions
    Gruebler, Anna
    Berenz, Vincent
    Suzuki, Kenji
    ADVANCED ROBOTICS, 2012, 26 (10) : 1143 - 1159
  • [10] Incremental learning of gestures for human-robot interaction
    Okada, Shogo
    Kobayashi, Yoichi
    Ishibashi, Satoshi
    Nishida, Toyoaki
    AI & SOCIETY, 2010, 25 (02) : 155 - 168