A three-dimensional mapping and virtual reality-based human-robot interaction for collaborative space exploration

被引:10
|
作者
Xiao, Junhao [1 ,2 ]
Wang, Pan [3 ]
Lu, Huimin [1 ]
Zhang, Hui [1 ]
机构
[1] Natl Univ Def Technol, Dept Automat, 137 Yanwachi Rd, Changsha 410073, Peoples R China
[2] Univ Lincoln, Sch Comp Sci, Lincoln, England
[3] China Aerodynam Res & Dev Ctr, Changsha, Peoples R China
基金
美国国家科学基金会; 国家重点研发计划;
关键词
Human-robot space exploration; human-robot interaction; 3D mapping; virtual reality; rescue robotics; SIMULTANEOUS LOCALIZATION; FRAMEWORK;
D O I
10.1177/1729881420925293
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Human-robot interaction is a vital part of human-robot collaborative space exploration, which bridges the high-level decision and path planning intelligence of human and the accurate sensing and modelling ability of the robot. However, most conventional human-robot interaction approaches rely on video streams for the operator to understand the robot's surrounding, which lacks situational awareness and force the operator to be stressed and fatigued. This research aims to improve efficiency and promote the natural level of interaction for human-robot collaboration. We present a human-robot interaction method based on real-time mapping and online virtual reality visualization, which is implemented and verified for rescue robotics. At the robot side, a dense point cloud map is built in real-time by LiDAR-IMU tightly fusion; the resulting map is further transformed into three-dimensional normal distributions transform representation. Wireless communication is employed to transmit the three-dimensional normal distributions transform map to the remote control station in an incremental manner. At the remote control station, the received map is rendered in virtual reality using parameterized ellipsoid cells. The operator controls the robot with three modes. In complex areas, the operator can use interactive devices to give low-level motion commands. In the less unstructured region, the operator can specify a path or even a target point. Afterwards, the robot follows the path or navigates to the target point autonomously. In other words, these two modes rely more on the robot's autonomy. By virtue of virtual reality visualization, the operator can have a more comprehensive understanding of the space to be explored. In this case, the high-level decision and path planning intelligence of human and the accurate sensing and modelling ability of the robot can be well integrated as a whole. Although the method is proposed for rescue robots, it can also be used in other out-of-sight teleoperation-based human-robot collaboration systems, including but not limited to manufacturing, space, undersea, surgery, agriculture and military operations.
引用
收藏
页数:10
相关论文
共 50 条
  • [41] Virtual Reality Platform to Develop and Test Applications on Human-Robot Social Interaction
    Bottega, Jair Augusto
    Steinmetz, Raul
    Kolling, Alisson Henrique
    Kich, Victor Augusto
    De Jesus, Junior Costa
    Grando, Ricardo Bedin
    Tello Gamarra, Daniel Fernando
    2022 LATIN AMERICAN ROBOTICS SYMPOSIUM (LARS), 2022 BRAZILIAN SYMPOSIUM ON ROBOTICS (SBR), AND 2022 WORKSHOP ON ROBOTICS IN EDUCATION (WRE), 2022, : 7 - 12
  • [42] Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI)
    Rosen, Eric
    Groechel, Thomas
    Walker, Michael E.
    Chang, Christine T.
    Forde, Jessica Zosa
    HRI '21: COMPANION OF THE 2021 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2021, : 721 - 723
  • [43] Virtual Reality Applications for Enhancing Human-Robot Interaction: A Gesture Recognition Perspective
    Sabbella, Sandeep Reddy
    Kaszuba, Sara
    Leotta, Francesco
    Nardi, Daniele
    PROCEEDINGS OF THE 23RD ACM INTERNATIONAL CONFERENCE ON INTELLIGENT VIRTUAL AGENTS, IVA 2023, 2023,
  • [44] Use of Virtual Reality for the Evaluation of Human-Robot Interaction Systems in Complex Scenarios
    Villani, Valeria
    Capelli, Beatrice
    Sabattini, Lorenzo
    2018 27TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (IEEE RO-MAN 2018), 2018, : 422 - 427
  • [45] Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI)
    Wozniak, Maciej K.
    Pascher, Max
    Ikeda, Bryce
    Luebbers, Matthew B.
    Jena, Ayesha
    COMPANION OF THE 2024 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, HRI 2024 COMPANION, 2024, : 1361 - 1363
  • [46] Human-robot collaboration in precise positioning of a three-dimensional object
    Wojtara, Tytus
    Uchihara, Masafumi
    Murayama, Hideyuki
    Shimoda, Shingo
    Sakai, Satoshi
    Fujimoto, Hideo
    Kimura, Hidenori
    AUTOMATICA, 2009, 45 (02) : 333 - 342
  • [47] Virtual, Augmented, and Mixed Reality for Human-robot Interaction: A Survey and Virtual Design Element Taxonomy
    Walker, Michael
    Phung, Thao
    Chakraborti, Tathagata
    Williams, Tom
    Szafir, Daniel
    ACM TRANSACTIONS ON HUMAN-ROBOT INTERACTION, 2023, 12 (04)
  • [48] Virtual reality-based digital twins for greenhouses: A focus on human interaction
    Slob, Naftali
    Hurst, William
    van de Zedde, Rick
    Tekinerdogan, Bedir
    COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2023, 208
  • [49] Study on Construction of Three-Dimensional Interaction Virtual Reality of Corridor Based on Computer Vision
    Liu, Xiaofan
    Wang, Jinye
    Huang, Shengxin
    Zhang, Tingting
    Lu, Jinjin
    IEEE ACCESS, 2023, 11 : 10639 - 10653
  • [50] Simulation of three-dimensional deformation of skin in human-robot interaction tasks based on the mass-spring-damper model
    Zhai, Jingmei
    Zhang, Hao
    Qinghua Daxue Xuebao/Journal of Tsinghua University, 2024, 64 (10): : 1706 - 1716