A three-dimensional mapping and virtual reality-based human-robot interaction for collaborative space exploration

被引:10
|
作者
Xiao, Junhao [1 ,2 ]
Wang, Pan [3 ]
Lu, Huimin [1 ]
Zhang, Hui [1 ]
机构
[1] Natl Univ Def Technol, Dept Automat, 137 Yanwachi Rd, Changsha 410073, Peoples R China
[2] Univ Lincoln, Sch Comp Sci, Lincoln, England
[3] China Aerodynam Res & Dev Ctr, Changsha, Peoples R China
基金
美国国家科学基金会; 国家重点研发计划;
关键词
Human-robot space exploration; human-robot interaction; 3D mapping; virtual reality; rescue robotics; SIMULTANEOUS LOCALIZATION; FRAMEWORK;
D O I
10.1177/1729881420925293
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Human-robot interaction is a vital part of human-robot collaborative space exploration, which bridges the high-level decision and path planning intelligence of human and the accurate sensing and modelling ability of the robot. However, most conventional human-robot interaction approaches rely on video streams for the operator to understand the robot's surrounding, which lacks situational awareness and force the operator to be stressed and fatigued. This research aims to improve efficiency and promote the natural level of interaction for human-robot collaboration. We present a human-robot interaction method based on real-time mapping and online virtual reality visualization, which is implemented and verified for rescue robotics. At the robot side, a dense point cloud map is built in real-time by LiDAR-IMU tightly fusion; the resulting map is further transformed into three-dimensional normal distributions transform representation. Wireless communication is employed to transmit the three-dimensional normal distributions transform map to the remote control station in an incremental manner. At the remote control station, the received map is rendered in virtual reality using parameterized ellipsoid cells. The operator controls the robot with three modes. In complex areas, the operator can use interactive devices to give low-level motion commands. In the less unstructured region, the operator can specify a path or even a target point. Afterwards, the robot follows the path or navigates to the target point autonomously. In other words, these two modes rely more on the robot's autonomy. By virtue of virtual reality visualization, the operator can have a more comprehensive understanding of the space to be explored. In this case, the high-level decision and path planning intelligence of human and the accurate sensing and modelling ability of the robot can be well integrated as a whole. Although the method is proposed for rescue robots, it can also be used in other out-of-sight teleoperation-based human-robot collaboration systems, including but not limited to manufacturing, space, undersea, surgery, agriculture and military operations.
引用
收藏
页数:10
相关论文
共 50 条
  • [31] Immersive Virtual Reality-Based Simulation to Support the Design of Natural Human-Robot Interfaces for Service Robotic Applications
    Bazzano, Federica
    Gentilini, Federico
    Lamberti, Fabrizio
    Sanna, Andrea
    Paravati, Gianluca
    Gatteschi, Valentina
    Gaspardone, Marco
    AUGMENTED REALITY, VIRTUAL REALITY, AND COMPUTER GRAPHICS, PT I, 2016, 9768 : 33 - 51
  • [32] An Evaluation Framework of Human-Robot Teaming for Navigation Among Movable Obstacles via Virtual Reality-Based Interactions
    Huang, Ching-, I
    Chou, Sun-Fu
    Liou, Li-Wei
    Moy, Nathan Alan
    Wang, Chi-Ruei
    Wang, Hsueh-Cheng
    Ahn, Charles
    Huang, Chun-Ting
    Yu, Lap-Fai
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (04): : 3411 - 3418
  • [33] A virtual reality-based ergonomic assessment approach for human-robot collaboration workstation design in modular construction manufacturing
    Fu, Yonglin
    Lu, Weisheng
    Chen, Junjie
    ADVANCED ENGINEERING INFORMATICS, 2025, 64
  • [34] Scalable Representation Learning for Long-Term Augmented Reality-Based Information Delivery in Collaborative Human-Robot Perception
    Han, Fei
    Siva, Sriram
    Zhang, Hao
    VIRTUAL, AUGMENTED AND MIXED REALITY: APPLICATIONS AND CASE STUDIES, VAMR 2019, PT II, 2019, 11575 : 47 - 62
  • [35] AR-based interaction for human-robot collaborative manufacturing
    Hietanen, Antti
    Latokartano, Jyrki
    Pieters, Roel
    Lanz, Minna
    Kamarainen, Joni-Kristian
    ROBOTICS AND COMPUTER-INTEGRATED MANUFACTURING, 2020, 63
  • [36] Design of a three-dimensional capacitor-based six-axis force sensor for human-robot interaction
    He, Zexia
    Liu, Tao
    SENSORS AND ACTUATORS A-PHYSICAL, 2021, 331
  • [37] Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI)
    Williams, Tom
    Szafir, Daniel
    Chakraborti, Tathagata
    Phillips, Elizabeth
    HRI '19: 2019 14TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2019, : 671 - 672
  • [38] Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI)
    Wozniak, Maciej
    Chang, Christine T.
    Luebbers, Matthew B.
    Ikeda, Bryce
    Walker, Michael
    Rosen, Eric
    Groechel, Thomas Roy
    COMPANION OF THE ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, HRI 2023, 2023, : 938 - 940
  • [39] Extended Abstract: Natural Human-Robot Interaction in Virtual Reality Telepresence Systems
    Zhang, Jingxin
    25TH 2018 IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES (VR), 2018, : 812 - 813
  • [40] Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI)
    Williams, Tom
    Szafir, Daniel
    Chakraborti, Tathagata
    Khim, Ong Soh
    Rosen, Eric
    Booth, Serena
    Groechel, Thomas
    HRI'20: COMPANION OF THE 2020 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2020, : 663 - 664