A three-dimensional mapping and virtual reality-based human-robot interaction for collaborative space exploration

被引:10
|
作者
Xiao, Junhao [1 ,2 ]
Wang, Pan [3 ]
Lu, Huimin [1 ]
Zhang, Hui [1 ]
机构
[1] Natl Univ Def Technol, Dept Automat, 137 Yanwachi Rd, Changsha 410073, Peoples R China
[2] Univ Lincoln, Sch Comp Sci, Lincoln, England
[3] China Aerodynam Res & Dev Ctr, Changsha, Peoples R China
基金
美国国家科学基金会; 国家重点研发计划;
关键词
Human-robot space exploration; human-robot interaction; 3D mapping; virtual reality; rescue robotics; SIMULTANEOUS LOCALIZATION; FRAMEWORK;
D O I
10.1177/1729881420925293
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Human-robot interaction is a vital part of human-robot collaborative space exploration, which bridges the high-level decision and path planning intelligence of human and the accurate sensing and modelling ability of the robot. However, most conventional human-robot interaction approaches rely on video streams for the operator to understand the robot's surrounding, which lacks situational awareness and force the operator to be stressed and fatigued. This research aims to improve efficiency and promote the natural level of interaction for human-robot collaboration. We present a human-robot interaction method based on real-time mapping and online virtual reality visualization, which is implemented and verified for rescue robotics. At the robot side, a dense point cloud map is built in real-time by LiDAR-IMU tightly fusion; the resulting map is further transformed into three-dimensional normal distributions transform representation. Wireless communication is employed to transmit the three-dimensional normal distributions transform map to the remote control station in an incremental manner. At the remote control station, the received map is rendered in virtual reality using parameterized ellipsoid cells. The operator controls the robot with three modes. In complex areas, the operator can use interactive devices to give low-level motion commands. In the less unstructured region, the operator can specify a path or even a target point. Afterwards, the robot follows the path or navigates to the target point autonomously. In other words, these two modes rely more on the robot's autonomy. By virtue of virtual reality visualization, the operator can have a more comprehensive understanding of the space to be explored. In this case, the high-level decision and path planning intelligence of human and the accurate sensing and modelling ability of the robot can be well integrated as a whole. Although the method is proposed for rescue robots, it can also be used in other out-of-sight teleoperation-based human-robot collaboration systems, including but not limited to manufacturing, space, undersea, surgery, agriculture and military operations.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] A Human-Robot Collaborative System for Robust Three-Dimensional Mapping
    Du, Jianhao
    Sheng, Weihua
    Liu, Meiqin
    IEEE-ASME TRANSACTIONS ON MECHATRONICS, 2018, 23 (05) : 2358 - 2368
  • [2] A virtual reality-based immersive teleoperation system for remote human-robot collaborative manufacturing
    Wan, Ke
    Li, Chengxi
    Lo, Fo-Sing
    Zheng, Pai
    MANUFACTURING LETTERS, 2024, 41 : 43 - 50
  • [3] Virtual Reality-based Human-Robot Interaction for Remote Pick-and-Place Tasks
    Xu, Wei
    Huf, Tim
    Ye, Siyang
    Sanchez, Joshua Rafael R.
    Rose, Darius
    Tung, Harry
    Tong, Yuang
    Hatcher, Jack
    Klein, Matthew
    Morales, Eric
    Guo, Davy
    Hsu, Yusam
    Peng, Haonan
    Assadian, Zubin A.
    Raiti, John
    COMPANION OF THE 2024 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, HRI 2024 COMPANION, 2024, : 1148 - 1152
  • [4] Functional Mapping for Human-Robot Collaborative Exploration
    Keshavdas, Shanker
    Kruijff, Geert-Jan M.
    2012 IEEE INTERNATIONAL SYMPOSIUM ON SAFETY, SECURITY, AND RESCUE ROBOTICS (SSRR), 2012,
  • [5] A human-robot interaction system based on 3D mapping and virtual reality
    Zhang H.
    Wang P.
    Xiao J.-H.
    Lu H.-M.
    Kongzhi yu Juece/Control and Decision, 2018, 33 (11): : 1975 - 1982
  • [6] Human-Robot Interaction Using Three-Dimensional Gestures
    Ponmani, K.
    Sridharan, S.
    INTELLIGENT EMBEDDED SYSTEMS, ICNETS2, VOL II, 2018, 492 : 67 - 76
  • [7] USING VIRTUAL REALITY TO TEST HUMAN-ROBOT INTERACTION DURING A COLLABORATIVE TASK
    Etzi, Roberta
    Huang, Siyuan
    Scurati, Giulia Wally
    Lyu, Shilei
    Ferrise, Francesco
    Gallace, Alberto
    Gaggioli, Andrea
    Chirico, Alice
    Carulli, Marina
    Bordegoni, Monica
    PROCEEDINGS OF THE ASME INTERNATIONAL DESIGN ENGINEERING TECHNICAL CONFERENCES AND COMPUTERS AND INFORMATION IN ENGINEERING CONFERENCE, 2019, VOL 1, 2020,
  • [8] Understanding Human-Robot Interaction in Virtual Reality
    Liu, Oliver
    Rakita, Daniel
    Mutlu, Bilge
    Gleicher, Michael
    2017 26TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN), 2017, : 751 - 757
  • [9] A Simulator for Human-Robot Interaction in Virtual Reality
    Murnane, Mark
    Higgins, Padraig
    Saraf, Monali
    Ferraro, Francis
    Matuszek, Cynthia
    Engel, Don
    2021 IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES ABSTRACTS AND WORKSHOPS (VRW 2021), 2021, : 470 - 471
  • [10] Exploration of two safety strategies in human-robot collaborative manufacturing using Virtual Reality
    Vosniakos, George-Christopher
    Ouillon, Lucas
    Matsas, Elias
    29TH INTERNATIONAL CONFERENCE ON FLEXIBLE AUTOMATION AND INTELLIGENT MANUFACTURING (FAIM 2019): BEYOND INDUSTRY 4.0: INDUSTRIAL ADVANCES, ENGINEERING EDUCATION AND INTELLIGENT MANUFACTURING, 2019, 38 : 524 - 531