The collaborative mapping and navigation based on visual SLAM in UAV platform

被引:0
作者
Wang C. [1 ]
Luo B. [1 ]
Li C. [1 ]
Wang W. [1 ]
Yin L. [1 ]
Zhao Q. [1 ]
机构
[1] State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University, Wuhan
来源
Cehui Xuebao/Acta Geodaetica et Cartographica Sinica | 2020年 / 49卷 / 06期
关键词
Air-ground collaboration; Autonomous navigation; Salient closed boundaries; Unmanned aerial vehicle; Visual SLAM;
D O I
10.11947/j.AGCS.2020.20190145
中图分类号
学科分类号
摘要
Due to the limitation of its working space, ground robots have great limitations on environmental perception. Combining the advantages of aerial robots in perspective, it is the mainstream trend to realize the collaboration of aerial and ground robots. This paper proposes a collaborative mapping and navigation scheme based on visual SLAM in UAV platform, which utilizes the wide-area perception capability brought by the aerial view of unmanned aerial vehicle, to assist the ground robot to construct the environmental model quickly and improve the ability of ground robots to map and navigate in challenging and unknown environments. This scheme first constructs a real-time detection and tracking thread for salient closed boundaries, and proposes a novel visual SLAM solution proposed for mapping of UAV with combining point, line features and salient closed boundaries. Compared with the traditional scheme, the combination of closed boundaries greatly optimizes the effect of mapping. Secondly, the ground robot automatically plans the global path according to the initial global map obtained by the aerial robot. During the moving process, the initial map from UAV is updated by using the mounted laser sensor on the ground one. And the continuous re-planning of the path enables the ground robot to avoid collisions with obstacles. In order to verify the feasibility and advancement of the proposed scheme, simulation experiments and real experiments were carried out respectively. The experimental results show that the proposed scheme significantly improves the mapping effect, realizes the whole process of collaborative navigation and mapping scheme, which improves the ability of ground robots to perform autonomous navigation and mapping in challenging unknown areas. However, the proposed method is not effective in complex situations such as dense obstacles distribution, high and low ground level, and the implementation of 2D navigation has large limitations. Based on the fusion of multi-sensors such as Lidar, IMU, etc.. Future work needs to improve the effect of tasks such as depth estimation and pose estimation to build accurate three-dimensional occupancy grid maps, and further designs a three-dimensional air-ground collaborative mapping and navigation scheme. © 2020, Surveying and Mapping Press. All right reserved.
引用
收藏
页码:767 / 776
页数:9
相关论文
共 33 条
  • [1] BUTZKE J, GOCHEV K, HOLDEN B, Et al., Planning for a ground-air robotic system with collaborative localization, Proceedings of 2016 IEEE International Conference on Robotics and Automation (ICRA), pp. 284-291, (2016)
  • [2] KELLY A, STENTZ A, AMIDI O, Et al., Toward reliable off road autonomous vehicles operating in challenging environments, International Journal of Robotics Research, 25, 5-6, pp. 449-483, (2006)
  • [3] HSIEH M A, COWLEY A, KELLER J F, Et al., Adaptive teams of autonomous aerial and ground robots for situational awareness, Journal of Field Robotics, 24, 11-12, pp. 991-1014, (2007)
  • [4] MUEGGLER E, FAESSLER M, FONTANA F, Et al., Aerial-guided navigation of a ground robot among movable obstacles, Proceedings of 2014 IEEE International Symposium on Safety, Security, and Rescue Robotics (2014), pp. 1-8, (2014)
  • [5] HARIK E H C, GUERIN F, GUINAND F, Et al., UAV-UGV cooperation for objects transportation in an industrial area, Proceedings of 2015 IEEE International Conference on Industrial Technology (ICIT), pp. 547-552, (2015)
  • [6] HEPPNER G, ROENNAU A, DILLMANN R., Enhancing sensor capabilities of walking robots through cooperative exploration with aerial robots, Journal of Automation Mobile Robotics and Intelligent Systems, 7, 2, pp. 5-11, (2013)
  • [7] GARZON M, VALENTE J, ZAPATA D, Et al., An aerial-ground robotic system for navigation and obstacle mapping in large outdoor areas, Sensors, 13, 1, pp. 1247-1267, (2013)
  • [8] LIU Yi, CHE Jin, ZHU Xiaobo, Et al., The navigation method and experimental research of air-ground robot cooperation, Application of Electronic Technique, 44, 10, pp. 144-148, (2018)
  • [9] RUDOL P, WZOREK M, CONTE G, Et al., Micro unmanned aerial vehicle visual servoing for cooperative indoor exploration, Proceedings of 2008 IEEE Aerospace Conference, pp. 1-10, (2008)
  • [10] FAESSLER M, MUEGGLER E, SCHWABE K, Et al., A monocular pose estimation system based on infrared LEDs, Proceedings of 2014 IEEE International Conference on Robotics and Automation (ICRA), pp. 907-913, (2014)