DRO-SLAM: real-time object-aware SLAM for navigation robots and autonomous driving in dynamic environments

被引:0
|
作者
Zixian W. [1 ]
Miao Z. [1 ]
Danfeng Y. [1 ]
机构
[1] School of Computer Science, Beijing University of Posts and Telecommunications, Beijing
关键词
data association; dynamic environment; object tracking; simultaneous localization and mapping (SLAM); stereo vision;
D O I
10.19682/j.cnki.1005-8885.2022.1011
中图分类号
学科分类号
摘要
Traditional simultaneous localization and mapping (SLAM) mostly performs under the assumption of an ideal static environment, which is not suitable for dynamic environments in the real world. Dynamic real-time object- aware SLAM (DRO-SLAM) is proposed in this paper, which is a visual SLAM that can realize simultaneous localizing and mapping and tracking of moving objects indoor and outdoor at the same time. It can use target recognition, oriented fast and rotated brief (ORB) feature points, and optical flow assistance to track multi-target dynamic objects and remove them during dense point cloud reconstruction while estimating their pose. By verifying the algorithm effect on the public dataset and comparing it with other methods, it can be obtained that the proposed algorithm has certain guarantees in real-time and accuracy, it also provides more functions. DRO-SLAM can provide the solution to automatic navigation which can realize lightweight deployment, provide more vehicles, pedestrians and other environmental information for navigation. © 2023, THE JOURNAL OF CHINA UNIVERSITIES OF POSTS AND TELECOMMUNICATIONS.
引用
收藏
页码:14 / 24
页数:10
相关论文
共 50 条
  • [1] DRO-SLAM: real-time object-aware SLAM for navigation robots and autonomous driving in dynamic environments
    Wang Zixian
    Zhang Miao
    Yan Danfeng
    The Journal of China Universities of Posts and Telecommunications, 2023, 30 (03) : 14 - 24
  • [2] Real-time monocular object SLAM
    Galvez-Lopez, Dorian
    Salas, Marta
    Tardos, Juan D.
    Montiel, J. M. M.
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2016, 75 : 435 - 449
  • [3] Real-Time Monocular Object-Model Aware Sparse SLAM
    Hosseinzadeh, Mehdi
    Li, Kejie
    Latif, Yasir
    Reid, Ian
    2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2019, : 7123 - 7129
  • [4] RSV-SLAM: Toward Real-Time Semantic Visual SLAM in Indoor Dynamic Environments
    Habibpour, Mobin
    Nemati, Alireza
    Meghdari, Ali
    Taheri, Alireza
    Nazari, Shima
    INTELLIGENT SYSTEMS AND APPLICATIONS, VOL 2, INTELLISYS 2023, 2024, 823 : 832 - 844
  • [5] A Real-Time Dynamic Object Segmentation Framework for SLAM System in Dynamic Scenes
    Chang, Jianfang
    Dong, Na
    Li, Donghui
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2021, 70
  • [6] Real-time Simultaneous Localization and Mapping (SLAM) for Vision-based Autonomous Navigation
    Lim, Hyon
    Lim, Jongwoo
    Kim, H. Jin
    TRANSACTIONS OF THE KOREAN SOCIETY OF MECHANICAL ENGINEERS A, 2015, 39 (05) : 483 - 489
  • [7] Real-Time Depth and Inertial Fusion for Local SLAM on Dynamic Legged Robots
    Camurri, Marco
    Bazeille, Stephane
    Caldwell, Darwin G.
    Semini, Claudio
    2015 IEEE INTERNATIONAL CONFERENCE ON MULTISENSOR FUSION AND INTEGRATION FOR INTELLIGENT SYSTEMS (MFI), 2015, : 259 - 264
  • [8] CFP-SLAM: A Real-time Visual SLAM Based on Coarse-to-Fine Probability in Dynamic Environments
    Hu, Xinggang
    Zhang, Yunzhou
    Cao, Zhenzhong
    Ma, Rong
    Wu, Yanmin
    Deng, Zhiqiang
    Sun, Wenkai
    2022 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2022, : 4399 - 4406
  • [9] Strong-SLAM: real-time RGB-D visual SLAM in dynamic environments based on StrongSORT
    Huang, Wei
    Zou, Chunlong
    Yun, Juntong
    Jiang, Du
    Huang, Li
    Liu, Ying
    Jiang, Guo Zhang
    Xie, Yuanmin
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2024, 35 (12)
  • [10] Real-time map building and navigation for autonomous robots in unknown environments
    Oriolo, G
    Ulivi, G
    Vendittelli, M
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 1998, 28 (03): : 316 - 333