Cross transformer for LiDAR-based loop closure detection

被引:0
作者
Zheng, Rui [1 ]
Ren, Yang [1 ]
Zhou, Qi [2 ]
Ye, Yibin [1 ]
Zeng, Hui [1 ,3 ]
机构
[1] Univ Sci & Technol Beijing, Beijing Engn Res Ctr Ind Spectrum Imaging, Sch Automat & Elect Engn, Beijing 100083, Peoples R China
[2] Harbin Engn Univ, Southampton Ocean Engn Joint Inst, Harbin 150001, Peoples R China
[3] Univ Sci & Technol Beijing, Shunde Innovat Sch, Foshan 528399, Peoples R China
基金
中国国家自然科学基金;
关键词
Deep learning; Loop closure; LiDAR; SLAM; Transformer; PLACE RECOGNITION; LOCALIZATION; DESCRIPTOR; VISION;
D O I
10.1007/s00138-024-01629-w
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Loop closure detection, also known as place recognition, a key component of simultaneous localization and mapping (SLAM) systems, aims to recognize previously visited locations and reduce the accumulated drift error caused by odometry. Current vision-based methods are susceptible to variations in illumination and perspective, limiting their generalization ability and robustness. Thus, in this paper, we propose CrossT-Net (Cross Transformer Net), a novel cross-attention based loop closure detection network for LiDAR. CrossT-Net directly estimates the similarity between two frames by leveraging multi-class information maps, including range, intensity, and normal maps, to comprehensively characterize environmental features. A Siamese Encoder Net with shared parameters extracts frame features, and a Cross Transformer module captures intra-frame context and inter-frame correlations through self-attention and cross-attention mechanisms. In the final stage, an Overlap Estimation Module predicts the point cloud overlap between two frames. Experimental results on several benchmark datasets demonstrate that our proposed method outperforms existing methods in precision and recall, and exhibits strong generalization performance in different road environments. The implementation of our approach is available at: https://github.com/Bryan-ZhengRui/CrossT-Net_Pytorch.
引用
收藏
页数:15
相关论文
共 45 条
[1]   Loop-Closure Detection with 3D LiDAR Data for Extreme Viewpoint Changes [J].
Alexiou, Dimitrios ;
Tsiakas, Kosmas ;
Kostavelis, Ioannis ;
Giakoumis, Dimitrios ;
Gasteratos, Antonios ;
Tzovaras, Dimitrios .
2022 26TH INTERNATIONAL CONFERENCE ON METHODS AND MODELS IN AUTOMATION AND ROBOTICS, MMAR 2022, 2022, :29-34
[2]   PADLoC: LiDAR-Based Deep Loop Closure Detection and Registration Using Panoptic Attention [J].
Arce, Jose ;
Voedisch, Niclas ;
Cattaneo, Daniele ;
Burgard, Wolfram ;
Valada, Abhinav .
IEEE ROBOTICS AND AUTOMATION LETTERS, 2023, 8 (03) :1319-1326
[3]   Simultaneous localization and mapping (SLAM): Part II [J].
Bailey, Tim ;
Durrant-Whyte, Hugh .
IEEE ROBOTICS & AUTOMATION MAGAZINE, 2006, 13 (03) :108-117
[4]   LCDNet: Deep Loop Closure Detection and Point Cloud Registration for LiDAR SLAM [J].
Cattaneo, Daniele ;
Vaghi, Matteo ;
Valada, Abhinav .
IEEE TRANSACTIONS ON ROBOTICS, 2022, 38 (04) :2074-2093
[5]   OverlapNet: a siamese network for computing LiDAR scan similarity with applications to loop closing and localization [J].
Chen, Xieyuanli ;
Laebe, Thomas ;
Milioto, Andres ;
Rohling, Timo ;
Behley, Jens ;
Stachniss, Cyrill .
AUTONOMOUS ROBOTS, 2022, 46 (01) :61-81
[6]   Transformer Tracking [J].
Chen, Xin ;
Yan, Bin ;
Zhu, Jiawen ;
Wang, Dong ;
Yang, Xiaoyun ;
Lu, Huchuan .
2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, :8122-8131
[7]  
Cop KP, 2018, IEEE INT CONF ROBOT, P3653, DOI 10.1109/ICRA.2018.8460940
[8]  
Dube R., 2017, P IEEE INT C ROB AUT, P5266
[9]   SegMap: Segment-based mapping and localization using data-driven descriptors [J].
Dube, Renaud ;
Cramariuc, Andrei ;
Dugas, Daniel ;
Sommer, Hannes ;
Dymczyk, Marcin ;
Nieto, Juan ;
Siegwart, Roland ;
Cadena, Cesar .
INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2020, 39 (2-3) :339-355
[10]   Seed: A Segmentation-Based Egocentric 3D Point Cloud Descriptor for Loop Closure Detection [J].
Fan, Yunfeng ;
He, Yichang ;
Tan, U-Xuan .
2020 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2020, :5158-5163