Inter-Scale Similarity Guided Cost Aggregation for Stereo Matching

被引:0
作者
Li, Pengxiang [1 ]
Yao, Chengtang [1 ]
Jia, Yunde [1 ,2 ]
Wu, Yuwei [1 ]
机构
[1] Beijing Inst Technol, Sch Comp Sci, Beijing Lab Intelligent Informat Technol, Beijing 100081, Peoples R China
[2] Shenzhen MSU BIT Univ, Guangdong Lab Machine Percept & Intelligent Comp, Shenzhen 518172, Peoples R China
关键词
Costs; Three-dimensional displays; Deconvolution; Interpolation; Optimization; Kernel; Aggregates; Stereo matching; cost aggregation; content-aware upsampling;
D O I
10.1109/TCSVT.2024.3453965
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Stereo matching aims to estimate 3D geometry by computing disparity from a rectified image pair. Most deep learning based stereo matching methods aggregate multi-scale cost volumes computed by downsampling and achieve good performance. However, their effectiveness in fine-grained areas is limited by significant detail loss during downsampling and the use of fixed weights in upsampling. In this paper, we propose an inter-scale similarity-guided cost aggregation method that dynamically upsamples the cost volumes according to the content of images for stereo matching. The method consists of two modules: inter-scale similarity measurement and stereo-content-aware cost aggregation. Specifically, we use inter-scale similarity measurement to generate similarity guidance from feature maps in adjacent scales. The guidance, generated from both reference and target images, is then used to aggregate the cost volumes from low-resolution to high-resolution via stereo-content-aware cost aggregation. We further split the 3D aggregation into 1D disparity and 2D spatial aggregation to reduce the computational cost. Experimental results on various benchmarks (e.g., SceneFlow, KITTI, Middlebury and ETH3D-two-view) show that our method achieves consistent performance gain on multiple models (e.g., PSM-Net, HSM-Net, CF-Net, FastAcv, and FactAcvPlus). The code can be found at https://github.com/Pengxiang-Li/issga-stereo.
引用
收藏
页码:134 / 147
页数:14
相关论文
共 73 条
[1]  
Azali M. N. Z., 2022, IET Conference Proceedings, P244, DOI 10.1049/icp.2022.2620
[2]   CBMV: A Coalesced Bidirectional Matching Volume for Disparity Estimation [J].
Batsos, Konstantinos ;
Cai, Changjiang ;
Mordohai, Philippos .
2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, :2060-2069
[3]   MONOTONE PIECEWISE BICUBIC INTERPOLATION [J].
CARLSON, RE ;
FRITSCH, FN .
SIAM JOURNAL ON NUMERICAL ANALYSIS, 1985, 22 (02) :386-400
[4]   StereoDRNet: Dilated Residual StereoNet [J].
Chabra, Rohan ;
Straub, Julian ;
Sweeney, Chris ;
Newcombe, Richard ;
Fuchs, Henry .
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, :11778-11787
[5]   Pyramid Stereo Matching Network [J].
Chang, Jia-Ren ;
Chen, Yong-Sheng .
2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, :5410-5418
[6]   DeepDriving: Learning Affordance for Direct Perception in Autonomous Driving [J].
Chen, Chenyi ;
Seff, Ari ;
Kornhauser, Alain ;
Xiao, Jianxiong .
2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2015, :2722-2730
[7]   Virtual Blood Vessels in Complex Background using Stereo X-ray Images [J].
Chen, Qiuyu ;
Bise, Ryoma ;
Gu, Lin ;
Zheng, Yinqiang ;
Sato, Imari ;
Hwang, Jenq-Neng ;
Imanishi, Nobuaki ;
Aiso, Sadakazu .
2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW 2017), 2017, :99-106
[8]   A Deep Visual Correspondence Embedding Model for Stereo Matching Costs [J].
Chen, Zhuoyuan ;
Sun, Xun ;
Wang, Liang ;
Yu, Yinan ;
Huang, Chang .
2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2015, :972-980
[9]   Region Separable Stereo Matching [J].
Cheng, Junda ;
Yang, Xin ;
Pu, Yuechuan ;
Guo, Peng .
IEEE TRANSACTIONS ON MULTIMEDIA, 2023, 25 :4880-4893
[10]   Adaptive Disparity Candidates Prediction Network for Efficient Real-Time Stereo Matching [J].
Dai, He ;
Zhang, Xuchong ;
Zhao, Yongli ;
Sun, Hongbin ;
Zheng, Nanning .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2022, 32 (05) :3099-3110