Global stereo matching algorithm based on disparity range estimation

被引:1
作者
Li, Jing [1 ]
Zhao, Hong [1 ]
Gu, Feifei [2 ]
机构
[1] Xi An Jiao Tong Univ, State Key Lab Mfg Syst Engn, Xian 710049, Shaanxi, Peoples R China
[2] Chinese Acad Sci, Shenzhen Inst Adv Technol, Guangdong Prov Key Lab Robot & Intelligent Syst, Guangzhou 518055, Guangdong, Peoples R China
来源
APPLICATIONS OF DIGITAL IMAGE PROCESSING XL | 2017年 / 10396卷
关键词
Disparity range estimation; Baseline; Disparity map; Stereo matching;
D O I
10.1117/12.2277503
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
The global stereo matching algorithms are of high accuracy for the estimation of disparity map, but the time-consuming in the optimization process still faces a curse, especially for the image pairs with high resolution and large baseline setting. To improve the computational efficiency of the global algorithms, a disparity range estimation scheme for the global stereo matching is proposed to estimate the disparity map of rectified stereo images in this paper. The projective geometry in a parallel binocular stereo vision is investigated to reveal a relationship between two disparities at each pixel in the rectified stereo images with different baselines, which can be used to quickly obtain a predicted disparity map in a long baseline setting estimated by that in the small one. Then, the drastically reduced disparity ranges at each pixel under a long baseline setting can be determined by the predicted disparity map. Furthermore, the disparity range estimation scheme is introduced into the graph cuts with expansion moves to estimate the precise disparity map, which can greatly save the cost of computing without loss of accuracy in the stereo matching, especially for the dense global stereo matching, compared to the traditional algorithm. Experimental results with the Middlebury stereo datasets are presented to demonstrate the validity and efficiency of the proposed algorithm.
引用
收藏
页数:9
相关论文
共 50 条
[31]   Stereo matching on images based on volume fusion and disparity space attention [J].
Liao, Lyuchao ;
Zeng, Jiemao ;
Lai, Taotao ;
Xiao, Zhu ;
Zou, Fumin ;
Fujita, Hamido .
ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 136
[32]   Stereo matching algorithm based on multiscale fusion [J].
Xu X. ;
Wu J. .
Xu, Xuesong (cedarxu@163.com), 1600, Science Press (33) :182-187
[33]   A stereo matching algorithm based on color segments [J].
Cai, XP ;
Zhou, DX ;
Li, GH ;
Zhuang, ZW .
2005 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, VOLS 1-4, 2005, :973-978
[34]   A method of stereo matching based on genetic algorithm [J].
Lü, CH ;
An, P ;
Zhang, ZY .
THIRD INTERNATIONAL SYMPOSIUM ON MULTISPECTRAL IMAGE PROCESSING AND PATTERN RECOGNITION, PTS 1 AND 2, 2003, 5286 :879-882
[35]   Stereo Matching Based On Election Campaign Algorithm [J].
Xie, Qing Hua ;
Zhang, Xiang Wei ;
Lv, Wen Ge ;
Cheng, Si Yuan .
2016 INTERNATIONAL CONFERENCE ON MECHATRONICS, MANUFACTURING AND MATERIALS ENGINEERING (MMME 2016), 2016, 63
[36]   Improved Stereo Matching Algorithm Based on PSMNet [J].
Liu J. ;
Feng Y. ;
Ji G. ;
Yan F. ;
Zhu S. .
Huanan Ligong Daxue Xuebao/Journal of South China University of Technology (Natural Science), 2020, 48 (01) :60-69and83
[37]   A Normalized Disparity Loss for Stereo Matching Networks [J].
Chen, Shuya ;
Xiang, Zhiyu ;
Xu, Peng ;
Zhao, Xijun .
IEEE ROBOTICS AND AUTOMATION LETTERS, 2023, 8 (01) :33-40
[38]   Outlier detection and disparity refinement in stereo matching [J].
Dong, Qicong ;
Feng, Jieqing .
JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2019, 60 :380-390
[39]   LOCAL STEREO MATCHING USING MOTION CUE AND MODIFIED CENSUS IN VIDEO DISPARITY ESTIMATION [J].
Lee, Zucheul ;
Khoshabeh, Ramsin ;
Juang, Jason ;
Nguyen, Truong Q. .
2012 PROCEEDINGS OF THE 20TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2012, :1114-1118
[40]   Stereo-LiDAR Fusion by Semi-Global Matching With Discrete Disparity-Matching Cost and Semidensification [J].
Yao, Yasuhiro ;
Ishikawa, Ryoichi ;
Oishi, Takeshi .
IEEE ROBOTICS AND AUTOMATION LETTERS, 2025, 10 (05) :4548-4555