Distributed Feature Matching for Robust Object Localization in Robotic Manipulation

被引:0
作者
Singh, Puran [1 ]
Rattan, Munish [1 ]
Grewal, Narwant Singh [1 ]
Aggarwal, Geetika [2 ]
机构
[1] Guru Nanak Dev Engn Coll, Dept Elect & Commun Engn, Ludhiana 141006, Punjab, India
[2] Teesside Univ, Dept Engn, Middlesbrough TS1 3BX, England
来源
IEEE ACCESS | 2024年 / 12卷
关键词
Cameras; Robot vision systems; Feature extraction; Automation; Robot kinematics; Three-dimensional displays; Object recognition; Location awareness; Training; Machine learning algorithms; feature matching; robotic automation; bin-picking; monocular vision;
D O I
10.1109/ACCESS.2024.3482428
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The feature matching algorithms are used to recognize the position of flat objects or surfaces in an image. This is particularly used for the control of autonomous robot arms for pick and place operations under monocular vision guidance systems. The problem arises where the object surface is not flat or the detected feature points belong to the different height planes. The error is much more prominent if the object is placed away from the center of the camera view that leads to projection parallax and the apparent surface geometry is distorted. The algorithm proposed in this paper identifies horizontal planes with different heights and uses feature matching on individual planes in a distributed way to find accurate position of the object. Two images of the object are required by this method to train and then find the object in a single image, this allows 3D model matching using only monocular camera without using machine learning techniques thatrequire a large dataset of training images. The algorithm works best for the multi-planar 3D objects, which have several feature pointson different height horizontal plane levels. The results have beencompared with the recent contour based feature matching method that addressed a similar problem.
引用
收藏
页码:161679 / 161687
页数:9
相关论文
共 50 条
[31]   Research on an Improved Stepwise Feature Matching Algorithm for UAV Indoor Localization [J].
He, Yong ;
He, Xiaochuan .
IEEE ACCESS, 2025, 13 :67323-67333
[32]   Feature Extraction and Matching Algorithms to Improve Localization Accuracy for Mobile Robots [J].
Kang, Sin-Won ;
Bae, Sang-Hyeon ;
Kuc, Tae-Yong .
2020 20TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND SYSTEMS (ICCAS), 2020, :991-994
[33]   DHM-Net: Deep Hypergraph Modeling for Robust Feature Matching [J].
Chen, Shunxing ;
Xiao, Guobao ;
Guo, Junwen ;
Wu, Qiangqiang ;
Ma, Jiayi .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2024, 33 :6002-6015
[34]   Robust Image Matching Based on Image Feature and Depth Information Fusion [J].
Yan, Zhiqiang ;
Wang, Hongyuan ;
Ning, Qianhao ;
Lu, Yinxi .
MACHINES, 2022, 10 (06)
[35]   Grid-Guided Sparse Laplacian Consensus for Robust Feature Matching [J].
Xia, Yifan ;
Ma, Jiayi .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2025, 34 :1367-1381
[36]   Spatiotemporal Feature Enhancement Network for Blur Robust Underwater Object Detection [J].
Zhou, Hao ;
Qi, Lu ;
Huang, Hai ;
Yang, Xu ;
Yang, Jing .
IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2024, 16 (05) :1814-1828
[37]   Object detection and recognition by using enhanced Speeded Up Robust Feature [J].
Al-asadi, Tawfiq A. ;
Obaid, Ahmed J. .
INTERNATIONAL JOURNAL OF COMPUTER SCIENCE AND NETWORK SECURITY, 2016, 16 (04) :66-71
[38]   Object Detection using Template and HOG Feature Matching [J].
Sultana, Marjia ;
Ahmed, Tasniya ;
Chakraborty, Partha ;
Khatun, Mahmuda ;
Hasan, Md Rakib ;
Uddin, Mohammad Shorif .
INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2020, 11 (07) :233-238
[39]   Object Recognition as Many-to-Many Feature Matching [J].
M. Fatih Demirci ;
Ali Shokoufandeh ;
Yakov Keselman ;
Lars Bretzner ;
Sven Dickinson .
International Journal of Computer Vision, 2006, 69 :203-222
[40]   Object recognition as many-to-many feature matching [J].
Demirci, M. Fatih ;
Shokoufandeh, Ali ;
Keselman, Yakov ;
Bretzner, Lars ;
Dickinson, Sven .
INTERNATIONAL JOURNAL OF COMPUTER VISION, 2006, 69 (02) :203-222