The next-best-view for workpiece localization in robot workspace

被引:2
作者
Hu, Jie [1 ]
Pagilla, Prabhakar R. [1 ]
Darbha, Swaroop [1 ]
机构
[1] Texas A&M Univ, Dept Mech Engn, College Stn, TX 77843 USA
来源
2021 IEEE/ASME INTERNATIONAL CONFERENCE ON ADVANCED INTELLIGENT MECHATRONICS (AIM) | 2021年
关键词
workpiece localization; robotics; manufacturing; next-best-view;
D O I
10.1109/AIM46487.2021.9517657
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Workpiece localization is the process of obtaining the location of a workpiece in a reference frame of a robotic workspace. The location (position and orientation) is represented by the transformation between a local frame associated with the workpiece and the specified reference frame in the workspace. In this work, we study the workpiece localization problem without the two commonly adopted restrictive assumptions: the data used to calculate the transformation is readily available and the correspondence between the data sets used for calculation is known. The goal is to automate the localization process starting from efficient data collection to determining the workpiece location in the workspace. We describe a strategy that includes the following aspects: predicting the correspondence between the measured data and the workpiece CAD model data; generating representative vectors that would aid in determining the next-best-view for collecting new information of the workpiece location; evaluating a search region to find the next sensor location that satisfies both the robot kinematics as well as sensor field-of-view constraints while giving the maximum view gain; and calculating the rigid body transformation from the local frame to the world frame to localize the workpiece. Numerical simulation and experimental results are presented and discussed for the proposed strategy.
引用
收藏
页码:1201 / 1206
页数:6
相关论文
共 38 条
[21]   Tuning and Comparison of Optimization Algorithms for the Next Best View Problematic [J].
Shain-Ruvalcaba, Everardo ;
Lopez-Damian, Efrain .
IEEE ACCESS, 2024, 12 :185567-185585
[22]   A solution to the next best view problem for automated surface acquisition [J].
Pito, R .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1999, 21 (10) :1016-1030
[23]   Determining next best view based on occlusion information in depth image [J].
Zhang, Shi-Hui ;
Liu, Jian-Xin ;
Kong, Ling-Fu .
Jisuanji Xuebao/Chinese Journal of Computers, 2015, 38 (12) :2450-2463
[24]   A Robot Visual Servo-based Approach to the Determination of Next Best Views [J].
Zhang, Lei ;
Zuo, Junqiu ;
Yao, Xingtian ;
Zhang, Xingguo ;
Shuai, Liguo .
2015 IEEE INTERNATIONAL CONFERENCE ON MECHATRONICS AND AUTOMATION, 2015, :2654-2659
[25]   Determining next best view based on occlusion information of a single depth image [J].
Zhang S.-H. ;
Zhang Y.-C. .
Tien Tzu Hsueh Pao/Acta Electronica Sinica, 2016, 44 (02) :445-452
[26]   Determining next best view using occlusion and contour information of visual object [J].
Zhang, Shi-Hui ;
Han, De-Wei ;
He, Huan .
Dianzi Yu Xinxi Xuebao/Journal of Electronics and Information Technology, 2015, 37 (12) :2921-2928
[27]   A New Next Best View Method For 3D Modeling of Unknown Objects [J].
Singh, Mahesh Kr. ;
Venkatesh, K. S. ;
Dutta, Ashish .
2015 THIRD INTERNATIONAL CONFERENCE ON IMAGE INFORMATION PROCESSING (ICIIP), 2015, :516-519
[28]   Similar Image Avoidance Policy Based on Image Similarity Measurement for Next Best View [J].
Wang, Jinhoon ;
Lee, Woonghee ;
Winata, I. Made Putra Arya ;
Oh, Junghyun .
2024 24TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND SYSTEMS, ICCAS 2024, 2024, :1552-1555
[29]   Next Best View For Point-Cloud Model Acquisition: Bayesian Approximation and Uncertainty Analysis [J].
Caeiro Caldeira, Madalena Pombinho ;
Moreno, Plinio .
2024 EIGHTH IEEE INTERNATIONAL CONFERENCE ON ROBOTIC COMPUTING, IRC 2024, 2024, :91-96
[30]   Determining next best view based on occlusion information in a single depth image of visual object [J].
Zhang, Shihui ;
Miao, Yuxia ;
Li, Xin ;
He, Huan ;
Sang, Yu ;
Du, Xuezhe .
INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS, 2017, 14 (01)