Active Visuo-Tactile Point Cloud Registration for Accurate Pose Estimation of Objects in an Unknown Workspace

被引:7
|
作者
Murali, Prajval Kumar [1 ,2 ]
Gentner, Michael [1 ,3 ]
Kaboli, Mohsen [1 ,4 ]
机构
[1] BMW Grp, Munich, Germany
[2] Univ Glasgow, Glasgow, Lanark, Scotland
[3] Tech Univ Munich, Munich, Germany
[4] Radhoud Univ, Donders Inst Brain & Cognit, Nijmegen, Netherlands
来源
2021 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS) | 2021年
关键词
D O I
10.1109/IROS51168.2021.9636877
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper proposes a novel active visuo-tactile based methodology wherein the accurate estimation of the time-invariant SE (3) pose of objects is considered for autonomous robotic manipulators. The robot equipped with tactile sensors on the gripper is guided by a vision estimate to actively explore and localize the objects in the unknown workspace. The robot is capable of reasoning over multiple potential actions, and execute the action to maximize information gain to update the current belief of the object. We formulate the pose estimation process as a linear translation invariant quaternion filter (TIQF) by decoupling the estimation of translation and rotation and formulating the update and measurement model in linear form. We perform pose estimation sequentially on acquired measurements using very sparse point cloud (<= 15 points) as acquiring each measurement using tactile sensing is time consuming. Furthermore, our proposed method is computationally efficient to perform an exhaustive uncertainty-based active touch selection strategy in real-time without the need for trading information gain with execution time. We evaluated the performance of our approach extensively in simulation and by a robotic system.
引用
收藏
页码:2838 / 2844
页数:7
相关论文
共 26 条
  • [1] Active Visuo-Tactile Interactive Robotic Perception for Accurate Object Pose Estimation in Dense Clutter
    Murali, Prajval Kumar
    Dutta, Anirvan
    Gentner, Michael
    Burdet, Etienne
    Dahiya, Ravinder
    Kaboli, Mohsen
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2022, 7 (02): : 4686 - 4693
  • [2] A Visuo-Tactile Control Framework for Manipulation and Exploration of Unknown Objects
    Li, Qiang
    Haschke, Robert
    Ritter, Helge
    2015 IEEE-RAS 15TH INTERNATIONAL CONFERENCE ON HUMANOID ROBOTS (HUMANOIDS), 2015, : 610 - 615
  • [3] Shared visuo-tactile interactive perception for robust object pose estimation
    Murali, Prajval Kumar
    Porr, Bernd
    Kaboli, Mohsen
    INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2024,
  • [4] Pose Estimation by Key Points Registration in Point Cloud
    Zhang, Weiyi
    Qi, Chenkun
    2019 3RD INTERNATIONAL SYMPOSIUM ON AUTONOMOUS SYSTEMS (ISAS 2019), 2019, : 65 - 68
  • [5] Grasping pose generation method for unknown objects based on point cloud sampling weight estimation
    Cai Z.-H.
    Yang L.
    Huang Z.-F.
    Kongzhi yu Juece/Control and Decision, 2023, 38 (10): : 2859 - 2866
  • [6] Active Touch Point Selection with Travel Cost in Tactile Exploration for Fast Shape Estimation of Unknown Objects
    Matsubara, Takamitsu
    Shibata, Kotaro
    Sugimoto, Kenji
    2016 IEEE INTERNATIONAL CONFERENCE ON ADVANCED INTELLIGENT MECHATRONICS (AIM), 2016, : 1115 - 1120
  • [7] The Pose Estimation of Mobile Robot Based on Improved Point Cloud Registration
    Miao, Yanzi
    Liu, Yang
    Ma, Hongbin
    Jin, Huijie
    INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS, 2016, 13
  • [8] Accurate Pose Estimation of the Texture-Less Objects With Known CAD Models via Point Cloud Matching
    Li, Hai
    Zeng, Qingfu
    Zhuang, Tingda
    Huang, Yanjiang
    Zhang, Xianmin
    IEEE SENSORS JOURNAL, 2023, 23 (21) : 26259 - 26268
  • [9] Research on Unknown Space Target Pose Estimation Method Based on Point Cloud
    Zhang, Huan
    Zhang, Yang
    Feng, Qingjuan
    Zhang, Kebei
    IEEE ACCESS, 2024, 12 : 149381 - 149390
  • [10] Active tactile exploration with uncertainty and travel cost for fast shape estimation of unknown objects
    Matsubara, Takamitsu
    Shibata, Kotaro
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2017, 91 : 314 - 326