Towards affordance detection for robot manipulation using affordance for parts and parts for affordance

被引:0
|
作者
Safoura Rezapour Lakani
Antonio J. Rodríguez-Sánchez
Justus Piater
机构
[1] Universität Innsbruck,
来源
Autonomous Robots | 2019年 / 43卷
关键词
Affordances; Part segmentation; RGB-D perception; Supervised learning;
D O I
暂无
中图分类号
学科分类号
摘要
As robots start to interact with their environments, they need to reason about the affordances of objects in those environments. In most cases, affordances can be inferred only from parts of objects, such as the blade of a knife for cutting or the head of a hammer for pounding. We propose an RGB-D part-based affordance detection method where the parts are obtained based on the affordances as well. We show that affordance detection benefits from a part-based object representation since parts are distinctive and generalizable to novel objects. We compare our method with other state-of-the-art affordance detection methods on a benchmark dataset (Myers et al. in International conference on robotics and automation (ICRA), 2015), outperforming these methods by an average of 14% on novel object instances. Furthermore, we apply our affordance detection method to a robotic grasping scenario to demonstrate that the robot is able to perform grasps after detecting the affordances.
引用
收藏
页码:1155 / 1172
页数:17
相关论文
共 50 条
  • [31] Semantic Labeling of 3D Point Clouds with Object Affordance for Robot Manipulation
    Kim, David Inkyu
    Sukhatme, Gaurav S.
    2014 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2014, : 5578 - 5584
  • [32] Affordance Discovery using Simulated Exploration
    Allevato, Adam
    Thomaz, Andrea
    Pryor, Mitch
    PROCEEDINGS OF THE 17TH INTERNATIONAL CONFERENCE ON AUTONOMOUS AGENTS AND MULTIAGENT SYSTEMS (AAMAS' 18), 2018, : 2174 - 2176
  • [33] Object affordance detection with boundary-preserving network for robotic manipulation tasks
    Congcong Yin
    Qiuju Zhang
    Neural Computing and Applications, 2022, 34 : 17963 - 17980
  • [34] Learning Visual Object Categories for Robot Affordance Prediction
    Sun, Jie
    Moore, Joshua L.
    Bobick, Aaron
    Rehg, James M.
    INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2010, 29 (2-3): : 174 - 197
  • [35] Object affordance detection with boundary-preserving network for robotic manipulation tasks
    Yin, Congcong
    Zhang, Qiuju
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (20): : 17963 - 17980
  • [36] The learning and use of traversability affordance using range images on a mobile robot
    Ugur, Emre
    Dogar, Mehmet R.
    Cakmak, Maya
    Sahin, Erol
    PROCEEDINGS OF THE 2007 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-10, 2007, : 1721 - +
  • [37] Affordance-based human-robot interaction
    Moratz, Reinhard
    Tenbrink, Thora
    TOWARDS AFFORDANCE-BASED ROBOT CONTROL, 2008, 4760 : 63 - +
  • [38] Does it help a robot navigate to call navigability an affordance?
    Hertzberg, Joachim
    Lingemann, Kai
    Loerken, Christopher
    Nuechter, Andreas
    Stiene, Stefan
    TOWARDS AFFORDANCE-BASED ROBOT CONTROL, 2008, 4760 : 16 - +
  • [39] Affordance Triggering For Arbitrary States Based on Robot Exploring
    Yi, Chang'an
    Zheng, Guofei
    Bi, Sheng
    Luo, Ronghua
    Yin, Pengshuai
    Xu, Xinshi
    Min, Huaqing
    2017 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (IEEE ROBIO 2017), 2017, : 1856 - 1861
  • [40] A Novel Formalization For Robot Cognition Based on Affordance Model
    Yi, Chang'an
    Min, Huaqing
    Luo, Ronghua
    Zhong, Zhipeng
    Shen, Xiaowen
    2012 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (ROBIO 2012), 2012,