Affordance Transfer Learning for Human-Object Interaction Detection

被引:74
作者
Hou, Zhi [1 ]
Yu, Baosheng [1 ]
Qiao, Yu [2 ,3 ]
Peng, Xiaojiang [4 ]
Tao, Dacheng [1 ]
机构
[1] Univ Sydney, Fac Engn, Sch Comp Sci, Sydney, NSW, Australia
[2] Chinese Acad Sci, Shenzhen Inst Adv Technol, Beijing, Peoples R China
[3] Shanghai AI Lab, Shanghai, Peoples R China
[4] Shenzhen Technol Univ, Shenzhen, Peoples R China
来源
2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021 | 2021年
基金
澳大利亚研究理事会;
关键词
D O I
10.1109/CVPR46437.2021.00056
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Reasoning the human-object interactions (HOI) is essential for deeper scene understanding, while object affordances (or functionalities) are of great importance for human to discover unseen HOIs with novel objects. Inspired by this, we introduce an affordance transfer learning approach to jointly detect HOIs with novel object and recognize affordances. Specifically, HOI representations can be decoupled into a combination of affordance and object representations, making it possible to compose novel interactions by combining affordance representations and novel object representations from additional images, i.e. transferring the affordance to novel objects. With the proposed affordance transfer learning, the model is also capable of inferring the affordances of novel objects from known affordance representations. The proposed method can thus be used to 1) improve the performance of HOI detection, especially for the HOIs with unseen objects; and 2) infer the affordances of novel objects. Experimental results on two datasets, HICO-DET and HOI-COCO (from V-COCO), demonstrate significant improvements over recent state-of-the-art methods for HOI detection and object affordance detection.
引用
收藏
页码:495 / 504
页数:10
相关论文
empty
未找到相关数据