Dual Bipartite Graph Learning: A General Approach for Domain Adaptive Object Detection

被引:36
作者
Chen, Chaoqi [1 ]
Li, Jiongcheng [2 ]
Zheng, Zebiao [2 ]
Huang, Yue [2 ]
Ding, Xinghao [2 ]
Yu, Yizhou [1 ]
机构
[1] Univ Hong Kong, Hong Kong, Peoples R China
[2] Xiamen Univ, Xiamen, Fujian, Peoples R China
来源
2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021) | 2021年
关键词
D O I
10.1109/ICCV48922.2021.00270
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Domain Adaptive Object Detection (DAOD) relieves the reliance on large-scale annotated data by transferring the knowledge learned from a labeled source domain to a new unlabeled target domain. Recent DAOD approaches resort to local feature alignment in virtue of domain adversarial training in conjunction with the ad-hoc detection pipelines to achieve feature adaptation. However, these methods are limited to adapt the specific types of object detectors and do not explore the cross-domain topological relations. In this paper, we first formulate DAOD as an open-set domain adaptation problem in which foregrounds (pixel or region) can be seen as the "known class", while backgrounds (pixel or region) are referred to as the "unknown class". To this end, we present a new and general perspective for DAOD named Dual Bipartite Graph Learning (DBGL), which captures the cross-domain interactions on both pixel-level and semantic-level via increasing the distinction between foregrounds and backgrounds and modeling the cross-domain dependencies among different semantic categories. Experiments reveal that the proposed DBGL in conjunction with one-stage and two-stage detectors exceeds the state-of-the-art performance on standard DAOD benchmarks.
引用
收藏
页码:2683 / 2692
页数:10
相关论文
共 52 条
[1]  
[Anonymous], 2018, ICML
[2]  
[Anonymous], 2020, ECCV
[3]  
[Anonymous], 2020, NEURIPS
[4]  
[Anonymous], 2019, CVPR, DOI DOI 10.1109/CVPR.2019.00072
[5]  
[Anonymous], 2019, CVPR, DOI DOI 10.1109/CVPR.2019.00234
[6]  
[Anonymous], 2018, NeurIPS
[7]  
[Anonymous], 2020, CVPR, DOI DOI 10.1109/CVPR42600.2020.00445
[8]  
[Anonymous], 2015, ICLR WORKSH TRACK
[9]  
[Anonymous], 2017, ICML
[10]  
Baktashmotlagh M., 2019, ICLR