Open Set Domain Adaptation: Theoretical Bound and Algorithm

被引:128
作者
Fang, Zhen [1 ]
Lu, Jie [1 ]
Liu, Feng [1 ]
Xuan, Junyu [1 ]
Zhang, Guangquan [1 ]
机构
[1] Univ Technol Sydney, Faulty Engn & Informat Technol, Ctr Artificial Intelligence, Ultimo, NSW 2007, Australia
基金
澳大利亚研究理事会;
关键词
Target recognition; Task analysis; Training; Prediction algorithms; Support vector machines; Random variables; Learning systems; Domain adaptation; machine learning; open set recognition; transfer learning; REGULARIZATION; FRAMEWORK;
D O I
10.1109/TNNLS.2020.3017213
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The aim of unsupervised domain adaptation is to leverage the knowledge in a labeled (source) domain to improve a model's learning performance with an unlabeled (target) domain-the basic strategy being to mitigate the effects of discrepancies between the two distributions. Most existing algorithms can only handle unsupervised closed set domain adaptation (UCSDA), i.e., where the source and target domains are assumed to share the same label set. In this article, we target a more challenging but realistic setting: unsupervised open set domain adaptation (UOSDA), where the target domain has unknown classes that are not found in the source domain. This is the first study to provide learning bound for open set domain adaptation, which we do by theoretically investigating the risk of the target classifier on unknown classes. The proposed learning bound has a special term, namely, open set difference, which reflects the risk of the target classifier on unknown classes. Furthermore, we present a novel and theoretically guided unsupervised algorithm for open set domain adaptation, called distribution alignment with open difference (DAOD), which is based on regularizing this open set difference bound. The experiments on several benchmark data sets show the superior performance of the proposed UOSDA method compared with the state-of-the-art methods in the literature.
引用
收藏
页码:4309 / 4322
页数:14
相关论文
共 56 条
[1]  
Baktashmotlagh M., 2019, ICLR
[2]  
Belkin M, 2006, J MACH LEARN RES, V7, P2399
[3]  
Ben-David S., 2006, Advances in Neural Information Processing Systems., V19
[4]   Open Set Domain Adaptation [J].
Busto, Pau Panareda ;
Gall, Juergen .
2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, :754-763
[5]   Graph Regularized Nonnegative Matrix Factorization for Data Representation [J].
Cai, Deng ;
He, Xiaofei ;
Han, Jiawei ;
Huang, Thomas S. .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2011, 33 (08) :1548-1560
[6]  
Candela Joaquin Quinonero, 2009, DATASET SHIFT MACHIN
[7]   Partial Transfer Learning with Selective Adversarial Networks [J].
Cao, Zhangjie ;
Long, Mingsheng ;
Wang, Jianmin ;
Jordan, Michael I. .
2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, :2724-2732
[8]   Domain Adaption via Feature Selection on Explicit Feature Map [J].
Deng, Wan-Yu ;
Lendasse, Amaury ;
Ong, Yew-Soon ;
Tsang, Ivor Wai-Hung ;
Chen, Lin ;
Zheng, Qing-Hua .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2019, 30 (04) :1180-1190
[9]  
Fang Z., 2019, P INT JOINT C NEUR N, P1
[10]   Scatter Component Analysis: A Unified Framework for Domain Adaptation and Domain Generalization [J].
Ghifary, Muhammad ;
Balduzzi, David ;
Kleijn, W. Bastiaan ;
Zhang, Mengjie .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2017, 39 (07) :1414-1430