共 50 条
Domain Neural Adaptation
被引:14
|作者:
Chen, Sentao
[1
]
Hong, Zijie
[2
]
Harandi, Mehrtash
[3
,4
]
Yang, Xiaowei
[2
]
机构:
[1] Shantou Univ, Dept Comp Sci, Shantou 515063, Peoples R China
[2] South China Univ Technol, Sch Software Engn, Guangzhou 510006, Peoples R China
[3] Monash Univ, Dept Elect & Comp Syst Engn, Melbourne, Vic 3800, Australia
[4] CSIRO Data 61, Eveleigh, NSW 2015, Australia
基金:
中国国家自然科学基金;
关键词:
Adaptation models;
Probability distribution;
DNA;
Neural networks;
Kernel;
Hilbert space;
Data models;
Domain adaptation;
joint distribution matching;
neural network;
relative chi-square (RCS) divergence;
reproducing kernel hilbert space (RKHS);
EMBEDDINGS;
NETWORK;
KERNEL;
D O I:
10.1109/TNNLS.2022.3151683
中图分类号:
TP18 [人工智能理论];
学科分类号:
081104 ;
0812 ;
0835 ;
1405 ;
摘要:
Domain adaptation is concerned with the problem of generalizing a classification model to a target domain with little or no labeled data, by leveraging the abundant labeled data from a related source domain. The source and target domains possess different joint probability distributions, making it challenging for model generalization. In this article, we introduce domain neural adaptation (DNA): an approach that exploits nonlinear deep neural network to 1) match the source and target joint distributions in the network activation space and 2) learn the classifier in an end-to-end manner. Specifically, we employ the relative chi-square divergence to compare the two joint distributions, and show that the divergence can be estimated via seeking the maximal value of a quadratic functional over the reproducing kernel hilbert space. The analytic solution to this maximization problem enables us to explicitly express the divergence estimate as a function of the neural network mapping. We optimize the network parameters to minimize the estimated joint distribution divergence and the classification loss, yielding a classification model that generalizes well to the target domain. Empirical results on several visual datasets demonstrate that our solution is statistically better than its competitors.
引用
收藏
页码:8630 / 8641
页数:12
相关论文