Domain adaptation via Multi-Layer Transfer Learning

被引:23
作者
Pan, Jianhan [1 ]
Hu, Xuegang [1 ]
Li, Peipei [1 ]
Li, Huizong [1 ]
He, Wei [1 ]
Zhang, Yuhong [1 ]
Lin, Yaojin [2 ]
机构
[1] Hefei Univ Technol, Hefei 230009, Peoples R China
[2] Minnan Normal Univ, Zhangzhou 363000, Peoples R China
基金
中国国家自然科学基金;
关键词
Transfer Learning; Non-Negative Matrix Tri-Factorization; Multi-Layer; Cross-domain classification; FUZZY SYSTEM;
D O I
10.1016/j.neucom.2015.12.097
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Transfer learning, which leverages labeled data in a source domain to train an accurate classifier for classification tasks in a target domain, has attracted extensive research interests recently for its effectiveness proven by many studies. Previous approaches adopt a common strategy that models the shared structure as a bridge across different domains by reducing distribution divergences. However, those approaches totally ignore specific latent spaces, which can be utilized to learn non-shared concepts. Only specific latent spaces contain specific latent factors, lacking which will lead to ineffective distinct concept learning. Additionally, only learning latent factors in one latent feature space layer may ignore those in the other layers. The missing latent factors may also help us to model the latent structure shared as the bridge. This paper proposes a novel transfer learning method Multi-Layer Transfer Learning (MLTL). MLTL first generates specific latent feature spaces. Second, it combines these specific latent feature spaces with common latent feature space into one latent feature space layer. Third, it generates multiple layers to learn the corresponding distributions on different layers with their pluralism simultaneously. Specifically, the pluralism of the distributions on different layers means that learning the distributions on one layer can help us to learn the distributions on the others. Furthermore, an iterative algorithm based on Non-Negative Matrix Tri-Factorization is proposed to solve the optimization problem. Comprehensive experiments demonstrate that MLTL can significantly outperform the state-of-the-art learning methods on topic and sentiment classification tasks. (C) 2016 Elsevier B.V. All rights reserved.
引用
收藏
页码:10 / 24
页数:15
相关论文
共 36 条
[1]  
Blitzer J., 2007, P 45 ANN M ASS COMP, P440
[2]  
Blitzer R., 2006, P 2006 C EMP METH NA, P120, DOI DOI 10.3115/1610075.1610094
[3]  
Boyd S, 2004, CONVEX OPTIMIZATION
[4]  
Chen Z., 2013, P 23 INT JOINT C ART, P1280
[5]   Transfer learning for activity recognition: a survey [J].
Cook, Diane ;
Feuz, Kyle D. ;
Krishnan, Narayanan C. .
KNOWLEDGE AND INFORMATION SYSTEMS, 2013, 36 (03) :537-556
[6]  
Dai W., 2007, P 24 INT C MACH LEAR, P193, DOI [DOI 10.1145/1273496.1273521, 10.1145/1273496.1273521]
[7]  
Dai WY, 2007, KDD-2007 PROCEEDINGS OF THE THIRTEENTH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, P210
[8]   Generalized Hidden-Mapping Ridge Regression, Knowledge-Leveraged Inductive Transfer Learning for Neural Networks, Fuzzy Systems and Kernel Methods [J].
Deng, Zhaohong ;
Choi, Kup-Sze ;
Jiang, Yizhang ;
Wang, Shitong .
IEEE TRANSACTIONS ON CYBERNETICS, 2014, 44 (12) :2585-2599
[9]   Knowledge-Leverage-Based Fuzzy System and Its Modeling [J].
Deng, Zhaohong ;
Jiang, Yizhang ;
Chung, Fu-Lai ;
Ishibuchi, Hisao ;
Wang, Shitong .
IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2013, 21 (04) :597-609
[10]   Knowledge-Leverage-Based TSK Fuzzy System Modeling [J].
Deng, Zhaohong ;
Jiang, Yizhang ;
Choi, Kup-Sze ;
Chung, Fu-Lai ;
Wang, Shitong .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2013, 24 (08) :1200-1212