Domain adaptation via Multi-Layer Transfer Learning

被引:23
作者
Pan, Jianhan [1 ]
Hu, Xuegang [1 ]
Li, Peipei [1 ]
Li, Huizong [1 ]
He, Wei [1 ]
Zhang, Yuhong [1 ]
Lin, Yaojin [2 ]
机构
[1] Hefei Univ Technol, Hefei 230009, Peoples R China
[2] Minnan Normal Univ, Zhangzhou 363000, Peoples R China
基金
中国国家自然科学基金;
关键词
Transfer Learning; Non-Negative Matrix Tri-Factorization; Multi-Layer; Cross-domain classification; FUZZY SYSTEM;
D O I
10.1016/j.neucom.2015.12.097
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Transfer learning, which leverages labeled data in a source domain to train an accurate classifier for classification tasks in a target domain, has attracted extensive research interests recently for its effectiveness proven by many studies. Previous approaches adopt a common strategy that models the shared structure as a bridge across different domains by reducing distribution divergences. However, those approaches totally ignore specific latent spaces, which can be utilized to learn non-shared concepts. Only specific latent spaces contain specific latent factors, lacking which will lead to ineffective distinct concept learning. Additionally, only learning latent factors in one latent feature space layer may ignore those in the other layers. The missing latent factors may also help us to model the latent structure shared as the bridge. This paper proposes a novel transfer learning method Multi-Layer Transfer Learning (MLTL). MLTL first generates specific latent feature spaces. Second, it combines these specific latent feature spaces with common latent feature space into one latent feature space layer. Third, it generates multiple layers to learn the corresponding distributions on different layers with their pluralism simultaneously. Specifically, the pluralism of the distributions on different layers means that learning the distributions on one layer can help us to learn the distributions on the others. Furthermore, an iterative algorithm based on Non-Negative Matrix Tri-Factorization is proposed to solve the optimization problem. Comprehensive experiments demonstrate that MLTL can significantly outperform the state-of-the-art learning methods on topic and sentiment classification tasks. (C) 2016 Elsevier B.V. All rights reserved.
引用
收藏
页码:10 / 24
页数:15
相关论文
共 36 条
[31]  
Zhang D., 2011, Proceedings of the 17th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Diego, CA, USA, August 21-24, 2011, P1208
[32]  
Zheng, 2008, AAAI, P1427, DOI [10.5555/1620270.1620296, DOI 10.5555/1620270.1620296]
[33]  
Zhu Y., 2011, P 25 AAAI
[34]  
Zhuang F., 2010, P 19 INT C INF KNOWL, P359
[35]  
Zhuang F., 2013, P INT JOINT C ART IN, P1960
[36]   Triplex Transfer Learning: Exploiting Both Shared and Distinct Concepts for Text Classification [J].
Zhuang, Fuzhen ;
Luo, Ping ;
Du, Changying ;
He, Qing ;
Shi, Zhongzhi ;
Xiong, Hui .
IEEE TRANSACTIONS ON CYBERNETICS, 2014, 44 (07) :1191-1203