Multitask transfer learning with kernel representation

被引:5
|
作者
Zhang, Yulu [1 ]
Ying, Shihui [1 ]
Wen, Zhijie [1 ]
机构
[1] Shanghai Univ, Sch Sci, Dept Math, Shanghai, Peoples R China
来源
NEURAL COMPUTING & APPLICATIONS | 2022年 / 34卷 / 15期
基金
中国国家自然科学基金;
关键词
Multitask transfer learning; Kernel representation; Task relation; Sparse regularization; ADAPTATION; MODEL;
D O I
10.1007/s00521-022-07126-3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In many real-world applications, collecting and labeling the data is expensive and time-consuming. Thus, there is a need to obtain a high-performance learner by leveraging the data or knowledge from other domains. Transfer learning is a promising method to solve the above problems. In this paper, we propose a multitask transfer learning method, which aims to improve the performance of the target learner by transferring knowledge from the related source tasks. First, we formulate the target learner as a nonlinear function, which is approximated by the linear combination of the eigenfunctions. Further, to transfer knowledge from the source tasks, we constrain the target model to be the linear combination of the source models according to the previous work. However, knowledge from some source tasks may not be useful for adaptation, so we add a sparse constraint to the objective function to select the related source tasks. Different from previous transfer learning methods, our method transfers knowledge by jointly learning the source tasks and the target task. Besides, it can select the source tasks associated with the target task by the sparse constraint. Empirically, the method exhibits protection against negative transfer. Finally, we compare our proposed method with three single-task learning methods and six state-of-the-art multitask learning methods on two data sets. When compared with the second best results, the nMSE of our method achieves a relative improvement of 10:85% with a training size of 100 on the SARCOS data set and a relative improvement of 4:26% with a training ratio of 20% on the Isolet data set. Experimental results show that our proposed method can effectively improve the performance of the target task by transferring knowledge from the related source tasks.
引用
收藏
页码:12709 / 12721
页数:13
相关论文
共 50 条
  • [1] Multitask transfer learning with kernel representation
    Yulu Zhang
    Shihui Ying
    Zhijie Wen
    Neural Computing and Applications, 2022, 34 : 12709 - 12721
  • [2] The Benefit of Multitask Representation Learning
    Maurer, Andreas
    Pontil, Massimiliano
    Romera-Paredes, Bernardino
    JOURNAL OF MACHINE LEARNING RESEARCH, 2016, 17
  • [3] The benefit of multitask representation learning
    Maurer, Andreas
    Pontil, Massimiliano
    Romera-Paredes, Bernardino
    Journal of Machine Learning Research, 2016, 17
  • [4] Learning rates of multitask kernel methods
    Sun, Haoming
    Zhang, Haizhang
    MATHEMATICAL METHODS IN THE APPLIED SCIENCES, 2023, 46 (09) : 11212 - 11228
  • [5] Transfer Feature Representation via Multiple Kernel Learning
    Wang, Wei
    Wang, Hao
    Zhang, Chen
    Xu, Fanjiang
    PROCEEDINGS OF THE TWENTY-NINTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2015, : 3073 - 3079
  • [6] Domain Adaptation Transfer Learning by Kernel Representation Adaptation
    Chen, Xiaoyi
    Lengelle, Regis
    PATTERN RECOGNITION APPLICATIONS AND METHODS, 2018, 10857 : 45 - 61
  • [7] Multitask Learning Using Regularized Multiple Kernel Learning
    Gonen, Mehmet
    Kandemir, Melih
    Kaski, Samuel
    NEURAL INFORMATION PROCESSING, PT II, 2011, 7063 : 500 - 509
  • [8] Provable Benefit of Multitask Representation Learning in Reinforcement Learning
    Cheng, Yuan
    Feng, Songtao
    Yang, Jing
    Zhang, Hong
    Liang, Yingbin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [9] Pareto-Path Multitask Multiple Kernel Learning
    Li, Cong
    Georgiopoulos, Michael
    Anagnostopoulos, Georgios C.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2015, 26 (01) : 51 - 61
  • [10] Scalable Multitask Representation Learning for Scene Classification
    Lapin, Maksim
    Schiele, Bernt
    Hein, Matthias
    2014 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2014, : 1434 - 1441