A Learner-Independent Knowledge Transfer Approach to Multi-task Learning

被引:0
|
作者
Shaoning Pang
Fan Liu
Youki Kadobayashi
Tao Ban
Daisuke Inoue
机构
[1] Unitec Institute of Technology,Department of Computing
[2] Auckland University of Technology,School of Computing and Mathematical Sciences
[3] Nara Institute of Science and Technology,Graduate School of Information Science
[4] National Institution of Information and Communications Technology,Cybersecurity Laboratory
来源
Cognitive Computation | 2014年 / 6卷
关键词
Multi-task learning; Knowledge transfer; Learner-independent multi-task learning; Minimum enclosing ball;
D O I
暂无
中图分类号
学科分类号
摘要
This paper proposes a learner-independent multi-task learning (MTL) scheme in which knowledge transfer (KT) is running beyond the learner. In the proposed KT approach, we use minimum enclosing balls (MEBs) as knowledge carriers to extract and transfer knowledge from one task to another. Since the knowledge presented in MEB can be decomposed as raw data, it can be incorporated into any learner as additional training data for a new learning task to improve the learning rate. The effectiveness and robustness of the proposed KT is evaluated, respectively, on multi-task pattern recognition problems derived from synthetic datasets, UCI datasets, and real face image datasets, using classifiers from different disciplines for MTL. The experimental results show that multi-task learners using KT via MEB carriers perform better than learners without-KT, and this has been successfully applied to different classifiers such as k nearest neighbor and support vector machines.
引用
收藏
页码:304 / 320
页数:16
相关论文
共 50 条
  • [1] A Learner-Independent Knowledge Transfer Approach to Multi-task Learning
    Pang, Shaoning
    Liu, Fan
    Kadobayashi, Youki
    Ban, Tao
    Inoue, Daisuke
    COGNITIVE COMPUTATION, 2014, 6 (03) : 304 - 320
  • [2] Multi-Task Learning with Knowledge Transfer for Facial Attribute Classification
    Fanhe, Xiaohui
    Guo, Jie
    Huang, Zheng
    Qiu, Weidong
    Zhang, Yuele
    2019 IEEE INTERNATIONAL CONFERENCE ON INDUSTRIAL TECHNOLOGY (ICIT), 2019, : 877 - 882
  • [3] A Multi-Task Learning Approach for Recommendation based on Knowledge Graph
    Yan, Cairong
    Liu, Shuai
    Zhang, Yanting
    Wang, Zijian
    Wang, Pengwei
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [4] A Multi-Task and Transfer Learning based Approach for MOS Prediction
    Tian, Xiaohai
    Fu, Kaiqi
    Gao, Shaojun
    Gu, Yiwei
    Wang, Kai
    Li, Wei
    Ma, Zejun
    INTERSPEECH 2022, 2022, : 5438 - 5442
  • [5] Knowledge Transfer in Multi-Task Deep Reinforcement Learning for Continuous Control
    Xu, Zhiyuan
    Wu, Kun
    Che, Zhengping
    Tang, Jian
    Ye, Jieping
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [6] MULTI-TASK DISTILLATION: TOWARDS MITIGATING THE NEGATIVE TRANSFER IN MULTI-TASK LEARNING
    Meng, Ze
    Yao, Xin
    Sun, Lifeng
    2021 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2021, : 389 - 393
  • [7] Independent Component Alignment for Multi-Task Learning
    Senushkin, Dmitry
    Patakin, Nikolay
    Kuznetsov, Arseny
    Konushin, Anton
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 20083 - 20093
  • [8] Automating Knowledge Transfer with Multi-Task Optimization
    Scott, Eric O.
    De Jong, Kenneth A.
    2019 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2019, : 2252 - 2259
  • [9] Online Knowledge Distillation for Multi-task Learning
    Jacob, Geethu Miriam
    Agarwal, Vishal
    Stenger, Bjorn
    2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 2358 - 2367
  • [10] A multi-task transfer learning method with dictionary learning
    Zheng, Xin
    Lin, Luyue
    Liu, Bo
    Xiao, Yanshan
    Xiong, Xiaoming
    KNOWLEDGE-BASED SYSTEMS, 2020, 191