A Learner-Independent Knowledge Transfer Approach to Multi-task Learning

被引:0
|
作者
Shaoning Pang
Fan Liu
Youki Kadobayashi
Tao Ban
Daisuke Inoue
机构
[1] Unitec Institute of Technology,Department of Computing
[2] Auckland University of Technology,School of Computing and Mathematical Sciences
[3] Nara Institute of Science and Technology,Graduate School of Information Science
[4] National Institution of Information and Communications Technology,Cybersecurity Laboratory
来源
Cognitive Computation | 2014年 / 6卷
关键词
Multi-task learning; Knowledge transfer; Learner-independent multi-task learning; Minimum enclosing ball;
D O I
暂无
中图分类号
学科分类号
摘要
This paper proposes a learner-independent multi-task learning (MTL) scheme in which knowledge transfer (KT) is running beyond the learner. In the proposed KT approach, we use minimum enclosing balls (MEBs) as knowledge carriers to extract and transfer knowledge from one task to another. Since the knowledge presented in MEB can be decomposed as raw data, it can be incorporated into any learner as additional training data for a new learning task to improve the learning rate. The effectiveness and robustness of the proposed KT is evaluated, respectively, on multi-task pattern recognition problems derived from synthetic datasets, UCI datasets, and real face image datasets, using classifiers from different disciplines for MTL. The experimental results show that multi-task learners using KT via MEB carriers perform better than learners without-KT, and this has been successfully applied to different classifiers such as k nearest neighbor and support vector machines.
引用
收藏
页码:304 / 320
页数:16
相关论文
共 50 条
  • [21] A Simple Approach to Balance Task Loss in Multi-Task Learning
    Liang, Sicong
    Deng, Chang
    Zhang, Yu
    2021 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2021, : 812 - 823
  • [22] Driver Drowsiness Detection by Multi-task and Transfer Learning
    Chang, Yuan
    Kameyama, Wataru
    INTERNATIONAL WORKSHOP ON ADVANCED IMAGING TECHNOLOGY (IWAIT) 2022, 2022, 12177
  • [23] Multi-task Transfer Learning for Bayesian Network Structures
    Benikhlef, Sarah
    Leray, Philippe
    Raschia, Guillaume
    Ben Messaoud, Montassar
    Sakly, Fayrouz
    SYMBOLIC AND QUANTITATIVE APPROACHES TO REASONING WITH UNCERTAINTY, ECSQARU 2021, 2021, 12897 : 217 - 228
  • [24] Episodic memory transfer for multi-task reinforcement learning
    Sorokin, Artyom Y.
    Burtsev, Mikhail S.
    BIOLOGICALLY INSPIRED COGNITIVE ARCHITECTURES, 2018, 26 : 91 - 95
  • [25] Evolutionary Multi-Task Optimization With Adaptive Intensity of Knowledge Transfer
    Zhou, Xinyu
    Mei, Neng
    Zhong, Maosheng
    Wang, Mingwen
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2024,
  • [26] Evolutionary multi-task optimization with hybrid knowledge transfer strategy
    Cai, Yiqiao
    Peng, Deming
    Liu, Peizhong
    Guo, Jing-Ming
    INFORMATION SCIENCES, 2021, 580 (580) : 874 - 895
  • [27] A COMBINED APPROACH TO MULTI-LABEL MULTI-TASK LEARNING
    Motamedvaziri, D.
    Saligrama, V.
    Castanon, D.
    2012 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP (SSP), 2012, : 616 - 619
  • [28] An ensemble knowledge transfer framework for evolutionary multi-task optimization
    Zhou, Jiajun
    Rao, Shijie
    Gao, Liang
    SWARM AND EVOLUTIONARY COMPUTATION, 2023, 83
  • [29] A Multi-Task Learning Approach for Delayed Feedback Modeling
    Huangfu, Zhigang
    Zhang, Gong-Duo
    Wu, Zhengwei
    Wu, Qintong
    Zhang, Zhiqiang
    Gu, Lihong
    Zhou, Jun
    Gu, Jinjie
    COMPANION PROCEEDINGS OF THE WEB CONFERENCE 2022, WWW 2022 COMPANION, 2022, : 116 - 120
  • [30] ROM: A Robust Online Multi-Task Learning Approach
    Zhang, Chi
    Zhao, Peilin
    Hao, Shuji
    Soh, Yeng Chai
    Lee, Bu Sung
    2016 IEEE 16TH INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2016, : 1341 - 1346