A Survey on Multi-Task Learning

被引:906
作者
Zhang, Yu [1 ,2 ]
Yang, Qiang [3 ]
机构
[1] Southern Univ Sci & Technol, Dept Comp Sci & Engn, Shenzhen 518055, Guangdong, Peoples R China
[2] Peng Cheng Lab, Shenzhen 518066, Guangdong, Peoples R China
[3] Hong Kong Univ Sci & Technol, Dept Comp Sci & Engn, Hong Kong, Peoples R China
关键词
Task analysis; Training; Computational modeling; Classification algorithms; Transfer learning; Supervised learning; Data models; Multi-task learning; machine learning; artificial intelligence; DEEP NEURAL-NETWORKS; MULTIPLE TASKS; CLASSIFICATION; MODEL; ALGORITHMS; REGRESSION; FRAMEWORK; TRACKING; SPARSITY; RECOVERY;
D O I
10.1109/TKDE.2021.3070203
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-Task Learning (MTL) is a learning paradigm in machine learning and its aim is to leverage useful information contained in multiple related tasks to help improve the generalization performance of all the tasks. In this paper, we give a survey for MTL from the perspective of algorithmic modeling, applications and theoretical analyses. For algorithmic modeling, we give a definition of MTL and then classify different MTL algorithms into five categories, including feature learning approach, low-rank approach, task clustering approach, task relation learning approach and decomposition approach as well as discussing the characteristics of each approach. In order to improve the performance of learning tasks further, MTL can be combined with other learning paradigms including semi-supervised learning, active learning, unsupervised learning, reinforcement learning, multi-view learning and graphical models. When the number of tasks is large or the data dimensionality is high, we review online, parallel and distributed MTL models as well as dimensionality reduction and feature hashing to reveal their computational and storage advantages. Many real-world applications use MTL to boost their performance and we review representative works in this paper. Finally, we present theoretical analyses and discuss several future directions for MTL.
引用
收藏
页码:5586 / 5609
页数:24
相关论文
共 265 条
[1]   Multi-Task CNN Model for Attribute Prediction [J].
Abdulnabi, Abrar H. ;
Wang, Gang ;
Lu, Jiwen ;
Jia, Kui .
IEEE TRANSACTIONS ON MULTIMEDIA, 2015, 17 (11) :1949-1959
[2]  
Acharya A., 2014, P 2014 SIAM INT C DA, P190
[3]  
Agarwal A., 2010, Proceedings of Advances in Neural Information Processing Systems, P46
[4]  
Ahmed A., 2012, P 21 ACM INT C INFOR, P1737
[5]   Scalable Hierarchical Multitask Learning Algorithms for Conversion Optimization in Display Advertising [J].
Ahmed, Amr ;
Das, Abhimanyu ;
Smola, Alexander J. .
WSDM'14: PROCEEDINGS OF THE 7TH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING, 2014, :153-162
[6]  
Alamgir M., 2010, 13th International Conference on Articial Intelligence and Statistics, P17
[7]   Learning to transfer: transferring latent task structures and its application to person-specific facial action unit detection [J].
Almaev, Timur ;
Martinez, Brais ;
Valstar, Michel .
2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2015, :3774-3782
[8]  
An Q., 2008, P 25 INT C MACH LEAR, P17
[9]  
Ando RK, 2005, J MACH LEARN RES, V6, P1817
[10]  
Andreas J, 2017, PR MACH LEARN RES, V70