The Benefit of Multitask Representation Learning

被引:0
|
作者
Maurer, Andreas [1 ]
Pontil, Massimiliano [2 ,3 ]
Romera-Paredes, Bernardino [4 ]
机构
[1] Adalbertstr 55, D-80799 Munich, Germany
[2] Ist Italiano Tecnol, I-16163 Genoa, Italy
[3] UCL, Dept Comp Sci, London WC1E 6BT, England
[4] Univ Oxford, Dept Engn Sci, Oxford OX1 3PJ, England
关键词
learning-to-learn; multitask learning; representation learning; statistical learning theory; transfer learning; MULTIPLE TASKS; INEQUALITIES;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We discuss a general method to learn data representations from multiple tasks. We provide a justification for this method in both settings of multitask learning and learning-to-learn. The method is illustrated in detail in the special case of linear feature learning. Conditions on the theoretical advantage offered by multitask representation learning over independent task learning are established. In particular, focusing on the important example of half-space learning, we derive the regime in which multitask representation learning is beneficial over independent task learning, as a function of the sample size, the number of tasks and the intrinsic data dimensionality. Other potential applications of our results include multitask feature learning in reproducing kernel Hilbert spaces and multilayer, deep networks.
引用
收藏
页数:32
相关论文
共 50 条
  • [1] The benefit of multitask representation learning
    Maurer, Andreas
    Pontil, Massimiliano
    Romera-Paredes, Bernardino
    Journal of Machine Learning Research, 2016, 17
  • [2] Provable Benefit of Multitask Representation Learning in Reinforcement Learning
    Cheng, Yuan
    Feng, Songtao
    Yang, Jing
    Zhang, Hong
    Liang, Yingbin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [3] Multitask transfer learning with kernel representation
    Yulu Zhang
    Shihui Ying
    Zhijie Wen
    Neural Computing and Applications, 2022, 34 : 12709 - 12721
  • [4] Multitask transfer learning with kernel representation
    Zhang, Yulu
    Ying, Shihui
    Wen, Zhijie
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (15): : 12709 - 12721
  • [5] Scalable Multitask Representation Learning for Scene Classification
    Lapin, Maksim
    Schiele, Bernt
    Hein, Matthias
    2014 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2014, : 1434 - 1441
  • [6] Holistic Representation Learning for Multitask Trajectory Anomaly Detection
    Stergiou, Alexandros
    De Weerdt, Brent
    Deligiannis, Nikos
    2024 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION, WACV 2024, 2024, : 6715 - 6725
  • [7] Multitask Representation Learning With Multiview Graph Convolutional Networks
    Huang, Hong
    Song, Yu
    Wu, Yao
    Shi, Jia
    Xie, Xia
    Jin, Hai
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (03) : 983 - 995
  • [8] ESSR: Evolving Sparse Sharing Representation for Multitask Learning
    Zhang, Yayu
    Qian, Yuhua
    Ma, Guoshuai
    Liang, Xinyan
    Liu, Guoqing
    Zhang, Qingfu
    Tang, Ke
    IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2024, 28 (03) : 748 - 762
  • [9] Understanding Inverse Scaling and Emergence in Multitask Representation Learning
    Ildiz, M. Emrullah
    Zhao, Zhe
    Oymak, Samet
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [10] Multitask Representation Learning for Multimodal Estimation of Depression Level
    Qureshi, Syed Arbaaz
    Saha, Sriparna
    Hasanuzzaman, Mohammed
    Dias, Gael
    IEEE INTELLIGENT SYSTEMS, 2019, 34 (05) : 45 - 52