Learning rates of multitask kernel methods

被引:0
|
作者
Sun, Haoming [1 ]
Zhang, Haizhang [1 ,2 ]
机构
[1] Sun Yat sen Univ, Sch Math Zhuhai, Zhuhai 519082, Peoples R China
[2] Sun Yat sen Univ, Guangdong Prov Key Lab Computat Sci, Guangzhou 510000, Peoples R China
基金
中国国家自然科学基金;
关键词
coefficient regularization; learning rates; matrix-valued reproducing kernels; multitask learning; regularization networks; vector-valued reproducing kernel Hilbert spaces; HILBERT-SPACES; VECTOR; THEOREM;
D O I
10.1002/mma.9176
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Learning theory aims at building a solid mathematical foundation for machine learning. A core objective of learning theory is to estimate the learning rates of various learning algorithms in order to analyze their generalization ability. By far, most such research efforts have been focusing on single-task kernel methods. There is little parallel work on learning rates of multitask kernel methods. We shall present an analysis of the learning rates for multitask regularization networks and l2-norm coefficient regularization. Compared to the existing work on learning rate estimates of multitask regularization networks, our study is more applicable in that we do not require the regression function to lie in the vector-valued reproducing kernel Hilbert space of the chosen matrix-valued reproducing kernel. Our work on the learning rate of multitask l2-norm coefficient regularization is new. For both methods, our results reveal a quantitative dependency of the learning rates on the number of tasks, which is also new in the literature.
引用
收藏
页码:11212 / 11228
页数:17
相关论文
共 50 条
  • [1] On the Benefits of Large Learning Rates for Kernel Methods
    Beugnot, Gaspard
    Mairal, Julien
    Rudi, Alessandro
    CONFERENCE ON LEARNING THEORY, VOL 178, 2022, 178 : 254 - 282
  • [2] Multitask transfer learning with kernel representation
    Yulu Zhang
    Shihui Ying
    Zhijie Wen
    Neural Computing and Applications, 2022, 34 : 12709 - 12721
  • [3] Multitask transfer learning with kernel representation
    Zhang, Yulu
    Ying, Shihui
    Wen, Zhijie
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (15): : 12709 - 12721
  • [4] Multitask Learning Using Regularized Multiple Kernel Learning
    Gonen, Mehmet
    Kandemir, Melih
    Kaski, Samuel
    NEURAL INFORMATION PROCESSING, PT II, 2011, 7063 : 500 - 509
  • [5] Pareto-Path Multitask Multiple Kernel Learning
    Li, Cong
    Georgiopoulos, Michael
    Anagnostopoulos, Georgios C.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2015, 26 (01) : 51 - 61
  • [6] Multitask Kernel-based Learning with Logic Constraints
    Diligenti, Michelangelo
    Gori, Marco
    Maggini, Marco
    Rigutini, Leonardo
    ECAI 2010 - 19TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2010, 215 : 433 - 438
  • [7] Inferring latent task structure for Multitask Learning by Multiple Kernel Learning
    Widmer, Christian
    Toussaint, Nora C.
    Altun, Yasemin
    Raetsch, Gunnar
    BMC BIOINFORMATICS, 2010, 11
  • [8] Inferring latent task structure for Multitask Learning by Multiple Kernel Learning
    Christian Widmer
    Nora C Toussaint
    Yasemin Altun
    Gunnar Rätsch
    BMC Bioinformatics, 11
  • [9] Efficient Multitask Multiple Kernel Learning With Application to Cancer Research
    Rahimi, Arezou
    Gonen, Mehmet
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (09) : 8716 - 8728
  • [10] A Unifying Framework for Typical Multitask Multiple Kernel Learning Problems
    Li, Cong
    Georgiopoulos, Michael
    Anagnostopoulos, Georgios C.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2014, 25 (07) : 1287 - 1297