Finding Age Path of Self-Paced Learning

被引:4
作者
Gu, Bin [1 ]
Zhai, Zhou [2 ]
Li, Xiang [3 ]
Huang, Heng [4 ]
机构
[1] Mohamed Bin Zayed Univ Artificial Intelligence, Dept Machine Learning, Abu Dhabi, U Arab Emirates
[2] Nanjing Univ Informat Sci & Technol, Sch Comp & Software, Nanjing, Peoples R China
[3] Univ Western Ontario, Comp Sci Dept, London, ON, Canada
[4] Univ Pittsburgh, Dept Elect & Comp Engn, Pittsburgh, PA USA
来源
2021 21ST IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2021) | 2021年
关键词
Self-paced learning; age parameter; solution path algorithm; REGULARIZATION PATH;
D O I
10.1109/ICDM51629.2021.00025
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Self-paced learning (SPL) is an emerging research topic in recent machine learning research which is often formulated as a bi-convex problem. The choice of the age parameter in SPL can control the learning pace and is crucial to achieve optimal performance. Traditionally, the age parameter is programmed to increase in a fixed rate while solving the SPL problem using the alternative optimization strategy (AOS). However, this simple heuristic is likely to miss the optimal age parameter especially when efficiency is a major concern. To address this problem, we propose a solution path method, APSPL, which can track the optimal solutions of SPL with respect to the change of age parameter (age path). Specifically, we use the difference of convex (DC) formulation to replace the original biconvex problem, which enables us to derive the path-following algorithm. For better efficiency, our algorithm uses a decremental and incremental training strategy to avoid retraining several times at different age values. We theoretically prove that the solutions produced by APSPL are the same as those generated by traditional SPL solvers. We also provide the finite time convergence proof of APSPL. To demonstrate the applicability of APSPL, we provide an extension of APSPL for semi-supervised classification. To the best of our knowledge, APSPL is the first solution path algorithm for self-paced learning. Experimental results on a variety of benchmark datasets not only verify the effectiveness and efficiency of APSPL over traditional SPL, but also show the advantage of using the optimal age parameter.
引用
收藏
页码:151 / 160
页数:10
相关论文
共 39 条
[1]  
Bach FR, 2006, J MACH LEARN RES, V7, P1713
[2]   Learning Deep Architectures for AI [J].
Bengio, Yoshua .
FOUNDATIONS AND TRENDS IN MACHINE LEARNING, 2009, 2 (01) :1-127
[3]  
Bergstra J, 2012, J MACH LEARN RES, V13, P281
[4]  
Boyd L., 2004, Convex Optimization, DOI DOI 10.1017/CBO9780511804441
[5]  
Cascante-Bonilla Paola, 2020, ARXIV PREPRINT ARXIV
[6]   Balanced Self-Paced Learning for Generative Adversarial Clustering Network [J].
Dizaji, Kamran Ghasedi ;
Wang, Xiaoqian ;
Deng, Cheng ;
Huang, Heng .
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, :4386-4395
[7]  
Gan J., 2018, PATTERN RECOGN LETT
[8]  
Gong Pinghua, 2013, JMLR Workshop Conf Proc, V28, P37
[9]   A kernel path algorithm for general parametric quadratic programming problem [J].
Gu, Bin ;
Xiong, Ziran ;
Yu, Shuyang ;
Zheng, Guansheng .
PATTERN RECOGNITION, 2021, 116
[10]   Groups-Keeping Solution Path Algorithm for Sparse Regression with Automatic Feature Grouping [J].
Gu, Bin ;
Liu, Guodong ;
Huang, Heng .
KDD'17: PROCEEDINGS OF THE 23RD ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2017, :185-193