RFN: A Random-Feature Based Newton Method for Empirical Risk Minimization in Reproducing Kernel Hilbert Spaces

被引:2
作者
Chang, Ting-Jui [1 ]
Shahrampour, Shahin [1 ]
机构
[1] Northeastern Univ, Dept Mech & Ind Engn, Boston, MA 02115 USA
关键词
Newton method; optimization algorithms; risk minimization; Hessian approximation; random features; OPTIMIZATION METHODS; CONVERGENCE;
D O I
10.1109/TSP.2022.3219993
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In supervised learning using kernel methods, we often encounter a large-scale finite-sum minimization over a reproducing kernel Hilbert space (RKHS). Large-scale finite-sum problems can be solved using efficient variants of Newton method, where the Hessian is approximated via sub-samples of data. In RKHS, however, the dependence of the penalty function to kernel makes standard sub-sampling approaches inapplicable, since the gram matrix is not readily available in a low-rank form. In this paper, we observe that for this class of problems, one can naturally use kernel approximation to speed up the Newton method. Focusing on randomized features for kernel approximation, we provide a novel second-order algorithm that enjoys local superlinear convergence and global linear convergence (with high probability). We derive the theoretical lower bound for the number of random features required for the approximated Hessian to be close to the true Hessian in the norm sense. Our numerical experiments on real-world data verify the efficiency of our method compared to several benchmarks.
引用
收藏
页码:5308 / 5319
页数:12
相关论文
共 46 条
[1]  
Agarwal N, 2017, J MACH LEARN RES, V18
[2]  
[Anonymous], RIC TYP CLASS
[3]  
[Anonymous], CARD DIS DAT
[4]  
[Anonymous], 2015, Proceedings of the 28th International Conference on Neural Information Processing Systems-Volume
[5]  
[Anonymous], 2012, AISTATS
[6]  
Avron H, 2017, PR MACH LEARN RES, V70
[7]   Comparative accuracies of artificial neural networks and discriminant analysis in predicting forest cover types from cartographic variables [J].
Blackard, JA ;
Dean, DJ .
COMPUTERS AND ELECTRONICS IN AGRICULTURE, 1999, 24 (03) :131-151
[8]   Exact and inexact subsampled Newton methods for optimization [J].
Bollapragada, Raghu ;
Byrd, Richard H. ;
Nocedal, Jorge .
IMA JOURNAL OF NUMERICAL ANALYSIS, 2019, 39 (02) :545-578
[9]  
Bordes A, 2009, J MACH LEARN RES, V10, P1737
[10]   Optimization Methods for Large-Scale Machine Learning [J].
Bottou, Leon ;
Curtis, Frank E. ;
Nocedal, Jorge .
SIAM REVIEW, 2018, 60 (02) :223-311