Accelerating Cross-Validation in Multinomial Logistic Regression with l1-Regularization

被引:0
作者
Obuchi, Tomoyuki [1 ]
Kabashima, Yoshiyuki [1 ]
机构
[1] Tokyo Inst Technol, Dept Math & Comp Sci, Meguro Ku, 2-12-1 Ookayama, Tokyo, Japan
关键词
classification; multinomial logistic regression; cross-validation; linear perturbation; self-averaging approximation; FEEDFORWARD NEURAL-NETWORKS; MEAN-FIELD APPROACH; ALGORITHMS; REGULARIZATION; EXPECTATION; MODELS; BOUNDS;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We develop an approximate formula for evaluating a cross-validation estimator of predictive likelihood for multinomial logistic regression regularized by an l(1)-norm. This allows us to avoid repeated optimizations required for literally conducting cross-validation; hence, the computational time can be significantly reduced. The formula is derived through a perturbative approach employing the largeness of the data size and the model dimensionality. An extension to the elastic net regularization is also addressed. The usefulness of the approximate formula is demonstrated on simulated data and the ISOLET dataset from the UCI machine learning repository. MATLAB and python codes implementing the approximate formula are distributed in (Obuchi, 2017; Takahashi and Obuchi, 2017).
引用
收藏
页数:30
相关论文
共 50 条
[21]   Integrated cross-validation for the random design nonparametric regression [J].
Chang, TK ;
Deng, WS ;
Lin, JH ;
Chu, CK .
TAIWANESE JOURNAL OF MATHEMATICS, 2005, 9 (01) :123-141
[22]   An Advanced Pruning Method in the Architecture of Extreme Learning Machines Using L1-Regularization and Bootstrapping [J].
Souza, Paulo Vitor de Campos ;
Bambirra Torres, Luiz Carlos ;
Lacerda Silva, Gustavo Rodrigues ;
Braga, Antonio de Padua ;
Lughofer, Edwin .
ELECTRONICS, 2020, 9 (05)
[23]   On l1-Regularization in Light of Nashed's Ill-Posedness Concept [J].
Flemming, Jens ;
Hofmann, Bernd ;
Veselic, Ivan .
COMPUTATIONAL METHODS IN APPLIED MATHEMATICS, 2015, 15 (03) :279-289
[24]   Asymptotic comparison of (partial) cross-validation, GCV and randomized GCV in nonparametric regression [J].
Girard, DA .
ANNALS OF STATISTICS, 1998, 26 (01) :315-334
[25]   Efficient construction of sparse radial basis function neural networks using L1-regularization [J].
Qian, Xusheng ;
Huang, He ;
Chen, Xiaoping ;
Huang, Tingwen .
NEURAL NETWORKS, 2017, 94 :239-254
[26]   L1-Regularization Based EEG Feature Learning for Detecting Epileptic Seizure [J].
Hussein, Ramy ;
Wang, Z. Jane ;
Ward, Rabab .
2016 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP), 2016, :1171-1175
[27]   Ensemble Kalman Filter Regularization Using Leave-One-Out Data Cross-Validation [J].
Rayo, Lautaro ;
Hoteit, Ibrahim .
NUMERICAL ANALYSIS AND APPLIED MATHEMATICS (ICNAAM 2012), VOLS A AND B, 2012, 1479 :1247-1250
[28]   Convergence rate of cross-validation in nonlinear wavelet regression estimation [J].
ZHANG Shuanglin and ZHENG ZhongguoDepartment of Probability and Statistics .
Chinese Science Bulletin, 1999, (10) :898-901
[29]   Cross-validation for change-point regression: Pitfalls and solutions [J].
Pein, Florian ;
Shah, Rajen d. .
BERNOULLI, 2025, 31 (01) :388-411
[30]   Convergence rate of cross-validation in nonlinear wavelet regression estimation [J].
Zhang, SL ;
Zheng, ZG .
CHINESE SCIENCE BULLETIN, 1999, 44 (10) :898-901