A Novel Regularization Paradigm for the Extreme Learning Machine

被引:7
作者
Zhang, Yuao [1 ]
Dai, Yunwei [1 ]
Wu, Qingbiao [1 ]
机构
[1] Zhejiang Univ, Sch Math Sci, 38 Zheda Rd, Hangzhou 310027, Zhejiang, Peoples R China
基金
中国国家自然科学基金;
关键词
Extreme learning machine (ELM); Robustness; Generalization; Convexity; Convergence analysis; RIDGE-REGRESSION; ELM; SCHEME;
D O I
10.1007/s11063-023-11248-7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Due to its fast training speed and powerful approximation capabilities, the extreme learning machine (ELM) has generated a lot of attention in recent years. However, the basic ELM still has some drawbacks, such as the tendency to over-fitting and the susceptibility to noisy data. By adding a regularization term to the basic ELM, the regularized extreme learning machine (R-ELM) can dramatically improve its generalization and stability. In the R-ELM, choosing an appropriate regularization parameter is critical since it can regulate the fitting and generalization capabilities of the model. In this paper, we propose the regularized functional extreme learning machine (RF-ELM), which employs the regularization functional instead of a preset regularization parameter for adaptively choosing appropriate regularization parameters. The regularization functional is defined according to output weights, and the successive approximation iterative algorithm is utilized to solve the output weights so that we can get their values simultaneously at each iteration step. We also developed a parallel version of RF-ELM (PRF-ELM) to deal with big data tasks. Furthermore, the analysis of convexity and convergence ensures the validity of the model training. Finally, the experiments on the function approximation and the UCI repository with or without noise data demonstrate the superiority and competitiveness of our proposed models.
引用
收藏
页码:7009 / 7033
页数:25
相关论文
共 43 条
[11]   Extreme learning machines: a survey [J].
Huang, Guang-Bin ;
Wang, Dian Hui ;
Lan, Yuan .
INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2011, 2 (02) :107-122
[12]   Protein fold recognition using Deep Kernelized Extreme Learning Machine and linear discriminant analysis [J].
Ibrahim, Wisam ;
Abadeh, Mohammad Saniee .
NEURAL COMPUTING & APPLICATIONS, 2019, 31 (08) :4201-4214
[13]   GENERAL CHOICE OF THE REGULARIZATION FUNCTIONAL IN REGULARIZED IMAGE-RESTORATION [J].
KANG, MG ;
KATSAGGELOS, AK .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 1995, 4 (05) :594-602
[14]   Generalized multichannel image deconvolution approach and its applications [J].
Kang, MG .
OPTICAL ENGINEERING, 1998, 37 (11) :2953-2964
[15]   Extreme minimal learning machine: Ridge regression with distance-based basis [J].
Karkkainen, Tommi .
NEUROCOMPUTING, 2019, 342 :33-48
[16]   Consistency of Multiagent Distributed Generative Adversarial Networks [J].
Ke, Shuya ;
Liu, Wenqi .
IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (06) :4886-4896
[17]   Real-Time Lossy Audio Signal Reconstruction Using Novel Sliding Based Multi-instance Linear Regression/Random Forest and Enhanced CGPANN [J].
Khan, Nadia Masood ;
Khan, Gul Muhammad .
NEURAL PROCESSING LETTERS, 2021, 53 (01) :227-255
[19]   Multi-parallel Extreme Learning Machine with Excitatory and Inhibitory Neurons for Regression [J].
Li, Guoqiang ;
Zou, Junnan .
NEURAL PROCESSING LETTERS, 2020, 51 (02) :1579-1597
[20]   Parameter-Free Extreme Learning Machine for Imbalanced Classification [J].
Li, Li ;
Zhao, Kaiyi ;
Sun, Ruizhi ;
Gan, Jiangzhang ;
Yuan, Gang ;
Liu, Tong .
NEURAL PROCESSING LETTERS, 2020, 52 (03) :1927-1944