共 43 条
A Novel Regularization Paradigm for the Extreme Learning Machine
被引:7
作者:
Zhang, Yuao
[1
]
Dai, Yunwei
[1
]
Wu, Qingbiao
[1
]
机构:
[1] Zhejiang Univ, Sch Math Sci, 38 Zheda Rd, Hangzhou 310027, Zhejiang, Peoples R China
基金:
中国国家自然科学基金;
关键词:
Extreme learning machine (ELM);
Robustness;
Generalization;
Convexity;
Convergence analysis;
RIDGE-REGRESSION;
ELM;
SCHEME;
D O I:
10.1007/s11063-023-11248-7
中图分类号:
TP18 [人工智能理论];
学科分类号:
081104 ;
0812 ;
0835 ;
1405 ;
摘要:
Due to its fast training speed and powerful approximation capabilities, the extreme learning machine (ELM) has generated a lot of attention in recent years. However, the basic ELM still has some drawbacks, such as the tendency to over-fitting and the susceptibility to noisy data. By adding a regularization term to the basic ELM, the regularized extreme learning machine (R-ELM) can dramatically improve its generalization and stability. In the R-ELM, choosing an appropriate regularization parameter is critical since it can regulate the fitting and generalization capabilities of the model. In this paper, we propose the regularized functional extreme learning machine (RF-ELM), which employs the regularization functional instead of a preset regularization parameter for adaptively choosing appropriate regularization parameters. The regularization functional is defined according to output weights, and the successive approximation iterative algorithm is utilized to solve the output weights so that we can get their values simultaneously at each iteration step. We also developed a parallel version of RF-ELM (PRF-ELM) to deal with big data tasks. Furthermore, the analysis of convexity and convergence ensures the validity of the model training. Finally, the experiments on the function approximation and the UCI repository with or without noise data demonstrate the superiority and competitiveness of our proposed models.
引用
收藏
页码:7009 / 7033
页数:25
相关论文