A deep residual compensation extreme learning machine and applications

被引:34
作者
Chen, Yinghao [1 ]
Xie, Xiaoliang [2 ]
Zhang, Tianle [1 ]
Bai, Jiaxian [3 ]
Hou, Muzhou [1 ]
机构
[1] Cent South Univ, Coll Math & Stat, Changsha 410083, Peoples R China
[2] Hunan Univ Technol & Business, Sch Math & Stat, Changsha, Peoples R China
[3] Hunan Univ, Coll Finance & Stat, Changsha, Peoples R China
关键词
airfoil self-noise; deep residual compensation extreme learning machine; extreme learning machine; gold price forecasting; regression problem; AIC MODEL SELECTION; OPTIMIZATION; REDUCTION; ALGORITHM;
D O I
10.1002/for.2663
中图分类号
F [经济];
学科分类号
02 ;
摘要
The extreme learning machine (ELM) is a type of machine learning algorithm for training a single hidden layer feedforward neural network. Randomly initializing the weight between the input layer and the hidden layer and the threshold of each hidden layer neuron, the weight matrix of the hidden layer can be calculated by the least squares method. The efficient learning ability in ELM makes it widely applicable in classification, regression, and more. However, owing to some unutilized information in the residual, there are relatively huge prediction errors involving ELM. In this paper, a deep residual compensation extreme learning machine model (DRC-ELM) of multilayer structures applied to regression is presented. The first layer is the basic ELM layer, which helps in obtaining an approximation of the objective function by learning the characteristics of the sample. The other layers are the residual compensation layers in which the learned residual is corrected layer by layer to the predicted value obtained in the previous layer by constructing a feature mapping between the input layer and the output of the upper layer. This model is applied to two practical problems: gold price forecasting and airfoil self-noise prediction. We used the DRC-ELM with 50, 100, and 200 residual compensation layers respectively for experiments, which show that DRC-ELM does better in generalization and robustness than classical ELM, improved ELM models such as GA-RELM and OS-ELM, and other traditional machine learning algorithms such as support vector machine (SVM) and back-propagation neural network (BPNN).
引用
收藏
页码:986 / 999
页数:14
相关论文
共 49 条
[1]   AIC MODEL SELECTION IN OVERDISPERSED CAPTURE-RECAPTURE DATA [J].
ANDERSON, DR ;
BURNHAM, KP ;
WHITE, GC .
ECOLOGY, 1994, 75 (06) :1780-1793
[2]   A Graphical Tool for Describing the Temporal Evolution of Clusters in Financial Stock Markets [J].
Arratia, Argimiro ;
Cabana, Alejandra .
COMPUTATIONAL ECONOMICS, 2013, 41 (02) :213-231
[3]  
Brooks T. F., 1989, NASA Reference Publication
[4]   Landmark recognition with compact BoW histogram and ensemble ELM [J].
Cao, Jiuwen ;
Chen, Tao ;
Fan, Jiayuan .
MULTIMEDIA TOOLS AND APPLICATIONS, 2016, 75 (05) :2839-2857
[5]   Extreme learning machines: new trends and applications [J].
Deng ChenWei ;
Huang GuangBin ;
Xu Jia ;
Tang JieXiong .
SCIENCE CHINA-INFORMATION SCIENCES, 2015, 58 (02) :1-16
[6]   Optimization method for wind turbine rotors [J].
Fuglsang, P ;
Madsen, HA .
JOURNAL OF WIND ENGINEERING AND INDUSTRIAL AERODYNAMICS, 1999, 80 (1-2) :191-206
[7]   Ensemble of extreme learning machine for remote sensing image classification [J].
Han, Min ;
Liu, Ben .
NEUROCOMPUTING, 2015, 149 :65-70
[8]   Deep Residual Learning for Image Recognition [J].
He, Kaiming ;
Zhang, Xiangyu ;
Ren, Shaoqing ;
Sun, Jian .
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, :770-778
[9]   APPROXIMATION CAPABILITIES OF MULTILAYER FEEDFORWARD NETWORKS [J].
HORNIK, K .
NEURAL NETWORKS, 1991, 4 (02) :251-257
[10]   Global Solar Radiation Prediction Using Hybrid Online Sequential Extreme Learning Machine Model [J].
Hou, Muzhou ;
Zhang, Tianle ;
Weng, Futian ;
Ali, Mumtaz ;
Al-Ansari, Nadhir ;
Yaseen, Zaher Mundher .
ENERGIES, 2018, 11 (12)