Recursive reduced kernel based extreme learning machine for aero-engine fault pattern recognition

被引:27
作者
You, Cheng-Xin [1 ,3 ]
Huang, Jin-Quan [1 ,2 ]
Lu, Feng [1 ,2 ,3 ]
机构
[1] Nanjing Univ Aeronaut & Astronaut, Coll Energy & Power Engn, Jiangsu Prov Key Lab Aerosp Power Syst, Nanjing 210016, Peoples R China
[2] Collaborat Innovat Ctr Adv Aeroengine, Beijing 100191, Peoples R China
[3] Aviat Ind Corp China, Aviat Motor Control Syst Inst, Wuxi 214063, Peoples R China
基金
中国国家自然科学基金;
关键词
Extreme learning machine; Kernel method; Sparseness; Reduced technique; Aero-engine; Fault pattern recognition; REGRESSION; ENSEMBLE;
D O I
10.1016/j.neucom.2016.06.069
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Kernel based extreme learning machine (K-ELM) has better generalization performance than basic ELM with less tuned parameters in most applications. However the original K-ELM is lack of sparseness, which makes the model scale grows linearly with sample size. This paper focuses on sparsity of K-ELM and proposes recursive reduced kernel based extreme learning machine (RR-KELM). The proposed algorithm chooses samples making more contribution to target function to constitute kernel dictionary meanwhile considering all the constraints generated by the whole training set. As a result it can simplify model structure and realize sparseness of K-ELM. Experimental results on benchmark datasets show that no matter for regression or classification problems, RR-KELM produces more compact model structure and higher real-time in comparison with other methods. The application of RR-KELM for aero-engine fault pattern recognition is also given in this paper. The simulation results demonstrate that RR-KELM has a high recognition rate on aero-engine fault pattern based on measurable parameters of aero-engine. (C) 2016 Elsevier B.V. All rights reserved.
引用
收藏
页码:1038 / 1045
页数:8
相关论文
共 30 条
[1]  
[Anonymous], REDUCED KERNEL EXTRE
[2]   Sparse Extreme Learning Machine for Classification [J].
Bai, Zuo ;
Huang, Guang-Bin ;
Wang, Danwei ;
Wang, Han ;
Westover, M. Brandon .
IEEE TRANSACTIONS ON CYBERNETICS, 2014, 44 (10) :1858-1870
[3]   Pruning error minimization in least squares support vector machines [J].
de Kruif, BJ ;
de Vries, TJA .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2003, 14 (03) :696-702
[4]  
DeCastro J., 2008, 44 AIAA ASME SAE ASE, DOI [10.2514/6.2008-4579, DOI 10.2514/6.2008-4579]
[5]   Logistic regression and artificial neural network classification models: a methodology review [J].
Dreiseitl, S ;
Ohno-Machado, L .
JOURNAL OF BIOMEDICAL INFORMATICS, 2002, 35 (5-6) :352-359
[6]  
Er MJ, 2014, IEEE IJCNN, P770, DOI 10.1109/IJCNN.2014.6889397
[7]   NEURAL NETWORK ENSEMBLES [J].
HANSEN, LK ;
SALAMON, P .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1990, 12 (10) :993-1001
[8]  
Hoegaerts L, 2004, LECT NOTES COMPUT SC, V3316, P1247
[9]   Trends in extreme learning machines: A review [J].
Huang, Gao ;
Huang, Guang-Bin ;
Song, Shiji ;
You, Keyou .
NEURAL NETWORKS, 2015, 61 :32-48
[10]   Convex incremental extreme learning machine [J].
Huang, Guang-Bin ;
Chen, Lei .
NEUROCOMPUTING, 2007, 70 (16-18) :3056-3062