Symbolic Regression Based Extreme Learning Machine Models for System Identification

被引:8
作者
Kokturk-Guzel, Basak Esin [1 ]
Beyhan, Selami [1 ]
机构
[1] Izmir Democracy Univ, Elect & Elect Engn, Gursel Aksel Bulvari 14, Izmir, Turkey
关键词
Symbolic regression; Extreme learning machine; System identification; Parsimonious model;
D O I
10.1007/s11063-021-10465-2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Reproducible machine learning models with less number of parameters and fast optimization are preferred in embedded system design for the applications of artificial intelligence. Due to implementation advantages, symbolic regression with genetic programming has been used for modeling data. In addition, extreme learning machines have been designed with acceptable performances in virtue of random learning strategy. In this paper, symbolic regression featured extreme learning machine models are proposed for the system identification. The symbolic regression layer with mathematical operators and basis functions has been randomly constructed instead of genetic programming whereas the output weighting parameters are optimized via least-squares optimization as in extreme learning machines. Consequently; implementable, efficient and easy designed models are constructed for future applications. Comparative results of the proposed and literature models present that proposed models provided smaller mean-squared errors and minimum-descriptive length performances.
引用
收藏
页码:1565 / 1578
页数:14
相关论文
共 24 条
[1]   Identification of nonlinear discrete-time systems using raised-cosine radial basis function networks [J].
Al-Ajlouni, AF ;
Schilling, RJ ;
Harris, SL .
INTERNATIONAL JOURNAL OF SYSTEMS SCIENCE, 2004, 35 (04) :211-221
[2]   Comprehensive Modeling of U-Tube Steam Generators Using Extreme Learning Machines [J].
Beyhan, Selami ;
Kavaklioglu, Kadir .
IEEE TRANSACTIONS ON NUCLEAR SCIENCE, 2015, 62 (05) :2245-2254
[3]   Fuzzy functions based ARX model and new fuzzy basis function models for nonlinear system identification [J].
Beyhan, Selami ;
Alci, Musa .
APPLIED SOFT COMPUTING, 2010, 10 (02) :439-444
[4]   Feature Selection to Improve Generalization of Genetic Programming for High-Dimensional Symbolic Regression [J].
Chen, Qi ;
Zhang, Mengjie ;
Xue, Bing .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2017, 21 (05) :792-806
[5]   Constructing parsimonious analytic models for dynamic systems via symbolic regression [J].
Derner, Erik ;
Kubalik, Jiri ;
Ancona, Nicola ;
Babuska, Robert .
APPLIED SOFT COMPUTING, 2020, 94
[6]   Stability of nonlinear polynomial ARMA models and their inverse [J].
Hernandez, E ;
Arkun, Y .
INTERNATIONAL JOURNAL OF CONTROL, 1996, 63 (05) :885-906
[7]   Universal approximation using incremental constructive feedforward networks with random hidden nodes [J].
Huang, Guang-Bin ;
Chen, Lei ;
Siew, Chee-Kheong .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2006, 17 (04) :879-892
[8]  
KOZA JR, 1994, STAT COMPUT, V4, P87, DOI 10.1007/BF00175355
[9]   Inference of compact nonlinear dynamic models by epigenetic local search [J].
La Cava, William ;
Danai, Kourosh ;
Spector, Lee .
ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2016, 55 :292-306
[10]   Extreme Learning Machine for Supervised Classification with Self-paced Learning [J].
Li, Li ;
Zhao, Kaiyi ;
Li, Sicong ;
Sun, Ruizhi ;
Cai, Saihua .
NEURAL PROCESSING LETTERS, 2020, 52 (03) :1723-1744