An Interpretable Constructive Algorithm for Incremental Random Weight Neural Networks and Its Application

被引:2
作者
Nan, Jing [1 ,2 ]
Dai, Wei [3 ]
Yuan, Guan [3 ]
Zhou, Ping [4 ,5 ]
机构
[1] China Univ Min & Technol, Sch Informat & Control Engn, Xuzhou 221116, Jiangsu, Peoples R China
[2] Singapore Univ Technol & Design, Engn Prod Dev Pillar, Singapore 487372, Singapore
[3] China Univ Min & Technol, Sch Informat & Control Engn, Sch Comp Sci & Technol, Digitizat Mine,Engn Res Ctr,Minist Educ, Xuzhou 221116, Jiangsu, Peoples R China
[4] Northeastern Univ, State Key Lab Synthet Automat Proc Ind, Shenyang 110819, Peoples R China
[5] Minist Educ, Key Lab Coal Proc & Efficient Utilizat, Xuzhou 221116, Jiangsu, Peoples R China
基金
中国国家自然科学基金;
关键词
Data modeling; interpretable constructive algorithm; neural networks (NNs); random algorithms; spatial geometric information; STOCHASTIC CONFIGURATION NETWORKS; FUNCTION APPROXIMATION;
D O I
10.1109/TII.2024.3423487
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this article, we aim to offer an interpretable learning paradigm for incremental random weight neural networks (IRWNNs). IRWNNs have become a hot research direction of neural network algorithms due to their ease of deployment and fast learning speed. However, existing IRWNNs have difficulty explaining how hidden nodes (parameters) affect the convergence of network residuals. To address this gap, this article proposes an interpretable construction algorithm (ICA). Specifically, we first conduct a spatial geometric analysis of the network construction process and establish the spatial geometric relationship between the network residuals and hidden parameters to visualize the influence of hidden parameters on the convergence of the network residuals. Second, based on the spatial geometric relationship and node pool strategy, an interpretable control strategy with spatial geometry information is established to obtain hidden parameters conducive to the convergence of network residuals. In addition, to facilitate ICA to handle complex tasks of big data, this article proposes a lightweight ICA with low complexity, namely ICA+. Finally, it is proved theoretically that the ICA and ICA+ proposed in this article have universal approximation properties. The experimental results on two real-world datasets and seven benchmark datasets demonstrate the advantages of the proposed ICA and ICA+ in terms of fast learning, good generalization, and compactness of network structure.
引用
收藏
页码:13622 / 13632
页数:11
相关论文
共 45 条
[1]  
Alcalá-Fdez J, 2011, J MULT-VALUED LOG S, V17, P255
[2]   Wavelet neural networks: A practical guide [J].
Alexandridis, Antonios K. ;
Zapranis, Achilleas D. .
NEURAL NETWORKS, 2013, 42 :1-27
[3]   Broad Learning System: An Effective and Efficient Incremental Learning System Without the Need for Deep Architecture [J].
Chen, C. L. Philip ;
Liu, Zhulin .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (01) :10-24
[4]   A rapid learning and dynamic stepwise updating algorithm for flat neural networks and the application to time-series prediction [J].
Chen, CLP ;
Wan, JZ .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 1999, 29 (01) :62-72
[5]   A Rapid Spiking Neural Network Approach With an Application on Hand Gesture Recognition [J].
Cheng, Long ;
Liu, Yang ;
Hou, Zeng-Guang ;
Tan, Min ;
Du, Dajun ;
Fei, Minrui .
IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2021, 13 (01) :151-161
[6]  
Cristianini N., 2000, An Introduction to Support Vector Machines and Other Kernel-based Learning Methods
[7]   Hybrid Parallel Stochastic Configuration Networks for Industrial Data Analytics [J].
Dai, Wei ;
Zhou, Xinyu ;
Li, Depeng ;
Zhu, Song ;
Wang, Xuesong .
IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2022, 18 (04) :2331-2341
[9]   Error Minimized Extreme Learning Machine With Growth of Hidden Nodes and Incremental Learning [J].
Feng, Guorui ;
Huang, Guang-Bin ;
Lin, Qingping ;
Gay, Robert .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2009, 20 (08) :1352-1357
[10]   Smooth function approximation using neural networks [J].
Ferrari, S ;
Stengel, RF .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2005, 16 (01) :24-38