A sparse deep belief network with efficient fuzzy learning framework

被引:59
作者
Wang, Gongming [1 ]
Jia, Qing-Shan [1 ]
Qiao, Junfei [2 ]
Bi, Jing [2 ]
Liu, Caixia [3 ]
机构
[1] Tsinghua Univ, Dept Automat, Beijing 100084, Peoples R China
[2] Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
[3] Peking Univ, Dept Environm Engn, Beijing 100871, Peoples R China
基金
中国国家自然科学基金;
关键词
Deep belief network; Deep learning; Sparse representation; Fuzzy neural network; Nonlinear system modeling; NONLINEAR-SYSTEM IDENTIFICATION; NEURAL-NETWORKS; CLASSIFICATION; REPRESENTATION; STABILITY; ALGORITHM;
D O I
10.1016/j.neunet.2019.09.035
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep belief network (DBN) is one of the most feasible ways to realize deep learning (DL) technique, and it has been attracting more and more attentions in nonlinear system modeling. However, DBN cannot provide satisfactory results in learning speed, modeling accuracy and robustness, which is mainly caused by dense representation and gradient diffusion. To address these problems and promote DBN's development in cross-models, we propose a Sparse Deep Belief Network with Fuzzy Neural Network (SDBFNN) for nonlinear system modeling. In this novel framework, the sparse DBN is considered as a pre-training technique to realize fast weight-initialization and to obtain feature vectors. It can balance the dense representation to improve its robustness. A fuzzy neural network is developed for supervised modeling so as to eliminate the gradient diffusion. Its input happens to be the obtained feature vector. As a novel cross-model, SDBFNN combines the advantages of both pre-training technique and fuzzy neural network to improve modeling capability. Its convergence is also analyzed as well. A benchmark problem and a practical problem in wastewater treatment are conducted to demonstrate the superiority of SDBFNN. The extensive experimental results show that SDBFNN achieves better performance than the existing methods in learning speed, modeling accuracy and robustness. (C) 2019 Elsevier Ltd. All rights reserved.
引用
收藏
页码:430 / 440
页数:11
相关论文
共 40 条
[1]   Automatic Modulation Classification Using Deep Learning Based on Sparse Autoencoders With Nonnegativity Constraints [J].
Ali, Afan ;
Fan Yangyu .
IEEE SIGNAL PROCESSING LETTERS, 2017, 24 (11) :1626-1630
[2]  
[Anonymous], 2008, Advances in neural information processing systems
[3]  
[Anonymous], USE DROPOUTS SPARSIT
[4]   Representation Learning: A Review and New Perspectives [J].
Bengio, Yoshua ;
Courville, Aaron ;
Vincent, Pascal .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (08) :1798-1828
[5]   Learning Deep Architectures for AI [J].
Bengio, Yoshua .
FOUNDATIONS AND TRENDS IN MACHINE LEARNING, 2009, 2 (01) :1-127
[6]   Multisensor Feature Fusion for Bearing Fault Diagnosis Using Sparse Autoencoder and Deep Belief Network [J].
Chen, Zhuyun ;
Li, Weihua .
IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2017, 66 (07) :1693-1702
[7]   Randomized algorithms for nonlinear system identification with deep learning modification [J].
de la Rosa, Erick ;
Yu, Wen .
INFORMATION SCIENCES, 2016, 364 :197-212
[8]   An optimizing BP neural network algorithm based on genetic algorithm [J].
Ding, Shifei ;
Su, Chunyang ;
Yu, Junzhao .
ARTIFICIAL INTELLIGENCE REVIEW, 2011, 36 (02) :153-162
[9]   Integration of genetic fuzzy systems and artificial neural networks for stock price forecasting [J].
Hadavandi, Esmaeil ;
Shavandi, Hassan ;
Ghanbari, Arash .
KNOWLEDGE-BASED SYSTEMS, 2010, 23 (08) :800-808
[10]   Efficient self-organizing multilayer neural network for nonlinear system modeling [J].
Han, Hong-Gui ;
Wang, Li-Dan ;
Qiao, Jun-Fei .
NEURAL NETWORKS, 2013, 43 :22-32