Low-Complexity Nonlinear Adaptive Filter Based on a Pipelined Bilinear Recurrent Neural Network

被引:38
作者
Zhao, Haiquan [1 ]
Zeng, Xiangping [2 ]
He, Zhengyou [3 ]
机构
[1] SW Jiaotong Univ, Sch Elect Engn, Chengdu 610031, Peoples R China
[2] SW Jiaotong Univ, Sch Informat Sci & Technol, Chengdu 610031, Peoples R China
[3] SW Jiaotong Univ, Sch Elect Engn, Chengdu 610031, Peoples R China
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2011年 / 22卷 / 09期
基金
美国国家科学基金会;
关键词
Bilinear recurrent neural network; pipelined architecture; pipelined recurrent neural network; real-time recurrent learning; Volterra filter; TRAINABLE AMPLITUDE; LEARNING ALGORITHM; MPEG VIDEO; PREDICTION; IDENTIFICATION; BACKPROPAGATION; EQUALIZATION; SPEECH;
D O I
10.1109/TNN.2011.2161330
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
To reduce the computational complexity of the bilinear recurrent neural network (BLRNN), a novel low-complexity nonlinear adaptive filter with a pipelined bilinear recurrent neural network (PBLRNN) is presented in this paper. The PBLRNN, inheriting the modular architectures of the pipelined RNN proposed by Haykin and Li, comprises a number of BLRNN modules that are cascaded in a chained form. Each module is implemented by a small-scale BLRNN with internal dynamics. Since those modules of the PBLRNN can be performed simultaneously in a pipelined parallelism fashion, it would result in a significant improvement of computational efficiency. Moreover, due to nesting module, the performance of the PBLRNN can be further improved. To suit for the modular architectures, a modified adaptive amplitude real-time recurrent learning algorithm is derived on the gradient descent approach. Extensive simulations are carried out to evaluate the performance of the PBLRNN on nonlinear system identification, nonlinear channel equalization, and chaotic time series prediction. Experimental results show that the PBLRNN provides considerably better performance compared to the single BLRNN and RNN models.
引用
收藏
页码:1494 / 1507
页数:14
相关论文
共 49 条
[1]   Fuzzy wavelet neural networks for identification and control of dynamic plants - A novel structure and a comparative study [J].
Abiyev, Rahib Hidayat ;
Kaynak, Okyay .
IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 2008, 55 (08) :3133-3140
[2]  
[Anonymous], 2001, Polynomial Signal Processing
[3]   New results on recurrent network training: Unifying the algorithms and accelerating convergence [J].
Atiya, AF ;
Parlos, AG .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2000, 11 (03) :697-709
[4]   Nonlinear adaptive prediction of speech with a pipelined recurrent neural network [J].
Baltersee, J ;
Chambers, JA .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 1998, 46 (08) :2207-2216
[5]   Optimal nonlinear adaptive prediction and modeling of MPEG video in ATM networks using pipelined recurrent neural networks [J].
Chang, PR ;
Hu, JT .
IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 1997, 15 (06) :1087-1100
[6]  
Chen YS, 2006, IEEE T WIREL COMMUN, V5, P23, DOI [10.1109/TWC.2006.1576521, 10.1109/TWC.2005.858033]
[7]   Kalman filter-trained recurrent neural equalizers for time-varying channels [J].
Choi, J ;
Lima, ACD ;
Haykin, S .
IEEE TRANSACTIONS ON COMMUNICATIONS, 2005, 53 (03) :472-480
[8]   Decision feedback recurrent neural equalization with fast convergence rate [J].
Choi, J ;
Bouchard, M ;
Yeap, TH .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2005, 16 (03) :699-708
[9]   RECURRENT NEURAL NETWORKS AND ROBUST TIME-SERIES PREDICTION [J].
CONNOR, JT ;
MARTIN, RD ;
ATLAS, LE .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (02) :240-254
[10]   Backpropagation algorithms for a broad class of dynamic networks [J].
De Jesus, Orlando ;
Hagan, Martin T. .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2007, 18 (01) :14-27