Nonlinear Identification Using Orthogonal Forward Regression With Nested Optimal Regularization

被引:18
作者
Hong, Xia [1 ]
Chen, Sheng [2 ,3 ]
Gao, Junbin [4 ]
Harris, Chris J. [2 ]
机构
[1] Univ Reading, Sch Syst Engn, Reading RG6 6AY, Berks, England
[2] Univ Southampton, Elect & Comp Sci, Southampton SO17 1BJ, Hants, England
[3] King Abdulaziz Univ, Fac Engn, Jeddah 21589, Saudi Arabia
[4] Charles Sturt Univ, Sch Comp & Math, Bathurst, NSW 2795, Australia
关键词
Cross validation (CV); forward regression; identification; leave-one-out (LOO) errors; nonlinear system; regularization; LEAST-SQUARES; MODEL-CONSTRUCTION; ALGORITHM; OPTIMIZATION; SELECTION; REPRESENTATIONS;
D O I
10.1109/TCYB.2015.2389524
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
An efficient data based-modeling algorithm for nonlinear system identification is introduced for radial basis function (RBF) neural networks with the aim of maximizing generalization capability based on the concept of leave-one-out (LOO) cross validation. Each of the RBF kernels has its own kernel width parameter and the basic idea is to optimize the multiple pairs of regularization parameters and kernel widths, each of which is associated with a kernel, one at a time within the orthogonal forward regression (OFR) procedure. Thus, each OFR step consists of one model term selection based on the LOO mean square error (LOOMSE), followed by the optimization of the associated kernel width and regularization parameter, also based on the LOOMSE. Since like our previous state-of-the-art local regularization assisted orthogonal least squares (LROLS) algorithm, the same LOOMSE is adopted for model selection, our proposed new OFR algorithm is also capable of producing a very sparse RBF model with excellent generalization performance. Unlike our previous LROLS algorithm which requires an additional iterative loop to optimize the regularization parameters as well as an additional procedure to optimize the kernel width, the proposed new OFR algorithm optimizes both the kernel widths and regularization parameters within the single OFR procedure, and consequently the required computational complexity is dramatically reduced. Nonlinear system identification examples are included to demonstrate the effectiveness of this new approach in comparison to the well-known approaches of support vector machine and least absolute shrinkage and selection operator as well as the LROLS algorithm.
引用
收藏
页码:2925 / 2936
页数:12
相关论文
共 37 条
[1]  
[Anonymous], 1992, THESIS CALIFORNIA I
[2]   The wavelet-NARMAX representation: A hybrid model structure combining polynomial models with multiresolution wavelet decompositions [J].
Billings, SA ;
Wei, HL .
INTERNATIONAL JOURNAL OF SYSTEMS SCIENCE, 2005, 36 (03) :137-152
[3]   A PREDICTION-ERROR AND STEPWISE-REGRESSION ESTIMATION ALGORITHM FOR NONLINEAR-SYSTEMS [J].
BILLINGS, SA ;
VOON, WSF .
INTERNATIONAL JOURNAL OF CONTROL, 1986, 44 (03) :803-822
[4]   THE IDENTIFICATION OF LINEAR AND NON-LINEAR MODELS OF A TURBOCHARGED AUTOMOTIVE DIESEL-ENGINE [J].
BILLINGS, SA ;
CHEN, S ;
BACKHOUSE, RJ .
MECHANICAL SYSTEMS AND SIGNAL PROCESSING, 1989, 3 (02) :123-142
[5]  
Box G.E., 1976, Time Series Analysis: Forecasting and Control
[6]  
Cawley GC, 2010, J MACH LEARN RES, V11, P2079
[7]   ORTHOGONAL LEAST-SQUARES METHODS AND THEIR APPLICATION TO NON-LINEAR SYSTEM-IDENTIFICATION [J].
CHEN, S ;
BILLINGS, SA ;
LUO, W .
INTERNATIONAL JOURNAL OF CONTROL, 1989, 50 (05) :1873-1896
[8]   Sparse kernel regression modeling using combined locally regularized orthogonal least squares and D-optimality experimental design [J].
Chen, S ;
Hong, X ;
Harris, CJ .
IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2003, 48 (06) :1029-1036
[9]   Sparse mzodeling using orthogonal forward regression with PRESS statistic and regularization [J].
Chen, S ;
Hong, X ;
Harris, CJ ;
Sharkey, PM .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2004, 34 (02) :898-911
[10]   REPRESENTATIONS OF NON-LINEAR SYSTEMS - THE NARMAX MODEL [J].
CHEN, S ;
BILLINGS, SA .
INTERNATIONAL JOURNAL OF CONTROL, 1989, 49 (03) :1013-1032