A statistical self-organizing learning system for remote sensing classification

被引:21
作者
Chi, HM [1 ]
Ersoy, MK [1 ]
机构
[1] Purdue Univ, Sch Elect & Comp Engn, W Lafayette, IN 47907 USA
来源
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING | 2005年 / 43卷 / 08期
关键词
artificial neural networks; hyperspectral image; classification; machine learning; overlitting; support vector machine (SMV); t-test;
D O I
10.1109/TGRS.2005.851188
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
A new learning system called a statistical self-organizing learning system (SSOLS), combining functional-link neural networks, statistical hypothesis testing, and self-organization of a number of enhancement nodes, is introduced for remote sensing applications. Its structure consists of two stages, a mapping stage and a learning stage. The input training vectors are initially mapped to the enhancement vectors in the mapping stage by multiplying with a random matrix, followed by pointwise nonlinear transformations. Starting with only one enhancement node, the enhancement layer incrementally adds an extra node in each iteration. The optimum dimension of the enhancement layer is determined by using an efficient leave-one-out cross-validation method. In this way, the number of enhancement nodes is also learned automatically. A t-test algorithm can also be applied to the mapping stage to mitigate the effect of overfitting and to further reduce the number of enhancement nodes required, resulting in a more compact network. In the learning stage, both the input vectors and the enhancement vectors are fed into a least squares learning module to obtain the estimated output vectors. This is made possible by choosing the output layer linear. In addition, several SSOLSs can be trained independently in parallel to form a consensual SSOLS, whose final output is a linear combination of the outputs of each SSOLS module. The SSOLS is simple, fast to compute, and suitable for remote sensing applications,
引用
收藏
页码:1890 / 1900
页数:11
相关论文
共 23 条
[1]  
[Anonymous], 2012, Methods of multivariate analysis
[2]   Parallel consensual neural networks [J].
Benediktsson, JA ;
Sveinsson, JR ;
Ersoy, OK ;
Swain, PH .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1997, 8 (01) :54-64
[4]  
Casella G., 2021, STAT INFERENCE
[5]   A rapid learning and dynamic stepwise updating algorithm for flat neural networks and the application to time-series prediction [J].
Chen, CLP ;
Wan, JZ .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 1999, 29 (01) :62-72
[6]   An incremental adaptive implementation of functional-link processing for function approximation, time-series prediction, and system identification [J].
Chen, CLP ;
LeClair, SR ;
Pao, YH .
NEUROCOMPUTING, 1998, 18 (1-3) :11-31
[7]  
CHI HM, 2002, P 2002 INT ENG SYST, V11
[8]  
COWEN CC, 1996, LINEAR ALGEBRA ENG S
[9]  
Fukunaga K., 1990, INTRO STAT PATTERN R
[10]  
Haykin S., 1999, NEURAL NETWORK COMPR