Multimodal pattern recognition by modular neural network

被引:3
作者
Yang, SL [1 ]
Chang, KC [1 ]
机构
[1] George Mason Univ, Sch Informat Technol & Engn, Ctr Excellence Command Control Commun & Intellige, Fairfax, VA 22030 USA
关键词
multilayer perceptron; neural network; modular; multimodal distribution; automatic target recognition; learning; local minimum;
D O I
10.1117/1.602016
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
Multilayer perceptrons (MLPs) have been widely applied to pattern recognition. it is found that when the data has a multimodal distribution, a standard MLP is hard to train and a valid neural network classifier is difficult to obtain. We propose a two-phase learning modular (TLM) neural network architecture to tackle this problem. The basic idea is to transform the multimodal distribution into a known and more learnable distribution and then use a standard MLP to classify the new data. The transformation is accomplished by decomposing the input feature space into several subspaces and training several MLPs with samples in the subsets. We verified this idea with a two-class classification example and applied the TLM to the inverse synthetic aperture radar (ISAR) automatic target recognition (ATR), and compared its performance with that of the MLP. Experiments show that the MLP is difficult to train. Its performance depends strongly on the number of training samples as well as the architecture parameters. On the other hand, the TLM is much easier to train and yields better performance. In addition, the TLM's performance is more robust. (C) 1998 Society of Photo-Optical instrumentation Engineers.
引用
收藏
页码:650 / 659
页数:10
相关论文
共 16 条
[2]   A HYBRID ALGORITHM FOR FINDING THE GLOBAL MINIMUM OF ERROR FUNCTION OF NEURAL NETWORKS AND ITS APPLICATIONS [J].
BABA, N ;
MOGAMI, Y ;
KOHZAKI, M ;
SHIRAISHI, Y ;
YOSHIDA, Y .
NEURAL NETWORKS, 1994, 7 (08) :1253-1265
[3]  
BALL GH, 1965, P IFIPS C
[4]   DYNAMIC NODE ARCHITECTURE LEARNING - AN INFORMATION-THEORETIC APPROACH [J].
BARTLETT, EB .
NEURAL NETWORKS, 1994, 7 (01) :129-140
[5]  
Krogager E., 1995, P INT JOINT C 4 IEEE, V1, P77
[6]   TOWARD GENERATING NEURAL-NETWORK STRUCTURES FOR FUNCTION APPROXIMATION [J].
NABHAN, TM ;
ZOMAYA, AY .
NEURAL NETWORKS, 1994, 7 (01) :89-99
[7]  
Novak L. M., 1992, International Journal of Imaging Systems and Technology, V4, P306, DOI 10.1002/ima.1850040410
[8]  
NOVAK LM, 1991, P IEEE INT C SYST EN, V1, P6
[9]  
Sangiovanni-Vincentelli 1 988]., 1988, Proceedings of The Conference on Neural Information Processing Systems, P40
[10]   BLOCK-STRUCTURED RECURRENT NEURAL NETWORKS [J].
SANTINI, S ;
DELBIMBO, A ;
JAIN, R .
NEURAL NETWORKS, 1995, 8 (01) :135-147