Single-layered complex-valued neural network for real-valued classification problems

被引:131
作者
Amin, Md. Faijul [1 ]
Murase, Kazuyuki [1 ,2 ]
机构
[1] Univ Fukui, Dept Human & Artificial Intelligence Syst, Grad Sch Engn, Fukui 9108507, Japan
[2] Univ Fukui, Res & Educ Program Life Sci, Fukui 9108507, Japan
关键词
Activation function; Classification; Complex-valued neural networks; Generalization; Phase-encoding; BACKPROPAGATION ALGORITHM;
D O I
10.1016/j.neucom.2008.04.006
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents a model of complex-valued neuron (CVN) for real-valued classification problems, introducing two new activation functions. In this CVN model, each real-valued input is encoded into a phase between 0 and it of a complex number of unity magnitude, and multiplied by a complex-valued weight. The weighted sum of inputs is then fed to an activation function. Both the proposed activation functions map complex values into real values, and their role is to divide the net-input (weighted sum) space into multiple regions representing the classes of input patterns. Gradient-based learning rules are derived for each of the activation functions. The ability of such CVN is discussed and tested with two-class problems, such as two- and three-input Boolean problems, and the symmetry detection in binary sequences. We show here that the CVN with both activation functions can form proper boundaries for these linear and nonlinear problems. For solving n-class problems, a complex-valued neural network (CVNN) consisting of n CVNs is also studied. We defined the one exhibiting the largest output among all the neurons as representing the output class. We tested such single-layered CVNNs on several real world benchmark problems. The results show that the classification ability of single-layered CVNN on unseen data is comparable to the conventional real-valued neural network (RVNN) having one hidden layer. Moreover, convergence of the CVNN is much faster than that of the RVNN in most cases. (C) 2008 Elsevier B.V. All rights reserved.
引用
收藏
页码:945 / 955
页数:11
相关论文
共 17 条
[1]   Multilayer feedforward neural network based on multi-valued neurons (MLMVN) and a backpropagation learning algorithm [J].
Aizenberg, Igor ;
Moraga, Claudio .
SOFT COMPUTING, 2007, 11 (02) :169-183
[2]  
Aoki H., 2003, COMPLEX VALUED NEURA, V5, P181
[3]   COMPLEX-DOMAIN BACKPROPAGATION [J].
GEORGIOU, GM ;
KOUTSOUGERAS, C .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-ANALOG AND DIGITAL SIGNAL PROCESSING, 1992, 39 (05) :330-334
[4]   APPLICATIONS OF COMPLEX-VALUED NEURAL NETWORKS TO COHERENT OPTICAL COMPUTING USING PHASE-SENSITIVE DETECTION SCHEME [J].
HIROSE, A .
INFORMATION SCIENCES-APPLICATIONS, 1994, 2 (02) :103-117
[5]  
Hirose A., 2006, COMPLEX VALUED NEURA
[6]   MULTILAYER FEEDFORWARD NETWORKS ARE UNIVERSAL APPROXIMATORS [J].
HORNIK, K ;
STINCHCOMBE, M ;
WHITE, H .
NEURAL NETWORKS, 1989, 2 (05) :359-366
[7]   Fully complex multi-layer perceptron network for nonlinear signal processing [J].
Kim, T ;
Adali, T .
JOURNAL OF VLSI SIGNAL PROCESSING SYSTEMS FOR SIGNAL IMAGE AND VIDEO TECHNOLOGY, 2002, 32 (1-2) :29-43
[8]   THE COMPLEX BACKPROPAGATION ALGORITHM [J].
LEUNG, H ;
HAYKIN, S .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 1991, 39 (09) :2101-2104
[9]  
Michel H.E., 1999, P IJCNN 99 INT JOINT, V1, P456
[10]   An extension of the back-propagation algorithm to complex numbers [J].
Nitta, T .
NEURAL NETWORKS, 1997, 10 (08) :1391-1415