Convergence analysis of a discrete Hopfield neural network with delay and its application to knowledge refinement

被引:9
作者
Tsang, Eric C. C.
Qiu, S. S.
Yeung, Daniel S.
机构
[1] Hong Kong Polytech Univ, Dept Comp, Kowloon, Peoples R China
[2] S China Univ Technol, Coll Automat Sci & Engn, Guangzhou, Guangdong, Peoples R China
关键词
discrete Hopfield neural network; delay; convergence; stable state;
D O I
10.1142/S0218001407005491
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper investigates the convergence theorems that are associated with a Discrete Hopfield Neural Network (DHNN) with delay. We present two updating rules, one for serial mode and the other for parallel mode. The speed of convergence of these proposed updating rules is faster than all of the existing updating rules. It has been proved in this paper that a DHNN with delay will converge to a stable state when operating in a serial mode if the matrix of weights of the no-delay term is symmetric. In addition, it has been proved that they will converge to a stable state when operating in a parallel mode if the matrix of weights of the no-delay term is a symmetric and non-negative definite matrix. The condition for convergence of a DHNN without delay can been relaxed from the need to have a symmetric matrix to an even weaker condition of having a quasi-symmetric matrix. The results in this paper extend both the existing results concerning the convergence of a DHNN without delay and our previous findings. By means of the new network structure and its convergence theorems, we propose a local searching algorithm for combinatorial optimization. We also relate the maximum value of a bivariate energy function to the stable states of a DHNN with delay, which generalizes Hopfield's energy function. Moreover, for the serial model we give the relationship between the convergence of the energy function and the convergence of the corresponding network. One application is presented to demonstrate the higher rate of convergence and the accuracy of the classification using our algorithm.
引用
收藏
页码:515 / 541
页数:27
相关论文
共 31 条
[1]  
ARUN J, 1995, IEEE T NEURAL NETWOR, V6, P724
[2]   Unsupervised feature selection using a neuro-fuzzy approach [J].
Basak, J ;
De, RK ;
Pal, SK .
PATTERN RECOGNITION LETTERS, 1998, 19 (11) :997-1006
[3]   The hysteretic Hopfield neural network [J].
Bharitkar, S ;
Mendel, JM .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2000, 11 (04) :879-888
[4]   A GENERALIZED CONVERGENCE THEOREM FOR NEURAL NETWORKS [J].
BRUCK, J ;
GOODMAN, JW .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1988, 34 (05) :1089-1092
[5]   ON THE CONVERGENCE PROPERTIES OF THE HOPFIELD MODEL [J].
BRUCK, J .
PROCEEDINGS OF THE IEEE, 1990, 78 (10) :1579-1585
[6]   Discrete state neural networks and energies [J].
Cosnard, M ;
Goles, E .
NEURAL NETWORKS, 1997, 10 (02) :327-334
[7]   FINDING STRUCTURE IN TIME [J].
ELMAN, JL .
COGNITIVE SCIENCE, 1990, 14 (02) :179-211
[8]  
Goles E., 1990, Neural and Automata Networks, DOI 10.1007/978-94-009-0529-0
[9]   DELAY-INDEPENDENT STABILITY IN BIDIRECTIONAL ASSOCIATIVE MEMORY NETWORKS [J].
GOPALSAMY, K ;
HE, XZ .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (06) :998-1002
[10]  
HOPFIELD JJ, 1985, BIOL CYBERN, V52, P141