Foundations of Implementing the Competitive Layer Model by Lotka-Volterra Recurrent Neural Networks

被引:53
作者
Yi, Zhang [1 ]
机构
[1] Sichuan Univ, Coll Comp Sci, Machine Intelligence Lab, Chengdu 610065, Peoples R China
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2010年 / 21卷 / 03期
基金
美国国家科学基金会;
关键词
Competitive layer model (CLM); convergence; energy functions; Lotka-Volterra recurrent neural networks (LV RNNs); minimum points; stable attractors; LINEAR TRANSFER-FUNCTIONS; WINNERS-SHARE-ALL; SENSORY SEGMENTATION; FEATURE BINDING; FORBIDDEN SETS; VISUAL-CORTEX; NEURONS; MULTISTABILITY; OPTIMIZATION; ACTIVATION;
D O I
10.1109/TNN.2009.2039758
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The competitive layer model (CLM) can be described by an optimization problem. The problem can be further formulated by an energy function, called the CLM energy function, in the subspace of nonnegative orthant. The set of minimum points of the CLM energy function forms the set of solutions of the CLM problem. Solving the CLM problem means to find out such solutions. Recurrent neural networks (RNNs) can be used to implement the CLM to solve the CLM problem. The key point is to make the set of minimum points of the CLM energy function just correspond to the set of stable attractors of the recurrent neural networks. This paper proposes to use Lotka-Volterra RNNs (LV RNNs) to implement the CLM. The contribution of this paper is to establish foundations of implementing the CLM by LV RNNs. The contribution mainly contains three parts. The first part is on the CLM energy function. Necessary and sufficient conditions for minimum points of the CLM energy function are established by detailed study. The second part is on the convergence of the proposed model of the LV RNNs. It is proven that interesting trajectories are convergent. The third part is the most important. It proves that the set of stable attractors of the proposed LV RNN just equals the set of minimum points of the CLM energy function in the nonnegative orthant. Thus, the LV RNNs can be used to solve the problem of the CLM. It is believed that by establishing such basic rigorous theories, more and interesting applications of the CLM can be found.
引用
收藏
页码:494 / 507
页数:14
相关论文
共 30 条
[1]  
[Anonymous], 2004, Convergence analysis of recurrent neural networks
[2]   Analog integrated circuits for the Lotka-Volterra competitive neural networks [J].
Asai, T ;
Ohtani, M ;
Yonezu, H .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1999, 10 (05) :1222-1231
[3]   A subthreshold MOS circuit for the Lotka-Volterra neural network producing the winners-share-all solution [J].
Asai, T ;
Fukai, T ;
Tanaka, S .
NEURAL NETWORKS, 1999, 12 (02) :211-216
[4]   THEORY OF ORIENTATION TUNING IN VISUAL-CORTEX [J].
BENYISHAI, R ;
BAROR, RL ;
SOMPOLINSKY, H .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 1995, 92 (09) :3844-3848
[5]   Neural network model of the primary visual cortex: From functional architecture to lateral connectivity and back [J].
Blumenfeld, Barak ;
Bibitchkov, Dmitri ;
Tsodyks, Misha .
JOURNAL OF COMPUTATIONAL NEUROSCIENCE, 2006, 20 (02) :219-241
[6]   RECURRENT EXCITATION IN NEOCORTICAL CIRCUITS [J].
DOUGLAS, RJ ;
KOCH, C ;
MAHOWALD, M ;
MARTIN, KAC ;
SUAREZ, HH .
SCIENCE, 1995, 269 (5226) :981-985
[7]   A simple neural network exhibiting selective activation of neuronal ensembles: From winner-take-all to winners-share-all [J].
Fukai, T ;
Tanaka, S .
NEURAL COMPUTATION, 1997, 9 (01) :77-97
[8]   BOUNDARY DETECTION BY CONSTRAINED OPTIMIZATION [J].
GEMAN, D ;
GEMAN, S ;
GRAFFIGNE, C ;
DONG, P .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1990, 12 (07) :609-628
[9]   Permitted and forbidden sets in symmetric threshold-linear networks [J].
Hahnloser, RHR ;
Seung, HS ;
Slotine, JJ .
NEURAL COMPUTATION, 2003, 15 (03) :621-638
[10]   Digital selection and analogue amplification coexist in a cortex-inspired silicon circuit [J].
Hahnloser, RHR ;
Sarpeshkar, R ;
Mahowald, MA ;
Douglas, RJ ;
Seung, HS .
NATURE, 2000, 405 (6789) :947-951