Cooperative information maximization with Gaussian activation functions for self-organizing maps

被引:23
作者
Kamimura, Ryotaro [1 ]
机构
[1] Tokai Univ, Informat Sci Lab, Ctr Informat Technol, Kanagawa 2591292, Japan
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2006年 / 17卷 / 04期
关键词
competition; cooperation; entropy maximization; Gaussian function; mutual information maximization; self-organizing maps (SOMs);
D O I
10.1109/TNN.2006.875984
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we propose a new information-theoretic method to produce explicit self-organizing maps (SOMs). Competition is realized by maximizing mutual information between input patterns and competitive units. Competitive unit outputs are computed by the Gaussian function of distance between input patterns and competitive units. A property of this Gaussian function is that, as distance becomes smaller, a neuron tends to fire strongly. Cooperation processes are realized by taking into account the firing rates of neighboring neurons. We applied our method to uniform distribution learning, chemical compound classification and road classification. Experimental results confirmed that cooperation processes could significantly increase information content in input patterns. When cooperative operations are not effective in increasing information, mutual information as well as entropy maximization is used to increase information. Experimental results showed that entropy maximization could be used to increase information and to obtain clearer SOMs, because competitive units are forced to be equally used on average.
引用
收藏
页码:909 / 918
页数:10
相关论文
共 31 条
[1]   COMPETITIVE LEARNING ALGORITHMS FOR VECTOR QUANTIZATION [J].
AHALT, SC ;
KRISHNAMURTHY, AK ;
CHEN, PK ;
MELTON, DE .
NEURAL NETWORKS, 1990, 3 (03) :277-290
[2]   Another K-winners-take-all analog neural network [J].
Calvert, BD ;
Marinov, CA .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2000, 11 (04) :829-838
[3]  
Cover T. M., 2005, ELEM INF THEORY, DOI 10.1002/047174882X
[4]  
DESIENO D, 1988, P IEEE INT C NEURAL, V1, P117
[5]  
GATLIN LL, 1972, INFORMATION THEORY L
[6]  
GROSSBERG S, 1987, COGNITIVE SCI, V11, P23, DOI 10.1111/j.1551-6708.1987.tb00862.x
[7]   Self-organizing maps, vector quantization, and mixture modeling [J].
Heskes, T .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2001, 12 (06) :1299-1305
[8]  
HULLE MMV, 1997, NEURAL COMPUT, V9, P595
[9]   Information-theoretic competitive learning with inverse Euclidean distance output units [J].
Kamimura, R .
NEURAL PROCESSING LETTERS, 2003, 18 (03) :163-184
[10]   Flexible feature discovery and structural information control [J].
Kamimura, R ;
Kamimura, T ;
Uchida, O .
CONNECTION SCIENCE, 2001, 13 (04) :323-347