Efficient Hopfield pattern recognition on a scale-free neural network

被引:98
作者
Stauffer, D [1 ]
Aharony, A
Costa, LD
Adler, J
机构
[1] Tel Aviv Univ, Raymond & Beverly Sackler Fac Exact Sci, Sch Phys & Astron, IL-69978 Tel Aviv, Israel
[2] Univ Cologne, Inst Theoret Phys, D-50923 Cologne, Germany
[3] Univ Sao Paulo, IFSC, Cybernet Vis Res Grp, BR-13560 Sao Carlos, SP, Brazil
[4] Technion Israel Inst Technol, Dept Phys, IL-32000 Haifa, Israel
关键词
D O I
10.1140/epjb/e2003-00114-7
中图分类号
O469 [凝聚态物理学];
学科分类号
070205 ;
摘要
Neural networks are supposed to recognise blurred images (or patterns) of N pixels (bits) each. Application of the network to an initial blurred version of one of P pre-assigned patterns should converge to the correct pattern. In the "standard" Hopfield model, the N "neurons" are connected to each other via N-2 bonds which contain the information on the stored patterns. Thus computer time and memory in general grow with N-2. The Hebb rule assigns synaptic coupling strengths proportional to the overlap of the stored patterns at the two coupled neurons. Here we simulate the Hopfield model on the Barabasi-Albert scale-free network, in which each newly added neuron is connected to only m other neurons, and at the end the number of neurons with q neighbours decays as 1/q(3). Although the quality of retrieval decreases for small m, we find good associative memory for 1 much less than m much less than N. Hence, these networks gain a factor N/m much greater than 1 in the computer memory and time.
引用
收藏
页码:395 / 399
页数:5
相关论文
共 18 条