Collective behavior of a small-world recurrent neural system with scale-free distribution

被引:105
|
作者
Deng, Zhidong [1 ]
Zhang, Yi
机构
[1] Tsinghua Univ, Dept Comp Sci, Beijing 100084, Peoples R China
[2] Tsinghua Univ, State Key Intelligent Technol & Syst, Beijing 100084, Peoples R China
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2007年 / 18卷 / 05期
基金
中国国家自然科学基金;
关键词
echo state network (ESN); local preferential attachments; recurrent neural networks (RNNs); scale-free; small world; time-series prediction;
D O I
10.1109/TNN.2007.894082
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper proposes a scale-free highly clustered echo state network (SHESN). We designed the SHESN to include a naturally evolving state reservoir according to incremental growth rules that account for the following features: 1) short characteristic path length, 2) high clustering coefficient, 3) scale-free distribution, and 4) hierarchical and distributed architecture. This new state reservoir contains a large number of internal neurons that are sparsely interconnected in the form of domains. Each domain comprises one backbone neuron and a number of local neurons around this backbone. Such a natural and efficient recurrent neural system essentially interpolates between the completely regular Elman network and the completely random echo state network (ESN) proposed by Jaeger et al. We investigated the collective characteristics of the proposed complex network model. We also successfully applied it to challenging problems such as the Mackey-Glass (MG) dynamic system and the laser time-series prediction. Compared to the ESN, our experimental results show that the SHESN model has a Significantly enhanced echo state property and better performance in approximating highly complex nonlinear dynamics. In a word, this large scale dynamic complex network reflects some natural characteristics of biological neural systems in many aspects such as power law, small-world property, and hierarchical architecture. It should have strong computing power, fast signal propagation speed, and coherent synchronization.
引用
收藏
页码:1364 / 1375
页数:12
相关论文
共 50 条
  • [41] Degree distribution and robustness of cooperative communication network with scale-free model
    Wang Jian-Rong
    Wang Jian-Ping
    He Zhen
    Xu Hai-Tao
    CHINESE PHYSICS B, 2015, 24 (06)
  • [42] Indirect Reciprocity with Contagious Reputation in Large-Scale Small-World Networks
    Neumann, Markus
    JASSS-THE JOURNAL OF ARTIFICIAL SOCIETIES AND SOCIAL SIMULATION, 2020, 23 (04): : 1 - 19
  • [43] Degree distribution and robustness of cooperative communication network with scale-free model
    王建荣
    王建萍
    何振
    许海涛
    Chinese Physics B, 2015, (06) : 119 - 125
  • [44] Influence of Blurred Ways on Pattern Recognition of a Scale-Free Hopfield Neural Network
    Chang Wen-Li
    COMMUNICATIONS IN THEORETICAL PHYSICS, 2010, 53 (01) : 195 - 199
  • [45] The early history and emergence of molecular functions and modular scale-free network behavior
    Aziz, M. Fayez
    Caetano-Anolles, Kelsey
    Caetano-Anolles, Gustavo
    SCIENTIFIC REPORTS, 2016, 6
  • [46] Synergetic behavior in the cascading failure propagation of scale-free coupled map lattices
    Bao, Z. J.
    Cao, Y. J.
    Ding, L. J.
    Wang, G. Z.
    Han, Z. X.
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2008, 387 (23) : 5922 - 5929
  • [47] Critical behavior of Ising models with random long-range (small-world) interactions
    Zhang, X.
    Novotny, M. A.
    BRAZILIAN JOURNAL OF PHYSICS, 2006, 36 (3A) : 664 - 671
  • [48] Collective behavior of large-scale neural networks with GPU acceleration
    Jingyi Qu
    Rubin Wang
    Cognitive Neurodynamics, 2017, 11 : 553 - 563
  • [49] Collective behavior of large-scale neural networks with GPU acceleration
    Qu, Jingyi
    Wang, Rubin
    COGNITIVE NEURODYNAMICS, 2017, 11 (06) : 553 - 563
  • [50] Scale-Free Single Image Deraining Via Visibility-Enhanced Recurrent Wavelet Learning
    Yang, Wenhan
    Liu, Jiaying
    Yang, Shuai
    Gu, Zongming
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2019, 28 (06) : 2948 - 2961