A more powerful Random Neural Network model in supervised learning applications

被引:0
作者
Basterrech, Sebastian [1 ]
Rubino, Gerardo [2 ]
机构
[1] VSB Tech Univ Ostrava, IT4Innovat, Ostrava, Czech Republic
[2] INRIA, Rennes, France
来源
2013 INTERNATIONAL CONFERENCE OF SOFT COMPUTING AND PATTERN RECOGNITION (SOCPAR) | 2013年
关键词
Random Neural Network; Supervised Learning; Pattern Recognition; Numerical Optimization; Gradient Descent;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Since the early 1990s, Random Neural Networks (RNNs) have gained importance in the Neural Networks and Queueing Networks communities. RNNs are inspired by biological neural networks and they are also an extension of open Jackson's networks in Queueing Theory. In 1993, a learning algorithm of gradient type was introduced in order to use RNNs in supervised learning tasks. This method considers only the weight connections among the neurons as adjustable parameters. All other parameters are deemed fixed during the training process. The RNN model has been successfully utilized in several types of applications such as: supervised learning problems, pattern recognition, optimization, image processing, associative memory. In this contribution we present a modification of the classic model obtained by extending the set of adjustable parameters. The modification increases the potential of the RNN model in supervised learning tasks keeping the same network topology and the same time complexity of the algorithm. We describe the new equations implementing a gradient descent learning technique for the model.
引用
收藏
页码:201 / 206
页数:6
相关论文
共 50 条
  • [41] Network Topology Identification using Supervised Pattern Recognition Neural Networks
    Perumalla, Aniruddha
    Koru, Ahmet Taha
    Johnson, Eric Norman
    ICAART: PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON AGENTS AND ARTIFICIAL INTELLIGENCE - VOL 1, 2021, : 258 - 264
  • [42] Deep Neural Network for Supervised Inference of Gene Regulatory Network
    Daoudi, Meroua
    Meshoul, Souham
    MODELLING AND IMPLEMENTATION OF COMPLEX SYSTEMS, 2019, 64 : 149 - 157
  • [43] Supervised Learning in Spiking Neural Networks with Synaptic Delay Plasticity: An Overview
    Lan, Yawen
    Li, Qiang
    CURRENT BIOINFORMATICS, 2020, 15 (08) : 854 - 865
  • [44] A modified supervised learning rule for training a photonic spiking neural network to recognize digital patterns
    Yahui Zhang
    Shuiying Xiang
    Xingxing Guo
    Aijun Wen
    Yue Hao
    Science China Information Sciences, 2021, 64
  • [45] A modified supervised learning rule for training a photonic spiking neural network to recognize digital patterns
    Zhang, Yahui
    Xiang, Shuiying
    Guo, Xingxing
    Wen, Aijun
    Hao, Yue
    SCIENCE CHINA-INFORMATION SCIENCES, 2021, 64 (02)
  • [46] Growing radial basis neural networks: Merging supervised and unsupervised learning with network growth techniques
    Karayiannis, NB
    Mi, GWQ
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1997, 8 (06): : 1492 - 1506
  • [47] A review of online learning in supervised neural networks
    Lakhmi C. Jain
    Manjeevan Seera
    Chee Peng Lim
    P. Balasubramaniam
    Neural Computing and Applications, 2014, 25 : 491 - 509
  • [48] A review of online learning in supervised neural networks
    Jain, Lakhmi C.
    Seera, Manjeevan
    Lim, Chee Peng
    Balasubramaniam, P.
    NEURAL COMPUTING & APPLICATIONS, 2014, 25 (3-4) : 491 - 509
  • [49] A partial least-squares regression neural NETwork (PLSNET) with supervised adaptive modular learning
    Ham, FM
    Kostanic, I
    APPLICATIONS AND SCIENCE OF ARTIFICIAL NEURAL NETWORKS II, 1996, 2760 : 139 - 150
  • [50] A Supervised Learning Algorithm for Learning Precise Timing of Multiple Spikes in Multilayer Spiking Neural Networks
    Taherkhani, Aboozar
    Belatreche, Ammar
    Li, Yuhua
    Maguire, Liam P.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (11) : 5394 - 5407