A more powerful Random Neural Network model in supervised learning applications

被引:0
作者
Basterrech, Sebastian [1 ]
Rubino, Gerardo [2 ]
机构
[1] VSB Tech Univ Ostrava, IT4Innovat, Ostrava, Czech Republic
[2] INRIA, Rennes, France
来源
2013 INTERNATIONAL CONFERENCE OF SOFT COMPUTING AND PATTERN RECOGNITION (SOCPAR) | 2013年
关键词
Random Neural Network; Supervised Learning; Pattern Recognition; Numerical Optimization; Gradient Descent;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Since the early 1990s, Random Neural Networks (RNNs) have gained importance in the Neural Networks and Queueing Networks communities. RNNs are inspired by biological neural networks and they are also an extension of open Jackson's networks in Queueing Theory. In 1993, a learning algorithm of gradient type was introduced in order to use RNNs in supervised learning tasks. This method considers only the weight connections among the neurons as adjustable parameters. All other parameters are deemed fixed during the training process. The RNN model has been successfully utilized in several types of applications such as: supervised learning problems, pattern recognition, optimization, image processing, associative memory. In this contribution we present a modification of the classic model obtained by extending the set of adjustable parameters. The modification increases the potential of the RNN model in supervised learning tasks keeping the same network topology and the same time complexity of the algorithm. We describe the new equations implementing a gradient descent learning technique for the model.
引用
收藏
页码:201 / 206
页数:6
相关论文
共 50 条
  • [31] Supervised learning in spiking neural networks: A review of algorithms and evaluations
    Wang, Xiangwen
    Lin, Xianghong
    Dang, Xiaochao
    NEURAL NETWORKS, 2020, 125 : 258 - 280
  • [32] EEG classification based on probabilistic neural network with supervised learning in brain computer interface
    吴婷
    HighTechnologyLetters, 2009, 15 (04) : 384 - 387
  • [33] Multilayer Photonic Spiking Neural Networks: Generalized Supervised Learning Algorithm and Network Optimization
    Fu, Chentao
    Xiang, Shuiying
    Han, Yanan
    Song, Ziwei
    Hao, Yue
    PHOTONICS, 2022, 9 (04)
  • [34] Supervised Learning in All FeFET-Based Spiking Neural Network: Opportunities and Challenges
    Dutta, Sourav
    Schafer, Clemens
    Gomez, Jorge
    Ni, Kai
    Joshi, Siddharth
    Datta, Suman
    FRONTIERS IN NEUROSCIENCE, 2020, 14
  • [35] A Semi-Supervised Active Learning Neural Network for Data Streams With Concept Drift
    Jiao, Botao
    Gomes, Heitor Murilo
    Xue, Bing
    Guo, Yinan
    Zhang, Mengjie
    IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE, 2025, 20 (01) : 18 - 33
  • [36] Intelligent Recommender System for Big Data Applications Based on the Random Neural Network
    Serrano, Will
    BIG DATA AND COGNITIVE COMPUTING, 2019, 3 (01) : 1 - 29
  • [37] The Multilayer Random Neural Network
    Aguilar, Jose
    Molina, Cristhian
    NEURAL PROCESSING LETTERS, 2013, 37 (02) : 111 - 133
  • [38] The Multilayer Random Neural Network
    Jose Aguilar
    Cristhian Molina
    Neural Processing Letters, 2013, 37 : 111 - 133
  • [39] High capacity associative memory based on the random Neural Network model
    Likas, A
    Stafylopatis, A
    INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 1996, 10 (08) : 919 - 937
  • [40] Supervised learning extensions to the CLAM network
    Thacker, NA
    Abraham, I
    Courtney, P
    NEURAL NETWORKS, 1997, 10 (02) : 315 - 326