Parallel learning by multitasking neural networks

被引:3
|
作者
Agliari, Elena [1 ]
Alessandrelli, Andrea [2 ,5 ]
Barra, Adriano [3 ,5 ]
Ricci-Tersenghi, Federico [4 ,5 ,6 ]
机构
[1] Sapienza Univ Roma, Dipartimento Matemat, Piazzale Aldo Moro 5, I-00185 Rome, Italy
[2] Univ Pisa, Dipartimento Informat, Lungarno Antonio Pacinotti 43, I-56126 Pisa, Italy
[3] Univ Salento, Dipartimento Matemat & Fis, Via Arnesano, I-73100 Lecce, Italy
[4] Sapienza Univ Roma, Dipartimento Fis, Piazzale Aldo Moro 2, I-00185 Rome, Italy
[5] Ist Nazl Fis Nucl, Sez Roma1 & Lecce, Lecce, Italy
[6] CNR, Nanotec, Rome Unit, I-00185 Rome, Italy
来源
JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT | 2023年 / 2023卷 / 11期
关键词
machine learning; computational neuroscience; optimization over networks; systems neuroscience; MODEL;
D O I
10.1088/1742-5468/ad0a86
中图分类号
O3 [力学];
学科分类号
08 ; 0801 ;
摘要
Parallel learning, namely the simultaneous learning of multiple patterns, constitutes a modern challenge for neural networks. While this cannot be accomplished by standard Hebbian associative neural networks, in this paper we show how the multitasking Hebbian network (a variation on the theme of the Hopfield model, working on sparse datasets) is naturally able to perform this complex task. We focus on systems processing in parallel a finite (up to logarithmic growth in the size of the network) number of patterns, mirroring the low-storage setting of standard associative neural networks. When patterns to be reconstructed are mildly diluted, the network handles them hierarchically, distributing the amplitudes of their signals as power laws w.r.t. the pattern information content (hierarchical regime), while, for strong dilution, the signals pertaining to all the patterns are simultaneously raised with the same strength (parallel regime). Further, we prove that the training protocol (either supervised or unsupervised) neither alters the multitasking performances nor changes the thresholds for learning. We also highlight (analytically and by Monte Carlo simulations) that a standard cost function (i.e. the Hamiltonian) used in statistical mechanics exhibits the same minima as a standard loss function (i.e. the sum of squared errors) used in machine learning.
引用
收藏
页数:38
相关论文
共 50 条
  • [41] Learning organo-transition metal catalyzed reactions by graph neural networks
    Sakai, Motoji
    Kaneshige, Mitsunori
    Yasuda, Koji
    JOURNAL OF COMPUTATIONAL CHEMISTRY, 2024, 45 (06) : 341 - 351
  • [42] Transfer Learning With Neural Networks for Bearing Fault Diagnosis in Changing Working Conditions
    Zhang, Ran
    Tao, Hongyang
    Wu, Lifeng
    Guan, Yong
    IEEE ACCESS, 2017, 5 : 14347 - 14357
  • [43] Deep Learning for Epidemiologists: An Introduction to Neural Networks
    Serghiou, Stylianos
    Rough, Kathryn
    AMERICAN JOURNAL OF EPIDEMIOLOGY, 2023, 192 (11) : 1904 - 1916
  • [44] Estimating Complex Networks Centrality via Neural Networks and Machine Learning
    Grando, FeIipe
    Lamb, Luis C.
    2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,
  • [45] Parallel Approach for Time Series Analysis with General Regression Neural Networks
    Cuevas-Tello, J. C.
    Gonzalez-Grimaldo, R. A.
    Rodriguez-Gonzalez, O.
    Perez-Gonzalez, H. G.
    Vital-Ochoa, O.
    JOURNAL OF APPLIED RESEARCH AND TECHNOLOGY, 2012, 10 (02) : 162 - 179
  • [46] Safety Air Bag System for Motorcycle Using Parallel Neural Networks
    So-Hyeon Jo
    Joo Woo
    Jae-Hoon Jeong
    Gi-Sig Byun
    Journal of Electrical Engineering & Technology, 2019, 14 : 2191 - 2203
  • [47] Safety Air Bag System for Motorcycle Using Parallel Neural Networks
    Jo, So-Hyeon
    Woo, Joo
    Jeong, Jae-Noon
    Byun, Gi-Sig
    JOURNAL OF ELECTRICAL ENGINEERING & TECHNOLOGY, 2019, 14 (05) : 2191 - 2203
  • [48] Application of Deep Learning Neural Networks in Computer-Aided Drug Discovery: A Review
    Mathivanan, Jay Shree
    Dhayabaran, Victor Violet
    David, Mary Rajathei
    Nidhi, Muthugobal Bagayalakshmi Karuna
    Prasath, Karuppasamy Muthuvel
    Suvaithenamudhan, Suvaiyarasan
    CURRENT BIOINFORMATICS, 2024, 19 (09) : 851 - 858
  • [49] TinySpiking: a lightweight and efficient python']python framework for unsupervised learning spiking neural networks
    Liu, Xin
    Mo, Lingfei
    Tang, Mengting
    ENGINEERING RESEARCH EXPRESS, 2025, 7 (01):
  • [50] Enhancing pressure transient analysis in reservoir characterization through deep learning neural networks
    Abdalla, Khaled A. M. H.
    Khattab, Hamid
    Tantawy, Mahmoud
    Mohamed, Ibrahim S.
    DISCOVER APPLIED SCIENCES, 2024, 6 (09)