Parallel learning by multitasking neural networks

被引:3
|
作者
Agliari, Elena [1 ]
Alessandrelli, Andrea [2 ,5 ]
Barra, Adriano [3 ,5 ]
Ricci-Tersenghi, Federico [4 ,5 ,6 ]
机构
[1] Sapienza Univ Roma, Dipartimento Matemat, Piazzale Aldo Moro 5, I-00185 Rome, Italy
[2] Univ Pisa, Dipartimento Informat, Lungarno Antonio Pacinotti 43, I-56126 Pisa, Italy
[3] Univ Salento, Dipartimento Matemat & Fis, Via Arnesano, I-73100 Lecce, Italy
[4] Sapienza Univ Roma, Dipartimento Fis, Piazzale Aldo Moro 2, I-00185 Rome, Italy
[5] Ist Nazl Fis Nucl, Sez Roma1 & Lecce, Lecce, Italy
[6] CNR, Nanotec, Rome Unit, I-00185 Rome, Italy
来源
JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT | 2023年 / 2023卷 / 11期
关键词
machine learning; computational neuroscience; optimization over networks; systems neuroscience; MODEL;
D O I
10.1088/1742-5468/ad0a86
中图分类号
O3 [力学];
学科分类号
08 ; 0801 ;
摘要
Parallel learning, namely the simultaneous learning of multiple patterns, constitutes a modern challenge for neural networks. While this cannot be accomplished by standard Hebbian associative neural networks, in this paper we show how the multitasking Hebbian network (a variation on the theme of the Hopfield model, working on sparse datasets) is naturally able to perform this complex task. We focus on systems processing in parallel a finite (up to logarithmic growth in the size of the network) number of patterns, mirroring the low-storage setting of standard associative neural networks. When patterns to be reconstructed are mildly diluted, the network handles them hierarchically, distributing the amplitudes of their signals as power laws w.r.t. the pattern information content (hierarchical regime), while, for strong dilution, the signals pertaining to all the patterns are simultaneously raised with the same strength (parallel regime). Further, we prove that the training protocol (either supervised or unsupervised) neither alters the multitasking performances nor changes the thresholds for learning. We also highlight (analytically and by Monte Carlo simulations) that a standard cost function (i.e. the Hamiltonian) used in statistical mechanics exhibits the same minima as a standard loss function (i.e. the sum of squared errors) used in machine learning.
引用
收藏
页数:38
相关论文
共 50 条
  • [1] Machine learning with parallel neural networks for analyzing and forecasting electricity demand
    Chen, Yi-Ting
    Sun, Edward W.
    Lin, Yi-Bing
    COMPUTATIONAL ECONOMICS, 2020, 56 (02) : 569 - 597
  • [2] A STOCHASTIC PARALLEL ALGORITHM FOR SUPERVISED LEARNING IN NEURAL NETWORKS
    PANDYA, AS
    VENUGOPAL, KP
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 1994, E77D (04) : 376 - 384
  • [3] Machine learning with parallel neural networks for analyzing and forecasting electricity demand
    Yi-Ting Chen
    Edward W. Sun
    Yi-Bing Lin
    Computational Economics, 2020, 56 : 569 - 597
  • [4] Deep learning in spiking neural networks
    Tavanaei, Amirhossein
    Ghodrati, Masoud
    Kheradpisheh, Saeed Reza
    Masquelier, Timothee
    Maida, Anthony
    NEURAL NETWORKS, 2019, 111 : 47 - 63
  • [5] Introduction to Machine Learning, Neural Networks, and Deep Learning
    Choi, Rene Y.
    Coyner, Aaron S.
    Kalpathy-Cramer, Jayashree
    Chiang, Michael F.
    Campbell, J. Peter
    TRANSLATIONAL VISION SCIENCE & TECHNOLOGY, 2020, 9 (02):
  • [6] Learning molecular potentials with neural networks
    Gokcan, Hatice
    Isayev, Olexandr
    WILEY INTERDISCIPLINARY REVIEWS-COMPUTATIONAL MOLECULAR SCIENCE, 2022, 12 (02)
  • [7] Parallel Convolutional Neural Networks and Transfer Learning for Classifying Landforms in Satellite Images
    Atik, Ipek
    INFORMATION TECHNOLOGY AND CONTROL, 2023, 52 (01): : 228 - 244
  • [8] Multitasking Associative Networks
    Agliari, Elena
    Barra, Adriano
    Galluzzi, Andrea
    Guerra, Francesco
    Moauro, Francesco
    PHYSICAL REVIEW LETTERS, 2012, 109 (26)
  • [9] Deep Neural Networks with Parallel Autoencoders for Learning Pairwise Relations: Handwritten Digits Subtraction
    Du, Tianchuan
    Liao, Li
    2015 IEEE 14TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA), 2015, : 582 - 587
  • [10] Applications of neural networks and deep learning to biomedical engineering
    Luis Sarmiento-Ramos, Jose
    UIS INGENIERIAS, 2020, 19 (04): : 1 - 18