Parallel hyperparameter optimization of spiking neural networks

被引:1
|
作者
Firmin, Thomas [1 ]
Boulet, Pierre [1 ]
Talbi, El-Ghazali [1 ]
机构
[1] Univ Lille, CNRS, UMR 9189, Cent Lille,Inria,CRIStAL, F-59000 Lille, France
关键词
Spiking neural networks; Hyperparameter optimization; Parallel asynchronous optimization; Bayesian optimization; STDP; SLAYER; ON-CHIP; CLASSIFICATION; DEEPER; MODEL;
D O I
10.1016/j.neucom.2024.128483
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Hyperparameter optimization of spiking neural networks (SNNs) is a difficult task which has not yet been deeply investigated in the literature. In this work, we designed a scalable constrained Bayesian based optimization algorithm that prevents sampling in non-spiking areas of an efficient high dimensional search space. These search spaces contain infeasible solutions that output no or only a few spikes during the training or testing phases, we call such a mode a "silent network". Finding them is difficult, as many hyperparameters are highly correlated to the architecture and to the dataset. We leverage silent networks by designing a spike- based early stopping criterion to accelerate the optimization process of SNNs trained by spike timing dependent plasticity and surrogate gradient. We parallelized the optimization algorithm asynchronously, and ran largescale experiments on heterogeneous multi-GPU Petascale architecture. Results show that by considering silent networks, we can design more flexible high-dimensional search spaces while maintaining a good efficacy. The optimization algorithm was able to focus on networks with high performances by preventing costly and worthless computation of silent networks.
引用
收藏
页数:23
相关论文
共 50 条
  • [1] Hyperparameter Optimization for Convolutional Neural Networks with Genetic Algorithms and Bayesian Optimization
    Puentes G, David E.
    Barrios H, Carlos J.
    Navaux, Philippe O. A.
    2022 IEEE LATIN AMERICAN CONFERENCE ON COMPUTATIONAL INTELLIGENCE (LA-CCI), 2022, : 131 - 135
  • [2] Multi-Objective Hyperparameter Optimization for Spiking Neural Network Neuroevolution
    Parsa, Maryam
    Kulkarni, Shruti R.
    Coletti, Mark
    Bassett, Jeffrey
    Mitchell, J. Parker
    Schuman, Catherine D.
    2021 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC 2021), 2021, : 1225 - 1232
  • [3] Speeding up the Hyperparameter Optimization of Deep Convolutional Neural Networks
    Hinz, Tobias
    Navarro-Guerrero, Nicolas
    Magg, Sven
    Wermter, Stefan
    INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE AND APPLICATIONS, 2018, 17 (02)
  • [4] Spiking neural networks for biomedical signal analysis
    Choi, Sang Ho
    BIOMEDICAL ENGINEERING LETTERS, 2024, 14 (05) : 955 - 966
  • [5] Learning improvement of spiking neural networks with dynamic adaptive hyperparameter neurons
    Liang, Jiakai
    Wang, Chao
    Ma, De
    Li, Ruixue
    Yue, Keqiang
    Li, Wenjun
    APPLIED INTELLIGENCE, 2024, 54 (19) : 9158 - 9176
  • [6] Spiking Neural Networks for Computational Intelligence: An Overview
    Dora, Shirin
    Kasabov, Nikola
    BIG DATA AND COGNITIVE COMPUTING, 2021, 5 (04)
  • [7] Spiking Neural Networks and Their Applications: A Review
    Yamazaki, Kashu
    Vo-Ho, Viet-Khoa
    Bulsara, Darshan
    Le, Ngan
    BRAIN SCIENCES, 2022, 12 (07)
  • [8] Parallel Model for Spiking Neural Networks using MATLAB
    Mirsu, Radu
    Tiponut, Virgil
    2010 9TH INTERNATIONAL SYMPOSIUM ON ELECTRONICS AND TELECOMMUNICATIONS (ISETC), 2010, : 369 - 372
  • [9] Toward classifying small lung nodules with hyperparameter optimization of convolutional neural networks
    Lima, Lucas L.
    Ferreira Junior, Jose R.
    Oliveira, Marcelo C.
    COMPUTATIONAL INTELLIGENCE, 2021, 37 (04) : 1599 - 1618
  • [10] Multiagent Reinforcement Learning for Hyperparameter Optimization of Convolutional Neural Networks
    Iranfar, Arman
    Zapater, Marina
    Atienza, David
    IEEE TRANSACTIONS ON COMPUTER-AIDED DESIGN OF INTEGRATED CIRCUITS AND SYSTEMS, 2022, 41 (04) : 1034 - 1047