Parallel hyperparameter optimization of spiking neural networks

被引:1
|
作者
Firmin, Thomas [1 ]
Boulet, Pierre [1 ]
Talbi, El-Ghazali [1 ]
机构
[1] Univ Lille, CNRS, UMR 9189, Cent Lille,Inria,CRIStAL, F-59000 Lille, France
关键词
Spiking neural networks; Hyperparameter optimization; Parallel asynchronous optimization; Bayesian optimization; STDP; SLAYER; ON-CHIP; CLASSIFICATION; DEEPER; MODEL;
D O I
10.1016/j.neucom.2024.128483
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Hyperparameter optimization of spiking neural networks (SNNs) is a difficult task which has not yet been deeply investigated in the literature. In this work, we designed a scalable constrained Bayesian based optimization algorithm that prevents sampling in non-spiking areas of an efficient high dimensional search space. These search spaces contain infeasible solutions that output no or only a few spikes during the training or testing phases, we call such a mode a "silent network". Finding them is difficult, as many hyperparameters are highly correlated to the architecture and to the dataset. We leverage silent networks by designing a spike- based early stopping criterion to accelerate the optimization process of SNNs trained by spike timing dependent plasticity and surrogate gradient. We parallelized the optimization algorithm asynchronously, and ran largescale experiments on heterogeneous multi-GPU Petascale architecture. Results show that by considering silent networks, we can design more flexible high-dimensional search spaces while maintaining a good efficacy. The optimization algorithm was able to focus on networks with high performances by preventing costly and worthless computation of silent networks.
引用
收藏
页数:23
相关论文
共 50 条
  • [21] Minimizing Inference Time: Optimization Methods for Converted Deep Spiking Neural Networks
    Mueller, Etienne
    Hansjakob, Julius
    Auge, Daniel
    Knoll, Alois
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [22] Hyperparameter importance and optimization of quantum neural networks across small datasets
    Charles Moussa
    Yash J. Patel
    Vedran Dunjko
    Thomas Bäck
    Jan N. van Rijn
    Machine Learning, 2024, 113 : 1941 - 1966
  • [23] Hyperparameter importance and optimization of quantum neural networks across small datasets
    Moussa, Charles
    Patel, Yash J.
    Dunjko, Vedran
    Baeck, Thomas
    van Rijn, Jan N.
    MACHINE LEARNING, 2023, 113 (4) : 1941 - 1966
  • [24] A Review of Algorithms and Hardware Implementations for Spiking Neural Networks
    Duy-Anh Nguyen
    Xuan-Tu Tran
    Iacopi, Francesca
    JOURNAL OF LOW POWER ELECTRONICS AND APPLICATIONS, 2021, 11 (02)
  • [25] Basic Enhancement Strategies When Using Bayesian Optimization for Hyperparameter Tuning of Deep Neural Networks
    Cho, Hyunghun
    Kim, Yongjin
    Lee, Eunjung
    Choi, Daeyoung
    Lee, Yongjae
    Rhee, Wonjong
    IEEE ACCESS, 2020, 8 : 52588 - 52608
  • [26] The geometry of robustness in spiking neural networks
    Calaim, Nuno
    Dehmelt, Florian A.
    Goncalves, Pedro J.
    Machens, Christian K.
    ELIFE, 2022, 11
  • [27] PlaNeural : Spiking Neural Networks that Plan
    Mitchell, Ian
    Huyck, Christian
    Evans, Carl
    7TH ANNUAL INTERNATIONAL CONFERENCE ON BIOLOGICALLY INSPIRED COGNITIVE ARCHITECTURES, (BICA 2016), 2016, 88 : 198 - 204
  • [28] Bayesian-based Hyperparameter Optimization for Spiking Neuromorphic Systems
    Parsa, Maryam
    Mitchell, J. Parker
    Schuman, Catherine D.
    Patton, Robert M.
    Potok, Thomas E.
    Roy, Kaushik
    2019 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2019, : 4472 - 4478
  • [29] A Review of Computing with Spiking Neural Networks
    Wu, Jiadong
    Wang, Yinan
    Li, Zhiwei
    Lu, Lun
    Li, Qingjiang
    CMC-COMPUTERS MATERIALS & CONTINUA, 2024, 78 (03): : 2909 - 2939
  • [30] On a framework of data assimilation for hyperparameter estimation of spiking neuronal networks
    Zhang, Wenyong
    Chen, Boyu
    Feng, Jianfeng
    Lu, Wenlian
    NEURAL NETWORKS, 2024, 171 : 293 - 307