Parallel hyperparameter optimization of spiking neural networks

被引:1
|
作者
Firmin, Thomas [1 ]
Boulet, Pierre [1 ]
Talbi, El-Ghazali [1 ]
机构
[1] Univ Lille, CNRS, UMR 9189, Cent Lille,Inria,CRIStAL, F-59000 Lille, France
关键词
Spiking neural networks; Hyperparameter optimization; Parallel asynchronous optimization; Bayesian optimization; STDP; SLAYER; ON-CHIP; CLASSIFICATION; DEEPER; MODEL;
D O I
10.1016/j.neucom.2024.128483
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Hyperparameter optimization of spiking neural networks (SNNs) is a difficult task which has not yet been deeply investigated in the literature. In this work, we designed a scalable constrained Bayesian based optimization algorithm that prevents sampling in non-spiking areas of an efficient high dimensional search space. These search spaces contain infeasible solutions that output no or only a few spikes during the training or testing phases, we call such a mode a "silent network". Finding them is difficult, as many hyperparameters are highly correlated to the architecture and to the dataset. We leverage silent networks by designing a spike- based early stopping criterion to accelerate the optimization process of SNNs trained by spike timing dependent plasticity and surrogate gradient. We parallelized the optimization algorithm asynchronously, and ran largescale experiments on heterogeneous multi-GPU Petascale architecture. Results show that by considering silent networks, we can design more flexible high-dimensional search spaces while maintaining a good efficacy. The optimization algorithm was able to focus on networks with high performances by preventing costly and worthless computation of silent networks.
引用
收藏
页数:23
相关论文
共 50 条
  • [41] Exploiting High Performance Spiking Neural Networks With Efficient Spiking Patterns
    Shen, Guobin
    Zhao, Dongcheng
    Zeng, Yi
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2025, 9 (02): : 1480 - 1489
  • [42] The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks
    Cramer, Benjamin
    Stradmann, Yannik
    Schemmel, Johannes
    Zenke, Friedemann
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (07) : 2744 - 2757
  • [43] Optimization Techniques for Conversion of Quantization Aware Trained Deep Neural Networks to Lightweight Spiking Neural Networks
    Lee, Kyungchul
    Choi, Sunghyun
    Lew, Dongwoo
    Park, Jongsun
    2021 36TH INTERNATIONAL TECHNICAL CONFERENCE ON CIRCUITS/SYSTEMS, COMPUTERS AND COMMUNICATIONS (ITC-CSCC), 2021,
  • [44] Image Classification with Recurrent Spiking Neural Networks
    Cureno Ramirez, Andres
    Garcia Morgado, Balam
    Gerardo de la Fraga, Luis
    PATTERN RECOGNITION, MCPR 2024, 2024, 14755 : 368 - 376
  • [45] Robustness of classification ability of spiking neural networks
    Jie Yang
    Pingping Zhang
    Yan Liu
    Nonlinear Dynamics, 2015, 82 : 723 - 730
  • [46] Improved Izhikevich neurons for spiking neural networks
    Kampakis, Stylianos
    SOFT COMPUTING, 2012, 16 (06) : 943 - 953
  • [47] Learning rules in spiking neural networks: A survey
    Yi, Zexiang
    Lian, Jing
    Liu, Qidong
    Zhu, Hegui
    Liang, Dong
    Liu, Jizhao
    NEUROCOMPUTING, 2023, 531 : 163 - 179
  • [48] HARDWARE IMPLEMENTATION OF STOCHASTIC SPIKING NEURAL NETWORKS
    Rossello, Josep L.
    Canals, Vincent
    Morro, Antoni
    Oliver, Antoni
    INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2012, 22 (04)
  • [49] Spiking Neural Networks in Spintronic Computational RAM
    Cilasun, Husrev
    Resch, Salonik
    Chowdhury, Zamshed, I
    Olson, Erin
    Zabihi, Masoud
    Zhao, Zhengyang
    Peterson, Thomas
    Parhi, Keshab K.
    Wang, Jian-Ping
    Sapatnekar, Sachin S.
    Karpuzcu, Ulya R.
    ACM TRANSACTIONS ON ARCHITECTURE AND CODE OPTIMIZATION, 2021, 18 (04)
  • [50] Realization of Fault Tolerance for Spiking Neural Networks with Particle Swarm Optimization
    Feng, Ruibin
    Leung, Chi-Sing
    Tsang, Peter
    NEURAL INFORMATION PROCESSING, PT II, 2015, 9490 : 79 - 86