Bayesian Optimization for Sparse Neural Networks With Trainable Activation Functions

被引:3
作者
Fakhfakh, Mohamed [1 ,2 ]
Chaari, Lotfi [2 ]
机构
[1] Univ Sfax, MIRACL, Sfax 3029, Tunisia
[2] Univ Toulouse, Toulouse INP, IRIT, F-31000 Toulouse, France
关键词
Bayes methods; Task analysis; Standards; Data models; Training; Shape; Probability density function; Activation function; deep neural networks; optimization; MCMC; Hamiltonian dynamics;
D O I
10.1109/TPAMI.2024.3387073
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In the literature on deep neural networks, there is considerable interest in developing activation functions that can enhance neural network performance. In recent years, there has been renewed scientific interest in proposing activation functions that can be trained throughout the learning process, as they appear to improve network performance, especially by reducing overfitting. In this paper, we propose a trainable activation function whose parameters need to be estimated. A fully Bayesian model is developed to automatically estimate from the learning data both the model weights and activation function parameters. An MCMC-based optimization scheme is developed to build the inference. The proposed method aims to solve the aforementioned problems and improve convergence time by using an efficient sampling scheme that guarantees convergence to the global maximum. The proposed scheme has been tested across a diverse datasets, encompassing both classification and regression tasks, and implemented in various CNN architectures to demonstrate its versatility and effectiveness. Promising results demonstrate the usefulness of our proposed approach in improving models accuracy due to the proposed activation function and Bayesian estimation of the parameters.
引用
收藏
页码:6699 / 6712
页数:14
相关论文
共 72 条
[1]   A dataset of microscopic peripheral blood cell images for development of automatic recognition systems [J].
Acevedo, Andrea ;
Merino, Anna ;
Alferez, Santiago ;
Molina, Angel ;
Boldu, Laura ;
Rodellar, Jose .
DATA IN BRIEF, 2020, 30
[2]   COVID-CT-MD, COVID-19 computed tomography scan dataset applicable in machine learning and deep learning [J].
Afshar, Parnian ;
Heidarian, Shahin ;
Enshaei, Nastaran ;
Naderkhani, Farnoosh ;
Rafiee, Moezedin Javad ;
Oikonomou, Anastasia ;
Fard, Faranak Babaki ;
Samimi, Kaveh ;
Plataniotis, Konstantinos N. ;
Mohammadi, Arash .
SCIENTIFIC DATA, 2021, 8 (01)
[3]   Machine Learning-driven optimization for SVM-based intrusion detection system in vehicular ad hoc networks [J].
Alsarhan, Ayoub ;
Alauthman, Mohammad ;
Alshdaifat, Esra'a ;
Al-Ghuwairi, Abdel-Rahman ;
Al-Dubai, Ahmed .
JOURNAL OF AMBIENT INTELLIGENCE AND HUMANIZED COMPUTING, 2021, 14 (5) :6113-6122
[4]   Particle Markov chain Monte Carlo methods [J].
Andrieu, Christophe ;
Doucet, Arnaud ;
Holenstein, Roman .
JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2010, 72 :269-342
[5]   A survey on modern trainable activation functions [J].
Apicella, Andrea ;
Donnarumma, Francesco ;
Isgro, Francesco ;
Prevete, Roberto .
NEURAL NETWORKS, 2021, 138 :14-32
[6]  
Basirat P. M., 2019, P ARW OAGM WORKSH, P1
[7]   LEARNING LONG-TERM DEPENDENCIES WITH GRADIENT DESCENT IS DIFFICULT [J].
BENGIO, Y ;
SIMARD, P ;
FRASCONI, P .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (02) :157-166
[8]  
Bouaziz F., 2020, OpenMed.Imag. J., V12, P1
[9]   Network inference, error, and informant (in)accuracy: a Bayesian approach [J].
Butts, CT .
SOCIAL NETWORKS, 2003, 25 (02) :103-140
[10]  
Çalik RC, 2018, I C COMP SYST APPLIC