Convolution-layer parameters optimization in Convolutional Neural Networks

被引:19
作者
Chegeni, Milad Kohzadi [1 ]
Rashno, Abdolreza [1 ]
Fadaei, Sadegh [2 ]
机构
[1] Lorestan Univ, Fac Engn, Dept Comp Engn, Khorramabad, Iran
[2] Univ Yasuj, Fac Engn, Dept Elect Engn, Yasuj, Iran
关键词
Convolutional Neural Networks; Parameters optimization; Particle swarm optimization; Convolution filter dimension; PARTICLE SWARM OPTIMIZATION; AUTOENCODERS;
D O I
10.1016/j.knosys.2022.110210
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Convolutional Neural Networks (CNNs) are the most important deep learning algorithms to classify images based on their visual features. CNNs architectures are made of convolution, pooling and fully connected (FC) layers and their corresponding parameters, which affect classification performance significantly. Convolution parameters optimization for CNNs, referred as CPOCNN, is proposed in this paper. To the best of our knowledge, this is the first optimization model which assigns adaptive upper-bounds of convolution parameters depends on data dimension in current layer and number of remained layers to reach the output layer. For this task, a comprehensive mathematical model is presented for constant structures of CNNs and it is proven that more optimization space is explored in comparison with all state-of-the-art methods. In optimization process, number of convolution filters and type of pooling filters are selected randomly; dimension of pooling filters, zero-padding and stride are considered constant. CPOCNN has been evaluated on 7 publicly available datasets and compared with 53 competitive CNN models with constant and optimized structures. Interested results showed that CPOCNN not only outperforms state-of-the-art CNN methods but also enriches weak CNN models by improving their accuracies more than 35%. Source Code of this paper is available online at Github.1 (c) 2022 Elsevier B.V. All rights reserved.
引用
收藏
页数:13
相关论文
共 53 条
[1]   Deep Learning: Parameter Optimization Using Proposed Novel Hybrid Bees Bayesian Convolutional Neural Network [J].
Alamri, Nawaf Mohammad H. ;
Packianather, Michael ;
Bigot, Samuel .
APPLIED ARTIFICIAL INTELLIGENCE, 2022, 36 (01)
[2]  
[Anonymous], 2015, Neural Networks (IJCNN), 2015 International Joint Conference on
[3]  
Aszemi NM, 2019, INT J ADV COMPUT SC, V10, P269
[4]   Classification of DNA damages on segmented comet assay images using convolutional neural network [J].
Atila, Umit ;
Baydilli, Yusuf Yargi ;
Sehirli, Eftal ;
Turan, Muhammed Kamil .
COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE, 2020, 186
[5]   Optimizing Convolutional Neural Network Hyperparameters by Enhanced Swarm Intelligence Metaheuristics [J].
Bacanin, Nebojsa ;
Bezdan, Timea ;
Tuba, Eva ;
Strumberger, Ivana ;
Tuba, Milan .
ALGORITHMS, 2020, 13 (03)
[6]  
Baker B, 2017, Arxiv, DOI arXiv:1611.02167
[7]  
Bochinski E, 2017, IEEE IMAGE PROC, P3924
[8]   Invariant Scattering Convolution Networks [J].
Bruna, Joan ;
Mallat, Stephane .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (08) :1872-1886
[9]   PCANet: A Simple Deep Learning Baseline for Image Classification? [J].
Chan, Tsung-Han ;
Jia, Kui ;
Gao, Shenghua ;
Lu, Jiwen ;
Zeng, Zinan ;
Ma, Yi .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2015, 24 (12) :5017-5032
[10]   Handling dropout probability estimation in convolution neural networks using meta-heuristics [J].
de Rosa, Gustavo H. ;
Papa, Joao P. ;
Yang, Xin-S .
SOFT COMPUTING, 2018, 22 (18) :6147-6156