KRR-CNN: kernels redundancy reduction in convolutional neural networks

被引:17
|
作者
Hssayni, El Houssaine [1 ]
Joudar, Nour-Eddine [2 ]
Ettaouil, Mohamed [1 ]
机构
[1] Sidi Mohamed Ben Abdellah Univ, Fac Sci & Technol Fez, Dept Math, Modelling & Math Struct Lab, Fes, Morocco
[2] Mohammed V Univ Rabat, ENSAM, Dept Appl Math & Informat, M2CS,Res Ctr STIS, Rabat, Morocco
来源
NEURAL COMPUTING & APPLICATIONS | 2022年 / 34卷 / 03期
关键词
Convolutional neural networks; Convolution kernel; Binary optimization; Genetic algorithm; Image classification; OPTIMIZATION; ALGORITHM;
D O I
10.1007/s00521-021-06540-3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Convolutional neural networks (CNNs) are a promising tool for solving real-world problems. However, successful CNNs often require a large number of parameters, which leads to a significant amount of memory and a higher computational cost. This may produce some undesirable phenomena, notably the overfitting. Indeed, in CNNs, many kernels are usually redundant and can be eliminated from the network while preserving the performance. In this work, we propose a new optimization model for kernels redundancy reduction in CNN named KRR-CNN. It consists of minimization and optimization phases. In the first one, a dataset is used to train a specific CNN generating a learned CNN with optimal parameters. These later are combined with a decision optimization model to reduce kernels that have not contributed to the first task. The optimization phase is carried out by the evolutionary genetic algorithm. Efficiency of KRR-CNN has been demonstrated by several experiments. In fact, the suggested model allows reducing the kernels redundancy and improving the classification performance comparable to the state-of-the-art CNNs.
引用
收藏
页码:2443 / 2454
页数:12
相关论文
共 50 条
  • [1] KRR-CNN: kernels redundancy reduction in convolutional neural networks
    El houssaine Hssayni
    Nour-Eddine Joudar
    Mohamed Ettaouil
    Neural Computing and Applications, 2022, 34 : 2443 - 2454
  • [2] A Multi-objective Optimization Model for Redundancy Reduction in Convolutional Neural Networks
    Ali Boufssasse
    El houssaine Hssayni
    Nour-Eddine Joudar
    Mohamed Ettaouil
    Neural Processing Letters, 2023, 55 : 9721 - 9741
  • [3] A Multi-objective Optimization Model for Redundancy Reduction in Convolutional Neural Networks
    Boufssasse, Ali
    Hssayni, El Houssaine
    Joudar, Nour-Eddine
    Ettaouil, Mohamed
    NEURAL PROCESSING LETTERS, 2023, 55 (07) : 9721 - 9741
  • [4] Reconfigurable Convolutional Kernels for Neural Networks on FPGAs
    Hardieck, Martin
    Kumm, Martin
    Moeller, Konrad
    Zipf, Peter
    PROCEEDINGS OF THE 2019 ACM/SIGDA INTERNATIONAL SYMPOSIUM ON FIELD-PROGRAMMABLE GATE ARRAYS (FPGA'19), 2019, : 43 - 52
  • [5] Convolutional Neural Network Pruning with Structural Redundancy Reduction
    Wang, Zi
    Li, Chengcheng
    Wang, Xiangyang
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 14908 - 14917
  • [6] Redundancy-Aware Pruning of Convolutional Neural Networks
    Xie, Guotian
    NEURAL COMPUTATION, 2020, 32 (12) : 2482 - 2506
  • [7] Convolutional Neural Networks with the F-transform Kernels
    Molek, Vojtech
    Perfilieva, Irina
    ADVANCES IN COMPUTATIONAL INTELLIGENCE, IWANN 2017, PT I, 2017, 10305 : 396 - 407
  • [8] Design of Kernels in Convolutional Neural Networks for Image Classification
    Sun, Zhun
    Ozay, Mete
    Okatani, Takayuki
    COMPUTER VISION - ECCV 2016, PT VII, 2016, 9911 : 51 - 66
  • [9] Clustering Convolutional Kernels to Compress Deep Neural Networks
    Son, Sanghyun
    Nah, Seungjun
    Lee, Kyoung Mu
    COMPUTER VISION - ECCV 2018, PT VIII, 2018, 11212 : 225 - 240
  • [10] Temporal Redundancy-Based Computation Reduction for 3D Convolutional Neural Networks
    De Alwis, Udari
    Alioto, Massimo
    2022 IEEE INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE CIRCUITS AND SYSTEMS (AICAS 2022): INTELLIGENT TECHNOLOGY IN THE POST-PANDEMIC ERA, 2022, : 86 - 89