Pruning neural networks with distribution estimation algorithms

被引:0
作者
Cantú-Paz, E [1 ]
机构
[1] Lawrence Livermore Natl Lab, Ctr Appl Sci Comp, Livermore, CA 94551 USA
来源
GENETIC AND EVOLUTIONARY COMPUTATION - GECCO 2003, PT I, PROCEEDINGS | 2003年 / 2723卷
关键词
D O I
暂无
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
This paper describes the application of four evolutionary algorithms to the pruning of neural networks used in classification problems. Besides of a simple genetic algorithm (GA), the paper considers three distribution estimation algorithms (DEAs): a compact CA, an extended compact CA, and the Bayesian Optimization Algorithm. The objective is to determine if the DEAs present advantages over the simple CA in tern-is of accuracy or speed in this problem. The experiments considered a feedforward neural network trained with standard backpropagation and 15 public-domain and artificial data sets. In most cases, the pruned networks seemed to have better or. e qual accuracy than the original fully-connected networks. We found few differences in the accuracy of the networks pruned by the four EAs, but found large differences in the execution time. The results suggest that a simple CA with a small population might be the best algorithm for pruning networks on the data sets we tested.
引用
收藏
页码:790 / 800
页数:11
相关论文
共 50 条
  • [31] Time series forecasting by evolving artificial neural networks with genetic algorithms, differential evolution and estimation of distribution algorithm
    Juan Peralta Donate
    Xiaodong Li
    Germán Gutiérrez Sánchez
    Araceli Sanchis de Miguel
    Neural Computing and Applications, 2013, 22 : 11 - 20
  • [32] Time series forecasting by evolving artificial neural networks with genetic algorithms, differential evolution and estimation of distribution algorithm
    Peralta Donate, Juan
    Li, Xiaodong
    Gutierrez Sanchez, German
    Sanchis de Miguel, Araceli
    NEURAL COMPUTING & APPLICATIONS, 2013, 22 (01) : 11 - 20
  • [33] Learning Bayesian networks in the space of structures by estimation of distribution algorithms
    Blanco, R
    Inza, I
    Larrañga, P
    INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2003, 18 (02) : 205 - 220
  • [34] Estimation of K distribution parameters using neural networks
    Wachowiak, MP
    Smolíková, R
    Zurada, JM
    Elmaghraby, AS
    IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, 2002, 49 (06) : 617 - 620
  • [35] "Learning-Compression" Algorithms for Neural Net Pruning
    Carreira-Perpinan, Miguel A.
    Idelbayev, Yerlan
    2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, : 8532 - 8541
  • [36] Automatic Pruning Rate Derivation for Structured Pruning of Deep Neural Networks
    Sakai, Yasufumi
    Iwakawa, Akinori
    Tabaru, Tsuguchika
    Inoue, Atsuki
    Kawaguchi, Hiroshi
    2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 2561 - 2567
  • [37] Neural Passage Quality Estimation for Static Pruning
    Chang, Xuejun
    Mishra, Debabrata
    Macdonald, Craig
    MacAvaney, Sean
    PROCEEDINGS OF THE 47TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2024, 2024, : 174 - 185
  • [38] Partition Pruning: Parallelization-Aware Pruning for Dense Neural Networks
    Shahhosseini, Sina
    Albaqsami, Ahmad
    Jasemi, Masoomeh
    Bagherzadeh, Nader
    2020 28TH EUROMICRO INTERNATIONAL CONFERENCE ON PARALLEL, DISTRIBUTED AND NETWORK-BASED PROCESSING (PDP 2020), 2020, : 307 - 311
  • [39] Sparse optimization guided pruning for neural networks
    Shi, Yong
    Tang, Anda
    Niu, Lingfeng
    Zhou, Ruizhi
    NEUROCOMPUTING, 2024, 574
  • [40] Structured Pruning of Deep Convolutional Neural Networks
    Anwar, Sajid
    Hwang, Kyuyeon
    Sung, Wonyong
    ACM JOURNAL ON EMERGING TECHNOLOGIES IN COMPUTING SYSTEMS, 2017, 13 (03)