Continuously evolving dropout with multi-objective evolutionary optimisation

被引:5
|
作者
Jiang, Pengcheng [1 ]
Xue, Yu [1 ]
Neri, Ferrante [2 ]
机构
[1] Nanjing Univ Informat Sci & Technol, Sch Comp Sci, Nanjing 210044, Peoples R China
[2] Univ Surrey, Dept Comp Sci, NICE Grp, Guildford GU2 7XH, England
关键词
Genetic algorithms; Multi-objective optimisation; Deep neural networks; Over-fitting; Dropout; NEURAL-NETWORKS; ALGORITHM;
D O I
10.1016/j.engappai.2023.106504
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Dropout is an effective method of mitigating over-fitting while training deep neural networks (DNNs). This method consists of switching off (dropping) some of the neurons of the DNN and training it by keeping the remaining neurons active. This approach makes the DNN general and resilient to changes in its inputs. However, the probability of a neuron belonging to a layer to be dropped, the 'dropout rate', is a hard-to-tune parameter that affects the performance of the trained model. Moreover, there is no reason, besides being more practical during parameter tuning, why the dropout rate should be the same for all neurons across a layer. This paper proposes a novel method to guide the dropout rate based on an evolutionary algorithm. In contrast to previous studies, we associate a dropout with each individual neuron of the network, thus allowing more flexibility in the training phase. The vector encoding the dropouts for the entire network is interpreted as the candidate solution of a bi-objective optimisation problem, where the first objective is the error reduction due to a set of dropout rates for a given data batch, while the second objective is the distance of the used dropout rates from a pre-arranged constant. The second objective is used to control the dropout rates and prevent them from becoming too small, hence ineffective; or too large, thereby dropping a too-large portion of the network. Experimental results show that the proposed method, namely GADropout, produces DNNs that consistently outperform DNNs designed by other dropout methods, some of them being modern advanced dropout methods representing the state-of-the-art. GADroput has been tested on multiple datasets and network architectures.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Multi-Objective Evolutionary Beer Optimisation
    al-Rifaie, Mohammad Majid
    Cavazza, Marc
    PROCEEDINGS OF THE 2022 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE COMPANION, GECCO 2022, 2022, : 683 - 686
  • [2] Evolutionary multi-objective optimisation: a survey
    Nedjah, Nadia
    Mourelle, Luiza de Macedo
    INTERNATIONAL JOURNAL OF BIO-INSPIRED COMPUTATION, 2015, 7 (01) : 1 - 25
  • [3] Evolutionary Dynamic Multi-objective Optimisation: A Survey
    Jiang, Shouyong
    Zou, Juan
    Yang, Shengxiang
    Yao, Xin
    ACM COMPUTING SURVEYS, 2023, 55 (04)
  • [4] On the Effect of Populations in Evolutionary Multi-Objective Optimisation
    Giel, Oliver
    Lehre, Per Kristian
    EVOLUTIONARY COMPUTATION, 2010, 18 (03) : 335 - 356
  • [5] Multi-objective evolutionary optimisation of microwave oscillators
    Brito, LDC
    de Carvalho, P
    Bermúdez, LA
    ELECTRONICS LETTERS, 2004, 40 (11) : 677 - 678
  • [6] Evolutionary Multi-objective Optimisation in Neurotrajectory Prediction
    Galvan, Edgar
    Stapleton, Fergal
    APPLIED SOFT COMPUTING, 2023, 146
  • [7] A Parallel Evolutionary System for Multi-objective Optimisation
    Hamdan, Mohammad
    Rudolph, Gunter
    Hochstrate, Nicola
    2020 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2020,
  • [8] Evolutionary Multi-objective Optimisation of Business Processes
    Tiwari, Ashutosh
    Vergidis, Kostas
    Turner, Chris
    SOFT COMPUTING IN INDUSTRIAL APPLICATIONS - ALGORITHMS, INTEGRATION, AND SUCCESS STORIES, 2010, 75 : 293 - 301
  • [9] Evolutionary multi-objective optimisation by diversity control
    Kulvanit, Pasan
    Piroonratana, Theera
    Chaiyaratana, Nachol
    Laowattana, Djitt
    COMPUTER SCIENCE - THEORY AND APPLICATIONS, 2006, 3967 : 447 - 456
  • [10] An evolutionary programming algorithm for multi-objective optimisation
    Lewis, A
    Abramson, D
    CEC: 2003 CONGRESS ON EVOLUTIONARY COMPUTATION, VOLS 1-4, PROCEEDINGS, 2003, : 1926 - 1932