Evolving Deep Convolutional Neural Networks for Image Classification

被引:509
作者
Sun, Yanan [1 ,2 ]
Xue, Bing [2 ]
Zhang, Mengjie [2 ]
Yen, Gary G. [3 ]
机构
[1] Sichuan Univ, Coll Comp Sci, Chengdu 610065, Peoples R China
[2] Victoria Univ Wellington, Sch Engn & Comp Sci, Wellington 6140, New Zealand
[3] Oklahoma State Univ, Sch Elect & Comp Engn, Stillwater, OK 74078 USA
基金
中国国家自然科学基金;
关键词
Computer architecture; Architecture; Optimization; Genetic algorithms; Encoding; Task analysis; Convolutional neural networks; Convolutional neural network (CNN); deep learning; genetic algorithms (GAs); image classification;
D O I
10.1109/TEVC.2019.2916183
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Evolutionary paradigms have been successfully applied to neural network designs for two decades. Unfortunately, these methods cannot scale well to the modern deep neural networks due to the complicated architectures and large quantities of connection weights. In this paper, we propose a new method using genetic algorithms for evolving the architectures and connection weight initialization values of a deep convolutional neural network to address image classification problems. In the proposed algorithm, an efficient variable-length gene encoding strategy is designed to represent the different building blocks and the potentially optimal depth in convolutional neural networks. In addition, a new representation scheme is developed for effectively initializing connection weights of deep convolutional neural networks, which is expected to avoid networks getting stuck into local minimum that is typically a major issue in the backward gradient-based optimization. Furthermore, a novel fitness evaluation method is proposed to speed up the heuristic search with substantially less computational resource. The proposed algorithm is examined and compared with 22 existing algorithms on nine widely used image classification tasks, including the state-of-the-art methods. The experimental results demonstrate the remarkable superiority of the proposed algorithm over the state-of-the-art designs in terms of classification error rate and the number of parameters (weights).
引用
收藏
页码:394 / 407
页数:14
相关论文
共 57 条
[1]  
Abadi M, 2016, PROCEEDINGS OF OSDI'16: 12TH USENIX SYMPOSIUM ON OPERATING SYSTEMS DESIGN AND IMPLEMENTATION, P265
[2]  
Ashlock D., 2006, Evolutionary computation for modeling and optimization
[3]  
Atanassov Emanouil, 2009, Large-Scale Scientific Computing. 7th International Conference, LSSC 2009. Revised Papers, P459
[4]  
Back T., 1996, Evolutionary Algorithms in Theory and Practice:Evolution Strategies, Evolutionary Programming, Genetic Algorithms
[5]  
Baker B., 2016, P INT C LEARN REPR I
[6]   INTEGER-MAKING THEOREMS [J].
BECK, J ;
FIALA, T .
DISCRETE APPLIED MATHEMATICS, 1981, 3 (01) :1-8
[7]   Representation Learning: A Review and New Perspectives [J].
Bengio, Yoshua ;
Courville, Aaron ;
Vincent, Pascal .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (08) :1798-1828
[8]  
Bengio Y, 2011, LECT NOTES ARTIF INT, V6925, P18, DOI 10.1007/978-3-642-24412-4_3
[9]  
Bergstra J., 2011, Advances in Neural Information Processing Systems, V24, P2546
[10]  
Bergstra J, 2012, J MACH LEARN RES, V13, P281