Evolutionary Neural Architecture Search Supporting Approximate Multipliers

被引:6
作者
Pinos, Michal [1 ]
Mrazek, Vojtech [1 ]
Sekanina, Lukas [1 ]
机构
[1] Brno Univ Technol, Fac Informat Technol, IT4Innovat Ctr Excellence, Bozetechova 2, Brno 61266, Czech Republic
来源
GENETIC PROGRAMMING, EUROGP 2021 | 2021年 / 12691卷
关键词
Approximate computing; Convolutional neural network; Cartesian genetic programming; Neuroevolution; Energy efficiency; NETWORKS;
D O I
10.1007/978-3-030-72812-0_6
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
There is a growing interest in automated neural architecture search (NAS) methods. They are employed to routinely deliver high-quality neural network architectures for various challenging data sets and reduce the designer's effort. The NAS methods utilizing multiobjective evolutionary algorithms are especially useful when the objective is not only to minimize the network error but also to minimize the number of parameters (weights) or power consumption of the inference phase. We propose a multi-objective NAS method based on Cartesian genetic programming for evolving convolutional neural networks (CNN). The method allows approximate operations to be used in CNNs to reduce power consumption of a target hardware implementation. During the NAS process, a suitable CNN architecture is evolved together with approximate multipliers to deliver the best trade-offs between the accuracy, network size and power consumption. The most suitable approximate multipliers are automatically selected from a library of approximate multipliers. Evolved CNNs are compared with common human-created CNNs of a similar complexity on the CIFAR-10 benchmark problem.
引用
收藏
页码:82 / 97
页数:16
相关论文
共 29 条
[1]  
[Anonymous], CIFAR-10
[2]  
Bergstra J, 2012, J MACH LEARN RES, V13, P281
[3]   An Updated Survey of Efficient Hardware Architectures for Accelerating Deep Convolutional Neural Networks [J].
Capra, Maurizio ;
Bussolino, Beatrice ;
Marchisio, Alberto ;
Shafique, Muhammad ;
Masera, Guido ;
Martina, Maurizio .
FUTURE INTERNET, 2020, 12 (07)
[4]   A fast and elitist multiobjective genetic algorithm: NSGA-II [J].
Deb, K ;
Pratap, A ;
Agarwal, S ;
Meyarivan, T .
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2002, 6 (02) :182-197
[5]  
Elsken T, 2019, J MACH LEARN RES, V20
[6]   Ristretto: A Framework for Empirical Study of Resource-Efficient Inference in Convolutional Neural Networks [J].
Gysel, Philipp ;
Pimentel, Jon ;
Motamedi, Mohammad ;
Ghiasi, Soheil .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (11) :5784-5789
[7]   Identity Mappings in Deep Residual Networks [J].
He, Kaiming ;
Zhang, Xiangyu ;
Ren, Shaoqing ;
Sun, Jian .
COMPUTER VISION - ECCV 2016, PT IV, 2016, 9908 :630-645
[8]  
Hsu CH, 2018, Arxiv, DOI [arXiv:1806.10332, DOI 10.48550/ARXIV.1806.10332]
[9]  
Jiang WW, 2020, Arxiv, DOI arXiv:2007.09087
[10]   Progressive Neural Architecture Search [J].
Liu, Chenxi ;
Zoph, Barret ;
Neumann, Maxim ;
Shlens, Jonathon ;
Hua, Wei ;
Li, Li-Jia ;
Li Fei-Fei ;
Yuille, Alan ;
Huang, Jonathan ;
Murphy, Kevin .
COMPUTER VISION - ECCV 2018, PT I, 2018, 11205 :19-35