Block Proposal Neural Architecture Search

被引:30
作者
Liu, Jiaheng [1 ]
Zhou, Shunfeng [2 ]
Wu, Yichao [2 ]
Chen, Ken [2 ]
Ouyang, Wanli [3 ]
Xu, Dong [3 ]
机构
[1] Beihang Univ, Sch Comp Sci & Engn, Beijing 100191, Peoples R China
[2] SenseTime Grp Ltd, Res Inst, Beijing 100080, Peoples R China
[3] Univ Sydney, Sch Elect & Informat Engn, Sydney, NSW 2006, Australia
关键词
Proposals; Computer architecture; Task analysis; DNA; Convolution; Network architecture; Evolutionary computation; Neural architecture search; neural network design; image classification; MODEL;
D O I
10.1109/TIP.2020.3028288
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The existing neural architecture search (NAS) methods usually restrict the search space to the pre-defined types of block for a fixed macro-architecture. However, this strategy will limit the search space and affect architecture flexibility if block proposal search (BPS) is not considered for NAS. As a result, block structure search is the bottleneck in many previous NAS works. In this work, we propose a new evolutionary algorithm referred to as latency EvoNAS (LEvoNAS) for block structure search, and also incorporate it to the NAS framework by developing a novel two-stage framework referred to as Block Proposal NAS (BP-NAS). Comprehensive experimental results on two computer vision tasks demonstrate the superiority of our newly proposed approach over the state-of-the-art lightweight methods. For the classification task on the ImageNet dataset, our BPN-A is better than 1.0-MobileNetV2 with similar latency, and our BPN-B saves 23.7% latency when compared with 1.4-MobileNetV2 with higher top-1 accuracy. Furthermore, for the object detection task on the COCO dataset, our method achieves significant performance improvement than MobileNetV2, which demonstrates the generalization capability of our newly proposed framework.
引用
收藏
页码:15 / 25
页数:11
相关论文
共 74 条
[1]  
Akimoto Y, 2019, PR MACH LEARN RES, V97
[2]   A computational model for visual selection [J].
Amit, Y ;
Geman, D .
NEURAL COMPUTATION, 1999, 11 (07) :1691-1715
[3]  
[Anonymous], P INT C LEARN REPR
[4]  
[Anonymous], 2018, NEURIPS
[5]  
[Anonymous], 2019, ADV NEURAL INFORM PR
[6]  
[Anonymous], 2016, P INT C LEARN REPR I
[7]  
[Anonymous], 2017, ARXIV PREPRINT ARXIV
[8]  
[Anonymous], 2015, Advances in neural information processing systems
[9]  
Baker B., 2018, ating neural architecture search using performance prediction
[10]  
Bender G, 2018, PR MACH LEARN RES, V80