Progressive Neural Architecture Search

被引:813
作者
Liu, Chenxi [1 ]
Zoph, Barret [2 ]
Neumann, Maxim [2 ]
Shlens, Jonathon [2 ]
Hua, Wei [2 ]
Li, Li-Jia [2 ]
Li Fei-Fei [2 ,3 ]
Yuille, Alan [1 ]
Huang, Jonathan [2 ]
Murphy, Kevin [2 ]
机构
[1] Johns Hopkins Univ, Baltimore, MD 21205 USA
[2] Google AI, Mountain View, CA USA
[3] Stanford Univ, Stanford, CA 94305 USA
来源
COMPUTER VISION - ECCV 2018, PT I | 2018年 / 11205卷
关键词
D O I
10.1007/978-3-030-01246-5_2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose a new method for learning the structure of convolutional neural networks (CNNs) that is more efficient than recent state-of-the-art methods based on reinforcement learning and evolutionary algorithms. Our approach uses a sequential model-based optimization (SMBO) strategy, in which we search for structures in order of increasing complexity, while simultaneously learning a surrogate model to guide the search through structure space. Direct comparison under the same search space shows that our method is up to 5 times more efficient than the RL method of Zoph et al. (2018) in terms of number of models evaluated, and 8 times faster in terms of total compute. The structures we discover in this way achieve state of the art classification accuracies on CIFAR-10 and ImageNet.
引用
收藏
页码:19 / 35
页数:17
相关论文
共 41 条
  • [1] [Anonymous], 2018, CORR
  • [2] [Anonymous], 2017, CVPR
  • [3] [Anonymous], 2005, PROC CVPR IEEE
  • [4] [Anonymous], 2017, CORR
  • [5] [Anonymous], 2017, CORR
  • [6] [Anonymous], 2017, CORR
  • [7] [Anonymous], 2017, arXiv
  • [8] [Anonymous], 2017, ICLR
  • [9] [Anonymous], 2017, CORR
  • [10] [Anonymous], 2017, ICLR