PBIL for Optimizing Hyperparameters of Convolutional Neural Networks and STL Decomposition

被引:6
作者
Vasco-Carofilis, Roberto A. [1 ]
Gutierrez-Naranjo, Miguel A. [1 ]
Cardenas-Montes, Miguel [2 ]
机构
[1] Univ Seville, Dept Comp Sci & Artificial Intelligence, Seville, Spain
[2] Ctr Invest Energet Medioambientales & Tecnol, Dept Fundamental Res, Madrid, Spain
来源
HYBRID ARTIFICIAL INTELLIGENT SYSTEMS, HAIS 2020 | 2020年 / 12344卷
关键词
Hyperparameters optimization; Convolutional Neural Networks; STL decomposition; PBIL; (222) Rn measurements; Canfranc Underground Laboratory; Forecasting;
D O I
10.1007/978-3-030-61705-9_13
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The optimization of hyperparameters in Deep Neural Networks is a critical task for the final performance, but it involves a high amount of subjective decisions based on previous researchers' expertise. This paper presents the implementation of Population-based Incremental Learning for the automatic optimization of hyperparameters in Deep Learning architectures. Namely, the proposed architecture is a combination of preprocessing the time series input with Seasonal Decomposition of Time Series by Loess, a classical method for decomposing time series, and forecasting with Convolutional Neural Networks. In the past, this combination has produced promising results, but penalized by an incremental number of parameters. The proposed architecture is applied to the prediction of the Rn-222 level at the Canfranc Underground Laboratory (Spain). By predicting the low-level periods of Rn-222, the potential contamination during the maintenance operations in the experiments hosted in the laboratory could be minimized. In this paper, it is shown that Population-based Incremental Learning can be used for the choice of optimized hyperparameters in Deep Learning architectures with a reasonable computational cost.
引用
收藏
页码:147 / 159
页数:13
相关论文
共 23 条
[1]  
[Anonymous], 1989, Generalization and network design strategies, DOI DOI 10.1148/radiol.2017162326
[2]  
[Anonymous], 2015, Tech. Rep.
[3]  
Baluja S., 1995, Machine Learning. Proceedings of the Twelfth International Conference on Machine Learning, P38
[4]  
Baluja S., 1994, POPULATION BASED INC, DOI DOI 10.5555/865123
[5]  
Bergstra J, 2012, J MACH LEARN RES, V13, P281
[6]   Ensemble Deep Learning for Forecasting 222Rn Radiation Level at Canfranc Underground Laboratory [J].
Cardenas-Monte, Miguel ;
Mendez-Jimenez, Ivan .
14TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING MODELS IN INDUSTRIAL AND ENVIRONMENTAL APPLICATIONS (SOCO 2019), 2020, 950 :157-167
[7]  
Cardenas-Montes M., LECT NOTES ARTIF INT, V11734, P431
[8]  
Cleveland R.B., 1990, Journal of Official Statistics, V6, P3
[9]   An effective algorithm for hyperparameter optimization of neural networks [J].
Diaz, G. I. ;
Fokoue-Nkoutche, A. ;
Nannicini, G. ;
Samulowitz, H. .
IBM JOURNAL OF RESEARCH AND DEVELOPMENT, 2017, 61 (4-5)
[10]  
Gamboa J. C. B., 2017, CoRR