Downscaling Temperature and Precipitation using Support Vector Regression with Evolutionary Strategy

被引:0
作者
Lima, Aranildo R. [1 ]
Cannon, Alex J. [1 ]
Hsieh, William W. [1 ]
机构
[1] Univ British Columbia, Dept Earth & Ocean Sci, Vancouver, BC V6T 1Z4, Canada
来源
2012 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN) | 2012年
关键词
Support Vector Regression; Evolutionary Strategy; Hyper-parameter optimization; Downscaling; Forecasting; FEATURE-SELECTION; OPTIMIZATION; PARAMETERS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this work, we propose a hybrid algorithm combining support vector regression with evolutionary strategy (SVR-ES) in order to build successful predictive models for downscaling problems. SVR-ES uses uncorrelated mutation with p step sizes to find the optimal SVR hyper-parameters. Two downscaling forecast problems used in the WCCI-2006 contest - surface air temperature and precipitation - were tested. We used multiple linear regression (MLR) as benchmark and a variety of machine learning techniques including bootstrap-aggregated ensemble artificial neural network (ANN), SVR with hyper-parameters given by the Cherkassky-Ma estimate and random forest (RF). We also tested all techniques with using stepwise linear regression (SLR) first to screen out irrelevant predictors. We concluded that SVR-ES is an attractive approach because it tends to outperform the other techniques and can also be implemented in an almost automatic way. The Cherkassky-Ma estimate is a useful approach to minimizing the MAE error and also saves computational time related to the hyper-parameter search. The ANN and RF are also good options to outperform multiple linear regression (MLR). Finally, the use of SLR for predictor selection can dramatically reduce computational time and often help to enhance accuracy.
引用
收藏
页数:8
相关论文
共 30 条
[1]  
[Anonymous], 1994, Neural networks: a comprehensive foundation
[2]  
Bishop C., 2006, PATTERN RECOGN, DOI DOI 10.1117/1.2819119
[3]   Random forests [J].
Breiman, L .
MACHINE LEARNING, 2001, 45 (01) :5-32
[4]   Random forests [J].
Breiman, L .
MACHINE LEARNING, 2001, 45 (01) :5-32
[5]  
Breiman L., 1984, Classification and regression trees, DOI DOI 10.1201/9781315139470
[6]   Forecasting summertime surface-level ozone concentrations in the Lower Fraser Valley of British Columbia: An ensemble neural network approach [J].
Cannon, AJ ;
Lord, ER .
JOURNAL OF THE AIR & WASTE MANAGEMENT ASSOCIATION, 2000, 50 (03) :322-339
[7]   Predictive uncertainty in environmental modelling [J].
Cawley, Gavin C. ;
Janacek, Gareth J. ;
Haylock, Malcolm R. ;
Dorling, Stephen R. .
NEURAL NETWORKS, 2007, 20 (04) :537-549
[8]  
CAWLEY GC, 2003, P EUR S ART NEUR NET, P167
[9]   Practical selection of SVM parameters and noise estimation for SVM regression [J].
Cherkassky, V ;
Ma, YQ .
NEURAL NETWORKS, 2004, 17 (01) :113-126
[10]  
Cybenko G., 1989, Mathematics of Control, Signals, and Systems, V2, P303, DOI 10.1007/BF02551274