Oblique random forest ensemble via Least Square Estimation for time series forecasting

被引:76
作者
Qiu, Xueheng [1 ]
Zhang, Le [1 ]
Suganthan, Ponnuthurai Nagaratnam [1 ]
Amaratunga, Gehan A. J. [2 ]
机构
[1] Nanyang Technol Univ, Sch Elect & Elect Engn, 50 Nanyang Ave, Singapore 639798, Singapore
[2] Univ Cambridge, Ctr Adv Photon & Elect, Elect Engn Div, Engn Dept, Cambridge CB3 0FA, England
基金
新加坡国家研究基金会;
关键词
Ensemble learning; Time series forecasting; Oblique random forest; Neural networks; Support vector regression; NEURAL-NETWORK; CLASSIFIERS; PREDICTION; DEEP; CLASSIFICATION; REGRESSION; ALGORITHM; MODEL;
D O I
10.1016/j.ins.2017.08.060
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recent studies in Machine Learning indicates that the classifiers most likely to be the bests are the random forests. As an ensemble classifier, random forest combines multiple decision trees to significant decrease the overall variances. Conventional random forest employs orthogonal decision tree which selects one "optimal" feature to split the data instances within a non-leaf node according to impurity criteria such as Gini impurity, information gain and so on. However, orthogonal decision tree may fail to capture the geometrical structure of the data samples. Motivated by this, we make the first attempt to study the oblique random forest in the context of time series forecasting. In each node of the decision tree, instead of the single "optimal" feature based orthogonal classification algorithms used by standard random forest, a least square classifier is employed to perform partition. The proposed method is advantageous with respect to both efficiency and accuracy. We empirically evaluate the proposed method on eight generic time series datasets and five electricity load demand time series datasets from the Australian Energy Market Operator and compare with several other benchmark methods. (C) 2017 Elsevier Inc. All rights reserved.
引用
收藏
页码:249 / 262
页数:14
相关论文
共 49 条
[1]  
AEMO, 2016, AUSTR EN MARK OP
[2]   Fast decorrelated neural network ensembles with random weights [J].
Alhamdoosh, Monther ;
Wang, Dianhui .
INFORMATION SCIENCES, 2014, 264 :104-117
[3]  
[Anonymous], 2012, PREDICTION CANDIDATE
[4]   Learning Deep Architectures for AI [J].
Bengio, Yoshua .
FOUNDATIONS AND TRENDS IN MACHINE LEARNING, 2009, 2 (01) :1-127
[5]   SmcHD1, containing a structural-maintenance-of-chromosomes hinge domain, has a critical role in X inactivation [J].
Blewitt, Marnie E. ;
Gendrel, Anne-Valerie ;
Pang, Zhenyi ;
Sparrow, Duncan B. ;
Whitelaw, Nadia ;
Craig, Jeffrey M. ;
Apedaile, Anwyn ;
Hilton, Douglas J. ;
Dunwoodie, Sally L. ;
Brockdorff, Neil ;
Kay, Graham F. ;
Whitelaw, Emma .
NATURE GENETICS, 2008, 40 (05) :663-669
[6]  
Breiman L, 1996, MACH LEARN, V24, P49
[7]   Random forests [J].
Breiman, L .
MACHINE LEARNING, 2001, 45 (01) :5-32
[8]   Random forests [J].
Breiman, L .
MACHINE LEARNING, 2001, 45 (01) :5-32
[9]  
Busseti Enzo, 2012, Deep learning for time series modeling, P1
[10]   A Genetic Algorithm for Constructing Compact Binary Decision Trees [J].
Cha, Sung-Hyuk ;
Tappert, Charles .
JOURNAL OF PATTERN RECOGNITION RESEARCH, 2009, 4 (01) :1-13