Model selection for weakly dependent time series forecasting

被引:36
作者
Alquier, Pierre [1 ,2 ]
Wintenberger, Olivier [3 ]
机构
[1] Univ Paris 07, Lab Probabilites & Modeles Aleatoires, F-75205 Paris 13, France
[2] CREST, Lab Stat, F-92240 Malakoff, France
[3] Univ Paris 09, CEREMADE, F-75775 Paris, France
关键词
adaptative inference; aggregation of estimators; autoregression estimation; model selection; randomized estimators; statistical learning; time series prediction; weak dependence; PAC-BAYESIAN BOUNDS; PREDICTION;
D O I
10.3150/11-BEJ359
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Observing a stationary time series, we propose a two-steps procedure for the prediction of its next value. The first step follows machine learning theory paradigm and consists in determining a set of possible predictors as randomized estimators in (possibly numerous) different predictive models. The second step follows the model selection paradigm and consists in choosing one predictor with good properties among all the predictors of the first step. We study our procedure for two different types of observations: causal Bernoulli shifts and bounded weakly dependent processes. In both cases, we give oracle inequalities: the risk of the chosen predictor is close to the best prediction risk in all predictive models that we consider. We apply our procedure for predictive models as linear predictors, neural networks predictors and nonparametric autoregressive predictors.
引用
收藏
页码:883 / 913
页数:31
相关论文
共 31 条
[1]  
Akaike H., 1998, Selected papers of Hirotugu Akaike, P199, DOI DOI 10.1007/978-1-4612-1694-0_15
[2]   PAC-Bayesian Bounds for Randomized Empirical Risk Minimizers [J].
Alquier, P. .
MATHEMATICAL METHODS OF STATISTICS, 2008, 17 (04) :279-304
[3]   NON-STRONG MIXING AUTOREGRESSIVE PROCESSES [J].
ANDREWS, DWK .
JOURNAL OF APPLIED PROBABILITY, 1984, 21 (04) :930-934
[4]  
[Anonymous], 2007, Institute of Mathematical Statistics Lecture Notes - Monograph Series
[5]   Aggregated estimators and empirical complexity for least square regression [J].
Audibert, JY .
ANNALES DE L INSTITUT HENRI POINCARE-PROBABILITES ET STATISTIQUES, 2004, 40 (06) :685-736
[6]  
Baraud Y, 2001, ANN STAT, V29, P839
[7]   APPROXIMATION AND ESTIMATION BOUNDS FOR ARTIFICIAL NEURAL NETWORKS [J].
BARRON, AR .
MACHINE LEARNING, 1994, 14 (01) :115-133
[8]  
Catoni O., 2003, A PAC-Bayesian approach to adaptive classification
[9]  
Catoni O., 2004, Lecture Notes in Mathematics, V1851
[10]  
Cesa-Bianchi N, 2006, Prediction, Learning, and Games