A Hybrid Prediction Model Integrating GARCH Models With a Distribution Manipulation Strategy Based on LSTM Networks for Stock Market Volatility

被引:23
作者
Koo, Eunho [1 ]
Kim, Geonwoo [2 ]
机构
[1] Korea Inst Adv Study, Ctr AI & Nat Sci, Seoul 02455, South Korea
[2] Seoul Natl Univ Sci & Technol, Sch Liberal Arts, Seoul 01811, South Korea
基金
新加坡国家研究基金会;
关键词
Predictive models; Data models; Stock markets; Stochastic processes; Licenses; Task analysis; Recurrent neural networks; Stock market volatility; long short-term memory; GARCH models; distribution manipulation; NEURAL-NETWORKS; FORECASTING VOLATILITY; INDEX; RETURNS; FUTURES; TERM;
D O I
10.1109/ACCESS.2022.3163723
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Accurate prediction of volatility is one of the most important tasks in financial decision making. Recently, the hybrid models integrating artificial neural networks with GARCH-type models have been developed, and performance gains from the models have been found to be outstanding. However, there have been few studies of hybrid models considering the nature of the distribution of financial data. Distribution of volatility time-series is highly concentrated near zero, and such aspect can cause low prediction performance on the whole domain of probability density function because weights in the networks can be trained to obtain accurate prediction only for the high frequency region, that is, near zero. To overcome the challenge, we propose a new hybrid model with GARCH-type models based on a novel non-linear filtering method to mitigate concentration property of volatility. For the filtering, we utilize root-type functions that transform extremely left-biased and pointed distribution of original volatility to a volume-upped (VU) distribution shifted to the right. Long short-term memory (LSTM) is employed as the basic implementation model, and the realized volatility of S&P 500 is predicted using the proposed models. It is found that the proposed hybrid model (VU-GARCH-LSTM) obtains 21.03% performance gain with respect to the root mean square error (RMSE) against the mean performances of the existing hybrid models integrating LSTM with GARCH-type models. Furthermore, the proposed model improves prediction performance in the right domain region of label probability density by making the prediction distribution comparable to the label distribution.
引用
收藏
页码:34743 / 34754
页数:12
相关论文
共 41 条
[1]   GENERALIZED AUTOREGRESSIVE CONDITIONAL HETEROSKEDASTICITY [J].
BOLLERSLEV, T .
JOURNAL OF ECONOMETRICS, 1986, 31 (03) :307-327
[2]   RECURRENT NEURAL NETWORKS AND ROBUST TIME-SERIES PREDICTION [J].
CONNOR, JT ;
MARTIN, RD ;
ATLAS, LE .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (02) :240-254
[3]   The influence of ARIMA-GARCH parameters in feed forward neural networks prediction [J].
de Oliveira, Mauri Aparecido .
NEURAL COMPUTING & APPLICATIONS, 2011, 20 (05) :687-701
[4]   AUTOREGRESSIVE CONDITIONAL HETEROSCEDASTICITY WITH ESTIMATES OF THE VARIANCE OF UNITED-KINGDOM INFLATION [J].
ENGLE, RF .
ECONOMETRICA, 1982, 50 (04) :987-1007
[5]   Predicting returns on Canadian exchange rates with artificial neural networks and EGARCH-M models [J].
Episcopos, A ;
Davis, J .
NEURAL COMPUTING & APPLICATIONS, 1996, 4 (03) :168-174
[6]   A hybrid modeling approach for forecasting the volatility of S&P 500 index return [J].
Hajizadeh, E. ;
Seifi, A. ;
Zarandi, M. N. Fazel ;
Turksen, I. B. .
EXPERT SYSTEMS WITH APPLICATIONS, 2012, 39 (01) :431-436
[7]   A new NN-PSO hybrid model for forecasting Euro/Dollar exchange rate volatility [J].
Hajizadeh, Ehsan ;
Mahootchi, Masoud ;
Esfahanipour, Akbar ;
Massahi, Mandi Kh .
NEURAL COMPUTING & APPLICATIONS, 2019, 31 (07) :2063-2071
[8]   Using neural networks for forecasting volatility of S&P 500 Index futures prices [J].
Hamid, SA ;
Iqbal, Z .
JOURNAL OF BUSINESS RESEARCH, 2004, 57 (10) :1116-1125
[9]   The vanishing gradient problem during learning recurrent neural nets and problem solutions [J].
Hochreiter, S .
INTERNATIONAL JOURNAL OF UNCERTAINTY FUZZINESS AND KNOWLEDGE-BASED SYSTEMS, 1998, 6 (02) :107-116
[10]  
Hochreiter S., 1997, Neural Computation, V9, P1735