The PSR-Transformer Nexus: A Deep Dive into Stock Time Series Forecasting

被引:0
作者
Phien, Nguyen Ngoc [1 ,2 ]
Platos, Jan [3 ]
机构
[1] Ton Duc Thang Univ, Ctr Appl Informat Technol, Ho Chi Minh City, Vietnam
[2] Ton Duc Thang Univ, Fac Informat Technol, Ho Chi Minh City, Vietnam
[3] VSB Tech Univ Ostrava, Fac Elect Engn & Comp Sci, Dept Comp Sci, Ostrava, Czech Republic
关键词
Stock market forecasting; deep learning; chaos theory; phase space reconstruction; transformer neural networks; time series analysis; INFORMATION; PREDICTION;
D O I
10.14569/IJACSA.2023.0141292
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Accurate stock market forecasting has remained an elusive endeavor due to the inherent complexity of financial systems dynamics. While deep neural networks have shown initial promise, robustness concerns around long-term dependencies persist. This research pioneers a synergistic fusion of nonlinear time series analysis and algorithmic advances in representation learning to enhance predictive modeling. Phase space reconstruction provides a principled way to reconstruct multidimensional phase spaces from single variable measurements, elucidating dynamical evolution. Transformer networks with self -attention have recently propelled state-of-the-art results in sequence modeling tasks. This paper introduces PSR-Transformer Networks specifically tailored for stock forecasting by feeding PSR interpreted constructs to transformer encoders. Extensive empirical evaluation on 20 years of historical equities data demonstrates significant accuracy improvements along with enhanced robustness against LSTM, CNNLSTM and Transformer models. The proposed interdisciplinary fusion establishes new performance benchmarks on modeling financial time series, validating synergies between domain -specific reconstruction and cutting -edge deep learning.
引用
收藏
页码:917 / 924
页数:8
相关论文
共 45 条
  • [1] Stock Price Prediction Using the ARIMA Model
    Adebiyi, Ayodele A.
    Adewumi, Aderemi O.
    Ayo, Charles K.
    [J]. 2014 UKSIM-AMSS 16TH INTERNATIONAL CONFERENCE ON COMPUTER MODELLING AND SIMULATION (UKSIM), 2014, : 106 - 112
  • [2] Bagul Jaikishan, 2022, 2022 5th International Conference on Advances in Science and Technology (ICAST), P49, DOI 10.1109/ICAST55766.2022.10039560
  • [3] GENERALIZED AUTOREGRESSIVE CONDITIONAL HETEROSKEDASTICITY
    BOLLERSLEV, T
    [J]. JOURNAL OF ECONOMETRICS, 1986, 31 (03) : 307 - 327
  • [4] Cao M., 2020, 2020 IEEE 91st Vehicular Technology Conference (VTC2020-Spring), P1, DOI [10.1109/VTC2020-Spring48590.2020.9129440, DOI 10.1109/VTC2020-SPRING48590.2020.9129440]
  • [5] Exploring the attention mechanism in LSTM-based Hong Kong stock price movement prediction
    Chen, Shun
    Ge, Lei
    [J]. QUANTITATIVE FINANCE, 2019, 19 (09) : 1507 - 1515
  • [6] Daryl, 2021, 2021 1st International Conference on Computer Science and Artificial Intelligence (ICCSAI)., P92, DOI 10.1109/ICCSAI53272.2021.9609720
  • [7] Deep insight into daily runoff forecasting based on a CNN-LSTM model
    Deng, Huiqi
    Chen, Wenjie
    Huang, Guoru
    [J]. NATURAL HAZARDS, 2022, 113 (03) : 1675 - 1696
  • [8] Devi JV, 2021, INT J ADV COMPUT SC, V12, P458
  • [9] Multivariate time series forecasting via attention-based encoder-decoder framework
    Du, Shengdong
    Li, Tianrui
    Yang, Yan
    Horng, Shi-Jinn
    [J]. NEUROCOMPUTING, 2020, 388 (388) : 269 - 279
  • [10] Deep learning with long short-term memory networks for financial market predictions
    Fischer, Thomas
    Krauss, Christopher
    [J]. EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2018, 270 (02) : 654 - 669