Tensor-Train Recurrent Neural Networks for Interpretable Multi-Way Financial Forecasting

被引:7
作者
Xu, Yao Lei [1 ]
Calvi, Giuseppe G. [1 ]
Mandic, Danilo P. [1 ]
机构
[1] Imperial Coll London, Dept Elect & Elect Engn, London, England
来源
2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN) | 2021年
关键词
Tensor-Train Decomposition; Recurrent Neural Networks; Financial Forecasting; Interpretability; Regularization; DIMENSIONALITY; DECOMPOSITIONS;
D O I
10.1109/IJCNN52387.2021.9534120
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recurrent Neural Networks (RNNs) represent the de facto standard machine learning tool for sequence modelling, owing to their expressive power and memory. However, when dealing with large dimensional data, the corresponding exponential increase in the number of parameters imposes a computational bottleneck. The necessity to equip RNNs with the ability to deal with the curse of dimensionality, such as through the parameter compression ability inherent to tensors, has led to the development of the Tensor-Train RNN (TT-RNN). Despite achieving promising results in many applications, the full potential of the TT-RNN is yet to be explored in the context of interpretable financial modelling, a notoriously challenging task characterized by multi-modal data with low signal-to-noise ratio. To address this issue, we investigate the potential of TT-RNN in the task of financial forecasting of currencies. We show, through the analysis of TT-factors, that the physical meaning underlying tensor decomposition, enables the TT-RNN model to aid the interpretability of results, thus mitigating the notorious "black-box" issue associated with neural networks. Furthermore, simulation results highlight the regularization power of TT decomposition, demonstrating the superior performance of TT-RNN over its uncompressed RNN counterpart and other tensor forecasting methods.
引用
收藏
页数:5
相关论文
共 31 条
  • [1] [Anonymous], 2020, MACHINE LEARNING FOR
  • [2] [Anonymous], 2019, ARXIV PREPRINT ARXIV
  • [3] [Anonymous], 2018, ADVANCES IN FINANCIA
  • [4] [Anonymous], 2014, ARXIV EPRINTS
  • [5] [Anonymous], 2019, PROCEEDINGS OF IEEE
  • [6] Calvi G. G., 2019, ARXIV190306133
  • [7] Choi T., 2015, ARXIV151106530
  • [8] Cichocki A, 2016, FOUND TRENDS MACH LE, V9, P431, DOI [10.1561/2200000059, 10.1561/2200000067]
  • [9] Tensor Networks for Dimensionality Reduction and Large-Scale Optimization Part 1 Low-Rank Tensor Decompositions
    Cichocki, Andrzej
    Lee, Namgil
    Oseledets, Ivan
    Anh-Huy Phan
    Zhao, Qibin
    Mandic, Danilo P.
    [J]. FOUNDATIONS AND TRENDS IN MACHINE LEARNING, 2016, 9 (4-5): : I - +
  • [10] Tensor Decompositions for Signal Processing Applications
    Cichocki, Andrzej
    Mandic, Danilo P.
    Anh Huy Phan
    Caiafa, Cesar F.
    Zhou, Guoxu
    Zhao, Qibin
    De Lathauwer, Lieven
    [J]. IEEE SIGNAL PROCESSING MAGAZINE, 2015, 32 (02) : 145 - 163