Deep Efficient Continuous Manifold Learning for Time Series Modeling

被引:4
作者
Jeong, Seungwoo [1 ]
Ko, Wonjun [2 ]
Mulyadi, Ahmad Wisnu [2 ]
Suk, Heung-Il [1 ]
机构
[1] Korea Univ, Dept Artificial Intelligence, Seoul 02841, South Korea
[2] Korea Univ, Dept Brain & Cognit Engn, Seoul 02841, South Korea
关键词
Cholesky space; deep learning; manifold learning; manifold ordinary differential equation; multivariate time series modeling; symmetric positive definite matrix; NEURAL-NETWORK;
D O I
10.1109/TPAMI.2023.3320125
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Modeling non-euclidean data is drawing extensive attention along with the unprecedented successes of deep neural networks in diverse fields. Particularly, a symmetric positive definite matrix is being actively studied in computer vision, signal processing, and medical image analysis, due to its ability to learn beneficial statistical representations. However, owing to its rigid constraints, it remains challenging to optimization problems and inefficient computational costs, especially, when incorporating it with a deep learning framework. In this paper, we propose a framework to exploit a diffeomorphism mapping between Riemannian manifolds and a Cholesky space, by which it becomes feasible not only to efficiently solve optimization problems but also to greatly reduce computation costs. Further, for dynamic modeling of time-series data, we devise a continuous manifold learning method by systematically integrating a manifold ordinary differential equation and a gated recurrent neural network. It is worth noting that due to the nice parameterization of matrices in a Cholesky space, training our proposed network equipped with Riemannian geometric metrics is straightforward. We demonstrate through experiments over regular and irregular time-series datasets that our proposed model can be efficiently and reliably trained and outperforms existing manifold methods and state-of-the-art methods in various time-series tasks.
引用
收藏
页码:171 / 184
页数:14
相关论文
共 33 条
  • [1] Bai SJ, 2018, Arxiv, DOI arXiv:1803.01271
  • [2] Recurrent Neural Networks for Multivariate Time Series with Missing Values
    Che, Zhengping
    Purushotham, Sanjay
    Cho, Kyunghyun
    Sontag, David
    Liu, Yan
    [J]. SCIENTIFIC REPORTS, 2018, 8
  • [3] Demsar J, 2006, J MACH LEARN RES, V7, P1
  • [4] Devlin J, 2019, Arxiv, DOI arXiv:1810.04805
  • [5] PhysioBank, PhysioToolkit, and PhysioNet - Components of a new research resource for complex physiologic signals
    Goldberger, AL
    Amaral, LAN
    Glass, L
    Hausdorff, JM
    Ivanov, PC
    Mark, RG
    Mietus, JE
    Moody, GB
    Peng, CK
    Stanley, HE
    [J]. CIRCULATION, 2000, 101 (23) : E215 - E220
  • [6] Hochreiter S, 1997, NEURAL COMPUT, V9, P1735, DOI [10.1162/neco.1997.9.8.1735, 10.1162/neco.1997.9.1.1, 10.1007/978-3-642-24797-2]
  • [7] HOLM S, 1979, SCAND J STAT, V6, P65
  • [8] SeqSleepNet: End-to-End Hierarchical Recurrent Neural Network for Sequence-to-Sequence Automatic Sleep Staging
    Huy Phan
    Andreotti, Fernando
    Cooray, Navin
    Chen, Oliver Y.
    De Vos, Maarten
    [J]. IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, 2019, 27 (03) : 400 - 410
  • [9] Phan H, 2018, IEEE ENG MED BIO, P453, DOI 10.1109/EMBC.2018.8512286
  • [10] Joint Classification and Prediction CNN Framework for Automatic Sleep Stage Classification
    Huy Phan
    Andreotti, Fernando
    Cooray, Navin
    Chen, Oliver Y.
    De Vos, Maarten
    [J]. IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, 2019, 66 (05) : 1285 - 1296