Sequence-to-sequence prediction of spatiotemporal systems

被引:8
|
作者
Shen, Guorui [1 ]
Kurths, Juergen [2 ,3 ]
Yuan, Ye [1 ]
机构
[1] Huazhong Univ Sci & Technol, Sch Artificial Intelligence & Automat, Wuhan 430074, Peoples R China
[2] Potsdam Inst Climate Impact Res, D-14473 Potsdam, Germany
[3] Humboldt Univ, Dept Phys, D-12489 Berlin, Germany
关键词
D O I
10.1063/1.5133405
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We propose a novel type of neural networks known as "attention-based sequence-to-sequence architecture" for a model-free prediction of spatiotemporal systems. This architecture is composed of an encoder and a decoder in which the encoder acts upon a given input sequence and then the decoder yields another output sequence to make a multistep prediction at a time. In order to demonstrate the potential of this approach, we train the neural network using data numerically sampled from the Korteweg-de Vries equation-which describes the interaction between solitary waves-and then predict its future evolution. Furthermore, we validate the applicability of the approach on datasets sampled from the chaotic Lorenz system and three other partial differential equations. The results show that the proposed method can achieve good performance in predicting the evolutionary behavior of studied spatiotemporal dynamics. To the best of our knowledge, this work is the first attempt at applying attention-based sequence-to-sequence architecture to the prediction task of solitary waves. Published under license by AIP Publishing.
引用
收藏
页数:10
相关论文
共 50 条
  • [41] Semantic Matching for Sequence-to-Sequence Learning
    Zhang, Ruiyi
    Chen, Changyou
    Zhang, Xinyuan
    Bai, Ke
    Carin, Lawrence
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 212 - 222
  • [42] Assessing incrementality in sequence-to-sequence models
    Ulmer, Dennis
    Hupkes, Dieuwke
    Bruni, Elia
    4TH WORKSHOP ON REPRESENTATION LEARNING FOR NLP (REPL4NLP-2019), 2019, : 209 - 217
  • [43] An Analysis of "Attention" in Sequence-to-Sequence Models
    Prabhavalkar, Rohit
    Sainath, Tara N.
    Li, Bo
    Rao, Kanishka
    Jaitly, Navdeep
    18TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2017), VOLS 1-6: SITUATED INTERACTION, 2017, : 3702 - 3706
  • [44] A deep sequence-to-sequence method for accurate long landing prediction based on flight data
    Kang, Zongwei
    Shang, Jiaxing
    Feng, Yong
    Zheng, Linjiang
    Wang, Qixing
    Sun, Hong
    Qiang, Baohua
    Liu, Zhen
    IET INTELLIGENT TRANSPORT SYSTEMS, 2021, 15 (08) : 1028 - 1042
  • [45] Remaining Useful Life Prediction Based on Normalizing Flow Embedded Sequence-to-Sequence Learning
    Yang, Haosen
    Ding, Keqin
    Qiu, Robert C.
    Mi, Tiebin
    IEEE TRANSACTIONS ON RELIABILITY, 2021, 70 (04) : 1342 - 1354
  • [46] Demo: Vessel Trajectory Prediction using Sequence-to-Sequence Models over Spatial Grid
    Duc-Duy Nguyen
    Van, Chan Le
    Ali, Muhammad Intizar
    DEBS'18: PROCEEDINGS OF THE 12TH ACM INTERNATIONAL CONFERENCE ON DISTRIBUTED AND EVENT-BASED SYSTEMS, 2018, : 258 - 261
  • [47] Prediction of discharge in a tidal river using the LSTM-based sequence-to-sequence models
    Zhigao Chen
    Yan Zong
    Zihao Wu
    Zhiyu Kuang
    Shengping Wang
    Acta Oceanologica Sinica, 2024, 43 (07) : 40 - 51
  • [48] A Deep Sequence-to-Sequence Method for Aircraft Landing Speed Prediction Based on QAR Data
    Kang, Zongwei
    Shang, Jiaxing
    Feng, Yong
    Zheng, Linjiang
    Liu, Dajiang
    Qiang, Baohua
    Wei, Ran
    WEB INFORMATION SYSTEMS ENGINEERING, WISE 2020, PT II, 2020, 12343 : 516 - 530
  • [49] Anomaly Detection for Industrial Control Systems Using Sequence-to-Sequence Neural Networks
    Kim, Jonguk
    Yun, Jeong-Han
    Kim, Hyoung Chun
    COMPUTER SECURITY, ESORICS 2019, 2020, 11980 : 3 - 18
  • [50] Retrosynthetic and Synthetic Reaction Prediction Model Based on Sequence-to-Sequence with Attention for Polymer Designs
    Taniwaki, Hiroaki
    Kaneko, Hiromasa
    MACROMOLECULAR THEORY AND SIMULATIONS, 2023, 32 (04)