An attention-based LSTM network for large earthquake prediction

被引:23
作者
Berhich, Asmae [1 ,2 ]
Belouadha, Fatima-Zahra [1 ]
Kabbaj, Mohammed Issam [1 ]
机构
[1] Mohammed V Univ Rabat, Ecole Mohammadia Ingn, E3S Res Ctr, AMIPS Res Team, Rabat, Morocco
[2] Ave Ibn Sina BP 765, Rabat 10090, Morocco
关键词
Earthquake prediction; Attention mechanism; Time -series data; LSTM; Seismic dataset;
D O I
10.1016/j.soildyn.2022.107663
中图分类号
P5 [地质学];
学科分类号
0709 ; 081803 ;
摘要
Due to the complexity of earthquakes, predicting their magnitude, timing and location is a challenging task because earthquakes do not show a specific pattern, which can lead to inaccurate predictions. But, using Arti-ficial Intelligence-based models, they have been able to provide promising results. However, few mature studies are dealing with large earthquake prediction, especially as a regression problem. For these reasons, this paper investigates an attention-based LSTM network for predicting the time, magnitude, and location of an impending large earthquake. LSTMs are used to learn temporal relationships, and the attention mechanism extracts important patterns and information from input features. The Japan earthquake dataset from 1900 to October 2021 was used because it represents a highly seismically active region known for its large earthquakes. The results are examined using the metrics of MSE, RMSE, MAE, R-squared, and accuracy. The performance results of our proposed model are significantly better compared to other empirical scenarios and a selected baseline method, where we found that the MSE of our model is better by approximately 60%.
引用
收藏
页数:12
相关论文
共 50 条
  • [21] An Improved Attention-based Bidirectional LSTM Model for Cyanobacterial Bloom Prediction
    Jianjun Ni
    Ruping Liu
    Guangyi Tang
    Yingjuan Xie
    International Journal of Control, Automation and Systems, 2022, 20 : 3445 - 3455
  • [22] An Improved Attention-based Bidirectional LSTM Model for Cyanobacterial Bloom Prediction
    Ni, Jianjun
    Liu, Ruping
    Tang, Guangyi
    Xie, Yingjuan
    INTERNATIONAL JOURNAL OF CONTROL AUTOMATION AND SYSTEMS, 2022, 20 (10) : 3445 - 3455
  • [23] DNACoder: a CNN-LSTM attention-based network for genomic sequence data compression
    K. S. Sheena
    Madhu S. Nair
    Neural Computing and Applications, 2024, 36 (29) : 18363 - 18376
  • [24] A dual-stage attention-based Bi-LSTM network for multivariate time series prediction
    Qi Cheng
    Yixin Chen
    Yuteng Xiao
    Hongsheng Yin
    Weidong Liu
    The Journal of Supercomputing, 2022, 78 : 16214 - 16235
  • [25] A dual-stage attention-based Bi-LSTM network for multivariate time series prediction
    Cheng, Qi
    Chen, Yixin
    Xiao, Yuteng
    Yin, Hongsheng
    Liu, Weidong
    JOURNAL OF SUPERCOMPUTING, 2022, 78 (14) : 16214 - 16235
  • [26] Step Counting with Attention-based LSTM
    Khan, Shehroz S.
    Abedi, Ali
    2022 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (SSCI), 2022, : 559 - 566
  • [27] A new attention-based LSTM model for closing stock price prediction
    Lin, Yuyang
    Huang, Qi
    Zhong, Qiyin
    Li, Muyang
    Li, Yan
    Ma, Fei
    INTERNATIONAL JOURNAL OF FINANCIAL ENGINEERING, 2022, 09 (03)
  • [28] Speech Emotion Classification Using Attention-Based LSTM
    Xie, Yue
    Liang, Ruiyu
    Liang, Zhenlin
    Huang, Chengwei
    Zou, Cairong
    Schuller, Bjoern
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2019, 27 (11) : 1675 - 1685
  • [29] An Intelligent Fault Diagnosis Method Based on Attention-Based Bidirectional LSTM Network
    Wang T.
    Wang T.
    Wang P.
    Qiao H.
    Xu M.
    Tianjin Daxue Xuebao (Ziran Kexue yu Gongcheng Jishu Ban)/Journal of Tianjin University Science and Technology, 2020, 53 (06): : 601 - 608
  • [30] Attention-based LSTM with Semantic Consistency for Videos Captioning
    Guo, Zhao
    Gao, Lianli
    Song, Jingkuan
    Xu, Xing
    Shao, Jie
    Shen, Heng Tao
    MM'16: PROCEEDINGS OF THE 2016 ACM MULTIMEDIA CONFERENCE, 2016, : 357 - 361