An attention-based LSTM network for large earthquake prediction

被引:23
作者
Berhich, Asmae [1 ,2 ]
Belouadha, Fatima-Zahra [1 ]
Kabbaj, Mohammed Issam [1 ]
机构
[1] Mohammed V Univ Rabat, Ecole Mohammadia Ingn, E3S Res Ctr, AMIPS Res Team, Rabat, Morocco
[2] Ave Ibn Sina BP 765, Rabat 10090, Morocco
关键词
Earthquake prediction; Attention mechanism; Time -series data; LSTM; Seismic dataset;
D O I
10.1016/j.soildyn.2022.107663
中图分类号
P5 [地质学];
学科分类号
0709 ; 081803 ;
摘要
Due to the complexity of earthquakes, predicting their magnitude, timing and location is a challenging task because earthquakes do not show a specific pattern, which can lead to inaccurate predictions. But, using Arti-ficial Intelligence-based models, they have been able to provide promising results. However, few mature studies are dealing with large earthquake prediction, especially as a regression problem. For these reasons, this paper investigates an attention-based LSTM network for predicting the time, magnitude, and location of an impending large earthquake. LSTMs are used to learn temporal relationships, and the attention mechanism extracts important patterns and information from input features. The Japan earthquake dataset from 1900 to October 2021 was used because it represents a highly seismically active region known for its large earthquakes. The results are examined using the metrics of MSE, RMSE, MAE, R-squared, and accuracy. The performance results of our proposed model are significantly better compared to other empirical scenarios and a selected baseline method, where we found that the MSE of our model is better by approximately 60%.
引用
收藏
页数:12
相关论文
共 50 条
  • [31] Attention-based Densely Connected LSTM for Video Captioning
    Zhu, Yongqing
    Jiang, Shuqiang
    PROCEEDINGS OF THE 27TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA (MM'19), 2019, : 802 - 810
  • [32] Attention-based LSTM for Automatic Evaluation of Press Conferences
    Yi, Shengzhou
    Mochitomi, Koshiro
    Suzuki, Isao
    Wang, Xueting
    Yamasaki, Toshihiko
    THIRD INTERNATIONAL CONFERENCE ON MULTIMEDIA INFORMATION PROCESSING AND RETRIEVAL (MIPR 2020), 2020, : 191 - 196
  • [33] Attention-Based Dense LSTM for Speech Emotion Recognition
    Xie, Yue
    Liang, Ruiyu
    Liang, Zhenlin
    Zhao, Li
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2019, E102D (07): : 1426 - 1429
  • [34] Retweet Prediction with Attention-based Deep Neural Network
    Zhang, Qi
    Gong, Yeyun
    Wu, Jindou
    Huang, Haoran
    Huang, Xuanjing
    CIKM'16: PROCEEDINGS OF THE 2016 ACM CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, 2016, : 75 - 84
  • [35] Short-term Wind Speed Prediction with a Two-layer Attention-based LSTM
    Qian, Jingcheng
    Zhu, Mingfang
    Zhao, Yingnan
    He, Xiangjian
    COMPUTER SYSTEMS SCIENCE AND ENGINEERING, 2021, 39 (02): : 197 - 209
  • [36] Attention-Based Bi-Directional Long-Short Term Memory Network for Earthquake Prediction
    Banna, Md. Hasan Al
    Ghosh, Tapotosh
    Nahian, Md. Jaber Al
    Taher, Kazi Abu
    Kaiser, M. Shamim
    Mahmud, Mufti
    Hossain, Mohammad Shahadat
    Andersson, Karl
    IEEE ACCESS, 2021, 9 : 56589 - 56603
  • [37] DECIDE THE NEXT PITCH: A PITCH PREDICTION MODEL USING ATTENTION-BASED LSTM
    Yu, Chih-Chang
    Chang, Chih-Ching
    Cheng, Hsu-Yung
    2022 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO WORKSHOPS (IEEE ICMEW 2022), 2022,
  • [38] Experimental prediction model for full life friction performance of wet clutch via attention-based LSTM network
    Feng, Yuqing
    Zheng, Changsong
    Yu, Liang
    Wei, Chengsi
    Ouyang, Xiangjun
    PROCEEDINGS OF THE INSTITUTION OF MECHANICAL ENGINEERS PART D-JOURNAL OF AUTOMOBILE ENGINEERING, 2024,
  • [39] Modified Particle Swarm Optimization with Attention-Based LSTM for Wind Power Prediction
    Sun, Yiyang
    Wang, Xiangwen
    Yang, Junjie
    ENERGIES, 2022, 15 (12)
  • [40] An Attention-Based CNN-LSTM Method for Effluent Wastewater Quality Prediction
    Li, Yue
    Kong, Bin
    Yu, Weiwei
    Zhu, Xingliang
    APPLIED SCIENCES-BASEL, 2023, 13 (12):