Modelling monthly rainfall of India through transformer-based deep learning architecture

被引:8
|
作者
Nayak, G. H. Harish [1 ,2 ]
Alam, Wasi [2 ]
Singh, K. N. [2 ]
Avinash, G. [1 ,2 ]
Ray, Mrinmoy [2 ]
Kumar, Rajeev Ranjan [2 ]
机构
[1] ICAR Indian Agr Res Inst, Grad Sch, New Delhi 110012, India
[2] ICAR Indian Agr Stat Res Inst, New Delhi 110012, India
关键词
Attention mechanism; Encoder-decoder; Gated recurrent units (GRUs); Long short-term memory (LSTM); Recurrent neural network (RNN); Transformer-encoder; PREDICTION;
D O I
10.1007/s40808-023-01944-7
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
In the realm of Earth systems modelling, the forecasting of rainfall holds crucial significance. The accurate prediction of monthly rainfall in India is paramount due to its pivotal role in determining the country's agricultural productivity. Due to this phenomenon's highly nonlinear dynamic nature, linear models are deemed inadequate. Parametric non-linear models also face limitations due to stringent assumptions. Consequently, there has been a notable surge in the adoption of machine learning approaches in recent times, owing to their data-driven nature. However, it is acknowledged that machine learning algorithms lack automatic feature extraction capabilities. This limitation has propelled the popularity of deep learning models, particularly in the domain of rainfall forecasting. Nevertheless, conventional deep learning architectures typically engage in the sequential processing of input data, a task that can prove challenging and time-consuming, especially when dealing with lengthy sequences. To address this concern, the present article proposes a rainfall modelling algorithm founded on a transformer-based deep learning architecture. The primary distinguishing feature of this approach lies in its capacity to parallelize sequential input data through an attention mechanism. This attribute facilitates expedited processing and training of larger datasets. The predictive performance of the transformer-based architecture was assessed using monthly rainfall data spanning 41 years, from 1980 to 2021, in India. Comparative evaluations were conducted with conventional recurrent neural networks, long short-term memory, and gated recurrent unit architectures. Experimental findings reveal that the transformer architecture outperforms other conventional deep learning architectures based on root mean square error and mean absolute percentage error. Furthermore, the accuracy of each architecture's predictions underwent testing using the Diebold-Mariano test. The conclusive findings highlight the discernible and noteworthy advantages of the transformer-based architecture in comparison to the sequential-based architectures.
引用
收藏
页码:3119 / 3136
页数:18
相关论文
共 50 条
  • [1] Transformer-based deep learning architecture for time series forecasting
    Nayak, G. H. Harish
    Alam, Md Wasi
    Avinash, G.
    Kumar, Rajeev Ranjan
    Ray, Mrinmoy
    Barman, Samir
    Singh, K. N.
    Naik, B. Samuel
    Alam, Nurnabi Meherul
    Pal, Prasenjit
    Rathod, Santosha
    Bisen, Jaiprakash
    SOFTWARE IMPACTS, 2024, 22
  • [2] Transformer-Based Deep Learning Architecture for Improved Cardiac Substructure Segmentation
    Summerfield, N.
    Qiu, J.
    Hossain, S.
    Dong, M.
    Glide-Hurst, C.
    MEDICAL PHYSICS, 2022, 49 (06) : E525 - E526
  • [3] Identifying suicidal emotions on social media through transformer-based deep learning
    Dheeraj Kodati
    Ramakrishnudu Tene
    Applied Intelligence, 2023, 53 : 11885 - 11917
  • [4] Identifying suicidal emotions on social media through transformer-based deep learning
    Kodati, Dheeraj
    Tene, Ramakrishnudu
    APPLIED INTELLIGENCE, 2023, 53 (10) : 11885 - 11917
  • [5] A transformer-based deep learning framework to predict employee attrition
    Li, Wenhui
    PEERJ COMPUTER SCIENCE, 2023, 9
  • [6] Transformer-based deep learning model for forced oscillation localization
    Matar, Mustafa
    Estevez, Pablo Gill
    Marchi, Pablo
    Messina, Francisco
    Elmoudi, Ramadan
    Wshah, Safwan
    INTERNATIONAL JOURNAL OF ELECTRICAL POWER & ENERGY SYSTEMS, 2023, 146
  • [7] Characterization of groundwater contamination: A transformer-based deep learning model
    Bai, Tao
    Tahmasebi, Pejman
    ADVANCES IN WATER RESOURCES, 2022, 164
  • [8] GIT: A Transformer-Based Deep Learning Model for Geoacoustic Inversion
    Feng, Sheng
    Zhu, Xiaoqian
    Ma, Shuqing
    Lan, Qiang
    JOURNAL OF MARINE SCIENCE AND ENGINEERING, 2023, 11 (06)
  • [9] WaveTransTimesNet: an enhanced deep learning monthly runoff prediction model based on wavelet transform and transformer architecture
    Xu, Dong-mei
    Li, Zong
    Wang, Wen-chuan
    Hong, Yang-hao
    Gu, Miao
    Hu, Xiao-xue
    Wang, Jun
    STOCHASTIC ENVIRONMENTAL RESEARCH AND RISK ASSESSMENT, 2025, 39 (03) : 883 - 910
  • [10] Transformer-Based Deep Learning Method for the Prediction of Ventilator Pressure
    Fan, Ruizhe
    2022 IEEE 2ND INTERNATIONAL CONFERENCE ON INFORMATION COMMUNICATION AND SOFTWARE ENGINEERING (ICICSE 2022), 2022, : 25 - 28