A Transformer-Based Framework for Parameter Learning of a Land Surface Hydrological Process Model

被引:1
|
作者
Li, Klin [1 ]
Lu, Yutong [1 ]
机构
[1] Sun Yat Sen Univ, Sch Comp Sci & Engn, Guangzhou 510006, Peoples R China
基金
中国国家自然科学基金;
关键词
parameters calibration; transformer; SMAP observation; soil moisture prediction; deep learning; MODIS evapotranspiration data; SOIL-MOISTURE; OPTIMIZATION; CALIBRATION; DECADES; WATER;
D O I
10.3390/rs15143536
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
The effective representation of land surface hydrological models strongly relies on spatially varying parameters that require calibration. Well-calibrated physical models can effectively propagate observed information to unobserved variables, but traditional calibration methods often result in nonunique solutions. In this paper, we propose a hydrological parameter calibration training framework consisting of a transformer-based parameter learning model (ParaFormer) and a surrogate model based on LSTM. On the one hand, ParaFormer utilizes self-attention mechanisms to learn a global mapping from observed data to the parameters to be calibrated, which captures spatial correlations. On the other hand, the surrogate model takes the calibrated parameters as inputs and simulates the observable variables, such as soil moisture, overcoming the challenges of directly combining complex hydrological models with a deep learning (DL) platform in a hybrid training scheme. Using the variable infiltration capacity model as the reference, we test the performance of ParaFormer on datasets of different resolutions. The results demonstrate that, in predicting soil moisture and transferring calibrated parameters in the task of evapotranspiration prediction, ParaFormer learns more effective and robust parameter mapping patterns compared to traditional and state-of-the-art DL-based parameter calibration methods.
引用
收藏
页数:18
相关论文
共 50 条
  • [21] ERTNet: an interpretable transformer-based framework for EEG emotion recognition
    Liu, Ruixiang
    Chao, Yihu
    Ma, Xuerui
    Sha, Xianzheng
    Sun, Limin
    Li, Shuo
    Chang, Shijie
    FRONTIERS IN NEUROSCIENCE, 2024, 18
  • [22] A Transformer-Based Framework for Geomagnetic Activity Prediction
    Abduallah, Yasser
    Wang, Jason T. L.
    Xu, Chunhui
    Wang, Haimin
    FOUNDATIONS OF INTELLIGENT SYSTEMS (ISMIS 2022), 2022, 13515 : 325 - 335
  • [23] Deep-ProBind: binding protein prediction with transformer-based deep learning model
    Khan, Salman
    Noor, Sumaiya
    Awan, Hamid Hussain
    Iqbal, Shehryar
    Alqahtani, Salman A.
    Dilshad, Naqqash
    Ahmad, Nijad
    BMC BIOINFORMATICS, 2025, 26 (01):
  • [24] A transformer-based framework for enterprise sales forecasting
    Sun, Yupeng
    Li, Tian
    PEERJ COMPUTER SCIENCE, 2024, 10 : 1 - 14
  • [25] Fastformer: Transformer-Based Fast Reasoning Framework
    Zhu, Wenjuan
    Guo, Ling
    Zhang, Tianxiang
    Han, Feng
    Wei, Yi
    Gong, Xiaoqing
    Xu, Pengfei
    Guo, Jing
    FOURTEENTH INTERNATIONAL CONFERENCE ON GRAPHICS AND IMAGE PROCESSING, ICGIP 2022, 2022, 12705
  • [26] Automatic identification of suicide notes with a transformer-based deep learning model
    Zhang, Tianlin
    Schoene, Annika M.
    Ananiadou, Sophia
    INTERNET INTERVENTIONS-THE APPLICATION OF INFORMATION TECHNOLOGY IN MENTAL AND BEHAVIOURAL HEALTH, 2021, 25
  • [27] A transformer-based deep learning model for Persian moral sentiment analysis
    Karami, Behnam
    Bakouie, Fatemeh
    Gharibzadeh, Shahriar
    JOURNAL OF INFORMATION SCIENCE, 2023,
  • [28] DeepReducer: A linear transformer-based model for MEG denoising
    Xu, Hui
    Zheng, Li
    Liao, Pan
    Lyu, Bingjiang
    Gao, Jia-Hong
    NEUROIMAGE, 2025, 308
  • [29] Generating Music Transition by Using a Transformer-Based Model
    Hsu, Jia-Lien
    Chang, Shuh-Jiun
    ELECTRONICS, 2021, 10 (18)
  • [30] Rail surface defect detection using a transformer-based network
    Guo, Feng
    Liu, Jian
    Qian, Yu
    Xie, Quanyi
    JOURNAL OF INDUSTRIAL INFORMATION INTEGRATION, 2024, 38