A Transformer-Based Framework for Parameter Learning of a Land Surface Hydrological Process Model

被引:1
|
作者
Li, Klin [1 ]
Lu, Yutong [1 ]
机构
[1] Sun Yat Sen Univ, Sch Comp Sci & Engn, Guangzhou 510006, Peoples R China
基金
中国国家自然科学基金;
关键词
parameters calibration; transformer; SMAP observation; soil moisture prediction; deep learning; MODIS evapotranspiration data; SOIL-MOISTURE; OPTIMIZATION; CALIBRATION; DECADES; WATER;
D O I
10.3390/rs15143536
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
The effective representation of land surface hydrological models strongly relies on spatially varying parameters that require calibration. Well-calibrated physical models can effectively propagate observed information to unobserved variables, but traditional calibration methods often result in nonunique solutions. In this paper, we propose a hydrological parameter calibration training framework consisting of a transformer-based parameter learning model (ParaFormer) and a surrogate model based on LSTM. On the one hand, ParaFormer utilizes self-attention mechanisms to learn a global mapping from observed data to the parameters to be calibrated, which captures spatial correlations. On the other hand, the surrogate model takes the calibrated parameters as inputs and simulates the observable variables, such as soil moisture, overcoming the challenges of directly combining complex hydrological models with a deep learning (DL) platform in a hybrid training scheme. Using the variable infiltration capacity model as the reference, we test the performance of ParaFormer on datasets of different resolutions. The results demonstrate that, in predicting soil moisture and transferring calibrated parameters in the task of evapotranspiration prediction, ParaFormer learns more effective and robust parameter mapping patterns compared to traditional and state-of-the-art DL-based parameter calibration methods.
引用
收藏
页数:18
相关论文
共 50 条
  • [1] A Transformer-based Framework for Multivariate Time Series Representation Learning
    Zerveas, George
    Jayaraman, Srideepika
    Patel, Dhaval
    Bhamidipaty, Anuradha
    Eickhoff, Carsten
    KDD '21: PROCEEDINGS OF THE 27TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2021, : 2114 - 2124
  • [2] Transformer-Based Parameter Estimation in Statistics
    Yin, Xiaoxin
    Yin, David S.
    MATHEMATICS, 2024, 12 (07)
  • [3] A transformer-based adversarial network framework for steganography
    Xiao, Chaoen
    Peng, Sirui
    Zhang, Lei
    Wang, Jianxin
    Ding, Ding
    Zhang, Jianyi
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 269
  • [4] A transformer-based deep learning framework to predict employee attrition
    Li, Wenhui
    PEERJ COMPUTER SCIENCE, 2023, 9
  • [5] Transformer-based contrastive learning framework for image anomaly detection
    Fan, Wentao
    Shangguan, Weimin
    Chen, Yewang
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2023, 14 (10) : 3413 - 3426
  • [6] A Transformer-Based Framework for Scene Text Recognition
    Selvam, Prabu
    Koilraj, Joseph Abraham Sundar
    Tavera Romero, Carlos Andres
    Alharbi, Meshal
    Mehbodniya, Abolfazl
    Webber, Julian L.
    Sengan, Sudhakar
    IEEE ACCESS, 2022, 10 : 100895 - 100910
  • [7] Transformer-based contrastive learning framework for image anomaly detection
    Wentao Fan
    Weimin Shangguan
    Yewang Chen
    International Journal of Machine Learning and Cybernetics, 2023, 14 : 3413 - 3426
  • [8] Transformer-based Encoder-Decoder Model for Surface Defect Detection
    Lu, Xiaofeng
    Fan, Wentao
    6TH INTERNATIONAL CONFERENCE ON INNOVATION IN ARTIFICIAL INTELLIGENCE, ICIAI2022, 2022, : 125 - 130
  • [9] TemproNet: A transformer-based deep learning model for seawater temperature prediction
    Chen, Qiaochuan
    Cai, Candong
    Chen, Yaoran
    Zhou, Xi
    Zhang, Dan
    Peng, Yan
    OCEAN ENGINEERING, 2024, 293
  • [10] Traffic Transformer: Transformer-based framework for temporal traffic accident prediction
    Al-Thani, Mansoor G.
    Sheng, Ziyu
    Cao, Yuting
    Yang, Yin
    AIMS MATHEMATICS, 2024, 9 (05): : 12610 - 12629