Layered Media Parameter Inversion Method Based on Deconvolution Autoencoder and Self-Attention Mechanism Using GPR Data

被引:2
|
作者
Yang, Xiaopeng [1 ,2 ,3 ]
Sun, Haoran [1 ,2 ,3 ]
Guo, Conglong [1 ,2 ,3 ]
Li, Yixuan [1 ,2 ,3 ]
Gong, Junbo [3 ]
Qu, Xiaodong [1 ,2 ,3 ]
Lan, Tian [1 ,2 ,3 ]
机构
[1] Beijing Inst Technol, Sch Informat & Elect, Beijing 100081, Peoples R China
[2] Minist Educ, Key Lab Elect & Informat Technol Satellite Nav, Beijing 100081, Peoples R China
[3] Beijing Inst Technol, Chongqing Innovat Ctr, Chongqing 401120, Peoples R China
来源
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING | 2024年 / 62卷
基金
中国国家自然科学基金;
关键词
Nonhomogeneous media; Reflection; Deconvolution; Reflection coefficient; Signal resolution; Estimation; Deep learning; Deconvolution autoencoder; ground-penetrating radar (GPR); layered media; parameters inversion; self-attention mechanism; GROUND-PENETRATING RADAR; QUALITY-CONTROL; THICKNESSES;
D O I
10.1109/TGRS.2024.3351894
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
Layered medium parameter inversion is a crucial technique in ground-penetrating radar (GPR) data processing and has wide application in civil engineering and geological exploration. In response to the issues of high computational complexity and low accuracy associated with existing methods, a novel layered medium parameter inversion approach is proposed, comprising the deconvolution autoencoder and the parameter inversion network. First, the deconvolution autoencoder is introduced to solve the pulse response of layered medium systems in an unsupervised manner, which enhances the computational efficiency of deconvolution and decouples the data acquisition system from the supervised model. Subsequently, a parameter inversion network, including a self-attention module and a residual multilayer perceptron (MLP), is proposed to address the challenge posed by the excessively sparse pulse responses. The self-attention module calculates the autocorrelation of the pulse sequence, providing temporal delay information between pulses and reducing the sparsity of the pulse response to facilitate feature extraction. Meanwhile, the residual MLP, known for its low information loss and adaptability to different output dimensions, is employed for model-based and pixel-based inversions in situations with and without prior knowledge of the layer number, respectively. Finally, simulated and measured datasets are constructed to comprehensively train and evaluate the proposed method. The results demonstrate that the proposed method exhibits better performance of inversion accuracy, computational efficiency, robustness, generalization capability, and noise resistance. In addition, it remains applicable even when prior knowledge of the layer number is unknown.
引用
收藏
页码:1 / 14
页数:14
相关论文
共 50 条
  • [21] A personalized paper recommendation method based on knowledge graph and transformer encoder with a self-attention mechanism
    Li Gao
    Yu Lan
    Zhen Yu
    Jian-min Zhu
    Applied Intelligence, 2023, 53 : 29991 - 30008
  • [22] LIGHTWEIGHT LANDSLIDE DETECTION METHOD BASED ON DEPTH SEPARABLE CONVOLUTION AND DOUBLE SELF-ATTENTION MECHANISM
    Li, Weibin
    Kong, Yuhui
    Wang, Rongfang
    Huo, Chunlei
    Chen, Jiawei
    Niu, Yi
    IGARSS 2023 - 2023 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2023, : 6198 - 6201
  • [23] Semantic Segmentation Method of Point Cloud in Automatic Driving Scene Based on Self-attention Mechanism
    Wang D.
    Shang H.
    Cao J.
    Wang T.
    Xia X.
    Han Y.
    Qiche Gongcheng/Automotive Engineering, 2022, 44 (11): : 1656 - 1664
  • [24] A Novel Anomaly Detection Method for Digital Twin Data Using Deconvolution Operation With Attention Mechanism
    Li, Zheng
    Duan, Mingxing
    Xiao, Bin
    Yang, Shenghong
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2023, 19 (05) : 7278 - 7286
  • [25] Locational Marginal Electricity Price Forecasting-Based Self-Attention Mechanism and Simulated Annealing Optimizer using Big Data
    Massaoudi, Mohamed
    Abu-Rub, Haitham
    Refaat, Shady S.
    Al-Kuwari, Ahmad Ali
    Huang, Tingwen
    10TH IEEE INTERNATIONAL CONFERENCE ON RENEWABLE ENERGY RESEARCH AND APPLICATIONS (ICRERA 2021), 2021, : 391 - 396
  • [26] Unsupervised anomaly detection of nuclear power plants under noise background based on convolutional adversarial autoencoder combining self-attention mechanism
    Sun, Xiang
    Guo, Shunsheng
    Liu, Shiqiao
    Guo, Jun
    Du, Baigang
    NUCLEAR ENGINEERING AND DESIGN, 2024, 428
  • [27] Fault diagnosis of rotating machinery using novel self-attention mechanism TCN with soft thresholding method
    Ding, Li
    Li, Qing
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2024, 35 (04)
  • [28] A Bearing Fault Diagnosis Method Based on Dilated Convolution and Multi-Head Self-Attention Mechanism
    Hou, Peng
    Zhang, Jianjie
    Jiang, Zhangzheng
    Tang, Yiyu
    Lin, Ying
    APPLIED SCIENCES-BASEL, 2023, 13 (23):
  • [29] A Group Resident Daily Load Forecasting Method Fusing Self-Attention Mechanism Based on Load Clustering
    Cao, Jie
    Zhang, Ru-Xuan
    Liu, Chao-Qiang
    Yang, Yuan-Bo
    Chen, Chin-Ling
    APPLIED SCIENCES-BASEL, 2023, 13 (02):
  • [30] A rolling bearing fault diagnosis method for imbalanced data based on multi-scale self-attention mechanism and novel loss function
    Qiang Ruiru
    Zhao Xiaoqiang
    INSIGHT, 2024, 66 (11) : 690 - 701