Layered Media Parameter Inversion Method Based on Deconvolution Autoencoder and Self-Attention Mechanism Using GPR Data

被引:2
|
作者
Yang, Xiaopeng [1 ,2 ,3 ]
Sun, Haoran [1 ,2 ,3 ]
Guo, Conglong [1 ,2 ,3 ]
Li, Yixuan [1 ,2 ,3 ]
Gong, Junbo [3 ]
Qu, Xiaodong [1 ,2 ,3 ]
Lan, Tian [1 ,2 ,3 ]
机构
[1] Beijing Inst Technol, Sch Informat & Elect, Beijing 100081, Peoples R China
[2] Minist Educ, Key Lab Elect & Informat Technol Satellite Nav, Beijing 100081, Peoples R China
[3] Beijing Inst Technol, Chongqing Innovat Ctr, Chongqing 401120, Peoples R China
来源
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING | 2024年 / 62卷
基金
中国国家自然科学基金;
关键词
Nonhomogeneous media; Reflection; Deconvolution; Reflection coefficient; Signal resolution; Estimation; Deep learning; Deconvolution autoencoder; ground-penetrating radar (GPR); layered media; parameters inversion; self-attention mechanism; GROUND-PENETRATING RADAR; QUALITY-CONTROL; THICKNESSES;
D O I
10.1109/TGRS.2024.3351894
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
Layered medium parameter inversion is a crucial technique in ground-penetrating radar (GPR) data processing and has wide application in civil engineering and geological exploration. In response to the issues of high computational complexity and low accuracy associated with existing methods, a novel layered medium parameter inversion approach is proposed, comprising the deconvolution autoencoder and the parameter inversion network. First, the deconvolution autoencoder is introduced to solve the pulse response of layered medium systems in an unsupervised manner, which enhances the computational efficiency of deconvolution and decouples the data acquisition system from the supervised model. Subsequently, a parameter inversion network, including a self-attention module and a residual multilayer perceptron (MLP), is proposed to address the challenge posed by the excessively sparse pulse responses. The self-attention module calculates the autocorrelation of the pulse sequence, providing temporal delay information between pulses and reducing the sparsity of the pulse response to facilitate feature extraction. Meanwhile, the residual MLP, known for its low information loss and adaptability to different output dimensions, is employed for model-based and pixel-based inversions in situations with and without prior knowledge of the layer number, respectively. Finally, simulated and measured datasets are constructed to comprehensively train and evaluate the proposed method. The results demonstrate that the proposed method exhibits better performance of inversion accuracy, computational efficiency, robustness, generalization capability, and noise resistance. In addition, it remains applicable even when prior knowledge of the layer number is unknown.
引用
收藏
页码:1 / 14
页数:14
相关论文
共 50 条
  • [11] Multi-View 3D Reconstruction Method Based on Self-Attention Mechanism
    Zhu, Guangzhao
    Bo, Wei
    Yang, Afeng
    Xin, Xu
    LASER & OPTOELECTRONICS PROGRESS, 2023, 60 (16)
  • [12] Power supply quality prediction method based on LSTM and self-attention mechanism
    Yang, Yan
    Yu, Chang
    JOURNAL OF COMPUTATIONAL METHODS IN SCIENCES AND ENGINEERING, 2025,
  • [13] Parameter inversion and target localization in layered media containing solids based on acoustic ray tracing method
    Yang, Hongjuan
    Li, Jian
    Tian, Xiaoxiao
    Yang, Lei
    Yang, Zhengyan
    Ma, Shuyi
    Wu, Zhanjun
    MEASUREMENT, 2023, 213
  • [14] Depression Detection Based on Hybrid Deep Learning SSCL Framework Using Self-Attention Mechanism: An Application to Social Networking Data
    Nadeem, Aleena
    Naveed, Muhammad
    Satti, Muhammad Islam
    Afzal, Hammad
    Ahmad, Tanveer
    Kim, Ki-Il
    SENSORS, 2022, 22 (24)
  • [15] Rumor Detection on Social Media: A Multi-view Model Using Self-attention Mechanism
    Geng, Yue
    Lin, Zheng
    Fu, Peng
    Wang, Weiping
    COMPUTATIONAL SCIENCE - ICCS 2019, PT I, 2019, 11536 : 339 - 352
  • [16] Electrocardiogram signal classification based on fusion method of residual network and self-attention mechanism
    Yuan C.
    Liu Z.
    Wang C.
    Yang F.
    Shengwu Yixue Gongchengxue Zazhi/Journal of Biomedical Engineering, 2023, 40 (03): : 474 - 481
  • [17] A Coarse-to-Fine Facial Landmark Detection Method Based on Self-attention Mechanism
    Gao, Pengcheng
    Lu, Ke
    Xue, Jian
    Shao, Ling
    Lyu, Jiayi
    IEEE TRANSACTIONS ON MULTIMEDIA, 2021, 23 : 926 - 938
  • [18] Solving method of traveling salesman problem based on performer graph self-attention mechanism
    Han, Li
    Duan, Qianqian
    SIGNAL IMAGE AND VIDEO PROCESSING, 2025, 19 (01)
  • [19] Gridding and filtering method of gravity and magnetic data based on self-attention deep learning
    Ma G.
    Wang Z.
    Li L.
    Shiyou Diqiu Wuli Kantan/Oil Geophysical Prospecting, 2022, 57 (01): : 34 - 42
  • [20] A personalized paper recommendation method based on knowledge graph and transformer encoder with a self-attention mechanism
    Gao, Li
    Lan, Yu
    Yu, Zhen
    Zhu, Jian-min
    APPLIED INTELLIGENCE, 2023, 53 (24) : 29991 - 30008