Layered Media Parameter Inversion Method Based on Deconvolution Autoencoder and Self-Attention Mechanism Using GPR Data

被引:2
|
作者
Yang, Xiaopeng [1 ,2 ,3 ]
Sun, Haoran [1 ,2 ,3 ]
Guo, Conglong [1 ,2 ,3 ]
Li, Yixuan [1 ,2 ,3 ]
Gong, Junbo [3 ]
Qu, Xiaodong [1 ,2 ,3 ]
Lan, Tian [1 ,2 ,3 ]
机构
[1] Beijing Inst Technol, Sch Informat & Elect, Beijing 100081, Peoples R China
[2] Minist Educ, Key Lab Elect & Informat Technol Satellite Nav, Beijing 100081, Peoples R China
[3] Beijing Inst Technol, Chongqing Innovat Ctr, Chongqing 401120, Peoples R China
来源
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING | 2024年 / 62卷
基金
中国国家自然科学基金;
关键词
Nonhomogeneous media; Reflection; Deconvolution; Reflection coefficient; Signal resolution; Estimation; Deep learning; Deconvolution autoencoder; ground-penetrating radar (GPR); layered media; parameters inversion; self-attention mechanism; GROUND-PENETRATING RADAR; QUALITY-CONTROL; THICKNESSES;
D O I
10.1109/TGRS.2024.3351894
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
Layered medium parameter inversion is a crucial technique in ground-penetrating radar (GPR) data processing and has wide application in civil engineering and geological exploration. In response to the issues of high computational complexity and low accuracy associated with existing methods, a novel layered medium parameter inversion approach is proposed, comprising the deconvolution autoencoder and the parameter inversion network. First, the deconvolution autoencoder is introduced to solve the pulse response of layered medium systems in an unsupervised manner, which enhances the computational efficiency of deconvolution and decouples the data acquisition system from the supervised model. Subsequently, a parameter inversion network, including a self-attention module and a residual multilayer perceptron (MLP), is proposed to address the challenge posed by the excessively sparse pulse responses. The self-attention module calculates the autocorrelation of the pulse sequence, providing temporal delay information between pulses and reducing the sparsity of the pulse response to facilitate feature extraction. Meanwhile, the residual MLP, known for its low information loss and adaptability to different output dimensions, is employed for model-based and pixel-based inversions in situations with and without prior knowledge of the layer number, respectively. Finally, simulated and measured datasets are constructed to comprehensively train and evaluate the proposed method. The results demonstrate that the proposed method exhibits better performance of inversion accuracy, computational efficiency, robustness, generalization capability, and noise resistance. In addition, it remains applicable even when prior knowledge of the layer number is unknown.
引用
收藏
页码:1 / 14
页数:14
相关论文
共 50 条
  • [41] Intelligent Task Allocation and Planning for Unmanned Surface Vehicle (USV) Using Self-Attention Mechanism and Locking Sweeping Method
    Luo, Jing
    Zhang, Yuhang
    Zhuang, Jiayuan
    Su, Yumin
    JOURNAL OF MARINE SCIENCE AND ENGINEERING, 2024, 12 (01)
  • [42] Defect Detection Method of Aluminum Profile Surface Using Deep Self-Attention Mechanism Under Hybrid Noise Conditions
    Chen, Renxiang
    Cai, Dongyin
    Hu, Xiaolin
    Zhan, Zan
    Wang, Shuai
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2021, 70
  • [43] A Forward Kinematics Solution Method for Cable-Driven Hyper-Redundant Manipulators Based on Self-attention Mechanism
    Wang, Tianao
    Leong, Zhenghao Nigel
    Wang, Guolei
    INTELLIGENT ROBOTICS AND APPLICATIONS, ICIRA 2024, PT IX, 2025, 15209 : 397 - 407
  • [44] Refined self-attention mechanism based real-time structural response prediction method under seismic action
    Meng, Shiqiao
    Zhou, Ying
    Gao, Zhiyuan
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 129
  • [45] TFCNN-BiGRU with self-attention mechanism for automatic human emotion recognition using multi-channel EEG data
    Houssein, Essam H.
    Hammad, Asmaa
    Samee, Nagwan Abdel
    Alohali, Manal Abdullah
    Ali, Abdelmgeid A.
    CLUSTER COMPUTING-THE JOURNAL OF NETWORKS SOFTWARE TOOLS AND APPLICATIONS, 2024, 27 (10): : 14365 - 14385
  • [46] A Hybrid Transformer Model for Obstructive Sleep Apnea Detection Based on Self-Attention Mechanism Using Single-Lead ECG
    Hu, Shuaicong
    Cai, Wenjie
    Gao, Tijie
    Wang, Mingjie
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71
  • [47] Non-Invasive Load Decomposition Method Based on Multi-Scale TCN and Multi-Head Self-Attention Mechanism
    Zhang, Yan
    Li, Fei
    Xiao, Yang
    Li, Kai
    Xia, Lei
    Tan, Huilei
    INTERNATIONAL JOURNAL OF MULTIPHYSICS, 2024, 18 (03) : 547 - 556
  • [48] Method for Remaining Useful Life Prediction of Turbofan Engines Combining Adam Optimization-Based Self-Attention Mechanism with Temporal Convolutional Networks
    Wang, Hairui
    Li, Dongjun
    Li, Ya
    Zhu, Guifu
    Lin, Rongxiang
    APPLIED SCIENCES-BASEL, 2024, 14 (17):
  • [49] A METHOD FOR COMPLETING MISSING 3D POINT CLOUD RECONSTRUCTED FROM AERIAL MULTI-VIEW IMAGES USING SELF-ATTENTION MECHANISM
    Kiyama, Takenobu
    Xie, Chun
    Shishido, Hidehiko
    Toriya, Hisatoshi
    Kitahara, Itaru
    IGARSS 2023 - 2023 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2023, : 927 - 930
  • [50] An Innovative Quality Prediction Method for Injection-Molded Products: A Multistage Fusion Regression Model Based on the IA-BiGRU and Multihead Self-Attention Mechanism
    Cheng, Youkang
    Zhan, Hongfei
    Wang, Rui
    Yu, Junhe
    Xie, Guangpeng
    IEEE SENSORS JOURNAL, 2025, 25 (04) : 7544 - 7561