Digital core image reconstruction based on residual self-attention generative adversarial networks

被引:0
|
作者
Lei He
Fuping Gui
Min Hu
Daolun Li
Wenshu Zha
Jieqing Tan
机构
[1] Hefei University of Technology,
来源
Computational Geosciences | 2023年 / 27卷
关键词
Reconstruction; Digital core image; Self-attention mechanism; Residual; Generative adversarial networks;
D O I
暂无
中图分类号
学科分类号
摘要
In order to perform accurate physical analysis of digital core, the reconstruction of high-quality digital core image has become a problem to be resolved at present. In this paper, a digital core image reconstruction method based on the residual self-attention generative adversarial networks is proposed. In the process of digital core image reconstruction, the traditional generative adversarial networks (GANs) can obtain high resolution detail features only by the spatial local point generation in low resolution details, and the far away dependency can only be processed by multiple convolution operations. In view of this, in this paper the residual self-attention block is introduced in the traditional GANs, which can strengthen the correlation learning between features and extract more features. In order to analyze the quality of generated shale images, in this paper the Fréchet Inception Distance (FID) and Kernel Inception Distance (KID) are used to evaluate the consistency of Gaussian distribution between reconstructed shale images and original ones, and the two-point covariance function is used to evaluate the structural similarity between reconstructed shale images and original ones. Plenty experiments show that the reconstructed shale images by the proposed method in the paper are closer to the original images and have better effect, compared to those of the state-of-art methods.
引用
收藏
页码:499 / 514
页数:15
相关论文
共 50 条
  • [1] Digital core image reconstruction based on residual self-attention generative adversarial networks
    He, Lei
    Gui, Fuping
    Hu, Min
    Li, Daolun
    Zha, Wenshu
    Tan, Jieqing
    COMPUTATIONAL GEOSCIENCES, 2023, 27 (03) : 499 - 514
  • [2] A Self-Attention Based Wasserstein Generative Adversarial Networks for Single Image Inpainting
    Mao, Yuanxin
    Zhang, Tianzhuang
    Fu, Bo
    Thanh, Dang N. H.
    PATTERN RECOGNITION AND IMAGE ANALYSIS, 2022, 32 (03) : 591 - 599
  • [3] A Self-Attention Based Wasserstein Generative Adversarial Networks for Single Image Inpainting
    Yuanxin Mao
    Tianzhuang Zhang
    Bo Fu
    Dang N. H. Thanh
    Pattern Recognition and Image Analysis, 2022, 32 : 591 - 599
  • [4] Self-attention and generative adversarial networks for algae monitoring
    Nhut Hai Huynh
    Boer, Gordon
    Schramm, Hauke
    EUROPEAN JOURNAL OF REMOTE SENSING, 2022, 55 (01) : 10 - 22
  • [5] PEGANs: Phased Evolutionary Generative Adversarial Networks with Self-Attention Module
    Xue, Yu
    Tong, Weinan
    Neri, Ferrante
    Zhang, Yixia
    MATHEMATICS, 2022, 10 (15)
  • [6] Structural dynamic response reconstruction using self-attention enhanced generative adversarial networks
    Fan, Gao
    He, Zhengyan
    Li, Jun
    ENGINEERING STRUCTURES, 2023, 276
  • [7] Shale Digital Core Image Generation Based on Generative Adversarial Networks
    Zha, Wenshu
    Li, Xingbao
    Li, Daolun
    Xing, Yan
    He, Lei
    Tan, Jieqing
    JOURNAL OF ENERGY RESOURCES TECHNOLOGY-TRANSACTIONS OF THE ASME, 2021, 143 (03):
  • [8] Generative Adversarial Networks for Abnormal Event Detection in Videos Based on Self-Attention Mechanism
    Zhang, Weichao
    Wang, Guanjun
    Huang, Mengxing
    Wang, Hongyu
    Wen, Shaoping
    IEEE ACCESS, 2021, 9 : 124847 - 124860
  • [9] Self-attention generative adversarial networks applied to conditional music generation
    Pedro Lucas Tomaz Neves
    José Fornari
    João Batista Florindo
    Multimedia Tools and Applications, 2022, 81 : 24419 - 24430
  • [10] Self-attention generative adversarial networks applied to conditional music generation
    Tomaz Neves, Pedro Lucas
    Fornari, Jose
    Florindo, Joao Batista
    MULTIMEDIA TOOLS AND APPLICATIONS, 2022, 81 (17) : 24419 - 24430