Additional Self-Attention Transformer With Adapter for Thick Haze Removal

被引:3
|
作者
Cai, Zhenyang [1 ]
Ning, Jin [1 ]
Ding, Zhiheng [1 ]
Duo, Bin [1 ]
机构
[1] Chengdu Univ Technol, Coll Comp Sci & Cyber Secur, Chengdu 610059, Peoples R China
关键词
Image dehazing; remote sensing image (RSI); thick haze; transformer;
D O I
10.1109/LGRS.2024.3368430
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
Remote sensing images (RSIs) are widely used in the fields of geological resources monitoring, earthquake relief, and weather forecasting, but they are easily nullified due to haze cover. Transformer-based image dehazing model can better remove the haze in RSIs and improve the clarity of RSIs. However, due to the insufficient ability to extract detailed information, the model performs poorly in the case of thick haze. To solve this problem, this letter introduces an additional self-attention (AS) mechanism to help the model acquire more detailed information based on the existing Transformer-based image dehazing model and introduces an adapter module to improve the model's fitting capacity with newly added content. Experimental results on benchmark RSIs indicate that the proposed method yields an average improvement of 0.95 in peak signal-to-noise ratio (PSNR) and 0.6% in structural similarity index metrices (SSIM) for light haze removal. Notably, the method exhibits a significant enhancement of 1.34 in PSNR and 1.9% in SSIM for the removal of thick haze, underscoring its advantage in heavy haze conditions. The source code can be accessed via https://github.com/Eric3200C/ASTA.
引用
收藏
页码:1 / 5
页数:5
相关论文
共 50 条
  • [41] EEG-Transformer: Self-attention from Transformer Architecture for Decoding EEG of Imagined Speech
    Lee, Young-Eun
    Lee, Seo-Hyun
    10TH INTERNATIONAL WINTER CONFERENCE ON BRAIN-COMPUTER INTERFACE (BCI2022), 2022,
  • [42] Attention to Emotions: Body Emotion Recognition In-the-Wild Using Self-attention Transformer Network
    Paiva, Pedro V. V.
    Ramos, Josue J. G.
    Gavrilova, Marina
    Carvalho, Marco A. G.
    COMPUTER VISION, IMAGING AND COMPUTER GRAPHICS THEORY AND APPLICATIONS, VISIGRAPP 2023, 2024, 2103 : 206 - 228
  • [43] SHYNESS AND SELF-ATTENTION
    CROZIER, WR
    BULLETIN OF THE BRITISH PSYCHOLOGICAL SOCIETY, 1983, 36 (FEB): : A5 - A5
  • [44] Momentum Transformer: Closing the Performance Gap Between Self-attention and Its Linearization
    Tan Nguyen
    Baraniuk, Richard G.
    Kirby, Robert M.
    Osher, Stanley J.
    Wang, Bao
    MATHEMATICAL AND SCIENTIFIC MACHINE LEARNING, VOL 190, 2022, 190
  • [45] Fixed Encoder Self-Attention Patterns in Transformer-Based Machine Translation
    Raganato, Alessandro
    Scherrer, Yves
    Tiedemann, Jorg
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 556 - 568
  • [46] X-Transformer: A Machine Translation Model Enhanced by the Self-Attention Mechanism
    Liu, Huey-Ing
    Chen, Wei-Lin
    APPLIED SCIENCES-BASEL, 2022, 12 (09):
  • [47] Self-attention transformer model for pan evaporation prediction: a case study in Australia
    Abed, Mustafa
    Imteaz, Monzur Alam
    Huang, Yuk Feng
    Ahmed, Ali Najah
    JOURNAL OF HYDROINFORMATICS, 2024, 26 (10) : 2538 - 2556
  • [48] In-Memory Transformer Self-Attention Mechanism Using Passive Memristor Crossbar
    Cai, Jack
    Kaleem, Muhammad Ahsan
    Genov, Roman
    Azghadi, Mostafa Rahimi
    Amirsoleimani, Amirali
    2024 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, ISCAS 2024, 2024,
  • [49] An Effective Video Transformer With Synchronized Spatiotemporal and Spatial Self-Attention for Action Recognition
    Alfasly, Saghir
    Chui, Charles K.
    Jiang, Qingtang
    Lu, Jian
    Xu, Chen
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (02) : 2496 - 2509
  • [50] Unveiling the Power of Self-Attention for Shipping Cost Prediction: The Rate Card Transformer
    Sreekar, P. Aditya
    Verma, Sahil
    Madhavan, Varun
    Persad, Abhishek
    ASIAN CONFERENCE ON MACHINE LEARNING, VOL 222, 2023, 222