EHAT:Enhanced Hybrid Attention Transformer for Remote Sensing Image Super-Resolution

被引:0
作者
Wang, Jian [1 ]
Xie, Zexin [1 ]
Du, Yanlin [1 ]
Song, Wei [1 ]
机构
[1] Shanghai Ocean Univ, Coll Informat Technol, Shanghai, Peoples R China
来源
PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2024, PT VIII | 2025年 / 15038卷
关键词
Vison Transformer; remote sensing; self attention; super resolution; Nonlocal neural Network;
D O I
10.1007/978-981-97-8685-5_16
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In recent years, deep learning (DL)-based super-resolution techniques for remote sensing images have made significant progress. However, these models have constraints in effectively managing long-range non-local information and reusing features, while also encountering issues such as gradient vanishing and explosion. To overcome these challenges, we propose the Enhanced Hybrid Attention Transformer (EHAT) framework, which is based on the Hybrid Attention Transformer (HAT) network backbone and combines a region-level nonlocal neural network block and a skip fusion network SFN to form a new skip fusion attention group (SFAG). In addition, we form a Multi-attention Block (MAB) by introducing spatial frequency block (SFB) based on fast Fourier convolution. We have conducted extensive experiments on Uc Merced, CLRS and RSSCN7 datasets. The results show that our method improves the PSNR by about 0.2 dB on Uc Mercedx4.
引用
收藏
页码:225 / 237
页数:13
相关论文
共 50 条
  • [11] Scale-Aware Backprojection Transformer for Single Remote Sensing Image Super-Resolution
    Hao, Jinglei
    Li, Wukai
    Lu, Yuting
    Jin, Yang
    Zhao, Yongqiang
    Wang, Shunzhou
    Wang, Binglu
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62
  • [12] Global sparse attention network for remote sensing image super-resolution
    Hu, Tao
    Chen, Zijie
    Wang, Mingyi
    Hou, Xintong
    Lu, Xiaoping
    Pan, Yuanyuan
    Li, Jianqing
    KNOWLEDGE-BASED SYSTEMS, 2024, 304
  • [13] Remote Sensing Image Super-Resolution With Residual Split Attention Mechanism
    Chen, Xitong
    Wu, Yuntao
    Lu, Tao
    Kong, Quan
    Wang, Jiaming
    Wang, Yu
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2023, 16 : 1 - 13
  • [14] Transferred Multi-Perception Attention Networks for Remote Sensing Image Super-Resolution
    Dong, Xiaoyu
    Xi, Zhihong
    Sun, Xu
    Gao, Lianru
    REMOTE SENSING, 2019, 11 (23)
  • [15] Remote Sensing Image Super-resolution: Challenges and Approaches
    Yang, Daiqin
    Li, Zimeng
    Xia, Yatong
    Chen, Zhenzhong
    2015 IEEE INTERNATIONAL CONFERENCE ON DIGITAL SIGNAL PROCESSING (DSP), 2015, : 196 - 200
  • [16] Parallel attention recursive generalization transformer for image super-resolution
    Jing Wang
    Yuanyuan Hao
    Hongxing Bai
    Lingyu Yan
    Scientific Reports, 15 (1)
  • [17] ESTUGAN: Enhanced Swin Transformer with U-Net Discriminator for Remote Sensing Image Super-Resolution
    Yu, Chunhe
    Hong, Lingyue
    Pan, Tianpeng
    Li, Yufeng
    Li, Tingting
    ELECTRONICS, 2023, 12 (20)
  • [18] Efficient Dual Attention Transformer for Image Super-Resolution
    Park, Soobin
    Jeong, Yuna
    Choi, Yong Suk
    39TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING, SAC 2024, 2024, : 963 - 970
  • [19] Remote Sensing Image Super-Resolution Based on Dense Channel Attention Network
    Ma, Yunchuan
    Lv, Pengyuan
    Liu, Hao
    Sun, Xuehong
    Zhong, Yanfei
    REMOTE SENSING, 2021, 13 (15)
  • [20] REMOTE SENSING IMAGE SUPER-RESOLUTION VIA ENHANCED BACK-PROJECTION NETWORKS
    Dong, Xiaoyu
    Xi, Zhihong
    Sun, Xu
    Yang, Lina
    IGARSS 2020 - 2020 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2020, : 1480 - 1483