Kernel Attention Network for Single Image Super-Resolution

被引:24
|
作者
Zhang, Dongyang [1 ,2 ]
Shao, Jie [1 ,2 ]
Shen, Heng Tao [1 ,2 ]
机构
[1] Univ Elect Sci & Technol China, Sch Comp Sci & Engn, Ctr Future Media, Chengdu 611731, Peoples R China
[2] Sichuan Artificial Intelligence Res Inst, Yibin 644000, Peoples R China
基金
中国国家自然科学基金;
关键词
Image super-resolution; kernel attention; receptive field; multi-scale features; CONVOLUTIONAL NETWORK;
D O I
10.1145/3398685
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recently, attention mechanisms have shown a developing tendency toward convolutional neural network (CNN), and some representative attention mechanisms, i.e., channel attention (CA) and spatial attention (SA) have been fully applied to single image super-resolution (SISR) tasks. However, the existing architectures directly apply these attention mechanisms to SISR without much consideration of the nature characteristic, resulting in less strong representational power. In this article, we propose a novel kernel attention module (KAM) for SISR, which enables the network to adjust its receptive field size corresponding to various scales of input by dynamically selecting the appropriate kernel. Based on this, we stack multiple kernel attention modules with group and residual connection to constitute a novel architecture for SISR, which enables our network to learn more distinguishing representations through filtering the information under different receptive fields. Thus, our network is more sensitive to multi-scale features, which enables our single network to deal with multi-scale SR task by predefining the upscaling modules. Besides, other attention mechanisms in super-resolution are also investigated and illustrated in detail in this article. Thanks to the kernel attention mechanism, the extensive benchmark evaluation shows that our method outperforms the other state-of-theart methods.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] CANS: Combined Attention Network for Single Image Super-Resolution
    Muhammad, Wazir
    Aramvith, Supavadee
    Onoye, Takao
    IEEE ACCESS, 2024, 12 : 167498 - 167517
  • [2] Efficient residual attention network for single image super-resolution
    Fangwei Hao
    Taiping Zhang
    Linchang Zhao
    Yuanyan Tang
    Applied Intelligence, 2022, 52 : 652 - 661
  • [3] DANS: Deep Attention Network for Single Image Super-Resolution
    Talreja, Jagrati
    Aramvith, Supavadee
    Onoye, Takao
    IEEE ACCESS, 2023, 11 : 84379 - 84397
  • [4] Efficient residual attention network for single image super-resolution
    Hao, Fangwei
    Zhang, Taiping
    Zhao, Linchang
    Tang, Yuanyan
    APPLIED INTELLIGENCE, 2022, 52 (01) : 652 - 661
  • [5] Kernel-attended residual network for single image super-resolution
    Dun, Yujie
    Da, Zongyang
    Yang, Shuai
    Xue, Yao
    Qian, Xueming
    KNOWLEDGE-BASED SYSTEMS, 2021, 213
  • [6] Residual Triplet Attention Network for Single-Image Super-Resolution
    Huang, Feng
    Wang, Zhifeng
    Wu, Jing
    Shen, Ying
    Chen, Liqiong
    ELECTRONICS, 2021, 10 (17)
  • [7] Single-image super-resolution with multilevel residual attention network
    Qin, Ding
    Gu, Xiaodong
    NEURAL COMPUTING & APPLICATIONS, 2020, 32 (19) : 15615 - 15628
  • [8] Single-image super-resolution with multilevel residual attention network
    Ding Qin
    Xiaodong Gu
    Neural Computing and Applications, 2020, 32 : 15615 - 15628
  • [9] Single Image Super-Resolution Using Deep Hierarchical Attention Network
    Zhao, Fei
    Chen, Rui
    Li, Yuan
    PROCEEDINGS OF 2020 5TH INTERNATIONAL CONFERENCE ON MULTIMEDIA AND IMAGE PROCESSING (ICMIP 2020), 2020, : 80 - 85
  • [10] LKASR: Large kernel attention for lightweight image super-resolution
    Feng, Hao
    Wang, Liejun
    Li, Yongming
    Du, Anyu
    KNOWLEDGE-BASED SYSTEMS, 2022, 252