Kernel Attention Network for Single Image Super-Resolution

被引:24
|
作者
Zhang, Dongyang [1 ,2 ]
Shao, Jie [1 ,2 ]
Shen, Heng Tao [1 ,2 ]
机构
[1] Univ Elect Sci & Technol China, Sch Comp Sci & Engn, Ctr Future Media, Chengdu 611731, Peoples R China
[2] Sichuan Artificial Intelligence Res Inst, Yibin 644000, Peoples R China
基金
中国国家自然科学基金;
关键词
Image super-resolution; kernel attention; receptive field; multi-scale features; CONVOLUTIONAL NETWORK;
D O I
10.1145/3398685
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recently, attention mechanisms have shown a developing tendency toward convolutional neural network (CNN), and some representative attention mechanisms, i.e., channel attention (CA) and spatial attention (SA) have been fully applied to single image super-resolution (SISR) tasks. However, the existing architectures directly apply these attention mechanisms to SISR without much consideration of the nature characteristic, resulting in less strong representational power. In this article, we propose a novel kernel attention module (KAM) for SISR, which enables the network to adjust its receptive field size corresponding to various scales of input by dynamically selecting the appropriate kernel. Based on this, we stack multiple kernel attention modules with group and residual connection to constitute a novel architecture for SISR, which enables our network to learn more distinguishing representations through filtering the information under different receptive fields. Thus, our network is more sensitive to multi-scale features, which enables our single network to deal with multi-scale SR task by predefining the upscaling modules. Besides, other attention mechanisms in super-resolution are also investigated and illustrated in detail in this article. Thanks to the kernel attention mechanism, the extensive benchmark evaluation shows that our method outperforms the other state-of-theart methods.
引用
收藏
页数:15
相关论文
共 50 条
  • [41] Dual-attention guided multi-scale network for single image super-resolution
    Juan Wen
    Lei Zha
    Applied Intelligence, 2022, 52 : 12258 - 12271
  • [42] Single Image Super-Resolution with Arbitrary Magnification Based on High-Frequency Attention Network
    Yun, Jun-Seok
    Yoo, Seok-Bong
    MATHEMATICS, 2022, 10 (02)
  • [43] Spatial and channel enhanced self-attention network for efficient single image super-resolution
    Song, Xiaogang
    Tan, Yuping
    Pang, Xinchao
    Zhang, Lei
    Lu, Xiaofeng
    Hei, Xinhong
    NEUROCOMPUTING, 2025, 620
  • [44] Image super-resolution via dynamic network
    Tian, Chunwei
    Zhang, Xuanyu
    Zhang, Qi
    Yang, Mingming
    Ju, Zhaojie
    CAAI TRANSACTIONS ON INTELLIGENCE TECHNOLOGY, 2024, 9 (04) : 837 - 849
  • [45] Dual-attention guided multi-scale network for single image super-resolution
    Wen, Juan
    Zha, Lei
    APPLIED INTELLIGENCE, 2022, 52 (11) : 12258 - 12271
  • [46] Edge-Enhanced with Feedback Attention Network for Image Super-Resolution
    Fu, Chunmei
    Yin, Yong
    SENSORS, 2021, 21 (06) : 1 - 16
  • [47] Deep Residual-Dense Attention Network for Image Super-Resolution
    Qin, Ding
    Gu, Xiaodong
    NEURAL INFORMATION PROCESSING, ICONIP 2019, PT V, 2019, 1143 : 3 - 10
  • [48] Image Super-Resolution Reconstruction Based on the Lightweight Hybrid Attention Network
    Chu, Yuezhong
    Wang, Kang
    Zhang, Xuefeng
    Heng, Liu
    ADVANCES IN MULTIMEDIA, 2024, 2024
  • [49] Channel Attention Network for Wireless Capsule Endoscopy Image Super-Resolution
    Sarvaiya, Anjali
    Vaghela, Hiren
    Upla, Kishor
    Raja, Kiran
    Pedersen, Marius
    COMPUTER VISION AND IMAGE PROCESSING, CVIP 2023, PT II, 2024, 2010 : 432 - 444
  • [50] A multi-scale enhanced large-kernel attention transformer network for lightweight image super-resolution
    Chang, Kairong
    Jun, Sun
    Biao, Yang
    Hu, Mingzhi
    Yang, Junlong
    SIGNAL IMAGE AND VIDEO PROCESSING, 2025, 19 (03)