Kernel Attention Network for Single Image Super-Resolution

被引:24
|
作者
Zhang, Dongyang [1 ,2 ]
Shao, Jie [1 ,2 ]
Shen, Heng Tao [1 ,2 ]
机构
[1] Univ Elect Sci & Technol China, Sch Comp Sci & Engn, Ctr Future Media, Chengdu 611731, Peoples R China
[2] Sichuan Artificial Intelligence Res Inst, Yibin 644000, Peoples R China
基金
中国国家自然科学基金;
关键词
Image super-resolution; kernel attention; receptive field; multi-scale features; CONVOLUTIONAL NETWORK;
D O I
10.1145/3398685
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recently, attention mechanisms have shown a developing tendency toward convolutional neural network (CNN), and some representative attention mechanisms, i.e., channel attention (CA) and spatial attention (SA) have been fully applied to single image super-resolution (SISR) tasks. However, the existing architectures directly apply these attention mechanisms to SISR without much consideration of the nature characteristic, resulting in less strong representational power. In this article, we propose a novel kernel attention module (KAM) for SISR, which enables the network to adjust its receptive field size corresponding to various scales of input by dynamically selecting the appropriate kernel. Based on this, we stack multiple kernel attention modules with group and residual connection to constitute a novel architecture for SISR, which enables our network to learn more distinguishing representations through filtering the information under different receptive fields. Thus, our network is more sensitive to multi-scale features, which enables our single network to deal with multi-scale SR task by predefining the upscaling modules. Besides, other attention mechanisms in super-resolution are also investigated and illustrated in detail in this article. Thanks to the kernel attention mechanism, the extensive benchmark evaluation shows that our method outperforms the other state-of-theart methods.
引用
收藏
页数:15
相关论文
共 50 条
  • [41] Context Reasoning Attention Network for Image Super-Resolution
    Zhang, Yulun
    Wei, Donglai
    Qin, Can
    Wang, Huan
    Pfister, Hanspeter
    Fu, Yun
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 4258 - 4267
  • [42] Residual shuffle attention network for image super-resolution
    Li, Xuanyi
    Shao, Zhuhong
    Li, Bicao
    Shang, Yuanyuan
    Wu, Jiasong
    Duan, Yuping
    MACHINE VISION AND APPLICATIONS, 2023, 34 (05)
  • [43] Attention mechanism feedback network for image super-resolution
    Chen, Xiao
    Jing, Ruyun
    Suna, Chaowen
    JOURNAL OF ELECTRONIC IMAGING, 2022, 31 (04)
  • [44] Pyramid Attention Dense Network for Image Super-Resolution
    Chen, Si-Bao
    Hu, Chao
    Luo, Bin
    Ding, Chris H. Q.
    Huang, Shi-Lei
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [45] Augmented global attention network for image super-resolution
    Du, Xiaobiao
    Jiang, Saibiao
    Liu, Jie
    IET IMAGE PROCESSING, 2022, 16 (02) : 567 - 575
  • [46] Stratified attention dense network for image super-resolution
    Zhiwei Liu
    Xiaofeng Mao
    Ji Huang
    Menghan Gan
    Yueyuan Zhang
    Signal, Image and Video Processing, 2022, 16 : 715 - 722
  • [47] Pixel attention convolutional network for image super-resolution
    Wang, Xin
    Zhang, Shufen
    Lin, Yuanyuan
    Lyu, Yanxia
    Zhang, Jiale
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (11): : 8589 - 8599
  • [48] Image super-resolution with parallel convolution attention network
    Zhang, Qiao
    Yang, Xiaomin
    Xiao, Long
    Yang, Feng
    Hussain, Farhan
    Won Kim, Pyoung
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2021, 33 (22):
  • [49] Residual shuffle attention network for image super-resolution
    Xuanyi Li
    Zhuhong Shao
    Bicao Li
    Yuanyuan Shang
    Jiasong Wu
    Yuping Duan
    Machine Vision and Applications, 2023, 34
  • [50] Edge Attention Network for Image Deblurring and Super-Resolution
    Han, Jong-Wook
    Choi, Jun-Ho
    Kim, Jun-Hyuk
    Lee, Jong-Seok
    2021 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2021, : 2401 - 2406