Kernel Attention Network for Single Image Super-Resolution

被引:24
|
作者
Zhang, Dongyang [1 ,2 ]
Shao, Jie [1 ,2 ]
Shen, Heng Tao [1 ,2 ]
机构
[1] Univ Elect Sci & Technol China, Sch Comp Sci & Engn, Ctr Future Media, Chengdu 611731, Peoples R China
[2] Sichuan Artificial Intelligence Res Inst, Yibin 644000, Peoples R China
基金
中国国家自然科学基金;
关键词
Image super-resolution; kernel attention; receptive field; multi-scale features; CONVOLUTIONAL NETWORK;
D O I
10.1145/3398685
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recently, attention mechanisms have shown a developing tendency toward convolutional neural network (CNN), and some representative attention mechanisms, i.e., channel attention (CA) and spatial attention (SA) have been fully applied to single image super-resolution (SISR) tasks. However, the existing architectures directly apply these attention mechanisms to SISR without much consideration of the nature characteristic, resulting in less strong representational power. In this article, we propose a novel kernel attention module (KAM) for SISR, which enables the network to adjust its receptive field size corresponding to various scales of input by dynamically selecting the appropriate kernel. Based on this, we stack multiple kernel attention modules with group and residual connection to constitute a novel architecture for SISR, which enables our network to learn more distinguishing representations through filtering the information under different receptive fields. Thus, our network is more sensitive to multi-scale features, which enables our single network to deal with multi-scale SR task by predefining the upscaling modules. Besides, other attention mechanisms in super-resolution are also investigated and illustrated in detail in this article. Thanks to the kernel attention mechanism, the extensive benchmark evaluation shows that our method outperforms the other state-of-theart methods.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] Recurrent Large Kernel Attention Network for Efficient Single Infrared Image Super-Resolution
    Liu, Gangping
    Zhou, Shuaijun
    Chen, Xiaxu
    Yue, Wenjie
    Ke, Jun
    IEEE ACCESS, 2024, 12 : 923 - 935
  • [2] Region Attention Network For Single Image Super-resolution
    Du, Xiaobiao
    Liu, Chongjin
    Yang, Xiaoling
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [3] Upsampling Attention Network for Single Image Super-resolution
    Zheng, Zhijie
    Jiao, Yuhang
    Fang, Guangyou
    VISAPP: PROCEEDINGS OF THE 16TH INTERNATIONAL JOINT CONFERENCE ON COMPUTER VISION, IMAGING AND COMPUTER GRAPHICS THEORY AND APPLICATIONS - VOL. 4: VISAPP, 2021, : 399 - 406
  • [4] Large coordinate kernel attention network for lightweight image super-resolution
    Hao, Fangwei
    Wu, Jiesheng
    Lu, Haotian
    Du, Ji
    Xu, Jing
    Xu, Xiaoxuan
    arXiv,
  • [5] SRGAT: Single Image Super-Resolution With Graph Attention Network
    Yan, Yanyang
    Ren, Wenqi
    Hu, Xiaobin
    Li, Kun
    Shen, Haifeng
    Cao, Xiaochun
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2021, 30 : 4905 - 4918
  • [6] Deep coordinate attention network for single image super-resolution
    Xie, Chao
    Zhu, Hongyu
    Fei, Yeqi
    IET IMAGE PROCESSING, 2022, 16 (01) : 273 - 284
  • [7] PYRAMID FUSION ATTENTION NETWORK FOR SINGLE IMAGE SUPER-RESOLUTION
    He, Hao
    Du, Zongcai
    Li, Wenfeng
    Tang, Jie
    Wu, Gangshan
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 2165 - 2169
  • [8] Fused pyramid attention network for single image super-resolution
    Chen, Shi
    Bi, Xiuping
    Zhang, Lefei
    IET IMAGE PROCESSING, 2023, 17 (06) : 1681 - 1693
  • [9] Efficient residual attention network for single image super-resolution
    Fangwei Hao
    Taiping Zhang
    Linchang Zhao
    Yuanyan Tang
    Applied Intelligence, 2022, 52 : 652 - 661
  • [10] Single image super-resolution via a ternary attention network
    Lianping Yang
    Jian Tang
    Ben Niu
    Haoyue Fu
    Hegui Zhu
    Wuming Jiang
    Xin Wang
    Applied Intelligence, 2023, 53 : 13067 - 13081