Kernel Attention Network for Single Image Super-Resolution

被引:24
|
作者
Zhang, Dongyang [1 ,2 ]
Shao, Jie [1 ,2 ]
Shen, Heng Tao [1 ,2 ]
机构
[1] Univ Elect Sci & Technol China, Sch Comp Sci & Engn, Ctr Future Media, Chengdu 611731, Peoples R China
[2] Sichuan Artificial Intelligence Res Inst, Yibin 644000, Peoples R China
基金
中国国家自然科学基金;
关键词
Image super-resolution; kernel attention; receptive field; multi-scale features; CONVOLUTIONAL NETWORK;
D O I
10.1145/3398685
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recently, attention mechanisms have shown a developing tendency toward convolutional neural network (CNN), and some representative attention mechanisms, i.e., channel attention (CA) and spatial attention (SA) have been fully applied to single image super-resolution (SISR) tasks. However, the existing architectures directly apply these attention mechanisms to SISR without much consideration of the nature characteristic, resulting in less strong representational power. In this article, we propose a novel kernel attention module (KAM) for SISR, which enables the network to adjust its receptive field size corresponding to various scales of input by dynamically selecting the appropriate kernel. Based on this, we stack multiple kernel attention modules with group and residual connection to constitute a novel architecture for SISR, which enables our network to learn more distinguishing representations through filtering the information under different receptive fields. Thus, our network is more sensitive to multi-scale features, which enables our single network to deal with multi-scale SR task by predefining the upscaling modules. Besides, other attention mechanisms in super-resolution are also investigated and illustrated in detail in this article. Thanks to the kernel attention mechanism, the extensive benchmark evaluation shows that our method outperforms the other state-of-theart methods.
引用
收藏
页数:15
相关论文
共 50 条
  • [21] Attention augmented multi-scale network for single image super-resolution
    Xiong, Chengyi
    Shi, Xiaodi
    Gao, Zhirong
    Wang, Ge
    APPLIED INTELLIGENCE, 2021, 51 (02) : 935 - 951
  • [22] Single image super-resolution with lightweight multi-scale dilated attention network
    Song, Xiaogang
    Pang, Xinchao
    Zhang, Lei
    Lu, Xiaofeng
    Hei, Xinhong
    APPLIED SOFT COMPUTING, 2025, 169
  • [23] Dynamic dual attention iterative network for image super-resolution
    Hao Feng
    Liejun Wang
    Shuli Cheng
    Anyu Du
    Yongming Li
    Applied Intelligence, 2022, 52 : 8189 - 8208
  • [24] Image super-resolution based on adaptive cascading attention network
    Zhou, Dengwen
    Chen, Yiming
    Li, Wenbin
    Li, Jinxin
    EXPERT SYSTEMS WITH APPLICATIONS, 2021, 186
  • [25] Hierarchical accumulation network with grid attention for image super-resolution
    Yang, Yue
    Qi, Yong
    KNOWLEDGE-BASED SYSTEMS, 2021, 233
  • [26] Lightweight Attention-Guided Network for Image Super-Resolution
    Ding, Zixuan
    Juan, Zhang
    Xiang, Li
    Wang, Xinyu
    LASER & OPTOELECTRONICS PROGRESS, 2023, 60 (14)
  • [27] Image super-resolution reconstruction based on dynamic attention network
    Zhao X.-Q.
    Wang Z.
    Song Z.-Y.
    Jiang H.-M.
    Zhejiang Daxue Xuebao (Gongxue Ban)/Journal of Zhejiang University (Engineering Science), 2023, 57 (08): : 1487 - 1494
  • [28] Information-Growth Attention Network for Image Super-Resolution
    Li, Zhuangzi
    Li, Ge
    Li, Thomas
    Liu, Shan
    Gao, Wei
    PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2021, 2021, : 544 - 552
  • [29] Lightweight image super-resolution with sliding Proxy Attention Network
    Hu, Zhenyu
    Sun, Wanjie
    Chen, Zhenzhong
    SIGNAL PROCESSING, 2025, 227
  • [30] Lightweight image super-resolution with multiscale residual attention network
    Xiao, Cunjun
    Dong, Hui
    Li, Haibin
    Li, Yaqian
    Zhang, Wenming
    JOURNAL OF ELECTRONIC IMAGING, 2022, 31 (04)