Lightweight image super-resolution with multiscale residual attention network

被引:0
|
作者
Xiao, Cunjun [1 ]
Dong, Hui [1 ]
Li, Haibin [1 ]
Li, Yaqian [1 ]
Zhang, Wenming [1 ]
机构
[1] Yanshan Univ, Key Lab Ind Comp Control Engn Hebei Prov, Qinhuangdao, Hebei, Peoples R China
基金
中国国家自然科学基金;
关键词
single-image super-resolution; attention mechanism; multiscale features; residual learning; QUALITY ASSESSMENT;
D O I
10.1117/1.JEI.31.4.043028
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In recent years, various convolutional neural networks have successfully applied to single-image super-resolution task. However, most existing models with deeper or wider networks require heavy computation and memory consumption that restrict them in practice. To solve the above questions, we propose a lightweight multiscale residual attention network, which not merely can extract more detail to improve the quality of the image but also decrease the usage of the parameters. More specifically, a multiscale residual attention block (MRAB) as the basic unit can fully exploit the image features with different sizes of convolutional kernels. Meanwhile, the attention mechanism can be adaptive to recalibrate channel and spatial information of feature mappings. Furthermore, a local information integration module (LFIM) is designed as the network architecture to maximize the use of local information. The LFIM consists of several MRAB and a local skip connection to complement information loss. Our experimental results show that our method is superior to the representative algorithms in performance with fewer parameters and computational overhead.
引用
收藏
页数:19
相关论文
共 50 条
  • [21] LBCRN: lightweight bidirectional correction residual network for image super-resolution
    Huang, Shuying
    Wang, Jichao
    Yang, Yong
    Wan, Weiguo
    Li, Guoqiang
    MULTIDIMENSIONAL SYSTEMS AND SIGNAL PROCESSING, 2023, 34 (01) : 341 - 364
  • [22] Residual Adaptive Dense Weight Attention Network for Single Image Super-Resolution
    Chen, Jiacheng
    Wang, Wanliang
    Xing, Fangsen
    Qian, Yutong
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [23] Lightweight multi-scale residual networks with attention for image super-resolution
    Liu, Huan
    Cao, Feilong
    Wen, Chenglin
    Zhang, Qinghua
    KNOWLEDGE-BASED SYSTEMS, 2020, 203
  • [24] TARN: a lightweight two-branch adaptive residual network for image super-resolution
    Huang, Shuying
    Wang, Jichao
    Yang, Yong
    Wan, Weiguo
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, 15 (09) : 4119 - 4132
  • [25] PSMFNet: Lightweight Partial Separation and Multiscale Fusion Network for Image Super-Resolution
    Cao, Shuai
    Liang, Jianan
    Cao, Yongjun
    Huang, Jinglun
    Yang, Zhishu
    CMC-COMPUTERS MATERIALS & CONTINUA, 2024, 81 (01):
  • [26] Deep Residual-Dense Attention Network for Image Super-Resolution
    Qin, Ding
    Gu, Xiaodong
    NEURAL INFORMATION PROCESSING, ICONIP 2019, PT V, 2019, 1143 : 3 - 10
  • [27] Residual Triplet Attention Network for Single-Image Super-Resolution
    Huang, Feng
    Wang, Zhifeng
    Wu, Jing
    Shen, Ying
    Chen, Liqiong
    ELECTRONICS, 2021, 10 (17)
  • [28] CASCADE ATTENTION BLEND RESIDUAL NETWORK FOR SINGLE IMAGE SUPER-RESOLUTION
    Chen, Tianyu
    Xiao, Guoqiang
    Tang, Xiaoqin
    Han, Xianfeng
    Ma, Wenzhuo
    Gou, Xinye
    2021 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2021, : 559 - 563
  • [29] Residual scale attention network for arbitrary scale image super-resolution
    Fu, Ying
    Chen, Jian
    Zhang, Tao
    Lin, Yonggang
    NEUROCOMPUTING, 2021, 427 : 201 - 211
  • [30] HRAN: Hybrid Residual Attention Network for Single Image Super-Resolution
    Muqeet, Abdul
    Bin Iqbal, Md Tauhid
    Bae, Sung-Ho
    IEEE ACCESS, 2019, 7 : 137020 - 137029