MSAR-Net: Multi-scale attention based light-weight image super-resolution

被引:15
|
作者
Mehta, Nancy [1 ]
Murala, Subrahmanyam [1 ]
机构
[1] Indian Inst Technol Ropar, Comp Vis & Pattern Recognit Lab, Rupnagar 140001, India
关键词
Multi-scale attention residual block; Up and down-sampling projection block; Image super-resolution;
D O I
10.1016/j.patrec.2021.08.011
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, single image super-resolution (SISR), aiming to preserve the lost structural and textural information from the input low resolution image, has witnessed huge demand from the videos and graphics industries. The exceptional success of convolution neural networks (CNNs), has absolutely revolutionized the field of SISR. However, for most of the CNN-based SISR methods, excessive memory consumption in terms of parameters and flops, hinders their application in low-computing power devices. Moreover, different state-of-the-art SR methods collect different f eatures, by treating all the pixels contributing equally to the performance of the network. In this paper, we take into consideration both the performance and the reconstruction efficiency, and propose a Light-weight multi-scale attention residual network (MSARNet) for SISR. The proposed MSAR-Net consists of stack of multi-scale attention residual (MSAR) blocks for feature refinement, and an up and down-sampling projection (UDP) block for edge refinement of the extracted multi-scale features. These blocks are capable of effectively exploiting the multi-scale edge information, without increasing the number of parameters. Specially, we design our network in progressive fashion, for substituting the large scale factors (x 4) combinations, with small scale factor ( x2) combinations, and thus gradually exploit the hierarchical information. In parallel, for modulation of multi-scale features in global and local manners, channel and spatial attention in MSAR block is being used. Visual results and quantitative metrics of PSNR and SSIM exhibit the accuracy of the proposed approach on synthetic benchmark super-resolution datasets. The experimental analysis shows that the proposed approach outperforms the other existing methods for SISR in terms of memory footprint, inference time and visual quality. (C) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页码:215 / 221
页数:7
相关论文
共 50 条
  • [1] MSAR-Net: A multi-scale attention residual network for medical image segmentation
    Li, Xiaoheng
    Chen, Cheng
    Chen, Yunqing
    Yu, Ming-an
    Xiao, Ruoxiu
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2025, 104
  • [2] Multi-scale attention network for image super-resolution
    Wang, Li
    Shen, Jie
    Tang, E.
    Zheng, Shengnan
    Xu, Lizhong
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2021, 80
  • [3] Image super-resolution network based on multi-scale adaptive attention
    Zhou Y.
    Pei S.
    Chen H.
    Xu S.
    Guangxue Jingmi Gongcheng/Optics and Precision Engineering, 2024, 32 (06): : 843 - 856
  • [4] TBNet: Stereo Image Super-Resolution with Multi-Scale Attention
    Zhu, Jiyang
    Han, Xue
    JOURNAL OF CIRCUITS SYSTEMS AND COMPUTERS, 2023, 32 (18)
  • [5] Image super-resolution reconstruction with multi-scale attention fusion
    Chen, Chun-yi
    Wu, Xin-yi
    Hu, Xiao-juan
    Yu, Hai-yang
    CHINESE OPTICS, 2023, 16 (05) : 1034 - 1044
  • [6] Single image super-resolution based on multi-scale dense attention network
    Gao, Farong
    Wang, Yong
    Yang, Zhangyi
    Ma, Yuliang
    Zhang, Qizhong
    SOFT COMPUTING, 2023, 27 (06) : 2981 - 2992
  • [7] Single image super-resolution based on multi-scale dense attention network
    Farong Gao
    Yong Wang
    Zhangyi Yang
    Yuliang Ma
    Qizhong Zhang
    Soft Computing, 2023, 27 : 2981 - 2992
  • [8] Image Super-Resolution Based on Residual Attention and Multi-Scale Feature Fusion
    Kou, Qiqi
    Zhao, Jiamin
    Cheng, Deqiang
    Su, Zhen
    Zhu, Xingguang
    IEEE ACCESS, 2023, 11 : 59530 - 59541
  • [9] Image super-resolution reconstruction based on multi-scale dual-attention
    Li, Hong-an
    Wang, Diao
    Zhang, Jing
    Li, Zhanli
    Ma, Tian
    CONNECTION SCIENCE, 2023, 35 (01)
  • [10] Attention augmented multi-scale network for single image super-resolution
    Xiong, Chengyi
    Shi, Xiaodi
    Gao, Zhirong
    Wang, Ge
    APPLIED INTELLIGENCE, 2021, 51 (02) : 935 - 951