Lightweight Single Image Super-Resolution With Multi-Scale Spatial Attention Networks

被引:14
|
作者
Soh, Jae Woong [1 ]
Cho, Nam Ik [1 ]
机构
[1] Seoul Natl Univ, INMC, Dept Elect & Comp Engn, Seoul 08826, South Korea
关键词
Feature extraction; Convolution; Spatial resolution; Computer architecture; Training; Convolutional neural networks; Convolutional neural network (CNN); lightweight; multi-scale spatial attention; single image super-resolution (SISR);
D O I
10.1109/ACCESS.2020.2974876
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Convolutional neural networks (CNNs) generally provide higher performance gain for single image super-resolution (SISR) as the depth and number of parameters are increasing. However, just increasing the layers of straightforward deep networks has a problem that it requires an impractically large number of parameters for obtaining state-of-the-art performance. Instead, some researchers proposed lightweight networks, which is designed with more sophisticated network structures for achieving better performance than the straightforward networks at the same parameter requirement. In this paper, we propose new lightweight Multi-scale Spatial Attention Networks (MSAN) for SISR, which attempt to bring out a better performance from the relatively small number of parameters. Specifically, we adopt a dense connection with feature fusion layers to broadcast abundant features to every level of layers, and propose a double residual structure that provides an additional skip-connection. We also design a Multi-scale Spatial Attention Block (MSAB) to exploit multi-scale spatial contextual information. Furthermore, we introduce a spatial attention module which adaptively focuses on the most informative feature scale in a given region of the image. In the experiments, we validate that the proposed MSAN achieves significant accuracy compared to recent lightweight models and comparable performance to the state-of-the-art methods.
引用
收藏
页码:35383 / 35391
页数:9
相关论文
共 50 条
  • [1] Lightweight multi-scale residual networks with attention for image super-resolution
    Liu, Huan
    Cao, Feilong
    Wen, Chenglin
    Zhang, Qinghua
    KNOWLEDGE-BASED SYSTEMS, 2020, 203
  • [2] Lightweight multi-scale aggregated residual attention networks for image super-resolution
    Pang, Shurong
    Chen, Zhe
    Yin, Fuliang
    MULTIMEDIA TOOLS AND APPLICATIONS, 2022, 81 (04) : 4797 - 4819
  • [3] Lightweight multi-scale aggregated residual attention networks for image super-resolution
    Shurong Pang
    Zhe Chen
    Fuliang Yin
    Multimedia Tools and Applications, 2022, 81 : 4797 - 4819
  • [4] Single image super-resolution with lightweight multi-scale dilated attention network
    Song, Xiaogang
    Pang, Xinchao
    Zhang, Lei
    Lu, Xiaofeng
    Hei, Xinhong
    APPLIED SOFT COMPUTING, 2025, 169
  • [5] A lightweight multi-scale channel attention network for image super-resolution
    Li, Wenbin
    Li, Juefei
    Li, Jinxin
    Huang, Zhiyong
    Zhou, Dengwen
    NEUROCOMPUTING, 2021, 456 : 327 - 337
  • [6] Lightweight Multi-Scale Asymmetric Attention Network for Image Super-Resolution
    Zhang, Min
    Wang, Huibin
    Zhang, Zhen
    Chen, Zhe
    Shen, Jie
    MICROMACHINES, 2022, 13 (01)
  • [7] Lightweight multi-scale distillation attention network for image super-resolution
    Tang, Yinggan
    Hu, Quanwei
    Bu, Chunning
    KNOWLEDGE-BASED SYSTEMS, 2025, 309
  • [8] Multi-scale convolutional attention network for lightweight image super-resolution
    Xie, Feng
    Lu, Pei
    Liu, Xiaoyong
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2023, 95
  • [9] LMSN:a lightweight multi-scale network for single image super-resolution
    Yiye Zou
    Xiaomin Yang
    Marcelo Keese Albertini
    Farhan Hussain
    Multimedia Systems, 2021, 27 : 845 - 856
  • [10] LMSN:a lightweight multi-scale network for single image super-resolution
    Zou, Yiye
    Yang, Xiaomin
    Albertini, Marcelo Keese
    Hussain, Farhan
    MULTIMEDIA SYSTEMS, 2021, 27 (04) : 845 - 856