MemNet: A Persistent Memory Network for Image Restoration

被引:1503
作者
Tai, Ying [1 ]
Yang, Jian [1 ]
Liu, Xiaoming [2 ]
Xu, Chunyan [1 ]
机构
[1] Nanjing Univ Sci & Technol, Dept Comp Sci & Engn, Nanjing, Jiangsu, Peoples R China
[2] Michigan State Univ, Dept Comp Sci & Engn, E Lansing, MI 48824 USA
来源
2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV) | 2017年
关键词
D O I
10.1109/ICCV.2017.486
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, very deep convolutional neural networks (CNNs) have been attracting considerable attention in image restoration. However, as the depth grows, the long-term dependency problem is rarely realized for these very deep models, which results in the prior states/layers having little influence on the subsequent ones. Motivated by the fact that human thoughts have persistency, we propose a very deep persistent memory network (MemNet) that introduces a memory block, consisting of a recursive unit and a gate unit, to explicitly mine persistent memory through an adaptive learning process. The recursive unit learns multi-level representations of the current state under different receptive fields. The representations and the outputs from the previous memory blocks are concatenated and sent to the gate unit, which adaptively controls how much of the previous states should be reserved, and decides how much of the current state should be stored. We apply MemNet to three image restoration tasks, i.e., image denosing, super-resolution and JPEG deblocking. Comprehensive experiments demonstrate the necessity of the MemNet and its unanimous superiority on all three tasks over the state of the arts. Code is available at https://github.com/tyshiwo/MemNet.
引用
收藏
页码:4549 / 4557
页数:9
相关论文
共 39 条
[1]  
[Anonymous], 2014, ACM MM
[2]  
[Anonymous], 2017, CVPR
[3]  
[Anonymous], 2008, NIPS
[4]  
[Anonymous], 2012, ECCV
[5]  
[Anonymous], 2011, ICCV
[6]  
[Anonymous], ICCV
[7]  
[Anonymous], 2015, CVPR
[8]  
[Anonymous], BMVC
[9]  
[Anonymous], 2015, arXiv: Learning
[10]  
[Anonymous], 2017, CVPR