CONVOLUTIONAL DICTIONARY LEARNING IN HIERARCHICAL NETWORKS

被引:0
作者
Zazo, Javier [1 ]
Tolooshams, Bahareh [1 ]
Ba, Demba [1 ]
机构
[1] Harvard Univ, Harvard John A Paulson Sch Engn & Appl Sci, Cambridge, MA 02138 USA
来源
2019 IEEE 8TH INTERNATIONAL WORKSHOP ON COMPUTATIONAL ADVANCES IN MULTI-SENSOR ADAPTIVE PROCESSING (CAMSAP 2019) | 2019年
关键词
Convolutional dictionary learning; sparse coding; deep networks; hierarchical models;
D O I
10.1109/camsap45676.2019.9022440
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Filter banks are a popular tool for the analysis of piecewise smooth signals such as natural images. Motivated by the empirically observed properties of scale and detail coefficients of images in the wavelet domain, we propose a hierarchical deep generative model of piecewise smooth signals that is a recursion across scales: the low pass scale coefficients at one layer are obtained by filtering the scale coefficients at the next layer, and adding a high pass detail innovation obtained by filtering a sparse vector. This recursion describes a linear dynamic system that is a non-Gaussian Markov process across scales and is closely related to multilayer-convolutional sparse coding (ML-CSC) generative model for deep networks, except that our model allows for deeper architectures, and combines sparse and non-sparse signal representations. We propose an alternating minimization algorithm for learning the filters in this hierarchical model given observations at layer zero, e.g., natural images. The algorithm alternates between a coefficient-estimation step and a filter update step. The coefficient update step performs sparse (detail) and smooth (scale) coding and, when unfolded, leads to a deep neural network. We use MNIST to demonstrate the representation capabilities of the model, and its derived features (coefficients) for classification.
引用
收藏
页码:131 / 135
页数:5
相关论文
共 20 条
  • [1] AGARWAL A., 2013, stat, V1050, P8
  • [2] Ba D, 2018, ARXIV180701958
  • [3] A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
    Beck, Amir
    Teboulle, Marc
    [J]. SIAM JOURNAL ON IMAGING SCIENCES, 2009, 2 (01): : 183 - 202
  • [4] Fast Convolutional Sparse Coding
    Bristow, Hilton
    Eriksson, Anders
    Lucey, Simon
    [J]. 2013 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2013, : 391 - 398
  • [5] Invariant Scattering Convolution Networks
    Bruna, Joan
    Mallat, Stephane
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (08) : 1872 - 1886
  • [6] Cands E., 2006, Compressive sampling, P1
  • [7] Dethlefsen C, 2003, BAYESIAN STATISTICS 7, P493
  • [8] Frankle Jonathan, 2019, The lottery ticket hypothesis: Finding sparse, trainable neural networks
  • [9] Convolutional Dictionary Learning: A Comparative Review and New Algorithms
    Garcia-Cardona, Cristina
    Wohlberg, Brendt
    [J]. IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, 2018, 4 (03) : 366 - 381
  • [10] ImageNet Classification with Deep Convolutional Neural Networks
    Krizhevsky, Alex
    Sutskever, Ilya
    Hinton, Geoffrey E.
    [J]. COMMUNICATIONS OF THE ACM, 2017, 60 (06) : 84 - 90