Minimax Reconstruction Risk of Convolutional Sparse Dictionary Learning

被引:0
作者
Singh, Shashank [1 ]
Poczos, Barnabas [1 ]
Ma, Jian [2 ]
机构
[1] Carnegie Mellon Univ, Machine Learning Dept, Pittsburgh, PA 15213 USA
[2] Carnegie Mellon Univ, Computat Biol Dept, Pittsburgh, PA 15213 USA
来源
INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 84 | 2018年 / 84卷
关键词
K-SVD; REPRESENTATIONS; ALGORITHMS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Sparse dictionary learning (SDL) has become a popular method for learning parsimonious representations of data, a fundamental problem in machine learning and signal processing. While most work on SDL assumes a training dataset of independent and identically distributed (IID) samples, a variant known as convolutional sparse dictionary learning (CSDL) relaxes this assumption to allow dependent, non-stationary sequential data sources. Recent work has explored statistical properties of IID SDL; however, the statistical properties of CSDL remain largely unstudied. This paper identifies minimax rates of CSDL in terms of reconstruction risk, providing both lower and upper bounds in a variety of settings. Our results make minimal assumptions, allowing arbitrary dictionaries and showing that CSDL is robust to dependent noise. We compare our results to similar results for IID SDL and verify our theory with synthetic experiments.
引用
收藏
页数:10
相关论文
共 53 条
[1]  
Agarwal A, 2014, POSTERIOR CAPSULAR RUPTURE: A PRACTICAL GUIDE TO PREVENTION AND MANAGEMENT, P127
[2]   K-SVD: An algorithm for designing overcomplete dictionaries for sparse representation [J].
Aharon, Michal ;
Elad, Michael ;
Bruckstein, Alfred .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2006, 54 (11) :4311-4322
[3]   Predicting the sequence specificities of DNA- and RNA-binding proteins by deep learning [J].
Alipanahi, Babak ;
Delong, Andrew ;
Weirauch, Matthew T. ;
Frey, Brendan J. .
NATURE BIOTECHNOLOGY, 2015, 33 (08) :831-+
[4]  
[Anonymous], 2015, ARXIV151006096
[5]  
[Anonymous], 2010, ARXIV10103467
[6]  
[Anonymous], 2013, Proc. 30th Int. Conf. on Machine Learning
[7]  
[Anonymous], 2007, Multi-Task Feature Learning, DOI DOI 10.7551/MITPRESS/7503.003.0010
[8]  
[Anonymous], 1995, CONVOLUTIONAL NETWOR
[9]  
[Anonymous], 2013, Concentration Inequali-ties: A Nonasymptotic Theory of Independence, DOI DOI 10.1093/ACPROF:OSO/9780199535255.001.0001
[10]  
[Anonymous], 2012, ARXIV12100685