A Decomposition Method for Large-Scale Sparse Coding in Representation Learning

被引:0
作者
Li, Yifeng [1 ]
Caron, Richard J. [2 ]
Ngom, Alioune [3 ]
机构
[1] Univ British Columbia, CMMT, Child & Family Res Inst, Vancouver, BC V5Z 1M9, Canada
[2] Univ Windsor, Math & Stat, Windsor, ON N9B 3P4, Canada
[3] Univ Windsor, Sch Comp Sci, Windsor, ON N9B 3P4, Canada
来源
PROCEEDINGS OF THE 2014 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN) | 2014年
关键词
CLASSIFICATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In representation learning, sparse representation is a parsimonious principle that a sample can be approximated by a sparse superposition of dictionary atoms. Sparse coding is the core of this technique. Since the dictionary is often redundant, the dictionary size can be very large. Many optimization methods have been proposed in the literature for sparse coding. However, the efficiency of the optimization for a tremendous number of dictionary atoms is still a bottleneck. In this paper, we propose to use decomposition method for large-scale sparse coding models. Our experimental results show that our method is very efficient.
引用
收藏
页码:3732 / 3738
页数:7
相关论文
共 19 条
[1]  
[Anonymous], 2006, ADV NEURAL INF PROCE
[2]  
[Anonymous], 2008, NONLINEAR PROGRAMMIN
[3]   Representation Learning: A Review and New Perspectives [J].
Bengio, Yoshua ;
Courville, Aaron ;
Vincent, Pascal .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (08) :1798-1828
[4]  
Boyd S., 2004, CONVEX OPTIMIZATION, VFirst, DOI DOI 10.1017/CBO9780511804441
[5]   From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images [J].
Bruckstein, Alfred M. ;
Donoho, David L. ;
Elad, Michael .
SIAM REVIEW, 2009, 51 (01) :34-81
[6]   Large-scale integration of microarray data reveals genes and pathways common to multiple cancer types [J].
Dawany, Noor B. ;
Dampier, Will N. ;
Tozeren, Aydin .
INTERNATIONAL JOURNAL OF CANCER, 2011, 128 (12) :2881-2891
[7]  
Elad M, 2010, SPARSE AND REDUNDANT REPRESENTATIONS, P3, DOI 10.1007/978-1-4419-7011-4_1
[8]   Modeling receptive fields with non-negative sparse coding [J].
Hoyer, PO .
NEUROCOMPUTING, 2003, 52-4 :547-552
[9]  
Jenatton R, 2011, J MACH LEARN RES, V12, P2297
[10]  
Joachims T, 1999, ADVANCES IN KERNEL METHODS, P169