Block-Sparse Recovery via Convex Optimization

被引:118
作者
Elhamifar, Ehsan [1 ]
Vidal, Rene [2 ]
机构
[1] Johns Hopkins Univ, Dept Elect & Comp Engn, Baltimore, MD 21218 USA
[2] Johns Hopkins Univ, Dept Biomed Engn, Ctr Imaging Sci, Baltimore, MD 21218 USA
关键词
Block-sparse signals; convex optimization; face recognition; principal angles; subspaces; RESTRICTED ISOMETRY PROPERTY; FACE RECOGNITION; SIGNALS; REPRESENTATIONS; POLYTOPES;
D O I
10.1109/TSP.2012.2196694
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Given a dictionary that consists of multiple blocks and a signal that lives in the range space of only a few blocks, we study the problem of finding a block-sparse representation of the signal, i.e., a representation that uses the minimum number of blocks. Motivated by signal/image processing and computer vision applications, such as face recognition, we consider the block-sparse recovery problem in the case where the number of atoms in each block is arbitrary, possibly much larger than the dimension of the underlying subspace. To find a block-sparse representation of a signal, we propose two classes of nonconvex optimization programs, which aim to minimize the number of nonzero coefficient blocks and the number of nonzero reconstructed vectors from the blocks, respectively. Since both classes of problems are NP-hard, we propose convex relaxations and derive conditions under which each class of the convex programs is equivalent to the original nonconvex formulation. Our conditions depend on the notions of mutual and cumulative subspace coherence of a dictionary, which are natural generalizations of existing notions of mutual and cumulative coherence. We evaluate the performance of the proposed convex programs through simulations as well as real experiments on face recognition. We show that treating the face recognition problem as a block-sparse recovery problem improves the state-of-the-art results by 10% with only 25% of the training data.
引用
收藏
页码:4094 / 4107
页数:14
相关论文
共 47 条
[1]   On the approximability of minimizing nonzero variables or unsatisfied relations in linear systems [J].
Amaldi, E ;
Kann, V .
THEORETICAL COMPUTER SCIENCE, 1998, 209 (1-2) :237-260
[2]  
[Anonymous], 2003, P IEEE C COMP VIS PA
[3]  
[Anonymous], 2009, P IEEE C COMP VIS PA
[4]  
[Anonymous], P IEEE C COMP VIS PA
[5]  
[Anonymous], IEEE T PATTERN ANAL
[6]   A Simple Proof of the Restricted Isometry Property for Random Matrices [J].
Baraniuk, Richard ;
Davenport, Mark ;
DeVore, Ronald ;
Wakin, Michael .
CONSTRUCTIVE APPROXIMATION, 2008, 28 (03) :253-263
[7]   Lambertian reflectance and linear subspaces [J].
Basri, R ;
Jacobs, DW .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2003, 25 (02) :218-233
[8]   Sparse Recovery From Combined Fusion Frame Measurements [J].
Boufounos, Petros ;
Kutyniok, Gitta ;
Rauhut, Holger .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2011, 57 (06) :3864-3876
[9]  
Boyd S., 2004, CONVEX OPTIMIZATION, VFirst, DOI DOI 10.1017/CBO9780511804441
[10]   Decoding by linear programming [J].
Candes, EJ ;
Tao, T .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2005, 51 (12) :4203-4215