Dictionary-Based Low-Rank Approximations and the Mixed Sparse Coding Problem

被引:0
作者
Cohen, Jeremy E. [1 ]
机构
[1] Univ Lyon, INSA Lyon, UJM St Etienne, CNRS,Inserm,CREATIS UMR 5220,UCBL, Villeurbanne, France
关键词
dictionary; tensors; sparse; low-rank; non-convex optimization; LASSO; NONNEGATIVE MATRIX FACTORIZATION; TENSOR DECOMPOSITION; VARIABLE SELECTION; COMPONENT ANALYSIS; ALGORITHMS; UNIQUENESS; IDENTIFIABILITY; CONVERGENCE; COMPLEXITY; REGRESSION;
D O I
10.3389/fams.2022.801650
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Constrained tensor and matrix factorization models allow to extract interpretable patterns from multiway data. Therefore crafting efficient algorithms for constrained low-rank approximations is nowadays an important research topic. This work deals with columns of factor matrices of a low-rank approximation being sparse in a known and possibly overcomplete basis, a model coined as Dictionary-based Low-Rank Approximation (DLRA). While earlier contributions focused on finding factor columns inside a dictionary of candidate columns, i.e., one-sparse approximations, this work is the first to tackle DLRA with sparsity larger than one. I propose to focus on the sparse-coding subproblem coined Mixed Sparse-Coding (MSC) that emerges when solving DLRA with an alternating optimization strategy. Several algorithms based on sparse-coding heuristics (greedy methods, convex relaxations) are provided to solve MSC. The performance of these heuristics is evaluated on simulated data. Then, I show how to adapt an efficient MSC solver based on the LASSO to compute Dictionary-based Matrix Factorization and Canonical Polyadic Decomposition in the context of hyperspectral image processing and chemometrics. These experiments suggest that DLRA extends the modeling capabilities of low-rank approximations, helps reducing estimation variance and enhances the identifiability and interpretability of estimated factors.
引用
收藏
页数:19
相关论文
共 80 条
[1]  
Aharon Michal, 2005, Proceedings of the SPIE - The International Society for Optical Engineering, V5914, P591411, DOI 10.1117/12.613878
[2]   Extended HALS algorithm for nonnegative Tucker decomposition and its applications for multiway analysis and classification [J].
Anh Huy Phan ;
Cichocki, Andrzej .
NEUROCOMPUTING, 2011, 74 (11) :1956-1969
[3]  
[Anonymous], 1998, THESIS U AMSTERDAM
[4]  
[Anonymous], 2010, P 13 INT C DIG AUD E
[5]  
[Anonymous], 2009, PROC 26 ANN INT C MA, DOI DOI 10.1145/1553374.1553484
[6]   Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods [J].
Attouch, Hedy ;
Bolte, Jerome ;
Svaiter, Benar Fux .
MATHEMATICAL PROGRAMMING, 2013, 137 (1-2) :91-129
[7]   A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems [J].
Beck, Amir ;
Teboulle, Marc .
SIAM JOURNAL ON IMAGING SCIENCES, 2009, 2 (01) :183-202
[8]   Brain-Source Imaging [J].
Becker, Hanna ;
Albera, Laurent ;
Comon, Pierre ;
Gribonval, Remi ;
Wendling, Fabrice ;
Merlet, Isabelle .
IEEE SIGNAL PROCESSING MAGAZINE, 2015, 32 (06) :100-112
[9]   The Fastest l1,∞ Prox in the West [J].
Bejar, Benjamin ;
Dokmanic, Ivan ;
Vidal, Rene .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (07) :3858-3869
[10]   Iterative hard thresholding for compressed sensing [J].
Blumensath, Thomas ;
Davies, Mike E. .
APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2009, 27 (03) :265-274