Dictionary-Based Low-Rank Approximations and the Mixed Sparse Coding Problem

被引:0
作者
Cohen, Jeremy E. [1 ]
机构
[1] Univ Lyon, INSA Lyon, UJM St Etienne, CNRS,Inserm,CREATIS UMR 5220,UCBL, Villeurbanne, France
关键词
dictionary; tensors; sparse; low-rank; non-convex optimization; LASSO; NONNEGATIVE MATRIX FACTORIZATION; TENSOR DECOMPOSITION; VARIABLE SELECTION; COMPONENT ANALYSIS; ALGORITHMS; UNIQUENESS; IDENTIFIABILITY; CONVERGENCE; COMPLEXITY; REGRESSION;
D O I
10.3389/fams.2022.801650
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Constrained tensor and matrix factorization models allow to extract interpretable patterns from multiway data. Therefore crafting efficient algorithms for constrained low-rank approximations is nowadays an important research topic. This work deals with columns of factor matrices of a low-rank approximation being sparse in a known and possibly overcomplete basis, a model coined as Dictionary-based Low-Rank Approximation (DLRA). While earlier contributions focused on finding factor columns inside a dictionary of candidate columns, i.e., one-sparse approximations, this work is the first to tackle DLRA with sparsity larger than one. I propose to focus on the sparse-coding subproblem coined Mixed Sparse-Coding (MSC) that emerges when solving DLRA with an alternating optimization strategy. Several algorithms based on sparse-coding heuristics (greedy methods, convex relaxations) are provided to solve MSC. The performance of these heuristics is evaluated on simulated data. Then, I show how to adapt an efficient MSC solver based on the LASSO to compute Dictionary-based Matrix Factorization and Canonical Polyadic Decomposition in the context of hyperspectral image processing and chemometrics. These experiments suggest that DLRA extends the modeling capabilities of low-rank approximations, helps reducing estimation variance and enhances the identifiability and interpretability of estimated factors.
引用
收藏
页数:19
相关论文
共 80 条
[71]   Three-way component analysis with smoothness constraints [J].
Timmerman, ME ;
Kiers, HAL .
COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2002, 40 (03) :447-470
[72]   Stable recovery of low-dimensional cones in Hilbert spaces: One RIP to rule them all [J].
Traonmilin, Yann ;
Gribonval, Remi .
APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2018, 45 (01) :170-205
[73]   Just relax: Convex programming methods for identifying sparse signals in noise [J].
Tropp, JA .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (03) :1030-1051
[74]   Greed is good: Algorithmic results for sparse approximation [J].
Tropp, JA .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2004, 50 (10) :2231-2242
[75]   Covariance Estimation in High Dimensions Via Kronecker Product Expansions [J].
Tsiligkaridis, Theodoros ;
Hero, Alfred O., III .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2013, 61 (21) :5347-5360
[76]   SOME MATHEMATICAL NOTES ON 3-MODE FACTOR ANALYSIS [J].
TUCKER, LR .
PSYCHOMETRIKA, 1966, 31 (03) :279-279
[77]   ON THE COMPLEXITY OF NONNEGATIVE MATRIX FACTORIZATION [J].
Vavasis, Stephen A. .
SIAM JOURNAL ON OPTIMIZATION, 2009, 20 (03) :1364-1377
[78]   Alternating proximal gradient method for sparse nonnegative Tucker decomposition [J].
Xu, Yangyang .
MATHEMATICAL PROGRAMMING COMPUTATION, 2015, 7 (01) :39-70
[79]  
Zhu F., 2017, ARXIV PREPRINT ARXIV
[80]   Fast Hyperspectral Image Denoising and Inpainting Based on Low-Rank and Sparse Representations [J].
Zhuang, Lina ;
Bioucas-Dias, Jose M. .
IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2018, 11 (03) :730-742