Optimal denoising of rotationally invariant rectangular matrices

被引:0
|
作者
Troiani, Emanuele [1 ]
Erba, Vittorio [1 ]
Krzakala, Florent [2 ]
Maillard, Antoine [3 ,4 ]
Zdeborova, Lenka [1 ]
机构
[1] Ecole Polytech Fed Lausanne, Stat Phys Computat Lab, Lausanne, Switzerland
[2] Ecole Polytech Fed Lausanne, Informat Learning & Phys Lab, Lausanne, Switzerland
[3] Swiss Fed Inst Technol, Dept Math, Zurich, Switzerland
[4] Swiss Fed Inst Technol, Inst Math Res FIM, Zurich, Switzerland
来源
MATHEMATICAL AND SCIENTIFIC MACHINE LEARNING, VOL 190 | 2022年 / 190卷
基金
欧盟地平线“2020”;
关键词
Matrix denoising; Bayes-optimality; Rotationally invariant estimator; High dimensional statistics; Random matrix theory; Harish-Chandra-Itzykson-Zuber intergral; Matrix factorization; LARGE DEVIATIONS;
D O I
暂无
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
In this manuscript we consider denoising of large rectangular matrices: given a noisy observation of a signal matrix, what is the best way of recovering the signal matrix itself? For Gaussian noise and rotationally-invariant signal priors, we completely characterize the optimal denoising estimator and its performance in the high-dimensional limit, in which the size of the signal matrix goes to infinity with fixed aspects ratio, and under the Bayes optimal setting, that is when the statistician knows how the signal and the observations were generated. Our results generalise previous works that considered only symmetric matrices to the more general case of non-symmetric and rectangular ones. We explore analytically and numerically a particular choice of factorized signal prior that models cross-covariance matrices and the matrix factorization problem. As a byproduct of our analysis, we provide an explicit asymptotic evaluation of the rectangular Harish-Chandra-Itzykson-Zuber integral in a special case.
引用
收藏
页数:23
相关论文
共 16 条