Quantized Low-Rank Multivariate Regression With Random Dithering

被引:3
|
作者
Chen, Junren [1 ]
Wang, Yueqi [1 ]
Ng, Michael K. [2 ]
机构
[1] Univ Hong Kong, Dept Math, Hong Kong, Peoples R China
[2] Hong Kong Baptist Univ, Dept Math, Hong Kong, Peoples R China
关键词
Multiresponse regression; quantization; M-estimator; low-rankness; dithering; GENERALIZED LASSO; M-ESTIMATORS; SHRINKAGE; SELECTION; MATRICES;
D O I
10.1109/TSP.2023.3322813
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Low-rank multivariate regression (LRMR) is an important statistical learning model that combines highly correlated tasks as a multiresponse regression problem with low-rank priori on the coefficient matrix. In this paper, we study quantized LRMR, a practical setting where the responses and/or the covariates are discretized to finite precision. We focus on the estimation of the underlying coefficient matrix. To make consistent estimator that could achieve arbitrarily small error possible, we employ uniform quantization with random dithering, i.e., we add appropriate random noise to the data before quantization. Specifically, uniform dither and triangular dither are used for responses and covariates, respectively. Based on the quantized data, we propose the constrained Lasso and regularized Lasso estimators, and derive the non-asymptotic error bounds. With the aid of dithering, the estimators achieve minimax optimal rate, while quantization only slightly worsens the multiplicative factor in the error rate. Moreover, we extend our results to a low-rank regression model with matrix responses. We corroborate and demonstrate our theoretical results via simulations on synthetic data, image restoration, as well as a real data application.
引用
收藏
页码:3913 / 3928
页数:16
相关论文
共 50 条
  • [31] Low-Rank Matrix Recovery From Noisy, Quantized, and Erroneous Measurements
    Gao, Pengzhi
    Wang, Ren
    Wang, Meng
    Chow, Joe H.
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2018, 66 (11) : 2918 - 2932
  • [32] Detecting low-rank clusters via random sampling
    Rangan, Aaditya V.
    JOURNAL OF COMPUTATIONAL PHYSICS, 2012, 231 (01) : 215 - 222
  • [33] Imputation and low-rank estimation with Missing Not At Random data
    Aude Sportisse
    Claire Boyer
    Julie Josse
    Statistics and Computing, 2020, 30 : 1629 - 1643
  • [34] Accelerated Low-Rank Visual Recovery by Random Projection
    Mu, Yadong
    Dong, Jian
    Yuan, Xiaotong
    Yan, Shuicheng
    2011 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2011,
  • [35] Imputation and low-rank estimation with Missing Not At Random data
    Sportisse, Aude
    Boyer, Claire
    Josse, Julie
    STATISTICS AND COMPUTING, 2020, 30 (06) : 1629 - 1643
  • [36] Low-rank approximation of improper complex random vectors
    Schreier, PJ
    Scharf, LL
    CONFERENCE RECORD OF THE THIRTY-FIFTH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS AND COMPUTERS, VOLS 1 AND 2, 2001, : 597 - 601
  • [37] On multivariate rank regression
    Chakraborty, B
    Chaudhuri, P
    L(1)-STATISTICAL PROCEDURES AND RELATED TOPICS, 1997, 31 : 399 - 414
  • [38] Testing heteroskedasticity in trace regression with low-rank matrix parameter
    Tan, Xiangyong
    Lu, Xuanliang
    Hu, Tianying
    Li, Hongmei
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2025,
  • [39] Sample Efficient Nonparametric Regression via Low-Rank Regularization
    Jiang, Jiakun
    Peng, Jiahao
    Lian, Heng
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2024,
  • [40] A framework of regularized low-rank matrix models for regression and classification
    Hsin-Hsiung Huang
    Feng Yu
    Xing Fan
    Teng Zhang
    Statistics and Computing, 2024, 34