Quantized Low-Rank Multivariate Regression With Random Dithering

被引:3
|
作者
Chen, Junren [1 ]
Wang, Yueqi [1 ]
Ng, Michael K. [2 ]
机构
[1] Univ Hong Kong, Dept Math, Hong Kong, Peoples R China
[2] Hong Kong Baptist Univ, Dept Math, Hong Kong, Peoples R China
关键词
Multiresponse regression; quantization; M-estimator; low-rankness; dithering; GENERALIZED LASSO; M-ESTIMATORS; SHRINKAGE; SELECTION; MATRICES;
D O I
10.1109/TSP.2023.3322813
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Low-rank multivariate regression (LRMR) is an important statistical learning model that combines highly correlated tasks as a multiresponse regression problem with low-rank priori on the coefficient matrix. In this paper, we study quantized LRMR, a practical setting where the responses and/or the covariates are discretized to finite precision. We focus on the estimation of the underlying coefficient matrix. To make consistent estimator that could achieve arbitrarily small error possible, we employ uniform quantization with random dithering, i.e., we add appropriate random noise to the data before quantization. Specifically, uniform dither and triangular dither are used for responses and covariates, respectively. Based on the quantized data, we propose the constrained Lasso and regularized Lasso estimators, and derive the non-asymptotic error bounds. With the aid of dithering, the estimators achieve minimax optimal rate, while quantization only slightly worsens the multiplicative factor in the error rate. Moreover, we extend our results to a low-rank regression model with matrix responses. We corroborate and demonstrate our theoretical results via simulations on synthetic data, image restoration, as well as a real data application.
引用
收藏
页码:3913 / 3928
页数:16
相关论文
共 50 条
  • [1] Multivariate response regression with low-rank and generalized sparsity
    Cho, Youngjin
    Park, Seyoung
    JOURNAL OF THE KOREAN STATISTICAL SOCIETY, 2022, 51 (03) : 847 - 867
  • [2] Multivariate response regression with low-rank and generalized sparsity
    Youngjin Cho
    Seyoung Park
    Journal of the Korean Statistical Society, 2022, 51 : 847 - 867
  • [3] MULTIVARIATE LINEAR REGRESSION WITH LOW-RANK AND ROW-SPARSITY
    SUN, Jun
    Shang, Pan
    Xu, Qiuyun
    Chen, Bingzhen
    PACIFIC JOURNAL OF OPTIMIZATION, 2022, 18 (02): : 349 - 366
  • [4] Low rank multivariate regression
    Giraud, Christophe
    ELECTRONIC JOURNAL OF STATISTICS, 2011, 5 : 1 - 25
  • [5] Low-rank elastic-net regularized multivariate Huber regression model
    Chen, Bingzhen
    Zhai, Wenjuan
    Huang, Zhiyong
    APPLIED MATHEMATICAL MODELLING, 2020, 87 : 571 - 583
  • [6] Multivariate functional response low-rank regression with an application to brain imaging data
    Ding, Xiucai
    Yu, Dengdeng
    Zhang, Zhengwu
    Kong, Dehan
    CANADIAN JOURNAL OF STATISTICS-REVUE CANADIENNE DE STATISTIQUE, 2021, 49 (01): : 150 - 181
  • [7] Low-Rank Regression with Tensor Responses
    Rabusseau, Guillaume
    Kadri, Hachem
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [8] LOW-RANK TENSOR HUBER REGRESSION
    Wei, Yangxin
    Luot, Ziyan
    Chen, Yang
    PACIFIC JOURNAL OF OPTIMIZATION, 2022, 18 (02): : 439 - 458
  • [9] Quantized Corrupted Sensing with Random Dithering
    Sun, Zhongxing
    Cui, Wei
    Liu, Yulong
    2020 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2020, : 1397 - 1402
  • [10] Quantized Corrupted Sensing With Random Dithering
    Sun, Zhongxing
    Cui, Wei
    Liu, Yulong
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2022, 70 : 600 - 615