Quantized Low-Rank Multivariate Regression With Random Dithering

被引:3
|
作者
Chen, Junren [1 ]
Wang, Yueqi [1 ]
Ng, Michael K. [2 ]
机构
[1] Univ Hong Kong, Dept Math, Hong Kong, Peoples R China
[2] Hong Kong Baptist Univ, Dept Math, Hong Kong, Peoples R China
关键词
Multiresponse regression; quantization; M-estimator; low-rankness; dithering; GENERALIZED LASSO; M-ESTIMATORS; SHRINKAGE; SELECTION; MATRICES;
D O I
10.1109/TSP.2023.3322813
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Low-rank multivariate regression (LRMR) is an important statistical learning model that combines highly correlated tasks as a multiresponse regression problem with low-rank priori on the coefficient matrix. In this paper, we study quantized LRMR, a practical setting where the responses and/or the covariates are discretized to finite precision. We focus on the estimation of the underlying coefficient matrix. To make consistent estimator that could achieve arbitrarily small error possible, we employ uniform quantization with random dithering, i.e., we add appropriate random noise to the data before quantization. Specifically, uniform dither and triangular dither are used for responses and covariates, respectively. Based on the quantized data, we propose the constrained Lasso and regularized Lasso estimators, and derive the non-asymptotic error bounds. With the aid of dithering, the estimators achieve minimax optimal rate, while quantization only slightly worsens the multiplicative factor in the error rate. Moreover, we extend our results to a low-rank regression model with matrix responses. We corroborate and demonstrate our theoretical results via simulations on synthetic data, image restoration, as well as a real data application.
引用
收藏
页码:3913 / 3928
页数:16
相关论文
共 50 条
  • [41] Low-rank feature selection for multi-view regression
    Rongyao Hu
    Debo Cheng
    Wei He
    Guoqiu Wen
    Yonghua Zhu
    Jilian Zhang
    Shichao Zhang
    Multimedia Tools and Applications, 2017, 76 : 17479 - 17495
  • [42] Tensor Regression Using Low-Rank and Sparse Tucker Decompositions
    Ahmed, Talal
    Raja, Haroon
    Bajwa, Waheed U.
    SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2020, 2 (04): : 944 - 966
  • [43] Low-rank feature selection for multi-view regression
    Hu, Rongyao
    Cheng, Debo
    He, Wei
    Wen, Guoqiu
    Zhu, Yonghua
    Zhang, Jilian
    Zhang, Shichao
    MULTIMEDIA TOOLS AND APPLICATIONS, 2017, 76 (16) : 17479 - 17495
  • [44] TENSOR QUANTILE REGRESSION WITH LOW-RANK TENSOR TRAIN ESTIMATION
    Liu, Zihuan
    Lee, Cheuk Yin
    Zhang, Heping
    ANNALS OF APPLIED STATISTICS, 2024, 18 (02): : 1294 - 1318
  • [45] Low-Rank Double Relaxed Regression for Discriminative Projection Learning
    Meenakshi, Meenakshi
    Srirangarajan, Seshan
    IEEE MMSP 2021: 2021 IEEE 23RD INTERNATIONAL WORKSHOP ON MULTIMEDIA SIGNAL PROCESSING (MMSP), 2021,
  • [46] Confidence Region of Singular Subspaces for Low-Rank Matrix Regression
    Xia, Dong
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2019, 65 (11) : 7437 - 7459
  • [47] How Good are Low-Rank Approximations in Gaussian Process Regression?
    Daskalakis, Constantinos
    Dellaportas, Petros
    Panos, Aristeidis
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 6463 - 6470
  • [48] The rate of convergence for sparse and low-rank quantile trace regression
    Tan, Xiangyong
    Peng, Ling
    Xiao, Peiwen
    Liu, Qing
    Liu, Xiaohui
    JOURNAL OF COMPLEXITY, 2023, 79
  • [49] Low-rank discriminative least squares regression for image classification
    Chen, Zhe
    Wu, Xiao-Jun
    Kittler, Josef
    SIGNAL PROCESSING, 2020, 173 (173)
  • [50] Robust Low-Rank Regularized Regression for Face Recognition with Occlusion
    Qian, Jianjun
    Yang, Jian
    Zhang, Fanlong
    Lin, Zhouchen
    2014 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW), 2014, : 21 - +