The Rate-Distortion Risk in Estimation From Compressed Data

被引:7
作者
Kipnis, Alon [1 ]
Rini, Stefano [2 ]
Goldsmith, Andrea J. [3 ]
机构
[1] Stanford Univ, Dept Stat, Stanford, CA 94035 USA
[2] Natl Yangming Jiaotong Univ, Dept Elect Engn, Hsinchu 30010, Taiwan
[3] Princeton Univ, Dept Elect Engn, Princeton, NJ 08544 USA
关键词
Bit rate; Distortion measurement; Distortion; Maximum likelihood estimation; Loss measurement; Task analysis; Rate-distortion; Source coding; Transport theory; Estimation; Rate distortion; Compression; EMPIRICAL DISTRIBUTION; TRANSPORTATION COST; INEQUALITIES;
D O I
10.1109/TIT.2021.3068981
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Consider the problem of estimating a latent signal from a lossy compressed version of the data when the compressor is agnostic to the relation between the signal and the data. This situation arises in a host of modern applications when data is transmitted or stored prior to determining the downstream inference task. Given a bitrate constraint and a distortion measure between the data and its compressed version, let us consider the joint distribution achieving Shannon's rate-distortion (RD) function. Given an estimator and a loss function associated with the downstream inference task, define the RD risk as the expected loss under the RD-achieving distribution. We provide general conditions under which the operational risk in estimating from the compressed data is asymptotically equivalent to the RD risk. The main theoretical tools to prove this equivalence are transportation-cost inequalities in conjunction with properties of compression codes achieving Shannon's RD function. Whenever such equivalence holds, a recipe for designing estimators from datasets undergoing lossy compression without specifying the actual compression technique emerges: design the estimator to minimize the RD risk. Our conditions are simplified in the special cases of discrete memoryless or multivariate normal data. For these scenarios, we derive explicit expressions for the RD risk of several estimators and compare them to the optimal source coding performance associated with full knowledge of the relation between the latent signal and the data.
引用
收藏
页码:2910 / 2924
页数:15
相关论文
共 50 条
  • [1] Genetics: Big hopes for big data
    Adams, Jill U.
    [J]. NATURE, 2015, 527 (7578) : S108 - S109
  • [2] [Anonymous], 2005, P IEEE SP 13 WORKSH
  • [3] [Anonymous], 2014, Advances in Neural Information Processing Systems
  • [4] Barnes L. P., 2019, ARXIV190202890
  • [5] Berger T, 1971, Rate distortion theory: A mathematical basis for data compression
  • [6] COVER TM, 1979, IEEE T INFORM THEORY, V25, P572, DOI 10.1109/TIT.1979.1056084
  • [7] Cover TM., 1991, ELEMENTS INFORM THEO
  • [8] The minimax distortion redundancy in noisy source coding
    Dembo, A
    Weissman, T
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2003, 49 (11) : 3020 - 3030
  • [9] Transportation cost-information inequalities and applications to random dynamical systems and diffusions
    Djellout, H
    Guillin, A
    Wu, L
    [J]. ANNALS OF PROBABILITY, 2004, 32 (3B) : 2702 - 2732
  • [10] DOBRUSHIN RL, 1962, IRE T INFORM THEOR, V8, pS293