Quantization of Random Distributions under KL Divergence

被引:4
作者
Adler, Aviv [1 ]
Tang, Jennifer [1 ]
Polyanskiy, Yury [1 ]
机构
[1] MIT, EECS, 77 Massachusetts Ave, Cambridge, MA 02139 USA
来源
2021 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT) | 2021年
关键词
Compression; rate distortion; quantization; Kullback-Leibler divergence; Shannon Lower Bound; INFORMATION; COMPUTATION; ENTROPY;
D O I
10.1109/ISIT45174.2021.9518081
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Consider the problem of representing a distribution pi on a large alphabet of size k up to fidelity epsilon in Kullback-Leibler (KL) divergence. Heuristically, arguing as for quadratic loss in high dimension, one expects that about (k/2) log (1/epsilon) bits would be required. We show this intuition is correct by proving explicit non-asymptotic bounds for the minimal average distortion when pi is randomly sampled from a symmetric Dirichlet prior on the simplex. Our method is to reduce the single-sample problem to the traditional setting of iid samples, but for a non-standard rate distortion question with the novel distortion measure d(x, y) = x log(x/y), which we call divergence distortion. Practically, our results advocate using a x bar right arrow x(2/3) compander (for small x) followed by a uniform scalar quantizer for storing large-alphabet distributions.
引用
收藏
页码:2762 / 2767
页数:6
相关论文
共 35 条