Central limit theorem;
Entropy;
Fisher information;
Relative entropy;
Bernoulli part decomposition;
Lattice distribution;
Convolution inequality;
MONOTONICITY;
INFORMATION;
INEQUALITY;
D O I:
10.1016/j.spa.2023.104294
中图分类号:
O21 [概率论与数理统计];
C8 [统计学];
学科分类号:
020208 ;
070103 ;
0714 ;
摘要:
A strengthened version of the central limit theorem for discrete random variables is established, relying only on information-theoretic tools and elementary arguments. It is shown that the relative entropy between the standardised sum of n independent and identically distributed lattice random variables and an appropriately discretised Gaussian, vanishes as n -> infinity.