Renyi Divergence and Kullback-Leibler Divergence

被引:906
作者
van Erven, Tim [1 ]
Harremoes, Peter [2 ]
机构
[1] Univ Paris 11, Dept Math, F-91405 Orsay, France
[2] Copenhagen Business Coll, DK-1358 Copenhagen, Denmark
关键词
alpha-divergence; Bhattacharyya distance; information divergence; Kullback-Leibler divergence; Pythagorean inequality; Renyi divergence; INFORMATION; ENTROPY; PROBABILITY; DISTRIBUTIONS; CONVERGENCE; STATISTICS; PINSKERS; RATES;
D O I
10.1109/TIT.2014.2320500
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Renyi divergence is related to Renyi entropy much like Kullback-Leibler divergence is related to Shannon's entropy, and comes up in many settings. It was introduced by Renyi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends on a parameter that is called its order. In particular, the Renyi divergence of order 1 equals the Kullback-Leibler divergence. We review and extend the most important properties of Renyi divergence and Kullback-Leibler divergence, including convexity, continuity, limits of sigma-algebras, and the relation of the special order 0 to the Gaussian dichotomy and contiguity. We also show how to generalize the Pythagorean inequality to orders different from 1, and we extend the known equivalence between channel capacity and minimax redundancy to continuous channel inputs (for all orders) and present several other minimax results.
引用
收藏
页码:3797 / 3820
页数:24
相关论文
共 60 条
[1]  
Aczel J., 1975, On measures of information and their characterizations
[2]   STRONG UNIFORM TIMES AND FINITE RANDOM-WALKS [J].
ALDOUS, D ;
DIACONIS, P .
ADVANCES IN APPLIED MATHEMATICS, 1987, 8 (01) :69-97
[3]  
ALI SM, 1966, J ROY STAT SOC B, V28, P131
[4]  
[Anonymous], 2002, A user's guide to measure theoretic probability
[5]  
[Anonymous], 2007, The Minimum Description Length Principle
[6]  
[Anonymous], 1996, Graduate Texts in Mathematics
[7]   Limits of information, Markov chains, and projection [J].
Barron, AR .
2000 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY, PROCEEDINGS, 2000, :25-25
[8]   RENYI ENTROPY AND PROBABILITY OF ERROR [J].
BENBASSAT, M ;
RAVIV, J .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1978, 24 (03) :324-331
[9]   ON ESTIMATING A DENSITY USING HELLINGER DISTANCE AND SOME OTHER STRANGE FACTS [J].
BIRGE, L .
PROBABILITY THEORY AND RELATED FIELDS, 1986, 71 (02) :271-291
[10]  
Cover T.M., 2006, ELEMENTS INFORM THEO, V2nd ed