Some upper bounds for relative entropy and applications

被引:14
作者
Dragomir, SS
Scholz, ML
Sunde, J
机构
[1] Victoria Univ Technol, Dept Comp & Math Sci, Melbourne, Vic 8001, Australia
[2] DSTO, Commun Div, Salisbury, SA 5108, Australia
关键词
relative entropy; mutual information; log-mapping; Kantorovic inequality; Diaz-Metcalf inequality;
D O I
10.1016/S0898-1221(00)00089-4
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
In this paper, we derive some upper bounds for the relative entropy D(p parallel to q) of two probability distribution and apply them to mutual information and entropy mapping. To achieve this, we use an inequality for the logarithm function,;(2.3) below, and some classical inequalities such as the Kantorovic Inequality and Diaz-Metcalf Inequality. (C) 2000 Elsevier Science Ltd. All rights reserved.
引用
收藏
页码:91 / 100
页数:10
相关论文
共 9 条
[1]  
Cover T. M., 2005, ELEM INF THEORY, DOI 10.1002/047174882X
[2]   Some bounds on entropy measures in information theory [J].
Dragomir, SS ;
Goh, CJ .
APPLIED MATHEMATICS LETTERS, 1997, 10 (03) :23-28
[3]   A counterpart of Jensen's discrete inequality for differentiable convex mappings and applications in information theory [J].
Dragomir, SS ;
Goh, CJ .
MATHEMATICAL AND COMPUTER MODELLING, 1996, 24 (02) :1-11
[4]  
DRAGOMIR SS, IN PRESS INDIAN J MA
[5]  
Matic M, 1998, MATH INEQUAL APPL, V1, P295
[6]  
MATIC M, IN PRESS J AUSTR M B
[7]  
MATIC M, THESIS U ZAGREB CROA
[8]  
MCELLECE RJ, 1977, THEORY INFORMATION C
[9]  
MITRINOVIC DS, 1970, ANAL INEQUALITIE