On Normalized Mutual Information: Measure Derivations and Properties

被引:93
|
作者
Kvalseth, Tarald O. [1 ,2 ]
机构
[1] Univ Minnesota, Dept Mech Engn, 111 Church St SE, Minneapolis, MN 55455 USA
[2] Univ Minnesota, Dept Ind & Syst Engn, Minneapolis, MN 55455 USA
关键词
mutual information; normalized mutual information; association measures; similarity measures; value validity; QUANTITATIVE-QUALITATIVE MEASURE; ENTROPY;
D O I
10.3390/e19110631
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Starting with a new formulation for the mutual information (MI) between a pair of events, this paper derives alternative upper bounds and extends those to the case of two discrete random variables. Normalized mutual information (NMI) measures are then obtained from those bounds, emphasizing the use of least upper bounds. Conditional NMI measures are also derived for three different events and three different random variables. Since the MI formulation for a pair of events is always nonnegative, it can properly be extended to include weighted MI and NMI measures for pairs of events or for random variables that are analogous to the well-known weighted entropy. This weighted MI is generalized to the case of continuous random variables. Such weighted measures have the advantage over previously proposed measures of always being nonnegative. A simple transformation is derived for the NMI, such that the transformed measures have the value-validity property necessary for making various appropriate comparisons between values of those measures. A numerical example is provided.
引用
收藏
页数:14
相关论文
共 50 条