Jensen-information generating function and its connections to some well-known information measures

被引:22
作者
Kharazmi, Omid [1 ]
Balakrishnan, Narayanaswamy [2 ]
机构
[1] Vali e Asr Univ Rafsanjan, Dept Stat, Fac Math Sci, Rafsanjan, Iran
[2] McMaster Univ, Dept Math & Stat, Hamilton, ON, Canada
关键词
Information generating function; Shannon entropy; Jensen-Shannon entropy; Jensen-extropy; Kullback-Leibler divergence; FISHER INFORMATION; SHANNON; DIVERGENCE; GINI;
D O I
10.1016/j.spl.2020.108995
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
In this work, we consider the information generating function measure and develop some new results associated with it. We specifically propose two new divergence measures and show that some of the well-known information divergences such as Jensen-Shannon, Jensen-extropy and Jensen-Taneja divergence measures are all special cases of it. Finally, we also discuss the information generating function for residual lifetime variables. (C) 2020 Elsevier B.V. All rights reserved.
引用
收藏
页数:10
相关论文
共 27 条