Properties of classical and quantum Jensen-Shannon divergence

被引:143
|
作者
Briet, Jop [1 ]
Harremoes, Peter [1 ]
机构
[1] Ctr Wiskunde & Informat, NL-1098 XG Amsterdam, Netherlands
来源
PHYSICAL REVIEW A | 2009年 / 79卷 / 05期
关键词
entropy; Hilbert spaces; probability; quantum theory; STATISTICAL DISTANCE; INFORMATION; ENTROPY; SPACE;
D O I
10.1103/PhysRevA.79.052311
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
Jensen-Shannon divergence (JD) is a symmetrized and smoothed version of the most important divergence measure of information theory, Kullback divergence. As opposed to Kullback divergence it determines in a very direct way a metric; indeed, it is the square of a metric. We consider a family of divergence measures (JD(alpha) for alpha>0), the Jensen divergences of order alpha, which generalize JD as JD(1)=JD. Using a result of Schoenberg, we prove that JD(alpha) is the square of a metric for alpha is an element of(0,2], and that the resulting metric space of probability distributions can be isometrically embedded in a real Hilbert space. Quantum Jensen-Shannon divergence (QJD) is a symmetrized and smoothed version of quantum relative entropy and can be extended to a family of quantum Jensen divergences of order alpha (QJD(alpha)). We strengthen results by Lamberti and co-workers by proving that for qubits and pure states, QJD(alpha)(1/2) is a metric space which can be isometrically embedded in a real Hilbert space when alpha is an element of(0,2]. In analogy with Burbea and Rao's generalization of JD, we also define general QJD by associating a Jensen-type quantity to any weighted family of states. Appropriate interpretations of quantities introduced are discussed and bounds are derived in terms of the total variation and trace distance.
引用
收藏
页数:11
相关论文
共 50 条
  • [41] Image Object Background Classification with Jensen-Shannon Divergence
    Nie, Fangyan
    Li, Jianqi
    IAENG International Journal of Applied Mathematics, 2022, 52 (04)
  • [42] An analysis of edge detection by using the Jensen-Shannon divergence
    Gómez-Lopera, JF
    Martínez-Aroza, J
    Robles-Pérez, AM
    Román-Roldán, R
    JOURNAL OF MATHEMATICAL IMAGING AND VISION, 2000, 13 (01) : 35 - 56
  • [43] Permutation Jensen-Shannon divergence for Random Permutation Set
    Chen, Luyuan
    Deng, Yong
    Cheong, Kang Hao
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 119
  • [44] Extrinsic Jensen-Shannon Divergence with Application in Active Hypothesis Testing
    Naghshvar, Mohammad
    Javidi, Tara
    2012 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS (ISIT), 2012,
  • [45] Generalized Jensen-Shannon Divergence Loss for Learning with Noisy Labels
    Englesson, Erik
    Azizpour, Hossein
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [46] Anomaly detection in network traffic using Jensen-Shannon divergence
    LIPADE Laboratory, University Paris Descartes, France
    不详
    IEEE Int Conf Commun, 2012, (5200-5204):
  • [47] Active learning for probability estimation using Jensen-Shannon divergence
    Melville, P
    Yang, SM
    Saar-Tsechansky, M
    Mooney, R
    MACHINE LEARNING: ECML 2005, PROCEEDINGS, 2005, 3720 : 268 - 279
  • [48] Detection of Neural Activities in FMRI Using Jensen-Shannon Divergence
    Basak, Jayanta
    ICAPR 2009: SEVENTH INTERNATIONAL CONFERENCE ON ADVANCES IN PATTERN RECOGNITION, PROCEEDINGS, 2009, : 39 - 42
  • [49] Feature Selection Stability Assessment Based on the Jensen-Shannon Divergence
    Guzman-Martinez, Roberto
    Alaiz-Rodriguez, Rocio
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, PT I, 2011, 6911 : 597 - 612
  • [50] Feature Selection for Clustering with Constraints Using Jensen-Shannon Divergence
    Li, Yuanhong
    Dong, Ming
    Ma, Yunqian
    19TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, VOLS 1-6, 2008, : 2424 - 2427