Bias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence

被引:0
|
作者
Noh, Yung-Kyun [1 ]
Sugiyama, Masashi [2 ]
Liu, Song [2 ]
du Plessis, Marthinus C. [2 ]
Park, Frank Chongwoo [3 ]
Lee, Daniel D. [4 ]
机构
[1] Korea Adv Inst Sci & Technol, Daejeon, South Korea
[2] Tokyo Inst Technol, Tokyo, Japan
[3] Seoul Natl Univ, Seoul, South Korea
[4] Univ Penn, Philadelphia, PA 19104 USA
来源
ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 33 | 2014年 / 33卷
关键词
FEATURE-SELECTION; GENE-EXPRESSION; INFORMATION; RELEVANCE;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Asymptotically unbiased nearest-neighbor estimators for KL divergence have recently been proposed and demonstrated in a number of applications. With small sample sizes, however, these nonparametric methods typically suffer from high estimation bias due to the non-local statistics of empirical nearest-neighbor information. In this paper, we show that this non-local bias can be mitigated by changing the distance metric, and we propose a method for learning an optimal Mahalanobis-type metric based on global information provided by approximate parametric models of the underlying densities. In both simulations and experiments, we demonstrate that this interplay between parametric models and nonparametric estimation methods significantly improves the accuracy of the nearest-neighbor KL divergence estimator.
引用
收藏
页码:669 / 677
页数:9
相关论文
共 50 条
  • [1] Bias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence
    Noh, Yung-Kyun
    Sugiyama, Masashi
    Liu, Song
    du Plessis, Marthinus C.
    Park, Frank Chongwoo
    Lee, Daniel D.
    NEURAL COMPUTATION, 2018, 30 (07) : 1930 - 1960
  • [2] Steganographic Applications of the Nearest-Neighbor Approach to Kullback-Leibler Divergence Estimation
    Korzhik, Valery
    Fedyanin, Ivan
    2015 THIRD INTERNATIONAL CONFERENCE ON DIGITAL INFORMATION, NETWORKING, AND WIRELESS COMMUNICATIONS (DINWC), 2015, : 133 - 138
  • [3] Kullback-Leibler Divergence Metric Learning
    Ji, Shuyi
    Zhang, Zizhao
    Ying, Shihui
    Wang, Liejun
    Zhao, Xibin
    Gao, Yue
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (04) : 2047 - 2058
  • [4] Nonparametric Estimation of Kullback-Leibler Divergence
    Zhang, Zhiyi
    Grabchak, Michael
    NEURAL COMPUTATION, 2014, 26 (11) : 2570 - 2593
  • [5] Statistical Estimation of the Kullback-Leibler Divergence
    Bulinski, Alexander
    Dimitrov, Denis
    MATHEMATICS, 2021, 9 (05) : 1 - 36
  • [6] Kullback-Leibler Divergence Estimation of Continuous Distributions
    Perez-Cruz, Fernando
    2008 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS, VOLS 1-6, 2008, : 1666 - 1670
  • [7] Minimization of the Kullback-Leibler Divergence for Nonlinear Estimation
    Darling, Jacob E.
    DeMars, Kyle J.
    JOURNAL OF GUIDANCE CONTROL AND DYNAMICS, 2017, 40 (07) : 1739 - 1748
  • [8] MINIMIZATION OF THE KULLBACK-LEIBLER DIVERGENCE FOR NONLINEAR ESTIMATION
    Darling, Jacob E.
    DeMars, Kyle J.
    ASTRODYNAMICS 2015, 2016, 156 : 213 - 232
  • [9] Anomaly Detection Using the Kullback-Leibler Divergence Metric
    Afgani, Mostafa
    Sinanovic, Sinan
    Haas, Harald
    ISABEL: 2008 FIRST INTERNATIONAL SYMPOSIUM ON APPLIED SCIENCES IN BIOMEDICAL AND COMMMUNICATION TECHNOLOGIES, 2008, : 197 - 201
  • [10] AUTOMATIC CLASSIFICATION OF ELECTROENCEPHALOGRAMS - KULLBACK-LEIBLER NEAREST NEIGHBOR RULES
    GERSCH, W
    MARTINELLI, F
    YONEMOTO, J
    LOW, MD
    MCEWAN, JA
    SCIENCE, 1979, 205 (4402) : 193 - 195