Direct Density-Derivative Estimation and Its Application in KL-Divergence Approximation

被引:0
|
作者
Sasaki, Hiroaki [1 ]
Noh, Yung-Kyun [2 ]
Sugiyama, Masashi [1 ]
机构
[1] Univ Tokyo, Grad Sch Frontier Sci, Tokyo, Japan
[2] Seoul Natl Univ, Dept Mech & Aeros Engn, Seoul, South Korea
来源
ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 38 | 2015年 / 38卷
关键词
MEAN SHIFT; RATIO;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Estimation of density derivatives is a versatile tool in statistical data analysis. A naive approach is to first estimate the density and then compute its derivative. However, such a two-step approach does not work well because a good density estimator does not necessarily mean a good density-derivative estimator. In this paper, we give a direct method to approximate the density derivative without estimating the density itself. Our proposed estimator allows analytic and computationally efficient approximation of multi-dimensional high-order density derivatives, with the ability that all hyper-parameters can be chosen objectively by cross-validation. We further show that the proposed density-derivative estimator is useful in improving the accuracy of non-parametric KL-divergence estimation via metric learning. The practical superiority of the proposed method is experimentally demonstrated in change detection and feature selection.
引用
收藏
页码:809 / 818
页数:10
相关论文
共 50 条
  • [1] Expected Logarithm of Central Quadratic Form and Its Use in KL-Divergence of Some Distributions
    Zadeh, Pourya Habib
    Hosseini, Reshad
    ENTROPY, 2016, 18 (08)
  • [2] Unified Perspective on Probability Divergence via the Density-Ratio Likelihood: Bridging KL-Divergence and Integral Probability Metrics
    Kato, Masahiro
    Imaizumi, Masaaki
    Minami, Kentaro
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 206, 2023, 206
  • [3] CHANGE DETECTION ON SAR IMAGES BY A PARAMETRIC ESTIMATION OF THE KL-DIVERGENCE BETWEEN GAUSSIAN MIXTURE MODELS
    Xu, Qian
    Karam, Lina J.
    2013 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2013, : 2109 - 2113
  • [4] The KL-Divergence Between a Graph Model and its Fair I-Projection as a Fairness Regularizer
    Buyl, Maarten
    De Bie, Tijl
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2021: RESEARCH TRACK, PT II, 2021, 12976 : 351 - 366
  • [5] Direct Density Derivative Estimation
    Sasaki, Hiroaki
    Noh, Yung-Kyun
    Niu, Gang
    Sugiyama, Masashi
    NEURAL COMPUTATION, 2016, 28 (06) : 1101 - 1140
  • [6] LiTAMIN2: Ultra Light LiDAR-based SLAM using Geometric Approximation applied with KL-Divergence
    Yokozuka, Masashi
    Koide, Kenji
    Oishi, Shuji
    Banno, Atsuhiko
    2021 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2021), 2021, : 11619 - 11625
  • [7] ANALYTICAL MODEL OF THE KL DIVERGENCE FOR GAMMA DISTRIBUTED DATA: APPLICATION TO FAULT ESTIMATION
    Youssef, Abdulrahman
    Delpha, Claude
    Diallo, Demba
    2015 23RD EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2015, : 2266 - 2270
  • [8] Estimating Density Ridges by Direct Estimation of Density-Derivative-Ratios
    Sasaki, Hiroaki
    Kanamori, Takafumi
    Sugiyama, Masashi
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 54, 2017, 54 : 204 - 212
  • [9] AN APPLICATION OF SPLINE APPROXIMATION WITH VARIABLE KNOTS TO OPTIMAL ESTIMATION OF THE DERIVATIVE
    CHUI, CK
    SMITH, PW
    SIAM JOURNAL ON MATHEMATICAL ANALYSIS, 1980, 11 (04) : 724 - 736
  • [10] An estimation of Phi divergence and its application in testing normality
    Tavakoli, Mahsa
    Noughabi, Hadi Alizadeh
    Borzadaran, Gholam Reza Mohtashami
    HACETTEPE JOURNAL OF MATHEMATICS AND STATISTICS, 2020, 49 (06): : 2104 - 2118