Direct Density-Derivative Estimation and Its Application in KL-Divergence Approximation

被引:0
|
作者
Sasaki, Hiroaki [1 ]
Noh, Yung-Kyun [2 ]
Sugiyama, Masashi [1 ]
机构
[1] Univ Tokyo, Grad Sch Frontier Sci, Tokyo, Japan
[2] Seoul Natl Univ, Dept Mech & Aeros Engn, Seoul, South Korea
来源
ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 38 | 2015年 / 38卷
关键词
MEAN SHIFT; RATIO;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Estimation of density derivatives is a versatile tool in statistical data analysis. A naive approach is to first estimate the density and then compute its derivative. However, such a two-step approach does not work well because a good density estimator does not necessarily mean a good density-derivative estimator. In this paper, we give a direct method to approximate the density derivative without estimating the density itself. Our proposed estimator allows analytic and computationally efficient approximation of multi-dimensional high-order density derivatives, with the ability that all hyper-parameters can be chosen objectively by cross-validation. We further show that the proposed density-derivative estimator is useful in improving the accuracy of non-parametric KL-divergence estimation via metric learning. The practical superiority of the proposed method is experimentally demonstrated in change detection and feature selection.
引用
收藏
页码:809 / 818
页数:10
相关论文
共 50 条