Estimating entropy on m bins given fewer than m samples

被引:83
作者
Paninski, L [1 ]
机构
[1] UCL, Gatsby Computat Neurosci Unit, London WC1N 3AR, England
关键词
approximation theory; bias; consistency; distribution-free bounds; entropy; estimation;
D O I
10.1109/TIT.2004.833360
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Consider a sequence p(N) of discrete probability measures, supported on m(N) points, and assume that we observe N independent and identically distributed (i.i.d.) samples from each p(N). We demonstrate the existence of an estimator of the entropy, H (p(N)),which is consistent even if the ratio N/m(N) is bounded (and, as a corollary, even if this ratio tends to zero, albeit at a sufficiently slow rate).
引用
收藏
页码:2200 / 2203
页数:4
相关论文
共 10 条
[1]  
[Anonymous], 1961, CONTRIBUTIONS THEORY
[2]  
[Anonymous], 1955, INFORM THEORY PSYCHO
[3]   Convergence properties of functional estimates for discrete distributions [J].
Antos, A ;
Kontoyiannis, I .
RANDOM STRUCTURES & ALGORITHMS, 2001, 19 (3-4) :163-193
[4]   A MEASURE OF ASYMPTOTIC EFFICIENCY FOR TESTS OF A HYPOTHESIS BASED ON THE SUM OF OBSERVATIONS [J].
CHERNOFF, H .
ANNALS OF MATHEMATICAL STATISTICS, 1952, 23 (04) :493-507
[5]  
Devroye L., 1996, A probabilistic theory of pattern recognition
[6]  
DONOHO DL, 1991, ANN STAT, V19, P633, DOI 10.1214/aos/1176348114
[7]  
Koosis P., 1988, LOGARITHMIC INTEGRAL
[8]  
LeCam L. M., 1986, ASYMPTOTIC METHODS S
[9]   Estimation of entropy and mutual information [J].
Paninski, L .
NEURAL COMPUTATION, 2003, 15 (06) :1191-1253
[10]  
Rudin W., 1973, FUNCTIONAL ANAL