Estimating entropy on m bins given fewer than m samples

被引:79
作者
Paninski, L [1 ]
机构
[1] UCL, Gatsby Computat Neurosci Unit, London WC1N 3AR, England
关键词
approximation theory; bias; consistency; distribution-free bounds; entropy; estimation;
D O I
10.1109/TIT.2004.833360
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Consider a sequence p(N) of discrete probability measures, supported on m(N) points, and assume that we observe N independent and identically distributed (i.i.d.) samples from each p(N). We demonstrate the existence of an estimator of the entropy, H (p(N)),which is consistent even if the ratio N/m(N) is bounded (and, as a corollary, even if this ratio tends to zero, albeit at a sufficiently slow rate).
引用
收藏
页码:2200 / 2203
页数:4
相关论文
共 10 条