Bias/variance decompositions for likelihood-based estimators

被引:70
作者
Heskes, T [1 ]
机构
[1] Catholic Univ Nijmegen, Fdn Neural Networks, NL-6525 EZ Nijmegen, Netherlands
关键词
D O I
10.1162/089976698300017232
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The bias/variance decomposition of mean-squared error is well understood and relatively straightforward. In this note, a similar simple decomposition is derived, valid for any kind of error measure that, when using the appropriate probability model, can be derived from a Kullback-Leibler divergence or log-likelihood.
引用
收藏
页码:1425 / 1433
页数:9
相关论文
共 15 条
[1]  
[Anonymous], 1986, STAT SCI
[2]  
[Anonymous], 1996, BIAS VARIANCE ARCING
[3]  
[Anonymous], 1996, P 13 INT C INT C MAC
[4]  
Bishop CM, 1997, ADV NEUR IN, V9, P347
[5]   A MULTIPLICATIVE FORMULA FOR AGGREGATING PROBABILITY ASSESSMENTS [J].
BORDLEY, RF .
MANAGEMENT SCIENCE, 1982, 28 (10) :1137-1148
[6]  
Dietterich T. G., 1995, Journal of Artificial Intelligence Research, V2, P263
[7]  
FRIEDMAN JH, 1996, BIAS VARIANCE 0 1 LO
[8]   NEURAL NETWORKS AND THE BIAS VARIANCE DILEMMA [J].
GEMAN, S ;
BIENENSTOCK, E ;
DOURSAT, R .
NEURAL COMPUTATION, 1992, 4 (01) :1-58
[9]   ON KULLBACK-LEIBLER LOSS AND DENSITY-ESTIMATION [J].
HALL, P .
ANNALS OF STATISTICS, 1987, 15 (04) :1491-1519
[10]  
HESKES T, 1998, ADV NEURAL INFORMATI, V10