Fast learning rates for plug-in classifiers

被引:245
作者
Audibert, Jean-Yves [1 ]
Tsybakov, Alexandre B.
机构
[1] Ecole Natl Ponts & Chaussees, Paris, France
[2] Univ Paris 06, F-75252 Paris 05, France
关键词
classification; statistical learning; fast rates of convergence; excess risk; plug-in classifiers; minimax lower bounds;
D O I
10.1214/009053606000001217
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
It has been recently shown that, under the margin (or low noise) assumption, there exist classifiers attaining fast rates of convergence of the excess Bayes risk, that is, rates faster than n(-1/2). The work on this subject has suggested the following two conjectures: (i) the best achievable fast rate is of the order n(-1), and (ii) the plug-in classifiers generally converge more slowly than the classifiers based on empirical risk minimization. We show that both conjectures are not correct. In particular, we construct plug-in classifiers that can achieve not only fast, but also super-fast rates, that is, rates faster than n-1. We establish minimax lower bounds showing that the obtained rates cannot be improved.
引用
收藏
页码:608 / 633
页数:26
相关论文
共 27 条
[1]  
[Anonymous], ADAPTIVITY SUPPORT V
[2]  
[Anonymous], 1998, STAT LEARNING THEORY
[3]  
AUDIBERT JY, 2004, LAB PROBABILITIES MO
[4]  
AUDIBERT JY, 2005, LAB PROBABILITES MOD
[5]   Convexity, classification, and risk bounds [J].
Bartlett, PL ;
Jordan, MI ;
McAuliffe, JD .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2006, 101 (473) :138-156
[6]  
Birman M.S., 1967, MAT SBORNIK, V73, P331
[7]   On the rate of convergence of regularized boosting classifiers [J].
Blanchard, G ;
Lugosi, G ;
Vayatis, N .
JOURNAL OF MACHINE LEARNING RESEARCH, 2004, 4 (05) :861-894
[8]  
BLANCHARD G, 2004, STAT PERFORMANCE SUP
[9]  
Devroye L., 1996, A probabilistic theory of pattern recognition, DOI DOI 10.1007/978-1-4612-0711-5
[10]  
Kolmogorov A.N., 1961, Amer. Math. Soc. Transl., V17, P277