MEASURING THE VC-DIMENSION OF A LEARNING-MACHINE

被引:369
作者
VAPNIK, V
LEVIN, E
LECUN, Y
机构
关键词
D O I
10.1162/neco.1994.6.5.851
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A method for measuring the capacity of learning machines is described. The method is based on fitting a theoretically derived function to empirical measurements of the maximal difference between the error rates on two separate data sets of varying sizes. Experimental measurements of the capacity of various types of linear classifiers are presented.
引用
收藏
页码:851 / 876
页数:26
相关论文
共 10 条
[1]   HINTS AND THE VC DIMENSION [J].
ABUMOSTAFA, YS .
NEURAL COMPUTATION, 1993, 5 (02) :278-288
[2]   What Size Net Gives Valid Generalization? [J].
Baum, Eric B. ;
Haussler, David .
NEURAL COMPUTATION, 1989, 1 (01) :151-160
[3]  
CORTES C, 1993, COMMUNICATION
[4]   BOUNDS FOR THE UNIFORM DEVIATION OF EMPIRICAL MEASURES [J].
DEVROYE, L .
JOURNAL OF MULTIVARIATE ANALYSIS, 1982, 12 (01) :72-79
[5]  
GUYON I, 1992, ADV NEUR IN, V4, P471
[6]  
LECUN Y, 1990, ADV NEURAL INFORMATI, V2
[7]  
VAPNIK V, 1989, PATTERN REC IMAGE AN, V1, P283
[8]   UNIFORM CONVERGENCE OF RELATIVE FREQUENCIES OF EVENTS TO THEIR PROBABILITIES [J].
VAPNIK, VN ;
CHERVONENKIS, AY .
THEORY OF PROBILITY AND ITS APPLICATIONS,USSR, 1971, 16 (02) :264-+
[9]  
Vapnik VN, 1982, ESTIMATION DEPENDENC
[10]  
WEIGEND AS, 1991, ADV NEURAL INFORMATI, V3