PAC-MDL bounds

被引:21
作者
Blum, A [1 ]
Langford, J
机构
[1] Carnegie Mellon Univ, Dept Comp Sci, Pittsburgh, PA 15213 USA
[2] IBM Corp, Watson Res Ctr, Yorktown Hts, NY USA
来源
LEARNING THEORY AND KERNEL MACHINES | 2003年 / 2777卷
关键词
D O I
10.1007/978-3-540-45167-9_26
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We point out that a number of standard sample complexity bounds (VC-dimension, PAC-Bayes, and others) are all related to the number of bits required to communicate the labels given the unlabeled data for a natural communication game. Motivated by this observation, we give a general sample complexity bound based on this game that allows us to unify these different bounds in one common framework.
引用
收藏
页码:344 / 357
页数:14
相关论文
共 9 条
  • [1] OCCAM RAZOR
    BLUMER, A
    EHRENFEUCHT, A
    HAUSSLER, D
    WARMUTH, MK
    [J]. INFORMATION PROCESSING LETTERS, 1987, 24 (06) : 377 - 380
  • [2] Cover T. M., 2005, ELEM INF THEORY, DOI 10.1002/047174882X
  • [3] Floyd S, 1995, MACH LEARN, V21, P269
  • [4] Hinton GE, 1993, KEEPING NEURAL NETWO
  • [5] Langford J., 2002, THESIS, P14
  • [6] LITTLESTONE N, 1986, UNPUB RELATING DATA
  • [7] MCALLESTER D, 1999, PAC BAYESIAN MODEL A
  • [8] UNIFORM CONVERGENCE OF RELATIVE FREQUENCIES OF EVENTS TO THEIR PROBABILITIES
    VAPNIK, VN
    CHERVONENKIS, AY
    [J]. THEORY OF PROBILITY AND ITS APPLICATIONS,USSR, 1971, 16 (02): : 264 - +
  • [9] VAPNIK VN, 1999, STAT LEARNING THEORY