Formal limitations of sample-wise information-theoretic generalization bounds

被引:2
作者
Harutyunyan, Hrayr [1 ]
Steeg, Greg Ver [1 ]
Galstyan, Aram [1 ]
机构
[1] USC Informat Sci Inst, Marina Del Rey, CA 90292 USA
来源
2022 IEEE INFORMATION THEORY WORKSHOP (ITW) | 2022年
关键词
learning theory; generalization bounds; information theory; GENERALIZATION ERROR;
D O I
10.1109/ITW54588.2022.9965850
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Some of the tightest information-theoretic generalization bounds depend on the average information between the learned hypothesis and a single training example. However, these sample-wise bounds were derived only for expected generalization gap. We show that even for expected squared generalization gap no such sample-wise information-theoretic bounds exist. The same is true for PAC-Bayes and single-draw bounds. Remarkably, PAC-Bayes, single-draw and expected squared generalization gap bounds that depend on information in pairs of examples exist.
引用
收藏
页码:440 / 445
页数:6
相关论文
共 28 条
[1]  
Alquier P, 2021, Arxiv, DOI arXiv:2110.11216
[2]  
Aminian G., 2020, IEEE Information Theory Workshop
[3]   Information-Theoretic Bounds on the Moments of the Generalization Error of Learning Algorithms [J].
Aminian, Gholamali ;
Toni, Laura ;
Rodrigues, Miguel R. D. .
2021 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2021, :682-687
[4]  
Bassily R., 2018, ALGORITHMIC LEARNING, P25
[5]  
Catoni O, 2007, Lecture Notes-Monograph Series, V56
[6]  
Dziugaite G. K., 2020, Advances in Neural Information Processing Systems, V33, P11723
[7]  
Dziugaite GK, 2017, CONFERENCE ON UNCERTAINTY IN ARTIFICIAL INTELLIGENCE (UAI2017)
[8]   Generalization Error Bounds via Renyi-, f-Divergences and Maximal Leakage [J].
Esposito, Amedeo Roberto ;
Gastpar, Michael ;
Issa, Ibrahim .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2021, 67 (08) :4986-5004
[9]  
Galvez Borja Rodriguez, 2021, Advances in Neural Information Processing Systems, V34
[10]  
Germain P., 2009, PROC INT C MACH LEAR, P353, DOI DOI 10.1145/1553374.1553419