On Lower Bounds for Statistical Learning Theory

被引:8
作者
Loh, Po-Ling [1 ]
机构
[1] Univ Wisconsin, Dept Elect & Comp Engn, 1415 Engn Dr, Madison, WI 53706 USA
关键词
machine learning; minimax estimation; community recovery; online learning; multi-armed bandits; channel decoding; threshold phenomena; MINIMAX RATES; ENTROPY; CONVERGENCE; INFORMATION; SELECTION;
D O I
10.3390/e19110617
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
In recent years, tools from information theory have played an increasingly prevalent role in statistical machine learning. In addition to developing efficient, computationally feasible algorithms for analyzing complex datasets, it is of theoretical importance to determine whether such algorithms are "optimal" in the sense that no other algorithm can lead to smaller statistical error. This paper provides a survey of various techniques used to derive information-theoretic lower bounds for estimation and learning. We focus on the settings of parameter and function estimation, community recovery, and online learning for multi-armed bandits. A common theme is that lower bounds are established by relating the statistical learning problem to a channel decoding problem, for which lower bounds may be derived involving information-theoretic quantities such as the mutual information, total variation distance, and Kullback-Leibler divergence. We close by discussing the use of information-theoretic quantities to measure independence in machine learning applications ranging from causality to medical imaging, and mention techniques for estimating these quantities efficiently in a data-driven manner.
引用
收藏
页数:17
相关论文
共 50 条
[41]   STATISTICAL PROBLEM CLASSES AND THEIR LINKS TO INFORMATION THEORY [J].
Clarke, Bertrand ;
Clarke, Jennifer ;
Yu, Chi Wai .
ECONOMETRIC REVIEWS, 2014, 33 (1-4) :337-371
[42]   ENTROPY BOUNDS THROUGH CONTINUUM THEORY [J].
Morales, C. A. ;
San Martin, B. ;
Sirvent, V. F. .
PROCEEDINGS OF THE AMERICAN MATHEMATICAL SOCIETY, 2021, 149 (03) :1061-1075
[43]   New Bounds of a Measure in Information Theory [J].
Popescu, Mihaela-Alexandra ;
Slussanschi, Oana ;
Olteanu, Alexandru-Corneliu ;
Pop, Florin .
2014 IEEE INTERNATIONAL CONFERENCE ON HIGH PERFORMANCE COMPUTING AND COMMUNICATIONS, 2014 IEEE 6TH INTL SYMP ON CYBERSPACE SAFETY AND SECURITY, 2014 IEEE 11TH INTL CONF ON EMBEDDED SOFTWARE AND SYST (HPCC,CSS,ICESS), 2014, :927-928
[44]   Classical Statistics and Statistical Learning in Imaging Neuroscience [J].
Bzdok, Danilo .
FRONTIERS IN NEUROSCIENCE, 2017, 11
[45]   On statistical measure theory [J].
Bao, Lingxin ;
Cheng, Lixin .
JOURNAL OF MATHEMATICAL ANALYSIS AND APPLICATIONS, 2013, 407 (02) :413-424
[46]   Characterizing Entropy in Statistical Physics and in Quantum Information Theory [J].
Baumgartner, Bernhard .
FOUNDATIONS OF PHYSICS, 2014, 44 (10) :1107-1123
[47]   Characterizing Entropy in Statistical Physics and in Quantum Information Theory [J].
Bernhard Baumgartner .
Foundations of Physics, 2014, 44 :1107-1123
[48]   Greatest lower bounds on the Ricci curvature of Fano manifolds [J].
Szekelyhidi, Gabor .
COMPOSITIO MATHEMATICA, 2011, 147 (01) :319-331
[49]   Experimental lower bounds to the classical capacity of quantum channels [J].
Ciampini, Mario A. ;
Cuevas, Alvaro ;
Mataloni, Paolo ;
Macchiavello, Chiara ;
Sacchi, Massimiliano F. .
PHYSICAL REVIEW A, 2021, 103 (06)
[50]   Generalization Bounds and Algorithms for Learning to Communicate Over Additive Noise Channels [J].
Weinberger, Nir .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2022, 68 (03) :1886-1921