On Lower Bounds for Statistical Learning Theory

被引:7
|
作者
Loh, Po-Ling [1 ]
机构
[1] Univ Wisconsin, Dept Elect & Comp Engn, 1415 Engn Dr, Madison, WI 53706 USA
关键词
machine learning; minimax estimation; community recovery; online learning; multi-armed bandits; channel decoding; threshold phenomena; MINIMAX RATES; ENTROPY; CONVERGENCE; INFORMATION; SELECTION;
D O I
10.3390/e19110617
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
In recent years, tools from information theory have played an increasingly prevalent role in statistical machine learning. In addition to developing efficient, computationally feasible algorithms for analyzing complex datasets, it is of theoretical importance to determine whether such algorithms are "optimal" in the sense that no other algorithm can lead to smaller statistical error. This paper provides a survey of various techniques used to derive information-theoretic lower bounds for estimation and learning. We focus on the settings of parameter and function estimation, community recovery, and online learning for multi-armed bandits. A common theme is that lower bounds are established by relating the statistical learning problem to a channel decoding problem, for which lower bounds may be derived involving information-theoretic quantities such as the mutual information, total variation distance, and Kullback-Leibler divergence. We close by discussing the use of information-theoretic quantities to measure independence in machine learning applications ranging from causality to medical imaging, and mention techniques for estimating these quantities efficiently in a data-driven manner.
引用
收藏
页数:17
相关论文
共 50 条
  • [1] New lower bounds for statistical query learning
    Yang, K
    JOURNAL OF COMPUTER AND SYSTEM SCIENCES, 2005, 70 (04) : 485 - 509
  • [2] Restricted Lower Bounds in the Statistical Theory of Electronic Energies
    Chang, Daniel T.
    Golden, Sidney
    PHYSICAL REVIEW A-GENERAL PHYSICS, 1970, 2 (06): : 2370 - 2376
  • [3] Lower Bounds on Learning Random Structures with Statistical Queries
    Angluin, Dana
    Eisenstat, David
    Kontorovich, Leonid
    Reyzin, Lev
    ALGORITHMIC LEARNING THEORY, ALT 2010, 2010, 6331 : 194 - 208
  • [4] A NOTE ON STABILITY OF ERROR BOUNDS IN STATISTICAL LEARNING THEORY
    Li, Ming
    Caponnetto, Andrea
    ANALYSIS AND APPLICATIONS, 2011, 9 (04) : 369 - 382
  • [5] Optimal Lower Bounds for Quantum Learning via Information Theory
    Hadiashar, Shima Bab
    Nayak, Ashwin
    Sinha, Pulkit
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2024, 70 (03) : 1876 - 1896
  • [6] Lower bounds for computing statistical depth
    Aloupis, G
    Cortés, C
    Gómez, F
    Soss, M
    Toussaint, G
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2002, 40 (02) : 223 - 229
  • [7] Sample size lower bounds in PAC learning by Algorithmic Complexity Theory
    Apolloni, B
    Gentile, C
    THEORETICAL COMPUTER SCIENCE, 1998, 209 (1-2) : 141 - 162
  • [8] Risk bounds for statistical learning
    Massart, Pascal
    Nedelec, Elodie
    ANNALS OF STATISTICS, 2006, 34 (05): : 2326 - 2366
  • [9] Towards Optimal Problem Dependent Generalization Error Bounds in Statistical Learning Theory
    Xu, Yunbei
    Zeevi, Assaf
    MATHEMATICS OF OPERATIONS RESEARCH, 2025, 50 (01)
  • [10] Statistical Query Lower Bounds for Tensor PCA
    Dudeja, Rishabh
    Hsu, Daniel
    JOURNAL OF MACHINE LEARNING RESEARCH, 2021, 22