On Lower Bounds for Statistical Learning Theory

被引:7
作者
Loh, Po-Ling [1 ]
机构
[1] Univ Wisconsin, Dept Elect & Comp Engn, 1415 Engn Dr, Madison, WI 53706 USA
关键词
machine learning; minimax estimation; community recovery; online learning; multi-armed bandits; channel decoding; threshold phenomena; MINIMAX RATES; ENTROPY; CONVERGENCE; INFORMATION; SELECTION;
D O I
10.3390/e19110617
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
In recent years, tools from information theory have played an increasingly prevalent role in statistical machine learning. In addition to developing efficient, computationally feasible algorithms for analyzing complex datasets, it is of theoretical importance to determine whether such algorithms are "optimal" in the sense that no other algorithm can lead to smaller statistical error. This paper provides a survey of various techniques used to derive information-theoretic lower bounds for estimation and learning. We focus on the settings of parameter and function estimation, community recovery, and online learning for multi-armed bandits. A common theme is that lower bounds are established by relating the statistical learning problem to a channel decoding problem, for which lower bounds may be derived involving information-theoretic quantities such as the mutual information, total variation distance, and Kullback-Leibler divergence. We close by discussing the use of information-theoretic quantities to measure independence in machine learning applications ranging from causality to medical imaging, and mention techniques for estimating these quantities efficiently in a data-driven manner.
引用
收藏
页数:17
相关论文
共 50 条
  • [21] Regret Lower Bounds for Unbiased Adaptive Control of Linear Quadratic Regulators
    Ziemann, Ingvar
    Sandberg, Henrik
    IEEE CONTROL SYSTEMS LETTERS, 2020, 4 (03): : 785 - 790
  • [22] Lower bounds on individual sequence regret
    Gofer, Eyal
    Mansour, Yishay
    MACHINE LEARNING, 2016, 103 (01) : 1 - 26
  • [23] Relative Loss Bounds for Temporal-Difference Learning
    Jürgen Forster
    Manfred K. Warmuth
    Machine Learning, 2003, 51 : 23 - 50
  • [24] Relative loss bounds for temporal-difference learning
    Forster, J
    Warmuth, MK
    MACHINE LEARNING, 2003, 51 (01) : 23 - 50
  • [25] Sharp entropy bounds for discrete statistical simulation
    Romik, D
    STATISTICS & PROBABILITY LETTERS, 1999, 42 (03) : 219 - 227
  • [26] Further improvements of some bounds on entropy measures in information theory
    Matic, M
    Pearce, CEM
    Pecaric, J
    MATHEMATICAL INEQUALITIES & APPLICATIONS, 1999, 2 (04): : 599 - 611
  • [27] Machine learning, statistical learning and the future of biological research in psychiatry
    Iniesta, R.
    Stahl, D.
    McGuffin, P.
    PSYCHOLOGICAL MEDICINE, 2016, 46 (12) : 2455 - 2465
  • [28] New Lower Bounds for Privacy in Communication Protocols
    Kerenidis, Iordanis
    Lauriere, Mathieu
    Xiao, David
    INFORMATION THEORETIC SECURITY, ICITS 2013, 2014, 8317 : 69 - 89
  • [29] Lower Bounds on the Capacities of Quantum Relay Channels
    Shi Jin-Jing
    Shi Rong-Hua
    Peng Xiao-Qi
    Guo Ying
    Yi Liu-Yang
    Lee Moon-Ho
    COMMUNICATIONS IN THEORETICAL PHYSICS, 2012, 58 (04) : 487 - 492
  • [30] Lower bounds for Dirichlet Laplacians and uncertainty principles
    Stollmann, Peter
    Stolz, Guenter
    JOURNAL OF THE EUROPEAN MATHEMATICAL SOCIETY, 2021, 23 (07) : 2337 - 2360