Individually Conditional Individual Mutual Information Bound on Generalization Error

被引:8
作者
Zhou, Ruida [1 ]
Tian, Chao [1 ]
Liu, Tie [1 ]
机构
[1] Texas A&M Univ, Dept Elect & Comp Engn, College Stn, TX 77843 USA
基金
美国国家科学基金会;
关键词
Mutual information; Training; Random variables; Heuristic algorithms; Training data; Noise measurement; Upper bound; Information-theoretic bounds; generalization error; stochastic gradient Langevin dynamics;
D O I
10.1109/TIT.2022.3144615
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We propose an information-theoretic bound on the generalization error based on a combination of the error decomposition technique of Bu et al. and the conditional mutual information (CMI) construction of Steinke and Zakynthinou. In a previous work, Haghifam et al. proposed a different bound combining the two aforementioned techniques, which we refer to as the conditional individual mutual information (CIMI) bound. However, in a simple Gaussian setting, both the CMI and the CIMI bounds are order-wise worse than that by Bu et al. This observation motivated us to propose the bound, which overcomes this issue by reducing the conditioning terms in the conditional mutual information. In the process of establishing this bound, a conditional decoupling lemma is established, which also leads to a meaningful dichotomy and comparison among these information-theoretic bounds. As an application of the proposed bound, we analyze the noisy and iterative stochastic gradient Langevin dynamics and provide an upper bound on its generalization error.
引用
收藏
页码:3304 / 3316
页数:13
相关论文
共 50 条
  • [41] A feasibility study of mutual information based setup error estimation for radiotherapy
    Kim, J
    Fessler, JA
    Lam, KL
    Balter, JM
    Ten Haken, RK
    MEDICAL PHYSICS, 2001, 28 (12) : 2507 - 2517
  • [42] Model-Induced Generalization Error Bound for Information-Theoretic Representation Learning in Source-Data-Free Unsupervised Domain Adaptation
    Yang, Baoyao
    Yeh, Hao-Wei
    Harada, Tatsuya
    Yuen, Pong C.
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2022, 31 : 419 - 432
  • [43] Mutual Information and Error Probability Analysis on Generalized Spatial Modulation System
    An, Zhecheng
    Wang, Jun
    Wang, Jintao
    Jian, Song
    IEEE TRANSACTIONS ON COMMUNICATIONS, 2017, 65 (03) : 1044 - 1060
  • [44] Some relations between mutual information and estimation error in wiener space
    Mayer-Wolf, Eddy
    Zakai, Moshe
    ANNALS OF APPLIED PROBABILITY, 2007, 17 (03) : 1102 - 1116
  • [45] Functional Properties of Minimum Mean-Square Error and Mutual Information
    Wu, Yihong
    Verdu, Sergio
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2012, 58 (03) : 1289 - 1301
  • [46] Collective mutual information maximization to unify passive and positive approaches for improving interpretation and generalization
    Kamimura, Ryotaro
    NEURAL NETWORKS, 2017, 90 : 56 - 71
  • [47] A Noise Reduction Method for Photoacoustic Imaging In Vivo Based on EMD and Conditional Mutual Information
    Zhou, Meng
    Xia, Haibo
    Zhong, Hongtao
    Zhang, Jiayao
    Gao, Fei
    IEEE PHOTONICS JOURNAL, 2019, 11 (01):
  • [48] Estimating Conditional Transfer Entropy in Time Series Using Mutual Information and Nonlinear Prediction
    Shahsavari Baboukani, Payam
    Graversen, Carina
    Alickovic, Emina
    Ostergaard, Jan
    ENTROPY, 2020, 22 (10) : 1 - 21
  • [49] A scoring function for learning Bayesian networks based on mutual information and conditional independence tests
    de Campos, Luis M.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2006, 7 : 2149 - 2187
  • [50] Empirical Mode Decomposition Technique With Conditional Mutual Information for Denoising Operational Sensor Data
    Omitaomu, Olufemi A.
    Protopopescu, Vladimir A.
    Ganguly, Auroop R.
    IEEE SENSORS JOURNAL, 2011, 11 (10) : 2565 - 2575