Information Bottleneck: Theory and Applications in Deep Learning

被引:9
作者
Geiger, Bernhard C. [1 ]
Kubin, Gernot [2 ]
机构
[1] Know Ctr GmbH, Inffeldgasse 13-6, A-8010 Graz, Austria
[2] Graz Univ Technol, Signal Proc & Speech Commun Lab, Inffeldgasse 16c, A-8010 Graz, Austria
基金
欧盟地平线“2020”;
关键词
information bottleneck; deep learning; neural networks;
D O I
10.3390/e22121408
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
引用
收藏
页数:4
相关论文
共 22 条
  • [11] The information bottleneck problem and its applications in machine learning
    Goldfeld Z.
    Polyanskiy Y.
    [J]. IEEE Journal on Selected Areas in Information Theory, 2020, 1 (01): : 19 - 38
  • [12] Hendrycks Dan, 2019, INT C LEARN REPR
  • [13] Convergence Behavior of DNNs with Mutual-Information-Based Regularization
    Jonsson, Hlynur
    Cherubini, Giovanni
    Eleftheriou, Evangelos
    [J]. ENTROPY, 2020, 22 (07)
  • [14] Kolchinsky A., 2019, P INT C LEARN REPR I
  • [15] Nonlinear Information Bottleneck
    Kolchinsky, Artemy
    Tracey, Brendan D.
    Wolpert, David H.
    [J]. ENTROPY, 2019, 21 (12)
  • [16] Gaussian Mean Field Regularizes by Limiting Learned Information
    Kunze, Julius
    Kirsch, Louis
    Ritter, Hippolyt
    Barber, David
    [J]. ENTROPY, 2019, 21 (08)
  • [17] Shwartz-Ziv Ravid, 2017, OPENING BLACK BOX DE
  • [18] Pareto-Optimal Data Compression for Binary Classification Tasks
    Tegmark, Max
    Wu, Tailin
    [J]. ENTROPY, 2020, 22 (01) : 7
  • [19] Markov Information Bottleneck to Improve Information Flow in Stochastic Neural Networks
    Thanh Tang Nguyen
    Choi, Jaesik
    [J]. ENTROPY, 2019, 21 (10)
  • [20] Tishby N., 1999, P 37 ANN ALL C COMM, P368, DOI DOI 10.48550/ARXIV.PHYSICS/0004057