Information Bottleneck: Theory and Applications in Deep Learning

被引:9
作者
Geiger, Bernhard C. [1 ]
Kubin, Gernot [2 ]
机构
[1] Know Ctr GmbH, Inffeldgasse 13-6, A-8010 Graz, Austria
[2] Graz Univ Technol, Signal Proc & Speech Commun Lab, Inffeldgasse 16c, A-8010 Graz, Austria
基金
欧盟地平线“2020”;
关键词
information bottleneck; deep learning; neural networks;
D O I
10.3390/e22121408
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
引用
收藏
页数:4
相关论文
共 22 条
  • [1] Information Dropout: Learning Optimal Representations Through Noisy Computation
    Achille, Alessandro
    Soatto, Stefano
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2018, 40 (12) : 2897 - 2905
  • [2] Alemi A. A., 2017, P INT C LEARN REPR I
  • [3] [Anonymous], 2020, ENTROPY SWITZ, DOI DOI 10.3390/E22020151
  • [4] [Anonymous], 2020, ENTROPY SWITZ, DOI DOI 10.3390/e22090999
  • [5] Belghazi MI, 2018, PR MACH LEARN RES, V80
  • [6] FISCHER I, 2020, ENTROPY-SWITZ, V22, DOI DOI 10.3390/e22101081
  • [7] Probabilistic Ensemble of Deep Information Networks
    Franzese, Giulio
    Visintin, Monica
    [J]. ENTROPY, 2020, 22 (01) : 100
  • [8] The Convex Information Bottleneck Lagrangian
    Galvez, Borja Rodriguez
    Thobaben, Ragnar
    Skoglund, Mikael
    [J]. ENTROPY, 2020, 22 (01) : 98
  • [9] Geiger B. C., 2020, ARXIV200309671
  • [10] GEIGER BC, 2020, ENTROPY-SWITZ, V22, DOI DOI 10.3390/e22111229