Learning Narrow One-Hidden-Layer ReLU Networks

被引:0
作者
Chen, Sitan [1 ]
Dou, Zehao [2 ]
Goel, Surbhi [3 ]
Klivans, Adam [4 ]
Meka, Raghu [5 ]
机构
[1] Univ Calif Berkeley, Berkeley, CA 94720 USA
[2] Yale, New Haven, CT USA
[3] Univ Penn, Philadelphia, PA 19104 USA
[4] Univ Texas Austin, Austin, TX 78712 USA
[5] Univ Calif Los Angeles, Los Angeles, CA 90024 USA
来源
THIRTY SIXTH ANNUAL CONFERENCE ON LEARNING THEORY, VOL 195 | 2023年 / 195卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider the well-studied problem of learning a linear combination of k ReLU activations with respect to a Gaussian distribution on inputs in d dimensions. We give the first polynomial-time algorithm that succeeds whenever k is a constant. All prior polynomial-time learners require additional assumptions on the network, such as positive combining coefficients or the matrix of hidden weight vectors being well-conditioned. Our approach is based on analyzing random contractions of higher-order moment tensors. We use a multi-scale analysis to argue that sufficiently close neurons can be collapsed together, sidestepping the conditioning issues present in prior work. This allows us to design an iterative procedure to discover individual neurons.(1)
引用
收藏
页数:35
相关论文
共 32 条
  • [1] Allen-Zhu Z, 2019, ADV NEUR IN, V32
  • [2] [Anonymous], 2016, 33 INT C MACH LEARN
  • [3] Bakshi A, 2019, PR MACH LEARN RES, V99
  • [4] Chen S., 2021, arXiv
  • [5] Chen S., 2022, ARXIV
  • [6] Chen S., 2020, arXiv
  • [7] Learning Deep ReLU Networks Is Fixed-Parameter Tractable
    Chen, Sitan
    Klivans, Adam R.
    Meka, Raghu
    [J]. 2021 IEEE 62ND ANNUAL SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE (FOCS 2021), 2022, : 696 - 707
  • [8] Daniely A., 2017, ARXIV
  • [9] Daniely Amit, 2020, ARXIV
  • [10] Small Covers for Near-Zero Sets of Polynomials and Learning Latent Variable Models
    Diakonikolas, Ilias
    Kane, Daniel M.
    [J]. 2020 IEEE 61ST ANNUAL SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE (FOCS 2020), 2020, : 184 - 195