Training Fully Connected Neural Networks is ∃R-Complete

被引:0
|
作者
Bertschinger, Daniel [1 ]
Hertrich, Christoph [2 ,5 ,6 ]
Jungeblut, Paul [3 ]
Miltzow, Tillmann [4 ]
Weber, Simon [1 ]
机构
[1] Swiss Fed Inst Technol, Dept Comp Sci, Zurich, Switzerland
[2] London Sch Econ & Polit Sci, Dept Math, London, England
[3] Karlsruhe Inst Technol, Inst Theoret Informat, Karlsruhe, Germany
[4] Univ Utrecht, Dept Informat & Comp Sci, Utrecht, Netherlands
[5] Univ Libre, Brussels, Belgium
[6] Goethe Univ, Frankfurt, Germany
基金
瑞士国家科学基金会; 欧洲研究理事会;
关键词
BOUNDS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider the algorithmic problem of finding the optimal weights and biases for a two-layer fully connected neural network to fit a given set of data points, also known as empirical risk minimization. We show that the problem is there exists R-complete. This complexity class can be defined as the set of algorithmic problems that are polynomial-time equivalent to finding real roots of a multivariate polynomial with integer coefficients. Furthermore, we show that arbitrary algebraic numbers are required as weights to be able to train some instances to optimality, even if all data points are rational. Our result already applies to fully connected instances with two inputs, two outputs, and one hidden layer of ReLU neurons. Thereby, we strengthen a result by Abrahamsen, Kleist and Miltzow [NeurIPS 2021]. A consequence of this is that a combinatorial search algorithm like the one by Arora, Basu, Mianjy and Mukherjee [ICLR 2018] is impossible for networks with more than one output dimension, unless NP = there exists R.
引用
收藏
页数:16
相关论文
共 50 条
  • [1] Training Neural Networks is ∃R-complete
    Abrahamsen, Mikkel
    Kleist, Linda
    Miltzow, Tillmann
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,
  • [2] A homotopy training algorithm for fully connected neural networks
    Chen, Qipin
    Hao, Wenrui
    PROCEEDINGS OF THE ROYAL SOCIETY A-MATHEMATICAL PHYSICAL AND ENGINEERING SCIENCES, 2019, 475 (2231):
  • [3] The r-complete partitions
    Park, SK
    DISCRETE MATHEMATICS, 1998, 183 (1-3) : 293 - 297
  • [4] The Art Gallery Problem Is ∃R-Complete
    Abrahamsen, Mikkel
    Adamaszek, Anna
    Miltzow, Tillmann
    STOC'18: PROCEEDINGS OF THE 50TH ANNUAL ACM SIGACT SYMPOSIUM ON THEORY OF COMPUTING, 2018, : 65 - 73
  • [5] Geometric Thickness of Multigraphs is ∃R-Complete
    Forster, Henry
    Kindermann, Philipp
    Miltzow, Tilmann
    Parada, Irene
    Terziadis, Soeren
    Vogtenhuber, Birgit
    LATIN 2024: THEORETICAL INFORMATICS, PT I, 2024, 14578 : 336 - 349
  • [6] Differentiable homotopy methods for gradually reinforcing the training of fully connected neural networks
    Li, Peixuan
    Li, Yuanbo
    NEUROCOMPUTING, 2024, 605
  • [7] Distributed Learning of Fully Connected Neural Networks using Independent Subnet Training
    Yuan, Binhang
    Wolfe, Cameron R.
    Dun, Chen
    Tang, Yuxin
    Kyrillidis, Anastasios
    Jermaine, Chris
    PROCEEDINGS OF THE VLDB ENDOWMENT, 2022, 15 (08): : 1581 - 1590
  • [8] A Kind of R-Complete Category for R-Posets
    Wu, Li-gang
    Fan, Lei
    Gao, Yang
    INTERNATIONAL JOINT CONFERENCE ON COMPUTATIONAL SCIENCES AND OPTIMIZATION, VOL 1, PROCEEDINGS, 2009, : 639 - +
  • [9] Representing Matroids over the Reals is ∃ R-complete
    Kim, Eun Jung
    de Mesmay, Arnaud
    Miltzow, Tillmann
    DISCRETE MATHEMATICS AND THEORETICAL COMPUTER SCIENCE, 2024, 26 (02):
  • [10] Spectrum Analysis for Fully Connected Neural Networks
    Jia, Bojun
    Zhang, Yanjun
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (12) : 10091 - 10104