Back-propagation is not efficient

被引:22
|
作者
Sima, J
机构
[1] Acad. of Sci. of the Czech Republic
[2] Dept. of Theoretical Informatics, Institute of Computer Science, Acad. of Sci. of the Czech Republic, 182 07 Prague 8
关键词
learning theory; loading problem; NP-hardness; standard sigmoid function; back-propagation; nonlinear programming;
D O I
10.1016/0893-6080(95)00135-2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The back-propagation learning algorithm for multi-layered neural networks, which is often successfully used in practice, appears very time consuming even for small network architectures or training tasks. However, no results are yet known concerning the complexity of this algorithm. Blum and Rivest proved that training even a three-node network is NP-complete for the case when a neuron computes the discrete linear threshold function. We generalize the technique from their NP-hardness proof for a continuous sigmoidal function used in back-propagation. We show that training a three-node sigmoid network with an additional constraint on the output neuron function (e.g., zero threshold) is NP-hard. As a consequence of this, we find training sigmoid feedforward networks, with a single hidden layer and with zero threshold of output neuron, to be intractable. This implies that back-propagation is generally not an efficient algorithm, unless at least P = NP. We rake advantage of these results by showing the NP-hardness of a special nonlinear programming problem. Copyright (C) 1996 Elsevier Science Ltd.
引用
收藏
页码:1017 / 1023
页数:7
相关论文
共 50 条
  • [1] An Interpretation of Forward-Propagation and Back-Propagation of DNN
    Xie, Guotian
    Lai, Jianhuang
    PATTERN RECOGNITION AND COMPUTER VISION, PT II, 2018, 11257 : 3 - 15
  • [2] WS-BP: An Efficient Wolf Search Based Back-Propagation Algorithm
    Nawi, Nazri Mohd
    Rehman, M. Z.
    Khan, Abdullah
    INTERNATIONAL CONFERENCE ON MATHEMATICS, ENGINEERING AND INDUSTRIAL APPLICATIONS 2014 (ICOMEIA 2014), 2015, 1660
  • [3] Chicken S-BP: An Efficient Chicken Swarm Based Back-Propagation Algorithm
    Khan, Abdullah
    Nawi, Nazri Mohd
    Shah, Rahmat
    Akhter, Nasreen
    Ullah, Atta
    Rehman, M. Z.
    AbdulHamid, Norhamreeza
    Chiroma, Haruna
    RECENT ADVANCES ON SOFT COMPUTING AND DATA MINING, 2017, 549 : 122 - 129
  • [4] An extension of the back-propagation algorithm to complex numbers
    Nitta, T
    NEURAL NETWORKS, 1997, 10 (08) : 1391 - 1415
  • [5] Geometric Back-Propagation in Morphological Neural Networks
    Groenendijk, Rick
    Dorst, Leo
    Gevers, Theo
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (11) : 14045 - 14051
  • [6] Hybrid back-propagation training with evolutionary strategies
    José Parra
    Leonardo Trujillo
    Patricia Melin
    Soft Computing, 2014, 18 : 1603 - 1614
  • [7] BEAM-BASED BACK-PROPAGATION IMAGING
    Heilpern, T.
    Shlivinski, A.
    Heyman, E.
    ICEAA: 2009 INTERNATIONAL CONFERENCE ON ELECTROMAGNETICS IN ADVANCED APPLICATIONS, VOLS 1 AND 2, 2009, : 156 - +
  • [8] DESIGN OPTIMIZATION WITH BACK-PROPAGATION NEURAL NETWORKS
    LEE, SJ
    HENZER, C
    JOURNAL OF INTELLIGENT MANUFACTURING, 1991, 2 (05) : 293 - 303
  • [9] Multi-output incremental back-propagation
    Rachana Chaudhari
    Dhwani Agarwal
    Kritika Ravishankar
    Nikita Masand
    Vijay K. Sambhe
    Sandeep S. Udmale
    Neural Computing and Applications, 2023, 35 : 14897 - 14910
  • [10] Awesome back-propagation machine learning paradigm
    Assem Badr
    Neural Computing and Applications, 2021, 33 : 13225 - 13249