Training the Feedforward Neural Network Using Unconscious Search

被引:0
|
作者
Amin-Naseri, M. R. [1 ]
Ardjmand, E. [2 ]
Weckman, G. [2 ]
机构
[1] Tarbiat Modares Univ, Tehran, Iran
[2] Ohio Univ, Athens, OH 45701 USA
来源
2013 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN) | 2013年
关键词
GENETIC ALGORITHM; BACKPROPAGATION; OPTIMIZATION; PERFORMANCE;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
One of the most widely used neural networks (NN) is the feedforward neural network (FNN). The most frequent application of FNN is in recognizing nonlinear patterns and, as a nonparametric method, in the estimation of functions especially in forecasting. In this study we will attempt to illustrate how a new metaheuristic algorithm known as Unconscious Search (US) may be utilized to train any feedforward neural network. US operates via a multi-start, memory-based, structured search algorithm that simulates the psychoanalytic psychotherapy process. The Theory of Psychoanalysis, propounded by Sigmund Freud is generally recognized as a descriptive and highly objective account of the mechanisms involved in psychological processes. This paper describes an analogy between the practice of psychoanalysis and the treatment of optimization problems, and it is the task of the present paper to apply US to the problem of training neural network. For this purpose we will first introduce US briefly then an application of US in training FNN is proposed and two benchmark problems are solved and the results of US are compared with the results of other metaheuristic algorithms.
引用
收藏
页数:7
相关论文
共 50 条
  • [1] Feedforward neural network training using intelligent global harmony search
    Tavakoli, Saeed
    Valian, Ehsan
    Mohanna, Shahram
    Evolving Systems, 2012, 3 (02) : 125 - 131
  • [2] Optimizing the Learning Process of Feedforward Neural Networks Using Lightning Search Algorithm
    Faris, Hossam
    Aljarah, Ibrahim
    Al-Madi, Nailah
    Mirjalili, Seyedali
    INTERNATIONAL JOURNAL ON ARTIFICIAL INTELLIGENCE TOOLS, 2016, 25 (06)
  • [3] Training Optimization of Feedforward Neural Network for Binary Classification
    Thawakar, Omkar
    Gajjewar, Pranav
    2019 INTERNATIONAL CONFERENCE ON COMPUTER COMMUNICATION AND INFORMATICS (ICCCI - 2019), 2019,
  • [4] Optimizing Deep Feedforward Neural Network Architecture: A Tabu Search Based Approach
    Gupta, Tarun Kumar
    Raza, Khalid
    NEURAL PROCESSING LETTERS, 2020, 51 (03) : 2855 - 2870
  • [5] An improved training algorithm for feedforward neural network learning based on terminal attractors
    Xinghuo Yu
    Bin Wang
    Batsukh Batbayar
    Liuping Wang
    Zhihong Man
    Journal of Global Optimization, 2011, 51 : 271 - 284
  • [6] An improved training algorithm for feedforward neural network learning based on terminal attractors
    Yu, Xinghuo
    Wang, Bin
    Batbayar, Batsukh
    Wang, Liuping
    Man, Zhihong
    JOURNAL OF GLOBAL OPTIMIZATION, 2011, 51 (02) : 271 - 284
  • [7] Training Feedforward Neural Networks using Hybrid Flower Pollination-Gravitational Search Algorithm
    Chakraborty, Dwaipayan
    Saha, Sankhadip
    Maity, Samaresh
    2015 1ST INTERNATIONAL CONFERENCE ON FUTURISTIC TRENDS ON COMPUTATIONAL ANALYSIS AND KNOWLEDGE MANAGEMENT (ABLAZE), 2015, : 292 - 297
  • [8] Improved monarch butterfly optimization for unconstrained global search and neural network training
    Faris, Hossam
    Aljarah, Ibrahim
    Mirjalili, Seyedali
    APPLIED INTELLIGENCE, 2018, 48 (02) : 445 - 464
  • [9] Training feedforward neural networks using neural networks and genetic algorithms
    Tellez, P
    Tang, Y
    INTERNATIONAL CONFERENCE ON COMPUTING, COMMUNICATIONS AND CONTROL TECHNOLOGIES, VOL 1, PROCEEDINGS, 2004, : 308 - 311
  • [10] A weight initialization method for improving training speed in feedforward neural network
    Yam, JYF
    Chow, TWS
    NEUROCOMPUTING, 2000, 30 (1-4) : 219 - 232