A convergence instability analysis of neural networks applications in financial data sets

被引:0
作者
John, S [1 ]
机构
[1] RMIT Univ, Dept Mech & Mfg Engn, Bundoora, Vic 3083, Australia
来源
INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL I AND II | 1999年
关键词
neural networks; convergence; artificial intelligence; financial performance; prediction; share price;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The investigative route taken here is that of a well proven technique of 'learning', albeit by a machine, using Artificial Neural Networks (ANN). A proprietary software package, was used in the study, of charting the Share Price (SPRCE) and Earnings per Share (EpS) of two Australian banks. Key performance data was obtained for several companies over a period of five years and this information was 'trained' to result in the listed SPRCE and EpS of these companies over the stated rime period. The intention being, with the knowledge of anticipated key financial and performance information, some indication of the SPRCE or EpS could be inferred and thus investment decisions can be made. With certain settings of the computational process, some convergence problems were experienced with the data set training exercise. These unstable runs naturally resulted in less than satisfactory predictions. The use of a limited version of this proprietary package however, resulted in some success in 'learning'. The question remains however, of the credibility or the robustness of such decision-making aids. It is argued here that while some credibility can be given to results within certain marker types, such as a Bear, steady or Bull markets, it is virtually impossible to generate near accurate predictive rr ends on markets as a whole. Some solutions to this dilemma are presented in this paper.
引用
收藏
页码:451 / 457
页数:7
相关论文
共 50 条
  • [21] Neural networks and their economic applications
    Morajda, J
    ARTIFICIAL INTELLIGENCE AND SECURITY IN COMPUTING SYSTEMS, 2003, 752 : 53 - 62
  • [22] Convergence of multiple deep neural networks for classification with fewer labeled data
    Chuho Yi
    Jungwon Cho
    Personal and Ubiquitous Computing, 2023, 27 : 1055 - 1064
  • [23] Convergence of multiple deep neural networks for classification with fewer labeled data
    Yi, Chuho
    Cho, Jungwon
    PERSONAL AND UBIQUITOUS COMPUTING, 2020, 27 (3) : 1055 - 1064
  • [24] Genetic algorithms to create training data sets for artificial neural networks
    Schwaiger, R
    Mayer, HA
    PROCEEDINGS OF THE THIRD NORDIC WORKSHOP ON GENETIC ALGORITHMS AND THEIR APPLICATIONS (3NWGA), 1997, : 153 - 161
  • [25] Convergence Analysis of Novel Fractional-Order Backpropagation Neural Networks With Regularization Terms
    Ma, Mingjie
    Yang, Jianhui
    IEEE TRANSACTIONS ON CYBERNETICS, 2024, 54 (05) : 3039 - 3050
  • [26] Prescribed convergence analysis of recurrent neural networks with parameter variations
    Bao, Gang
    Zeng, Zhigang
    MATHEMATICS AND COMPUTERS IN SIMULATION, 2021, 182 : 858 - 870
  • [27] Convergence analysis of online gradient method for BP neural networks
    Wu, Wei
    Wang, Jian
    Cheng, Mingsong
    Li, Zhengxue
    NEURAL NETWORKS, 2011, 24 (01) : 91 - 98
  • [28] Efficient construction and convergence analysis of sparse convolutional neural networks
    Zhao, Shuai
    Fan, Qinwei
    Dong, Qingmei
    Xing, Zhiwei
    Yang, Xiaofei
    He, Xingshi
    NEUROCOMPUTING, 2024, 597
  • [29] Practical applications of neural networks in texture analysis
    Biebelmann, E
    Koppen, M
    Nickolay, B
    NEUROCOMPUTING, 1996, 13 (2-4) : 261 - 279
  • [30] Convergence of dynamical cell structures neural networks with applications to time series prediction
    Luxemburg, L
    8TH WORLD MULTI-CONFERENCE ON SYSTEMICS, CYBERNETICS AND INFORMATICS, VOL IX, PROCEEDINGS: COMPUTER SCIENCE AND ENGINEERING: I, 2004, : 252 - 257