BACKPROPAGATION USES PRIOR INFORMATION EFFICIENTLY

被引:20
作者
BARNARD, E [1 ]
BOTHA, EC [1 ]
机构
[1] UNIV PRETORIA,DEPT ELECT & ELECTR ENGN,PRETORIA 0002,SOUTH AFRICA
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 1993年 / 4卷 / 05期
关键词
D O I
10.1109/72.248457
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The ability of neural net classifiers to deal with a priori information is investigated. For this purpose, back-propagation classifiers are trained with data from known distributions with variable a priori probabilities, and their performance on separate test sets is evaluated. It is found that back-propagation employs a priori information in a slightly suboptimal fashion, but that this does not have serious consequences on the performance of this classifier. Furthermore, it is found that the inferior generalization that results when an excessive number of network parameters are used can (partially) be ascribed to this suboptimality.
引用
收藏
页码:794 / 802
页数:9
相关论文
共 14 条