AN IMPROVED ALGORITHM FOR NEURAL-NETWORK CLASSIFICATION OF IMBALANCED TRAINING SETS

被引:177
作者
ANAND, R [1 ]
MEHROTRA, KG [1 ]
MOHAN, CK [1 ]
RANKA, S [1 ]
机构
[1] SYRACUSE UNIV,SCH COMP & INFORMAT SCI,SYRACUSE,NY 13244
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 1993年 / 4卷 / 06期
基金
美国国家科学基金会;
关键词
D O I
10.1109/72.286891
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The backpropagation algorithm converges very slowly for two-class problems in which most of the exemplars belong to one dominant class. We analyze that this occurs because the computed net error gradient vector is dominated by the bigger class so much that the net error for the exemplars in the smaller class increases significantly in the initial iteration. The subsequent rate of convergence of the net error is very low. We present a modified technique for calculating a direction in weight-space which decreases the error for each class. Using this algorithm, we have been able to accelerate the rate of learning for two-class classification problems by an order of magnitude.
引用
收藏
页码:962 / 969
页数:8
相关论文
共 15 条
  • [1] [Anonymous], 1987, COMPUT SPEECH LANG, DOI DOI 10.1016/0885-2308(87)90026-X
  • [2] FAHLMAN SE, 1988, CMUCS88162 CARN MELL
  • [3] HOGG RV, 1971, INTRO MATH STATISTIC
  • [4] INCREASED RATES OF CONVERGENCE THROUGH LEARNING RATE ADAPTATION
    JACOBS, RA
    [J]. NEURAL NETWORKS, 1988, 1 (04) : 295 - 307
  • [5] JACOBS RA, 1990, COINS9027 U MASS DEP
  • [6] James M., 1985, CLASSIFICATION ALGOR
  • [7] KOHONEN T, 1988, 2ND P IEEE INT C NEU, V1, P61
  • [8] Kowalik J., 1968, METHODS UNCONSTRAINE
  • [9] Ostrowski A.M., 1973, SOLUTION EQUATIONS E
  • [10] PIERRE DA, 1969, OPTIMIZATION THEORY, P297