Classification for high-dimension small-sample data sets based on Kullback-Leibler information measure

被引:0
作者
Guo, P [1 ]
Lyu, MR [1 ]
机构
[1] Chinese Univ Hong Kong, Dept Comp Sci & Engn, Shatin, Hong Kong, Peoples R China
来源
IC-AI'2000: PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 1-III | 2000年
关键词
classification; covariance matrix estimation; small sample set with high dimension; Smoothing Parameter Selection; Kullback-Leibler Information Measure;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In classifying samples by Gaussian classifier, the covariance matrix estimated with a small number sample set becomes unstable, which leads to degrading the classification accuracy, in this paper, we discuss tile covariance matrix estimation problem for small number samples with high dimension setting based on Kullback-Leibler Information Measure. A new covariance matrix estimator is developed, and a fast, rough estimating regularization parameter formula is derived. Experiments are performed to investigate the classification accuracy with developed covariance matrix estimator and higher classification accuracy results are obtained.
引用
收藏
页码:1187 / 1193
页数:7
相关论文
共 12 条
[1]   COMPARATIVE-ANALYSIS OF STATISTICAL PATTERN-RECOGNITION METHODS IN HIGH-DIMENSIONAL SETTINGS [J].
AEBERHARD, S ;
COOMANS, D ;
DEVEL, O .
PATTERN RECOGNITION, 1994, 27 (08) :1065-1077
[2]  
[Anonymous], 1996, Monte Carlo Concepts, Algorithms and Applications
[3]  
Bishop C. M., 1995, NEURAL NETWORKS PATT
[4]  
DEVROYE L, 1988, COURSE DENSITY ESTIM
[5]   REGULARIZED DISCRIMINANT-ANALYSIS [J].
FRIEDMAN, JH .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1989, 84 (405) :165-175
[6]  
Fukunaga K., 1990, INTRO STAT PATTERN R
[7]  
Gentle J.E., 1998, RANDOM NUMBER GENERA
[8]  
KULLBACK S, 1959, INFORMATION THEORY S
[9]  
LAIRD NM, 1977, J ROYAL STAT SOC B, V39, P1
[10]   MIXTURE DENSITIES, MAXIMUM-LIKELIHOOD AND THE EM ALGORITHM [J].
REDNER, RA ;
WALKER, HF .
SIAM REVIEW, 1984, 26 (02) :195-237