Optimization of K-NN by feature weight learning

被引:0
作者
Shi, Q [1 ]
Lv, L [1 ]
Chen, H [1 ]
机构
[1] Hebei Univ, Fac Math & Comp Sci, Badoing, Hebei, Peoples R China
来源
PROCEEDINGS OF 2005 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-9 | 2005年
关键词
similarity metrics; feature weight; K-NN;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The Euclidean distance is usually chosen as the similarity measure in the conventional similarity metrics, which usually relates to all attributes. The smaller the distance is, the greater the similarity is. All the features of each vector have different functions in describing samples. So we can decide different function of every feature by using feature weight learning, that is, introduce feature weight parameters to the distance formula. Feature weight learning can be viewed as a linear transformation for a set of points in the Euclidean space. The numerical experiments applied in K-NN algorithm prove the validity of this learning algorithm.
引用
收藏
页码:2828 / 2831
页数:4
相关论文
共 7 条
[1]  
[Anonymous], P 19 ANN INT ACM SIG
[2]  
[Anonymous], UCI repository of machine learning database [Online]. Available
[3]   Unsupervised feature selection using a neuro-fuzzy approach [J].
Basak, J ;
De, RK ;
Pal, SK .
PATTERN RECOGNITION LETTERS, 1998, 19 (11) :997-1006
[4]  
HE JY, 2000, FDN OPERATIONAL RES, P301
[5]   A "soft" K-Nearest Neighbor voting scheme [J].
Mitchell, HB ;
Schaefer, PA .
INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2001, 16 (04) :459-468
[6]  
Mitchell T. M., 1997, MACH LEARN, P230
[7]  
YANG Y, 1994, P 17 ANN INT ACM SIG, P13