STUDY ON UNSUPERVISED FEATURE SELECTION METHOD BASED ON EXTENDED ENTROPY

被引:1
作者
Sun, Zhanquan [1 ]
Li, Feng [2 ]
Huang, Huifen [3 ]
机构
[1] Univ Shanghai Sci & Technol, Minist Educ, Shanghai Key Lab Modern Opt Syst, Engn Res Ctr Opt Instrument & Syst, Shanghai 200093, Peoples R China
[2] Shanghai Univ, Coll Liberal Arts, Dept Hist, Shanghai 200436, Peoples R China
[3] Shandong Yingcai Univ, Jinan, Shandong, Peoples R China
基金
美国国家科学基金会;
关键词
Unsupervised feature selection; extended entropy; information loss; correlation value; INFORMATION;
D O I
10.31577/cai_2019_1_223
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Feature selection techniques are designed to find the relevant feature subset of the original features that can facilitate clustering, classification and retrieval. It is an important research topic in pattern recognition and machine learning. Feature selection is mainly partitioned into two classes, i.e. supervised and unsupervised methods. Currently research mostly concentrates on supervised ones. Few efficient unsupervised feature selection methods have been developed because no label information is available. On the other hand, it is difficult to evaluate the selected features. An unsupervised feature selection method based on extended entropy is proposed here. The information loss based on extended entropy is used to measure the correlation between features. The method assures that the selected features have both big individual information and little redundancy information with the selected features. At last, the efficiency of the proposed method is illustrated with some practical datasets.
引用
收藏
页码:223 / 239
页数:17
相关论文
共 19 条
  • [1] Anguita D., 2013, EUR S ART NEUR NETW, P437
  • [2] [Anonymous], 2014, PARALLEL FEATURE SEL, DOI DOI 10.1007/978-3-319-01766-2_35
  • [3] Aroquiaraj I. L., 2013, Proceedings of the 2013 International Conference on Pattern Recognition, Informatics and Mobile Engineering (PRIME), P479, DOI 10.1109/ICPRIME.2013.6496718
  • [4] USING MUTUAL INFORMATION FOR SELECTING FEATURES IN SUPERVISED NEURAL-NET LEARNING
    BATTITI, R
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (04): : 537 - 550
  • [5] Cai D., 2010, P 16 ACM SIGKDD INT, P333, DOI [10.1145/1835804.1835848, DOI 10.1145/1835804.1835848]
  • [6] Chiappino Simone, 2014, 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), P4364, DOI 10.1109/ICASSP.2014.6854426
  • [7] Fan RE, 2005, J MACH LEARN RES, V6, P1889
  • [8] Research on collaborative negotiation for e-commerce.
    Feng, YQ
    Lei, Y
    Li, Y
    Cao, RZ
    [J]. 2003 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-5, PROCEEDINGS, 2003, : 2085 - 2088
  • [9] Frank E., 2005, Data Mining: Practical Machine Learning Tools and Techniques
  • [10] Joint Embedding Learning and Sparse Regression: A Framework for Unsupervised Feature Selection
    Hou, Chenping
    Nie, Feiping
    Li, Xuelong
    Yi, Dongyun
    Wu, Yi
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2014, 44 (06) : 793 - 804