Unsupervised and Supervised Feature Selection for Incomplete Data via L2,1-Norm and Reconstruction Error Minimization

被引:0
|
作者
Cai, Jun [1 ]
Fan, Linge [1 ]
Xu, Xin [1 ]
Wu, Xinrong [1 ]
机构
[1] Army Engn Univ PLA, Coll Commun Engn, Nanjing 210007, Peoples R China
来源
APPLIED SCIENCES-BASEL | 2022年 / 12卷 / 17期
基金
中国国家自然科学基金;
关键词
unsupervised feature selection; supervised feature selection; incomplete data; L-2; L-1; norm; reconstruction error; MISSING DATA; IMPUTATION;
D O I
10.3390/app12178752
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Feature selection has been widely used in machine learning and data mining since it can alleviate the burden of the so-called curse of dimensionality of high-dimensional data. However, in previous works, researchers have designed feature selection methods with the assumption that all the information from a data set can be observed. In this paper, we propose unsupervised and supervised feature selection methods for use with incomplete data, further introducing an L-2,L-1 norm and a reconstruction error minimization method. Specifically, the proposed feature selection objective functions take advantage of an indicator matrix reflecting unobserved information in incomplete data sets, and we present pairwise constraints, minimizing the L-2,L-1-norm-robust loss functionand performing error reconstruction simultaneously. Furthermore, we derive two alternative iterative algorithms to effectively optimize the proposed objective functions and the convergence of the proposed algorithms is proven theoretically. Extensive experimental studies were performed on both real and synthetic incomplete data sets to demonstrate the performance of the proposed methods.
引用
收藏
页数:21
相关论文
共 50 条
  • [1] Unsupervised maximum margin feature selection via L2,1-norm minimization
    Shizhun Yang
    Chenping Hou
    Feiping Nie
    Yi Wu
    Neural Computing and Applications, 2012, 21 : 1791 - 1799
  • [2] Unsupervised Discriminative Feature Selection in a Kernel Space via L2,1-Norm Minimization
    Liu, Yang
    Wang, Yizhou
    2012 21ST INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR 2012), 2012, : 1205 - 1208
  • [3] Unsupervised maximum margin feature selection via L 2,1-norm minimization
    Yang, Shizhun
    Hou, Chenping
    Nie, Feiping
    Wu, Yi
    NEURAL COMPUTING & APPLICATIONS, 2012, 21 (07): : 1791 - 1799
  • [4] Robust discriminant feature selection via joint L2,1-norm distance minimization and maximization
    Yang, Zhangjing
    Ye, Qiaolin
    Chen, Qiao
    Ma, Xu
    Fu, Liyong
    Yang, Guowei
    Yan, He
    Liu, Fan
    KNOWLEDGE-BASED SYSTEMS, 2020, 207
  • [5] Fast unsupervised feature selection with anchor graph and l2,1-norm regularization
    Hu, Haojie
    Wang, Rong
    Nie, Feiping
    Yang, Xiaojun
    Yu, Weizhong
    MULTIMEDIA TOOLS AND APPLICATIONS, 2018, 77 (17) : 22099 - 22113
  • [6] Cost-sensitive feature selection via the l2,1-norm
    Zhao, Hong
    Yu, Shenglong
    INTERNATIONAL JOURNAL OF APPROXIMATE REASONING, 2019, 104 : 25 - 37
  • [7] Robust feature selection via l2,1-norm in finite mixture of regression
    Li, Xiangrui
    Zhu, Dongxiao
    PATTERN RECOGNITION LETTERS, 2018, 108 : 15 - 22
  • [8] l2,1-norm minimization based negative label relaxation linear regression for feature selection
    Peng, Yali
    Sehdev, Paramjit
    Liu, Shigang
    Lie, Jun
    Wang, Xili
    PATTERN RECOGNITION LETTERS, 2018, 116 : 170 - 178
  • [9] Sparse Neighborhood Preserving Embedding via L2,1-Norm Minimization
    Zhou, Youpeng
    Ding, Yulin
    Luo, Yifu
    Ren, Haoliang
    PROCEEDINGS OF 2016 9TH INTERNATIONAL SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE AND DESIGN (ISCID), VOL 2, 2016, : 378 - 382
  • [10] Robust Feature Selection via Simultaneous Capped l2 -Norm and l2,1 -Norm Minimization
    Lan, Gongmin
    Hou, Chenping
    Yi, Dongyun
    PROCEEDINGS OF 2016 IEEE INTERNATIONAL CONFERENCE ON BIG DATA ANALYSIS (ICBDA), 2016, : 147 - 151