Feature Selection with L1 Regularization in Formal Neurons

被引:0
|
作者
Bobrowski, Leon [1 ,2 ]
机构
[1] Bialystok Tech Univ, Fac Comp Sci, Wiejska 45A, Bialystok, Poland
[2] Inst Biocybernet & Biomed Engn, PAS, Warsaw, Poland
来源
ENGINEERING APPLICATIONS OF NEURAL NETWORKS, EANN 2024 | 2024年 / 2141卷
关键词
high-dimensional data sets; formal neurons with a margin; feature selection; CPL criterion functions; L-1; regularization;
D O I
10.1007/978-3-031-62495-7_26
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Designing classifiers on high-dimensional learning data sets is an important task that appears in artificial intelligence applications. Designing classifiers for high-dimensional data involves learning hierarchical neural networks combined with feature selection. Feature selection aims to omit features that are unnecessary for a given problem. Feature selection in formal meurons can be achieved by minimizing convex and picewise linear (CPL) criterion functions with L-1 regularization. Minimizing CPL criterion functions can be associated with computations on a finite number of vertices in the parameter space.
引用
收藏
页码:343 / 353
页数:11
相关论文
共 50 条
  • [31] An Interior Point Method for L1/2-SVM and Application to Feature Selection in Classification
    Yao, Lan
    Zhang, Xiongji
    Li, Dong-Hui
    Zeng, Feng
    Chen, Haowen
    JOURNAL OF APPLIED MATHEMATICS, 2014,
  • [32] Embedded heterogeneous feature selection for conjoint analysis: A SVM approach using L1 penalty
    Maldonado, Sebastian
    Montoya, Ricardo
    Lopez, Julio
    APPLIED INTELLIGENCE, 2017, 46 (04) : 775 - 787
  • [33] l1/2,1 group sparse regularization for compressive sensing
    Liu, Shengcai
    Zhang, Jiangshe
    Liu, Junmin
    Yin, Qingyan
    SIGNAL IMAGE AND VIDEO PROCESSING, 2016, 10 (05) : 861 - 868
  • [34] SPARSE REPRESENTATION LEARNING OF DATA BY AUTOENCODERS WITH L1/2 REGULARIZATION
    Li, F.
    Zurada, J. M.
    Wu, W.
    NEURAL NETWORK WORLD, 2018, 28 (02) : 133 - 147
  • [35] Sparse kernel logistic regression based on L1/2 regularization
    Xu Chen
    Peng ZhiMing
    Jing WenFeng
    SCIENCE CHINA-INFORMATION SCIENCES, 2013, 56 (04) : 1 - 16
  • [36] A Sharp Nonasymptotic Bound and Phase Diagram of L1/2 Regularization
    Zhang, Hai
    Xu, Zong Ben
    Wang, Yao
    Chang, Xiang Yu
    Liang, Yong
    ACTA MATHEMATICA SINICA-ENGLISH SERIES, 2014, 30 (07) : 1242 - 1258
  • [37] Sparse kernel logistic regression based on L1/2 regularization
    XU Chen
    PENG ZhiMing
    JING WenFeng
    Science China(Information Sciences), 2013, 56 (04) : 75 - 90
  • [38] A Sharp Nonasymptotic Bound and Phase Diagram of L1/2 Regularization
    Hai ZHANG
    Zong Ben XU
    Yao WANG
    Xiang Yu CHANG
    Yong LIANG
    ActaMathematicaSinica(EnglishSeries), 2014, 30 (07) : 1242 - 1258
  • [39] Structured Pruning of Convolutional Neural Networks via L1 Regularization
    Yang, Chen
    Yang, Zhenghong
    Khattak, Abdul Mateen
    Yang, Liu
    Zhang, Wenxin
    Gao, Wanlin
    Wang, Minjuan
    IEEE ACCESS, 2019, 7 : 106385 - 106394
  • [40] A sharp nonasymptotic bound and phase diagram of L1/2 regularization
    Hai Zhang
    Zong Ben Xu
    Yao Wang
    Xiang Yu Chang
    Yong Liang
    Acta Mathematica Sinica, English Series, 2014, 30 : 1242 - 1258