Feature Selection with L1 Regularization in Formal Neurons

被引:0
|
作者
Bobrowski, Leon [1 ,2 ]
机构
[1] Bialystok Tech Univ, Fac Comp Sci, Wiejska 45A, Bialystok, Poland
[2] Inst Biocybernet & Biomed Engn, PAS, Warsaw, Poland
来源
ENGINEERING APPLICATIONS OF NEURAL NETWORKS, EANN 2024 | 2024年 / 2141卷
关键词
high-dimensional data sets; formal neurons with a margin; feature selection; CPL criterion functions; L-1; regularization;
D O I
10.1007/978-3-031-62495-7_26
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Designing classifiers on high-dimensional learning data sets is an important task that appears in artificial intelligence applications. Designing classifiers for high-dimensional data involves learning hierarchical neural networks combined with feature selection. Feature selection aims to omit features that are unnecessary for a given problem. Feature selection in formal meurons can be achieved by minimizing convex and picewise linear (CPL) criterion functions with L-1 regularization. Minimizing CPL criterion functions can be associated with computations on a finite number of vertices in the parameter space.
引用
收藏
页码:343 / 353
页数:11
相关论文
共 50 条
  • [1] Feature Selection Using Smooth Gradient L1/2 Regularization
    Gao, Hongmin
    Yang, Yichen
    Zhang, Bingyin
    Li, Long
    Zhang, Huaqing
    Wu, Shujun
    NEURAL INFORMATION PROCESSING (ICONIP 2017), PT IV, 2017, 10637 : 160 - 170
  • [2] NONCONVEX L1/2 REGULARIZATION FOR SPARSE PORTFOLIO SELECTION
    Xu, Fengmin
    Wang, Guan
    Gao, Yuelin
    PACIFIC JOURNAL OF OPTIMIZATION, 2014, 10 (01): : 163 - 176
  • [3] Gene Selection based on Fuzzy measure with L1 regularization
    Wang, Jinfeng
    Chen, Jiajie
    Wang, Hui
    2018 21ST IEEE INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE AND ENGINEERING (CSE 2018), 2018, : 157 - 163
  • [4] Sparse Feature Grouping based on l1/2 Norm Regularization
    Mao, Wentao
    Xu, Wentao
    Li, Yuan
    2018 ANNUAL AMERICAN CONTROL CONFERENCE (ACC), 2018, : 1045 - 1051
  • [5] Variable selection for functional regression models via the L1 regularization
    Matsui, Hidetoshi
    Konishi, Sadanori
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2011, 55 (12) : 3304 - 3310
  • [6] The L1/2 regularization method for variable selection in the Cox model
    Liu, Cheng
    Liang, Yong
    Luan, Xin-Ze
    Leung, Kwong-Sak
    Chan, Tak-Ming
    Xu, Zong-Ben
    Zhang, Hai
    APPLIED SOFT COMPUTING, 2014, 14 : 498 - 503
  • [7] Feature Selection and Cancer Classification via Sparse Logistic Regression with the Hybrid L1/2+2 Regularization
    Huang, Hai-Hui
    Liu, Xiao-Ying
    Liang, Yong
    PLOS ONE, 2016, 11 (05):
  • [8] Feature Selection With l2,1-2 Regularization
    Shi, Yong
    Miao, Jianyu
    Wang, Zhengyu
    Zhang, Peng
    Niu, Lingfeng
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (10) : 4967 - 4982
  • [9] Prediction using step-wise L1, L2 regularization and feature selection for small data sets with large number of features
    Ozgur Demir-Kavuk
    Mayumi Kamada
    Tatsuya Akutsu
    Ernst-Walter Knapp
    BMC Bioinformatics, 12
  • [10] Iterative L1/2 Regularization Algorithm for Variable Selection in the Cox Proportional Hazards Model
    Liu, Cheng
    Liang, Yong
    Luan, Xin-Ze
    Leung, Kwong-Sak
    Chan, Tak-Ming
    Xu, Zong-Ben
    Zhang, Hai
    ADVANCES IN SWARM INTELLIGENCE, ICSI 2012, PT II, 2012, 7332 : 11 - 17