A neurodynamic optimization approach to supervised feature selection via fractional programming

被引:24
作者
Wang, Yadi [1 ,2 ,3 ]
Li, Xiaoping [3 ,4 ]
Wang, Jun [5 ,6 ]
机构
[1] Henan Univ, Henan Key Lab Big Data Anal & Proc, Kaifeng 475004, Peoples R China
[2] Henan Univ, Inst Data & Knowledge Engn, Sch Comp & Informat Engn, Kaifeng 475004, Peoples R China
[3] Southeast Univ, Sch Comp Sci & Engn, Nanjing 211189, Peoples R China
[4] Southeast Univ, Minist Educ, Key Lab Comp Network & Informat Integrat, Nanjing 211189, Peoples R China
[5] City Univ Hong Kong, Dept Comp Sci, Kowloon, Hong Kong, Peoples R China
[6] City Univ Hong Kong, Sch Data Sci, Kowloon, Hong Kong, Peoples R China
基金
中国国家自然科学基金;
关键词
Feature selection; Information-theoretic measures; Fractional programming; Neurodynamic optimization; RECURRENT NEURAL-NETWORK; LIMITING ACTIVATION FUNCTION; CONSTRAINED OPTIMIZATION; PSEUDOCONVEX OPTIMIZATION; NONLINEAR OPTIMIZATION; MUTUAL INFORMATION; DESIGN;
D O I
10.1016/j.neunet.2021.01.004
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Feature selection is an important issue in machine learning and data mining. Most existing feature selection methods are greedy in nature thus are prone to sub-optimality. Though some global feature selection methods based on unsupervised redundancy minimization can potentiate clustering performance improvements, their efficacy for classification may be limited. In this paper, a neurodynamics-based holistic feature selection approach is proposed via feature redundancy minimization and relevance maximization. An information-theoretic similarity coefficient matrix is defined based on multi-information and entropy to measure feature redundancy with respect to class labels. Supervised feature selection is formulated as a fractional programming problem based on the similarity coefficients. A neurodynamic approach based on two one-layer recurrent neural networks is developed for solving the formulated feature selection problem. Experimental results with eight benchmark datasets are discussed to demonstrate the global convergence of the neural networks and superiority of the proposed neurodynamic approach to several existing feature selection methods in terms of classification accuracy, precision, recall, and F-measure. (C) 2021 Elsevier Ltd. All rights reserved.
引用
收藏
页码:194 / 206
页数:13
相关论文
共 81 条
[11]   Feature Selection Using a Neural Framework With Controlled Redundancy [J].
Chakraborty, Rudrasis ;
Pal, Nikhil R. .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2015, 26 (01) :35-50
[12]   A Two-Timescale Duplex Neurodynamic Approach to Mixed-Integer Optimization [J].
Che, Hangjun ;
Wang, Jun .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (01) :36-48
[13]   A collaborative neurodynamic approach to global and combinatorial optimization [J].
Che, Hangjun ;
Wang, Jun .
NEURAL NETWORKS, 2019, 114 :15-27
[14]   A Two-Timescale Duplex Neurodynamic Approach to Biconvex Optimization [J].
Che, Hangjun ;
Wang, Jun .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2019, 30 (08) :2503-2514
[15]   Feature Selection With Controlled Redundancy in a Fuzzy Rule Based Framework [J].
Chung, I-Fang ;
Chen, Yi-Cheng ;
Pal, Nikhil R. .
IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2018, 26 (02) :734-748
[16]  
Cover TM., 1991, ELEMENTS INFORM THEO
[17]  
Duda R.O., 1973, PATTERN CLASSIFICATI, V3
[18]  
Fleuret F, 2004, J MACH LEARN RES, V5, P1531
[19]   Feature-Selected Tree-Based Classification [J].
Freeman, Cecille ;
Kulic, Dana ;
Basir, Otman .
IEEE TRANSACTIONS ON CYBERNETICS, 2013, 43 (06) :1990-2004
[20]  
Ghojogh B, 2019, ARXIV190502845