Unsupervised Feature Selection for High-Order Embedding Learning and Sparse Learning

被引:0
作者
Hu, Zebiao [1 ]
Wang, Jian [2 ]
Mandziuk, Jacek [3 ,4 ]
Ren, Zhongxin [5 ]
Pal, Nikhil R. [6 ,7 ]
机构
[1] China Univ Petr East China, Coll Control Sci & Engn, Qingdao 266580, Peoples R China
[2] China Univ Petr East China, Coll Sci, Qingdao 266580, Peoples R China
[3] Warsaw Univ Technol, Fac Math & Informat Sci, PL-00662 Warsaw, Poland
[4] AGH Univ Krakow, Fac Comp Sci, PL-30059 Krakow, Poland
[5] West East Gas Pipeline Co, Natl Petr & Nat Gas Pipeline Network Grp, Tech Res Sci & Technol Innovat & Management Gas St, Beijing 100000, Peoples R China
[6] Techno India Univ, Comp Sci & Engn Dept, Kolkata 700091, India
[7] South Asian Univ, Dept Math, New Delhi 110068, India
基金
中国国家自然科学基金;
关键词
Feature extraction; Optimization; Sparse matrices; Filtering algorithms; Clustering algorithms; Petroleum; Linear programming; Learning systems; High dimensional data; Classification algorithms; Embedding learning; feature selection; geometric structure; high-order similarity; sparse learning; REGRESSION;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The majority of the unsupervised feature selection methods usually explore the first-order similarity of the data while ignoring the high-order similarity of the instances, which makes it easy to construct a suboptimal similarity graph. Furthermore, such methods, often are not suitable for performing feature selection due to their high complexity, especially when the dimensionality of the data is high. To address the above issues, a novel method, termed as unsupervised feature selection for high-order embedding learning and sparse learning (UFSHS), is proposed to select useful features. More concretely, UFSHS first takes advantage of the high-order similarity of the original input to construct an optimal similarity graph that accurately reveals the essential geometric structure of high-dimensional data. Furthermore, it constructs a unified framework, integrating high-order embedding learning and sparse learning, to learn an appropriate projection matrix with row sparsity, which helps to select an optimal subset of features. Moreover, we design a novel alternative optimization method that provides different optimization strategies according to the relationship between the number of instances and the dimensionality, respectively, which significantly reduces the computational complexity of the model. Even more amazingly, the proposed optimization strategy is shown to be applicable to ridge regression, broad learning systems and fuzzy systems. Extensive experiments are conducted on nine public datasets to illustrate the superiority and efficiency of our UFSHS.
引用
收藏
页码:2355 / 2368
页数:14
相关论文
共 54 条
[1]  
Akhiat Y, 2018, 2018 IEEE 5TH INTERNATIONAL CONGRESS ON INFORMATION SCIENCE AND TECHNOLOGY (IEEE CIST'18), P232, DOI 10.1109/CIST.2018.8596467
[2]   Unsupervised Feature Selection with Controlled Redundancy (UFeSCoR) [J].
Banerjee, Monami ;
Pal, Nikhil R. .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2015, 27 (12) :3390-3403
[3]   SDI: A tool for speech differentiation in user identification [J].
Basit, Muhammad Abdul ;
Liu, Chanjuan ;
Zhao, Enyu .
EXPERT SYSTEMS WITH APPLICATIONS, 2024, 243
[4]   Laplacian eigenmaps for dimensionality reduction and data representation [J].
Belkin, M ;
Niyogi, P .
NEURAL COMPUTATION, 2003, 15 (06) :1373-1396
[5]  
Cai D., 2010, P 16 ACM SIGKDD INT, P333, DOI 10.1145/1835804.1835848
[6]   Detecting Meaningful Clusters From High-Dimensional Data: A Strongly Consistent Sparse Center-Based Clustering Approach [J].
Chakraborty, Saptarshi ;
Das, Swagatam .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (06) :2894-2908
[7]   Broad Learning System: An Effective and Efficient Incremental Learning System Without the Need for Deep Architecture [J].
Chen, C. L. Philip ;
Liu, Zhulin .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (01) :10-24
[8]  
Devi S., 2018, 2018 INT C CURR TREN, P1, DOI DOI 10.1109/ICCTCT.2018.8550928
[9]   Feature Selection Inspired Classifier Ensemble Reduction [J].
Diao, Ren ;
Chao, Fei ;
Peng, Taoxin ;
Snooke, Neal ;
Shen, Qiang .
IEEE TRANSACTIONS ON CYBERNETICS, 2014, 44 (08) :1259-1268
[10]   Fuzzy Broad Learning System: A Novel Neuro-Fuzzy Model for Regression and Classification [J].
Feng, Shuang ;
Chen, C. L. Philip .
IEEE TRANSACTIONS ON CYBERNETICS, 2020, 50 (02) :414-424