Feature selection in high-dimensional classification via an adaptive multifactor evolutionary algorithm with local search

被引:0
|
作者
Li, Zhihui [1 ]
Li, Hong [1 ]
Gao, Weifeng [1 ]
Xie, Jin [1 ]
Slowik, Adam [2 ]
机构
[1] Xidian Univ, Sch Math & Stat, Xian 710126, Peoples R China
[2] Koszalin Univ Technol, Dept Elect & Comp Sci, Koszalin, Poland
基金
中国博士后科学基金;
关键词
Feature selection; Evolutionary multitasking; Multifactor optimization; Local search; High-dimensional dataset classification; Knowledge transfer; DIFFERENTIAL EVOLUTION;
D O I
10.1016/j.asoc.2024.112574
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As datasets grow in dimension and sample size, feature selection becomes increasingly important in machine learning. Features are often associated with multiple tasks, so adopting a multi-task optimization framework in feature selection can improve its classification performance. Multifactor optimization provides a powerful evolutionary multi-tasking paradigm capable of simultaneously handling multiple related optimization tasks. Taking inspiration from these, this article proposes a parameter adaptive multifactor feature selection algorithm (AMFEA). To help the algorithm escape from local optima, AMFEA uses a local search strategy to assist the algorithm in finding the global optimum. In addition, AMFEA has designed an adaptive knowledge transfer parameter matrix that dynamically adjusts parameter sizes based on the population's fitness to control the frequency of knowledge transfer between tasks. This effectively transfers knowledge between different tasks and helps the algorithm converge quickly. Experimental results on 18 high-dimensional datasets show that AMFEA significantly improves classification accuracy compared with evolutionary algorithms and traditional feature selection methods.
引用
收藏
页数:13
相关论文
共 50 条
  • [41] Feature selection for high-dimensional classification using a competitive swarm optimizer
    Shenkai Gu
    Ran Cheng
    Yaochu Jin
    Soft Computing, 2018, 22 : 811 - 822
  • [42] Feature selection for high-dimensional classification using a competitive swarm optimizer
    Gu, Shenkai
    Cheng, Ran
    Jin, Yaochu
    SOFT COMPUTING, 2018, 22 (03) : 811 - 822
  • [43] Benchmark for filter methods for feature selection in high-dimensional classification data
    Bommert, Andrea
    Sun, Xudong
    Bischl, Bernd
    Rahnenfuehrer, Joerg
    Lang, Michel
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2020, 143
  • [44] Genetic programming for feature construction and selection in classification on high-dimensional data
    Binh Tran
    Bing Xue
    Mengjie Zhang
    Memetic Computing, 2016, 8 : 3 - 15
  • [45] Guided Particle Adaptation PSO for Feature Selection on High-dimensional Classification
    Huang, Mingshen
    Yuan, Weiwei
    Guan, Donghai
    Lu, Mengze
    Koc, Cetin Kaya
    ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT I, ICIC 2024, 2024, 14862 : 14 - 26
  • [46] An iterative SVM approach to feature selection and classification in high-dimensional datasets
    Liu, Dehua
    Qian, Hui
    Dai, Guang
    Zhang, Zhihua
    PATTERN RECOGNITION, 2013, 46 (09) : 2531 - 2537
  • [47] Genetic programming for feature construction and selection in classification on high-dimensional data
    Binh Tran
    Xue, Bing
    Zhang, Mengjie
    MEMETIC COMPUTING, 2016, 8 (01) : 3 - 15
  • [48] BFRA: A New Binary Hyper-Heuristics Feature Ranks Algorithm for Feature Selection in High-Dimensional Classification Data
    Shaddeli, Aitak
    Gharehchopogh, Farhad Soleimanian
    Masdari, Mohammad
    Solouk, Vahid
    INTERNATIONAL JOURNAL OF INFORMATION TECHNOLOGY & DECISION MAKING, 2023, 22 (01) : 471 - 536
  • [49] Feature selection for high-dimensional data
    Bolón-Canedo V.
    Sánchez-Maroño N.
    Alonso-Betanzos A.
    Progress in Artificial Intelligence, 2016, 5 (2) : 65 - 75
  • [50] Feature selection for high-dimensional data
    Destrero A.
    Mosci S.
    De Mol C.
    Verri A.
    Odone F.
    Computational Management Science, 2009, 6 (1) : 25 - 40