Feature Selection for Interval-Valued Data Based on D-S Evidence Theory

被引:13
|
作者
Peng, Yichun [1 ]
Zhang, Qinli [2 ]
机构
[1] Yulin Normal Univ, Sch Comp Sci & Engn, Yulin 537000, Guangxi, Peoples R China
[2] Chizhou Univ, Sch Big Data & Artificial Intelligence, Chizhou 247000, Anhui, Peoples R China
来源
IEEE ACCESS | 2021年 / 9卷
关键词
Interval-valued data; IVIS; D-S evidence theory; belief function; plausibility function; feature selection; ROUGH SET APPROACH; DEMPSTER-SHAFER THEORY; UNCERTAINTY MEASURES; ATTRIBUTE REDUCTION; KNOWLEDGE REDUCTION; INFORMATION-SYSTEMS; ENTROPY; APPROXIMATIONS;
D O I
10.1109/ACCESS.2021.3109013
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Feature selection is one basic and critical technology for data mining, especially in current "big data era''. Rough set theory (RST) is sensitive to noise in feature selection due to the strict condition of equivalence relation. However, D-S evidence theory is flexible to measure uncertainty of information. This paper introduces robust feature evaluation metrics "belief function'' and "plausibility function'' into feature selection algorithm to avoid the defect that classification effect is affected by noise. First of all, similarity between information values in an interval-valued information system (IVIS) is given and a variable parameter to control the similarity of samples is introduced. Then, theta-lower approximation and theta-upper approximation in IVIS are put forward. Next, belief function and plausibility function based on theta-lower approximation and theta-upper approximation are put forward. Finally, four feature selection algorithms in an IVIS based on D-S evidence theory are proposed. The experimental results on four real interval-valued datasets show that the proposed metrics are robust to noise, and the proposed algorithms are more effective than the existing algorithms.
引用
收藏
页码:122754 / 122765
页数:12
相关论文
共 50 条
  • [1] Feature selection for set-valued data based on D-S evidence theory
    Wang, Yini
    Wang, Sichun
    ARTIFICIAL INTELLIGENCE REVIEW, 2023, 56 (03) : 2667 - 2696
  • [2] Interval Dominance-Based Feature Selection for Interval-Valued Ordered Data
    Li, Wentao
    Zhou, Haoxiang
    Xu, Weihua
    Wang, Xi-Zhao
    Pedrycz, Witold
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (10) : 6898 - 6912
  • [3] Incremental feature selection based on uncertainty measure for dynamic interval-valued data
    Shu, Wenhao
    Chen, Ting
    Cao, Dongtao
    Qian, Wenbin
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, 15 (04) : 1453 - 1472
  • [4] Feature selection for dynamic interval-valued ordered data based on fuzzy dominance neighborhood rough set
    Sang, Binbin
    Chen, Hongmei
    Yang, Lei
    Li, Tianrui
    Xu, Weihua
    Luo, Chuan
    KNOWLEDGE-BASED SYSTEMS, 2021, 227
  • [5] Feature selection for interval-valued data via FRIC-model
    Hu, Chunjiao
    Huang, Hengjie
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2023, 44 (01) : 919 - 938
  • [6] Attribute reduction for set-valued data based on D-S evidence theory
    Zhang, Qinli
    Li, Lulu
    INTERNATIONAL JOURNAL OF GENERAL SYSTEMS, 2022, 51 (08) : 822 - 861
  • [7] Incremental feature selection based on uncertainty measure for dynamic interval-valued data
    Wenhao Shu
    Ting Chen
    Dongtao Cao
    Wenbin Qian
    International Journal of Machine Learning and Cybernetics, 2024, 15 : 1453 - 1472
  • [8] Gene Selection in a Single Cell Gene Space Based on D-S Evidence Theory
    Li, Zhaowen
    Zhang, Qinli
    Wang, Pei
    Liu, Fang
    Song, Yan
    Wen, Ching-Feng
    INTERDISCIPLINARY SCIENCES-COMPUTATIONAL LIFE SCIENCES, 2022, 14 (03) : 722 - 744
  • [9] Feature selection for set-valued data based on D–S evidence theory
    Yini Wang
    Sichun Wang
    Artificial Intelligence Review, 2023, 56 : 2667 - 2696
  • [10] Maximum Information Coefficient Feature Selection Method for Interval-Valued Data
    Qi, Xiaobo
    Song, Jinyu
    Qi, Hui
    Shi, Ying
    IEEE ACCESS, 2024, 12 : 53752 - 53766