Joint Feature Transformation and Selection Based on Dempster-Shafer Theory

被引:3
|
作者
Lian, Chunfeng [1 ,2 ]
Ruan, Su [2 ]
Denoeux, Thierry [1 ]
机构
[1] Univ Technol Compiegne, Sorbonne Univ, CNRS, UMR 7253, F-60205 Compiegne, France
[2] Univ Rouen, QuantIF EA 4108 LITIS, F-76000 Rouen, France
关键词
Belief functions; Dempster-Shafer theory; Feature transformation; Feature selection; Pattern classification; EVIDENTIAL C-MEANS; BELIEF FUNCTIONS; RULE; ALGORITHM; IMAGES;
D O I
10.1007/978-3-319-40596-4_22
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In statistical pattern recognition, feature transformation attempts to change original feature space to a low-dimensional subspace, in which new created features are discriminative and non-redundant, thus improving the predictive power and generalization ability of subsequent classification models. Traditional transformation methods are not designed specifically for tackling data containing unreliable and noisy input features. To deal with these inputs, a new approach based on Dempster-Shafer Theory is proposed in this paper. A specific loss function is constructed to learn the transformation matrix, in which a sparsity term is included to realize joint feature selection during transformation, so as to limit the influence of unreliable input features on the output low-dimensional subspace. The proposed method has been evaluated by several synthetic and real datasets, showing good performance.
引用
收藏
页码:253 / 261
页数:9
相关论文
共 50 条
  • [1] Dempster-Shafer Theory for Stock Selection
    Salehy, Nima
    Okten, Giray
    2021 IEEE 45TH ANNUAL COMPUTERS, SOFTWARE, AND APPLICATIONS CONFERENCE (COMPSAC 2021), 2021, : 1729 - 1734
  • [2] Uncertainty based on Dempster-Shafer theory
    Xiao, MZ
    Chen, GJ
    ICEMI'2003: PROCEEDINGS OF THE SIXTH INTERNATIONAL CONFERENCE ON ELECTRONIC MEASUREMENT & INSTRUMENTS, VOLS 1-3, 2003, : 117 - 120
  • [3] Dempster-Shafer Theory Based Feature Selection with Sparse Constraint for Outcome Prediction in Cancer Therapy
    Lian, Chunfeng
    Ruan, Su
    Denoeux, Thierry
    Li, Hua
    Vera, Pierre
    MEDICAL IMAGE COMPUTING AND COMPUTER-ASSISTED INTERVENTION, PT III, 2015, 9351 : 695 - 702
  • [4] Feature selection for label distribution learning using Dempster-Shafer evidence theory
    Zhao, Zhengwei
    Wang, Rongrong
    Pang, Wei
    Li, Zhaowen
    APPLIED INTELLIGENCE, 2025, 55 (04)
  • [5] Feature Selection With Ensemble Learning Based on Improved Dempster-Shafer Evidence Fusion
    Zheng, Yifeng
    Li, Guohe
    Zhang, Wenjie
    Li, Ying
    Wei, Baoya
    IEEE ACCESS, 2019, 7 : 9032 - 9045
  • [6] AN EXERCISE IN DEMPSTER-SHAFER THEORY
    HAJEK, P
    HARMANEC, D
    INTERNATIONAL JOURNAL OF GENERAL SYSTEMS, 1992, 20 (02) : 137 - 142
  • [7] A clash in Dempster-Shafer theory
    Xiong, W
    Ju, S
    Luo, X
    10TH IEEE INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS, VOLS 1-3: MEETING THE GRAND CHALLENGE: MACHINES THAT SERVE PEOPLE, 2001, : 793 - 796
  • [8] Fundamentals of the Dempster-Shafer Theory
    Peri, Joseph S. J.
    SIGNAL PROCESSING, SENSOR FUSION, AND TARGET RECOGNITION XXI, 2012, 8392
  • [9] Categorification of the Dempster-Shafer Theory
    Peri, Joseph S. J.
    SIGNAL PROCESSING, SENSOR/INFORMATION FUSION, AND TARGET RECOGNITION XXIV, 2015, 9474
  • [10] Internal Feature Selection Method of CSP Based on L1-Norm and Dempster-Shafer Theory
    Jin, Jing
    Xiao, Ruocheng
    Daly, Ian
    Miao, Yangyang
    Wang, Xingyu
    Cichocki, Andrzej
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (11) : 4814 - 4825