Feature selections based on three improved condition entropies and one new similarity degree in interval-valued decision systems

被引:3
作者
Chen, Benwei [1 ,2 ]
Zhang, Xianyong [1 ,3 ]
Yang, Jilin [4 ]
机构
[1] Sichuan Normal Univ, Sch Math Sci, Chengdu 610066, Sichuan, Peoples R China
[2] North Sichuan Coll Presch Teacher Educ, Dept Primary Educ, Guangyuan, Sichuan, Peoples R China
[3] Sichuan Normal Univ, Visual Comp & Virtual Real Key Lab Sichuan Prov, Chengdu 610066, Peoples R China
[4] Sichuan Normal Univ, Coll Comp Sci, Chengdu 610101, Sichuan, Peoples R China
基金
中国国家自然科学基金;
关键词
Feature selection; Interval-valued decision system; Uncertainty measurement; Similarity degree; Condition entropy; Granulation non-monotonicity; ATTRIBUTE REDUCTION; UNCERTAINTY MEASURES; ACCURACY;
D O I
10.1016/j.engappai.2023.107165
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Feature selections facilitate classification learning in various data environments. Aiming at interval-valued decision systems (IVDSs), feature selections rely on information measures and similarity degrees, whereas current selection algorithms on credibility-based condition entropy and classical similarity degree are accompanied with some measurement limitations and advancement space. In this paper based on IVDSs, three coverage-credibility-based condition entropies and one geometry-probabilistic similarity degree are proposed across two dimensions of informationization and granulation, and they improve the existing condition entropy and similarity degree; accordingly, 4 x 2 feature selections emerge for optimization and applicability, and they systematically contain one initial selection algorithm and seven new/robuster algorithms. At first, three-way granular measures (i.e., credibility, coverage, and integrated coverage-credibility) are formulated in IVDSs, and three novel condition entropies are established by implementing three information structures on coverage-credibility. These condition entropies acquire in-depth improvements, hierarchical algorithms, size relationships, maximum/minimum conditions, and granulation non-monotonicity. Then, the probabilistic similarity degree is defined by a six-piecewise function with quadratic factors, and this new measure gains the geometry-probability mechanism and high-quality improvement. Furthermore, feature selections are determined by preserving condition entropies and by mining feature significances, so eight selection algorithms are obtained by combining condition entropies and similarity degrees. Finally, data experiments are performed to validate relevant uncertainty measures and feature selections, and seven constructional selection algorithms outperform three contrastive algorithms to achieve better classification performances.
引用
收藏
页数:22
相关论文
共 51 条
[1]   R-HEFS: Rough set based heterogeneous ensemble feature selection method for medical data classification [J].
Bania, Rubul Kumar ;
Halder, Anindya .
ARTIFICIAL INTELLIGENCE IN MEDICINE, 2021, 114
[2]   Attribute Reduction in an Incomplete Interval-Valued Decision Information System [J].
Chen, Yiying ;
Li, Zhaowen ;
Zhang, Gangqiang .
IEEE ACCESS, 2021, 9 :64539-64557
[3]   Measures of uncertainty for neighborhood rough sets [J].
Chen, Yumin ;
Xue, Yu ;
Ma, Ying ;
Xu, Feifei .
KNOWLEDGE-BASED SYSTEMS, 2017, 120 :226-235
[4]   Attribute reduction in interval-valued information systems based on information entropies [J].
Dai, Jian-hua ;
Hu, Hu ;
Zheng, Guo-jie ;
Hu, Qing-hua ;
Han, Hui-feng ;
Shi, Hong .
FRONTIERS OF INFORMATION TECHNOLOGY & ELECTRONIC ENGINEERING, 2016, 17 (09) :919-928
[5]   Uncertainty measurement for interval-valued information systems [J].
Dai, Jianhua ;
Wang, Wentao ;
Mi, Ju-Sheng .
INFORMATION SCIENCES, 2013, 251 :63-78
[6]   Uncertainty measurement for interval-valued decision systems based on extended conditional entropy [J].
Dai, Jianhua ;
Wang, Wentao ;
Xu, Qing ;
Tian, Haowei .
KNOWLEDGE-BASED SYSTEMS, 2012, 27 :443-450
[7]   Adaptive Hausdorff distances and dynamic clustering of symbolic interval data [J].
de Carvalho, FDT ;
de Souza, RMCR ;
Chavent, M ;
Lechevallier, Y .
PATTERN RECOGNITION LETTERS, 2006, 27 (03) :167-179
[8]  
Demsar J, 2006, J MACH LEARN RES, V7, P1
[9]   Approximate distribution reducts in inconsistent interval-valued ordered decision tables [J].
Du, Wen Sheng ;
Hu, Bao Qing .
INFORMATION SCIENCES, 2014, 271 :93-114
[10]  
Dua D, 2019, UCI MACHINE LEARNING