Fusing multiple interval-valued fuzzy monotonic decision trees

被引:3
作者
Chen, Jiankai [1 ,3 ]
Li, Zhongyan [2 ]
Wang, Xin [3 ]
Su, Han [1 ]
Zhai, Junhai [3 ]
机构
[1] North China Elect Power Univ, Sch Control & Comp Engn, Beijing 102206, Peoples R China
[2] North China Elect Power Univ, Sch Math & Phys, Beijing 102206, Peoples R China
[3] Hebei Univ, Coll Math & Informat Sci, Hebei Key Lab Machine Learning & Computat Intellig, Baoding 071002, Peoples R China
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
Monotonic classification; Fuzzy dominance rough set; Interval-valued data; Feature selection; Fusion learning; Decision trees; FEATURE-SELECTION; ENTROPY;
D O I
10.1016/j.ins.2024.120810
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
As a powerful knowledge mining technique for ordinal classification tasks, dominance -based rough set theory has many advantages but also some issues. Sensitivity to noisy information means that even a single mislabeled sample can lead to substantial fluctuations in the approximation calculations. Another issue is that most monotonic classifiers can only handle real -valued data and cannot directly deal with interval -valued data, which is common in practical applications. To address these issues, a tree -based fusion learning method for monotonic classification of intervalvalued attributes, named FM-IFMDT, has been proposed. Its functional structure comprises three components: (i) The proposed robust fl -precision interval -valued fuzzy dominance neighborhood rough set model (fl-IFDNRS) can adaptively identify and filter noise samples. (ii) The interval dominance discernibility matrix based on fl-IFDNRS is developed for feature selection and can generate a set of complete and diverse feature subsets. (iii) The novel interval -valued fuzzy monotonic decision tree (IFMDT) based on the probability distribution is trained on each feature subset and is used as the base classifier of the weighted voting fusion model. Extensive experiments show the proposed fusion learning method has significant advantages.
引用
收藏
页数:25
相关论文
共 45 条
  • [1] Alcala-Fdez J., 2011, MultipleValued Log. Soft Comput., V17, P2
  • [2] Ben-David A., 1989, Computational Intelligence, V5, P45, DOI 10.1111/j.1467-8640.1989.tb00314.x
  • [3] Borowik G., 2011, 2011 21st International Conference on Systems Engineering, P482, DOI 10.1109/ICSEng.2011.98
  • [4] Learning rule sets and Sugeno integrals for monotonic classification problems
    Brabant, Quentin
    Couceiro, Miguel
    Dubois, Didier
    Prade, Henri
    Rico, Agnes
    [J]. FUZZY SETS AND SYSTEMS, 2020, 401 : 4 - 37
  • [5] Random forests
    Breiman, L
    [J]. MACHINE LEARNING, 2001, 45 (01) : 5 - 32
  • [6] Prototype selection to improve monotonic nearest neighbor
    Cano, Jose-Ramon
    Aljohani, Naif R.
    Abbasi, Rabeeh Ayaz
    Alowidbi, Jalal S.
    Garcia, Salvador
    [J]. ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2017, 60 : 128 - 135
  • [7] Credit rating with a monotonicity-constrained support vector machine model
    Chen, Chih-Chuan
    Li, Sheng-Tun
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2014, 41 (16) : 7235 - 7247
  • [8] Sample Pair Selection for Attribute Reduction with Rough Set
    Chen, Degang
    Zhao, Suyun
    Zhang, Lei
    Yang, Yongping
    Zhang, Xiao
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2012, 24 (11) : 2080 - 2093
  • [9] Chen J., 2022, ADV COMPUT INTELL, V2, P12, DOI [10.1007/s43674-021-00016-6, DOI 10.1007/S43674-021-00016-6]
  • [10] Self-adaptive interval dominance-based feature selection for monotonic classification of interval-valued attributes
    Chen, Jiankai
    Li, Zhongyan
    Su, Han
    Zhai, Junhai
    [J]. INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, 15 (06) : 2209 - 2228