Multi-Label Learning with Distribution Matching Ensemble: An Adaptive and Just-In-Time Weighted Ensemble Learning Algorithm for Classifying a Nonstationary Online Multi-Label Data Stream

被引:1
|
作者
Shen, Chao [1 ]
Liu, Bingyu [1 ]
Shao, Changbin [1 ]
Yang, Xibei [1 ]
Xu, Sen [2 ]
Zhu, Changming [3 ]
Yu, Hualong [1 ]
机构
[1] Jiangsu Univ Sci & Technol, Sch Comp, Zhenjiang 212100, Peoples R China
[2] Yancheng Inst Technol, Sch Informat Technol, Yancheng 224051, Peoples R China
[3] Minzu Univ China, Key Lab Ethn language Intelligent Anal & Secur Gov, Beijing 100081, Peoples R China
来源
SYMMETRY-BASEL | 2025年 / 17卷 / 02期
基金
中国国家自然科学基金;
关键词
multi-label data stream; adaptive weighted ensemble; concept drift; distribution matching; Gaussian mixture model; Kullback-Leibler divergence; label distribution drift detection; CONCEPT DRIFT; CLASSIFICATION; MACHINE;
D O I
10.3390/sym17020182
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Learning from a nonstationary data stream is challenging, as a data stream is generally considered to be endless, and the learning model is required to be constantly amended for adapting the shifting data distributions. When it meets multi-label data, the challenge would be further intensified. In this study, an adaptive online weighted multi-label ensemble learning algorithm called MLDME (multi-label learning with distribution matching ensemble) is proposed. It simultaneously calculates both the feature matching level and label matching level between any one reserved data block and the new received data block, further providing an adaptive decision weight assignment for ensemble classifiers based on their distribution similarities. Specifically, MLDME abandons the most commonly used but not totally correct underlying hypothesis that in a data stream, each data block always has the most approximate distribution with that emerging after it; thus, MLDME could provide a just-in-time decision for the new received data block. In addition, to avoid an infinite extension of ensemble classifiers, we use a fixed-size buffer to store them and design three different dynamic classifier updating rules. Experimental results for nine synthetic and three real-world multi-label nonstationary data streams indicate that the proposed MLDME algorithm is superior to some popular and state-of-the-art online learning paradigms and algorithms, including two specifically designed ones for classifying a nonstationary multi-label data stream.
引用
收藏
页数:25
相关论文
共 50 条
  • [1] Weighted Ensemble Classification of Multi-label Data Streams
    Wang, Lulu
    Shen, Hong
    Tian, Hui
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2017, PT II, 2017, 10235 : 551 - 562
  • [2] A Novel Online Stacked Ensemble for Multi-Label Stream Classification
    Buyukcakir, Alican
    Bonab, Hamed
    Can, Fazli
    CIKM'18: PROCEEDINGS OF THE 27TH ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, 2018, : 1063 - 1072
  • [3] Boosting Adaptive Weighted Broad Learning System for Multi-Label Learning
    Lin, Yuanxin
    Yu, Zhiwen
    Yang, Kaixiang
    Fan, Ziwei
    Chen, C. L. Philip
    IEEE-CAA JOURNAL OF AUTOMATICA SINICA, 2024, 11 (11) : 2204 - 2219
  • [4] MULFE: Multi-Label Learning via Label-Specific Feature Space Ensemble
    Lin, Yaojin
    Hu, Qinghua
    Liu, Jinghua
    Zhu, Xingquan
    Wu, Xindong
    ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2022, 16 (01)
  • [5] A Novel Probabilistic Label Enhancement Algorithm for Multi-Label Distribution Learning
    Tan, Chao
    Chen, Sheng
    Ji, Genlin
    Geng, Xin
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2022, 34 (11) : 5098 - 5113
  • [6] NkEL: nearest k-labelsets ensemble for multi-label learning
    Zhong, Xi-Yan
    Zhang, Yu-Li
    Wang, Dan-Dong
    Min, Fan
    APPLIED INTELLIGENCE, 2025, 55 (01)
  • [7] Multi-label active learning by model guided distribution matching
    Nengneng Gao
    Sheng-Jun Huang
    Songcan Chen
    Frontiers of Computer Science, 2016, 10 : 845 - 855
  • [8] Multi-label active learning by model guided distribution matching
    Gao, Nengneng
    Huang, Sheng-Jun
    Chen, Songcan
    FRONTIERS OF COMPUTER SCIENCE, 2016, 10 (05) : 845 - 855
  • [9] An Online Variational Inference and Ensemble Based Multi-label Classifier for Data Streams
    Thi Thu Thuy Nguyen
    Tien Thanh Nguyen
    Liew, Alan Wee-Chung
    Wang, Shi-Lin
    Liang, Tiancai
    Hu, Yongjiang
    2019 ELEVENTH INTERNATIONAL CONFERENCE ON ADVANCED COMPUTATIONAL INTELLIGENCE (ICACI 2019), 2019, : 302 - 307
  • [10] Efficient Ensemble Classification for Multi-Label Data Streams with Concept Drift
    Sun, Yange
    Shao, Han
    Wang, Shasha
    INFORMATION, 2019, 10 (05)