Multi-Label Learning with Distribution Matching Ensemble: An Adaptive and Just-In-Time Weighted Ensemble Learning Algorithm for Classifying a Nonstationary Online Multi-Label Data Stream

被引:1
|
作者
Shen, Chao [1 ]
Liu, Bingyu [1 ]
Shao, Changbin [1 ]
Yang, Xibei [1 ]
Xu, Sen [2 ]
Zhu, Changming [3 ]
Yu, Hualong [1 ]
机构
[1] Jiangsu Univ Sci & Technol, Sch Comp, Zhenjiang 212100, Peoples R China
[2] Yancheng Inst Technol, Sch Informat Technol, Yancheng 224051, Peoples R China
[3] Minzu Univ China, Key Lab Ethn language Intelligent Anal & Secur Gov, Beijing 100081, Peoples R China
来源
SYMMETRY-BASEL | 2025年 / 17卷 / 02期
基金
中国国家自然科学基金;
关键词
multi-label data stream; adaptive weighted ensemble; concept drift; distribution matching; Gaussian mixture model; Kullback-Leibler divergence; label distribution drift detection; CONCEPT DRIFT; CLASSIFICATION; MACHINE;
D O I
10.3390/sym17020182
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Learning from a nonstationary data stream is challenging, as a data stream is generally considered to be endless, and the learning model is required to be constantly amended for adapting the shifting data distributions. When it meets multi-label data, the challenge would be further intensified. In this study, an adaptive online weighted multi-label ensemble learning algorithm called MLDME (multi-label learning with distribution matching ensemble) is proposed. It simultaneously calculates both the feature matching level and label matching level between any one reserved data block and the new received data block, further providing an adaptive decision weight assignment for ensemble classifiers based on their distribution similarities. Specifically, MLDME abandons the most commonly used but not totally correct underlying hypothesis that in a data stream, each data block always has the most approximate distribution with that emerging after it; thus, MLDME could provide a just-in-time decision for the new received data block. In addition, to avoid an infinite extension of ensemble classifiers, we use a fixed-size buffer to store them and design three different dynamic classifier updating rules. Experimental results for nine synthetic and three real-world multi-label nonstationary data streams indicate that the proposed MLDME algorithm is superior to some popular and state-of-the-art online learning paradigms and algorithms, including two specifically designed ones for classifying a nonstationary multi-label data stream.
引用
收藏
页数:25
相关论文
共 50 条
  • [21] Learning with Latent Label Hierarchy from Incomplete Multi-Label Data
    Pei, Yuanli
    Fern, Xiaoli
    Raich, Raviv
    2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 2075 - 2080
  • [22] Adaptive Decision Threshold-Based Extreme Learning Machine for Classifying Imbalanced Multi-label Data
    Gao, Shang
    Dong, Wenlu
    Cheng, Ke
    Yang, Xibei
    Zheng, Shang
    Yu, Hualong
    NEURAL PROCESSING LETTERS, 2020, 52 (03) : 2151 - 2173
  • [23] Effective lazy learning algorithm based on a data gravitation model for multi-label learning
    Reyes, Oscar
    Morell, Carlos
    Ventura, Sebastian
    INFORMATION SCIENCES, 2016, 340 : 159 - 174
  • [24] Distributed Online Multi-Label Learning with Privacy Protection in Internet of Things
    Huang, Fan
    Yang, Nan
    Chen, Huaming
    Bao, Wei
    Yuan, Dong
    APPLIED SCIENCES-BASEL, 2023, 13 (04):
  • [25] Label correlation guided borderline oversampling for imbalanced multi-label data learning
    Zhang, Kai
    Mao, Zhaoyang
    Cao, Peng
    Liang, Wei
    Yang, Jinzhu
    Li, Weiping
    Zaiane, Osmar R.
    KNOWLEDGE-BASED SYSTEMS, 2023, 279
  • [26] Multi-label enhancement manifold learning algorithm for vehicle video
    Tan, Chao
    Ji, Genlin
    Zeng, Xiaoqian
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2023, 35 (13)
  • [27] Semi-supervised Online Classification Method for Multi-label Data Stream Based on Kernel Extreme Learning Machine
    Wang, Yuchen
    Qiu, Shiyuan
    Li, Peipei
    Hu, Xuegang
    Moshi Shibie yu Rengong Zhineng/Pattern Recognition and Artificial Intelligence, 2024, 37 (08): : 741 - 754
  • [28] Stable Label-Specific Features Generation for Multi-Label Learning via Mixture-Based Clustering Ensemble
    Wang, Yi-Bo
    Hang, Jun-Yi
    Zhang, Min-Ling
    IEEE-CAA JOURNAL OF AUTOMATICA SINICA, 2022, 9 (07) : 1248 - 1261
  • [29] An effective single-model learning for multi-label data
    Siahroudi, Sajjad Kamali
    Kudenko, Daniel
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 232
  • [30] Incremental deep forest for multi-label data streams learning
    Liang, Shunpan
    Pan, Weiwei
    You, Dianlong
    Liu, Ze
    Yin, Ling
    APPLIED INTELLIGENCE, 2022, 52 (12) : 13398 - 13414