Incremental Rebalancing Learning on Evolving Data Streams

被引:17
作者
Bernardo, Alessio [1 ]
Valle, Emanuele Della [1 ]
Bifet, Albert [2 ,3 ]
机构
[1] DEIB Politecn Milano, Milan, Italy
[2] Univ Waikato, Hamilton, New Zealand
[3] Telecom ParisTech, LTCI, Palaiseau, France
来源
20TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS (ICDMW 2020) | 2020年
关键词
Evolving Data Stream; Streaming; Concept Drift; MOA; Balancing;
D O I
10.1109/ICDMW51313.2020.00121
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Nowadays, every device connected to the Internet generates an ever-growing (formally, unbounded) stream of data. Machine Learning on data streams is a grand challenge due to its resource constraints. Indeed, standard machine learning techniques are not able to deal with data whose statistics are subject to gradual or sudden changes (formally, concept drift) without any warning. Massive Online Analysis (MOA) is the collective name, as well as a software library, for new learners that can manage data streams. In this paper, we present a research study on streaming rebalancing. Indeed, data streams can be imbalanced as static data, but there is not a method to rebalance them incrementally. For this reason, we propose a new streaming approach able to rebalance data streams online. Our new methodology is evaluated against some synthetically generated datasets using prequential evaluation to demonstrate that it outperforms the existing approaches.
引用
收藏
页码:844 / 850
页数:7
相关论文
共 11 条
[11]  
Tsymbal A., 2004, COMPUT SCI DEP TRINI, V106, P58