Prototype Selection for Multilabel Instance-Based Learning

被引:3
|
作者
Filippakis, Panagiotis [1 ]
Ougiaroglou, Stefanos [1 ]
Evangelidis, Georgios [2 ]
机构
[1] Int Hellen Univ, Dept Informat & Elect Engn, Sch Engn, Thessaloniki 57400, Greece
[2] Univ Macedonia, Sch Informat Sci, Dept Appl Informat, 156 Egnatia St, Thessaloniki 54636, Greece
关键词
data reduction techniques; instance reduction; multilabel classification; prototype selection; instance-based classification; binary relevance; CNN; IB2; BRkNN; DATA REDUCTION; LOCAL SETS; CLASSIFICATION; GENERATION; ALGORITHM; KNN;
D O I
10.3390/info14100572
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Reducing the size of the training set, which involves replacing it with a condensed set, is a widely adopted practice to enhance the efficiency of instance-based classifiers while trying to maintain high classification accuracy. This objective can be achieved through the use of data reduction techniques, also known as prototype selection or generation algorithms. Although there are numerous algorithms available in the literature that effectively address single-label classification problems, most of them are not applicable to multilabel data, where an instance can belong to multiple classes. Well-known transformation methods cannot be combined with a data reduction technique due to different reasons. The Condensed Nearest Neighbor rule is a popular parameter-free single-label prototype selection algorithm. The IB2 algorithm is the one-pass variation of the Condensed Nearest Neighbor rule. This paper proposes variations of these algorithms for multilabel data. Through an experimental study conducted on nine distinct datasets as well as statistical tests, we demonstrate that the eight proposed approaches (four for each algorithm) offer significant reduction rates without compromising the classification accuracy.
引用
收藏
页数:22
相关论文
共 50 条
  • [41] Efficient instance-based learning on data streams
    Beringer, Juergen
    Huellermeier, Eyke
    INTELLIGENT DATA ANALYSIS, 2007, 11 (06) : 627 - 650
  • [42] Instance-based ensemble pruning for imbalanced learning
    Zhi, Weimei
    Guo, Huaping
    Fan, Ming
    Ye, Yangdong
    INTELLIGENT DATA ANALYSIS, 2015, 19 (04) : 779 - 794
  • [43] An optimization algorithm based on active and instance-based learning
    Fuentes, O
    Solorio, T
    MICAI 2004: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2004, 2972 : 242 - 251
  • [44] A Weighted Feature Selection Method for Instance-Based Classification
    Agre, Gennady
    Dzhondzhorov, Anton
    ARTIFICIAL INTELLIGENCE: METHODOLOGY, SYSTEMS, AND APPLICATIONS, AIMSA 2016, 2016, 9883 : 14 - 25
  • [45] A New Representation for Instance-based Clonal Selection Algorithms
    Vilas Boas Oliveira, Luiz Otavio
    Drummond, Isabela Neves
    Pappa, Gisele Lobo
    2013 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2013, : 2259 - 2266
  • [46] Feature selection for clustering using instance-based learning by exploring the nearest and farthest neighbors
    Chen, Chien-Hsing
    INFORMATION SCIENCES, 2015, 318 : 14 - 27
  • [47] Topic-Based Instance and Feature Selection in Multilabel Classification
    Ma, Jianghong
    Chow, Tommy W. S.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (01) : 315 - 329
  • [48] Version space learning with instance-based boundary sets
    Smirnov, EN
    Braspenning, PJ
    ECAI 1998: 13TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 1998, : 460 - 464
  • [49] Instance-based manifesto?
    Jelasity, M
    BEHAVIORAL AND BRAIN SCIENCES, 2000, 23 (04) : 482 - +
  • [50] Extracting Web Data Using Instance-Based Learning
    Yanhong Zhai
    Bing Liu
    World Wide Web, 2007, 10 : 113 - 132