Active learning with misclassification sampling based on committee

被引:0
|
作者
Long, Jun [1 ]
Yin, Jianping [1 ]
Zhu, En [1 ]
Zhao, Wentao [1 ]
机构
[1] Natl Univ Def Technol, Sch Comp Sci, Changsha 410073, Peoples R China
基金
中国国家自然科学基金;
关键词
active learning; misclassification sampling; committee; version space reduction;
D O I
10.1142/S0218488508005248
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Active learning is an important approach to reduce data-collection costs for inductive learning problems by sampling only the most informative instances for labeling. We focus here on the sampling criterion for how to select these most informative instances. Three contributions are made in this paper. First, in contrast to the leading sampling strategy of halving the volume of version space, we present the sampling strategy of reducing the volume of version space by more than half with the assumption of target function being chosen from nonuniform distribution over version space. Second, we propose the idea of sampling the instances that would be most possibly misclassified. Third, we develop a sampling method named CBMPMS (Committee Based Most Possible Misclassification Sampling) which samples the instances that have the largest probability to be misclassified by the current classifier. Comparing the proposed CBMPMS method with the existing active learning methods, when the classifiers achieve the same accuracy, the former method will sample fewer times than the latter ones. The experiments show that the proposed method outperforms the traditional sampling methods on most selected datasets.
引用
收藏
页码:55 / 70
页数:16
相关论文
共 50 条
  • [31] Active learning by query by committee with robust divergences
    Hino H.
    Eguchi S.
    Information Geometry, 2023, 6 (1) : 81 - 106
  • [32] Congressional committee simulation: An active learning experiment
    Ciliotta-Rubery, A
    Levy, D
    PS-POLITICAL SCIENCE & POLITICS, 2000, 33 (04) : 847 - 851
  • [33] Efficient sampling-based Bayesian Active Learning for synaptic characterization
    Gontier, Camille
    Surace, Simone Carlo
    Delvendahl, Igor
    Mueller, Martin
    Pfister, Jean-Pascal
    PLOS COMPUTATIONAL BIOLOGY, 2023, 19 (08)
  • [34] Active AODE learning based on a novel sampling strategy and its application
    Wu, Jia
    Cai, Zhi-hua
    Chen, Xiao-lin
    Ao, Shuang
    INTERNATIONAL JOURNAL OF COMPUTER APPLICATIONS IN TECHNOLOGY, 2013, 47 (04) : 326 - 333
  • [35] Uncertainty Sampling Based Active Learning with Diversity Constraint by Sparse Selection
    Wang, Gaoang
    Hwang, Jenq-Neng
    Rose, Craig
    Wallace, Farron
    2017 IEEE 19TH INTERNATIONAL WORKSHOP ON MULTIMEDIA SIGNAL PROCESSING (MMSP), 2017,
  • [36] Contextual Bandit for Active Learning: Active Thompson Sampling
    Bouneffouf, Djallel
    Laroche, Romain
    Urvoy, Tanguy
    Feraud, Raphael
    Allesiardo, Robin
    NEURAL INFORMATION PROCESSING (ICONIP 2014), PT I, 2014, 8834 : 405 - 412
  • [37] Convergence of Uncertainty Sampling for Active Learning
    Raj, Anant
    Bach, Francis
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [38] Optimal sampling in unbiased active learning
    Imberg, Henrik
    Jonasson, Johan
    Axelson-Fisk, Marina
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 559 - 568
  • [39] Bucketized Active Sampling for learning ACOPF
    Klamkin, Michael
    Tanneau, Mathieu
    Mak, Terrence W. K.
    Van Hentenryck, Pascal
    ELECTRIC POWER SYSTEMS RESEARCH, 2024, 235
  • [40] Category Learning Through Active Sampling
    Markant, Doug
    Gureckis, Todd M.
    COGNITION IN FLUX, 2010, : 248 - 253