Stability-based Stopping Criterion for Active Learning

被引:5
作者
Wang, Wenquan [1 ]
Cai, Wenbin [1 ]
Zhang, Ya [1 ]
机构
[1] Shanghai Jiao Tong Univ, Shanghai Key Lab Multimedia Proc & Transmiss, Shanghai, Peoples R China
来源
2014 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM) | 2014年
关键词
Stopping criterion; Active learning; Stability;
D O I
10.1109/ICDM.2014.99
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
While active learning has drawn broad attention in recent years, there are relatively few studies on stopping criterion for active learning. We here propose a novel model stability based stopping criterion, which considers the potential of each unlabeled examples to change the model once added to the training set. The underlying motivation is that active learning should terminate when the model does not change much by adding remaining examples. Inspired by the widely used stochastic gradient update rule, we use the gradient of the loss at each candidate example to measure its capability to change the classifier. Under the model change rule, we stop active learning when the changing ability of all remaining unlabeled examples is less than a given threshold. We apply the stability-based stopping criterion to two popular classifiers: logistic regression and support vector machines (SVMs). It can be generalized to a wide spectrum of learning models. Substantial experimental results on various UCI benchmark data sets have demonstrated that the proposed approach outperforms state-of-art methods in most cases.
引用
收藏
页码:1019 / 1024
页数:6
相关论文
共 50 条
  • [1] Stopping Criterion for Active Learning with Model Stability
    Zhang, Yexun
    Cai, Wenbin
    Wang, Wenquan
    Zhang, Ya
    ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2018, 9 (02)
  • [2] A stopping criterion for active learning
    Vlachos, Andreas
    COMPUTER SPEECH AND LANGUAGE, 2008, 22 (03) : 295 - 312
  • [3] A Stopping Criterion for Transductive Active Learning
    Kottke, Daniel
    Sandrock, Christoph
    Krempl, Georg
    Sick, Bernhard
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2022, PT IV, 2023, 13716 : 468 - 484
  • [4] Prediction Stability as a Criterion in Active Learning
    Liu, Junyu
    Li, Xiang
    Zhou, Jiqiang
    Shen, Jianxiong
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2020, PT II, 2020, 12397 : 157 - 167
  • [5] Low density separation as a stopping criterion for active learning SVM
    Fu, Chunjiang
    Yang, Yupu
    INTELLIGENT DATA ANALYSIS, 2015, 19 (04) : 727 - 741
  • [6] Towards active learning: A stopping criterion for the sequential sampling of grain boundary degrees of freedom
    Schmalofski, Timo
    Kroll, Martin
    Dette, Holger
    Janisch, Rebecca
    MATERIALIA, 2023, 31
  • [7] Confidence-based stopping criteria for active learning for data annotation
    Zhu J.
    Wang H.
    Hovy E.
    Ma M.
    ACM Transactions on Speech and Language Processing, 2010, 6 (03):
  • [8] An accelerated active learning Kriging model with the distance-based subdomain and a new stopping criterion for reliability analysis
    Zhang, Yu
    Dong, You
    Xu, Jun
    RELIABILITY ENGINEERING & SYSTEM SAFETY, 2023, 231
  • [9] Stability-based validation of bicluster solutions
    Lee, Youngrok
    Lee, Jeonghwa
    Jun, Chi-Hyuck
    PATTERN RECOGNITION, 2011, 44 (02) : 252 - 264
  • [10] Stability-based PAC-Bayes analysis for multi-view learning algorithms
    Sun, Shiliang
    Yu, Mengran
    Shawe-Taylor, John
    Mao, Liang
    INFORMATION FUSION, 2022, 86-87 : 76 - 92