Active learning for noisy oracle via density power divergence

被引:4
作者
Sogawa, Yasuhiro [1 ]
Ueno, Tsuyoshi [2 ]
Kawahara, Yoshinobu [1 ]
Washio, Takashi [1 ]
机构
[1] Osaka Univ, Inst Sci & Ind Res, Osaka, Japan
[2] Japan Sci & Technol Agcy, Minato Discrete Struct Manipulat Syst Project, Kita Ku, Osaka, Japan
关键词
Noisy oracle; Active learning; Density power divergence; ROBUST;
D O I
10.1016/j.neunet.2013.05.007
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The accuracy of active learning is critically influenced by the existence of noisy labels given by a noisy oracle. In this paper, we propose a novel pool-based active learning framework through robust measures based on density power divergence. By minimizing density power divergence, such as a-divergence and gamma-divergence, one can estimate the model accurately even under the existence of noisy labels within data. Accordingly, we develop query selecting measures for pool-based active learning using these divergences. In addition, we propose an evaluation scheme for these measures based on asymptotic statistical analyses, which enables us to perform active learning by evaluating an estimation error directly. Experiments with benchmark datasets and real-world image datasets show that our active learning scheme performs better than several baseline methods. (C) 2013 Elsevier Ltd. All rights reserved.
引用
收藏
页码:133 / 143
页数:11
相关论文
共 50 条
  • [31] Consistency of minimizing a penalized density power divergence estimator for mixing distribution
    Taewook Lee
    Sangyeol Lee
    Statistical Papers, 2009, 50 : 67 - 80
  • [32] Robust estimation in generalized linear models: the density power divergence approach
    Abhik Ghosh
    Ayanendranath Basu
    TEST, 2016, 25 : 269 - 290
  • [33] Composite Likelihood Methods Based on Minimum Density Power Divergence Estimator
    Castilla, Elena
    Martin, Nirian
    Pardo, Leandro
    Zografos, Konstantinos
    ENTROPY, 2018, 20 (01):
  • [34] Robust estimation in generalized linear models: the density power divergence approach
    Ghosh, Abhik
    Basu, Ayanendranath
    TEST, 2016, 25 (02) : 269 - 290
  • [35] Robust empirical Bayes small area estimation with density power divergence
    Sugasawa, S.
    BIOMETRIKA, 2020, 107 (02) : 467 - 480
  • [36] Active learning with maximum density and minimum redundancy
    Gu, Yingjie (csyjgu@gmail.com), 1600, Springer Verlag (8834): : 103 - 110
  • [37] Active Learning with Maximum Density and Minimum Redundancy
    Gu, Yingjie
    Jin, Zhong
    Chiu, Steve C.
    NEURAL INFORMATION PROCESSING (ICONIP 2014), PT I, 2014, 8834 : 103 - 110
  • [38] On Enhancing the Label Propagation Algorithm for Sentiment Analysis Using Active Learning with an Artificial Oracle
    Yazidi, Anis
    Hammer, Hugo Lewi
    Bai, Aleksander
    Engelstad, Paal
    ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING, PT II (ICAISC 2015), 2015, 9120 : 799 - 810
  • [39] Reliability analysis for data-driven noisy models using active learning
    Pires, Anderson, V
    Moustapha, Maliki
    Marelli, Stefano
    Sudret, Bruno
    STRUCTURAL SAFETY, 2025, 112
  • [40] Robust tests for the equality of two normal means based on the density power divergence
    A. Basu
    A. Mandal
    N. Martin
    L. Pardo
    Metrika, 2015, 78 : 611 - 634