Temporal Output Discrepancy for Loss Estimation-Based Active Learning

被引:0
|
作者
Huang, Siyu [1 ]
Wang, Tianyang [2 ]
Xiong, Haoyi [3 ]
Wen, Bihan [4 ]
Huan, Jun [5 ]
Dou, Dejing [3 ]
机构
[1] Harvard Univ, Harvard A John Paulson Sch Engn & Appl Sci, Cambridge, MA 02134 USA
[2] Austin Peay State Univ, Dept Comp Sci & Informat Technol, Clarksville, TN 37044 USA
[3] Baidu Res, Big Data Lab, Beijing 100193, Peoples R China
[4] Nanyang Technol Univ, Sch Elect & Elect Engn, Singapore 639798, Singapore
[5] Amazon, AWS AI Lab, Seattle, WA 98109 USA
关键词
Active learning; loss estimation; model selection; semisupervised learning; temporal consistency regularization;
D O I
10.1109/TNNLS.2022.3186855
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
While deep learning succeeds in a wide range of tasks, it highly depends on the massive collection of annotated data which is expensive and time-consuming. To lower the cost of data annotation, active learning has been proposed to interactively query an oracle to annotate a small proportion of informative samples in an unlabeled dataset. Inspired by the fact that the samples with higher loss are usually more informative to the model than the samples with lower loss, in this article we present a novel deep active learning approach that queries the oracle for data annotation when the unlabeled sample is believed to incorporate high loss. The core of our approach is a measurement temporal output discrepancy (TOD) that estimates the sample loss by evaluating the discrepancy of outputs given by models at different optimization steps. Our theoretical investigation shows that TOD lower-bounds the accumulated sample loss thus it can be used to select informative unlabeled samples. On basis of TOD, we further develop an effective unlabeled data sampling strategy as well as an unsupervised learning criterion for active learning. Due to the simplicity of TOD, our methods are efficient, flexible, and task-agnostic. Extensive experimental results demonstrate that our approach achieves superior performances than the state-of-the-art active learning methods on image classification and semantic segmentation tasks. In addition, we show that TOD can be utilized to select the best model of potentially the highest testing accuracy from a pool of candidate models.
引用
收藏
页码:2109 / 2123
页数:15
相关论文
共 50 条
  • [31] Active Learning for Ranking through Expected Loss Optimization
    Long, Bo
    Chapelle, Olivier
    Zhang, Ya
    Chang, Yi
    Zheng, Zhaohui
    Tseng, Belle
    SIGIR 2010: PROCEEDINGS OF THE 33RD ANNUAL INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH DEVELOPMENT IN INFORMATION RETRIEVAL, 2010, : 267 - 274
  • [32] Pseudo loss active learning for deep visual tracking
    Cui, Zhiyan
    Lu, Na
    Wang, Weifeng
    PATTERN RECOGNITION, 2022, 130
  • [33] Pseudo loss active learning for deep visual tracking
    Cui, Zhiyan
    Lu, Na
    Wang, Weifeng
    PATTERN RECOGNITION, 2022, 130
  • [34] Combining Committee-Based Semi-Supervised Learning and Active Learning
    Mohamed Farouk Abdel Hady
    Friedhelm Schwenker
    Journal of Computer Science and Technology, 2010, 25 : 681 - 698
  • [35] Combining Committee-Based Semi-Supervised Learning and Active Learning
    Hady, Mohamed Farouk Abdel
    Schwenker, Friedhelm
    JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, 2010, 25 (04) : 681 - 698
  • [36] Combining Committee-Based Semi-Supervised Learning and Active Learning
    Mohamed Farouk Abdel Hady
    Friedhelm Schwenker
    Journal of Computer Science & Technology, 2010, 25 (04) : 681 - 698
  • [37] Disentanglement based Active Learning
    Kappiyath, Adarsh
    Silpa, V. S.
    Sumitra, S.
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [38] A kriging-based active learning algorithm for contour estimation of integrated response with noise factors
    Mei Han
    Qianqian Huang
    Linhan Ouyang
    Xufeng Zhao
    Engineering with Computers, 2023, 39 : 1341 - 1362
  • [39] A kriging-based active learning algorithm for contour estimation of integrated response with noise factors
    Han, Mei
    Huang, Qianqian
    Ouyang, Linhan
    Zhao, Xufeng
    ENGINEERING WITH COMPUTERS, 2023, 39 (02) : 1341 - 1362
  • [40] Selecting Influential Examples: Active Learning with Expected Model Output Changes
    Freytag, Alexander
    Rodner, Erik
    Denzler, Joachim
    COMPUTER VISION - ECCV 2014, PT IV, 2014, 8692 : 562 - 577