Temporal Output Discrepancy for Loss Estimation-Based Active Learning

被引:0
作者
Huang, Siyu [1 ]
Wang, Tianyang [2 ]
Xiong, Haoyi [3 ]
Wen, Bihan [4 ]
Huan, Jun [5 ]
Dou, Dejing [3 ]
机构
[1] Harvard Univ, Harvard A John Paulson Sch Engn & Appl Sci, Cambridge, MA 02134 USA
[2] Austin Peay State Univ, Dept Comp Sci & Informat Technol, Clarksville, TN 37044 USA
[3] Baidu Res, Big Data Lab, Beijing 100193, Peoples R China
[4] Nanyang Technol Univ, Sch Elect & Elect Engn, Singapore 639798, Singapore
[5] Amazon, AWS AI Lab, Seattle, WA 98109 USA
关键词
Active learning; loss estimation; model selection; semisupervised learning; temporal consistency regularization;
D O I
10.1109/TNNLS.2022.3186855
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
While deep learning succeeds in a wide range of tasks, it highly depends on the massive collection of annotated data which is expensive and time-consuming. To lower the cost of data annotation, active learning has been proposed to interactively query an oracle to annotate a small proportion of informative samples in an unlabeled dataset. Inspired by the fact that the samples with higher loss are usually more informative to the model than the samples with lower loss, in this article we present a novel deep active learning approach that queries the oracle for data annotation when the unlabeled sample is believed to incorporate high loss. The core of our approach is a measurement temporal output discrepancy (TOD) that estimates the sample loss by evaluating the discrepancy of outputs given by models at different optimization steps. Our theoretical investigation shows that TOD lower-bounds the accumulated sample loss thus it can be used to select informative unlabeled samples. On basis of TOD, we further develop an effective unlabeled data sampling strategy as well as an unsupervised learning criterion for active learning. Due to the simplicity of TOD, our methods are efficient, flexible, and task-agnostic. Extensive experimental results demonstrate that our approach achieves superior performances than the state-of-the-art active learning methods on image classification and semantic segmentation tasks. In addition, we show that TOD can be utilized to select the best model of potentially the highest testing accuracy from a pool of candidate models.
引用
收藏
页码:2109 / 2123
页数:15
相关论文
共 50 条
  • [41] Link prediction for hypothesis generation: an active curriculum learning infused temporal graph-based approach
    Akujuobi, Uchenna
    Kumari, Priyadarshini
    Choi, Jihun
    Badreddine, Samy
    Maruyama, Kana
    Palaniappan, Sucheendra K.
    Besold, Tarek R.
    ARTIFICIAL INTELLIGENCE REVIEW, 2024, 57 (09)
  • [42] Multi-label Active Learning with Error Correcting Output Codes
    Sun, Ningzhao
    Shan, Jincheng
    Hou, Chenping
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2019, PT II, 2019, 11440 : 331 - 342
  • [44] Deep Anomaly Detection with Ensemble-Based Active Learning
    Tang, Xuning
    Astle, Yihua Shi
    Freeman, Craig
    2020 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2020, : 1663 - 1670
  • [45] An Active Learning Algorithm for Image Classification Based on Difficulty and Competence
    Li, Gen
    Zhao, Lu
    Gu, Junwei
    IEEE ACCESS, 2023, 11 : 60398 - 60406
  • [46] Combination of Loss-based Active Learning and Semi-supervised Learning for Recognizing Entities in Chinese Electronic Medical Records
    Yan, Jinghui
    Zong, Chengqing
    Xu, Jinan
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2023, 22 (05)
  • [47] Active Learning and Effort Estimation: Finding the Essential Content of Software Effort Estimation Data
    Kocaguneli, Ekrem
    Menzies, Tim
    Keung, Jacky
    Cok, David
    Madachy, Ray
    IEEE TRANSACTIONS ON SOFTWARE ENGINEERING, 2013, 39 (08) : 1040 - 1053
  • [48] Estimation of low failure probability based on active learning Kriging model with a concentric ring approaching strategy
    Xufeng Yang
    Yongshou Liu
    Xiuyang Fang
    Caiying Mi
    Structural and Multidisciplinary Optimization, 2018, 58 : 1175 - 1186
  • [49] Cross-Modal Diversity-Based Active Learning for Multi-Modal Emotion Estimation
    Xu, Yifan
    Meng, Lubin
    Peng, Ruimin
    Yin, Yingjie
    Ding, Jingting
    Li, Liang
    Wu, Dongrui
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [50] Deep Active Learning with Range Feedback for Facial Age Estimation
    Bhattacharya, Aditya R.
    Chakraborty, Shayok
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,