Bridging the gap with grad: Integrating active learning into semi-supervised domain generalization

被引:1
|
作者
Li, Jingwei [1 ,2 ]
Li, Yuan [1 ,2 ]
Tan, Jie [1 ]
Liu, Chengbao [1 ,2 ,3 ]
机构
[1] Chinese Acad Sci, Inst Automat, Beijing 100190, Peoples R China
[2] Univ Chinese Acad Sci, Sch Artificial Intelligence, Beijing 100049, Peoples R China
[3] Chinese Acad Sci, Inst Automat, 95 East Zhongguancun Rd, Beijing 100190, Peoples R China
关键词
Domain generalization; Semi-supervised learning; Active learning;
D O I
10.1016/j.neunet.2023.12.017
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Domain generalization (DG) aims to generalize from a large amount of source data that are fully annotated. However, it is laborious to collect labels for all source data in practice. Some research gets inspiration from semi-supervised learning (SSL) and develops a new task called semi-supervised domain generalization (SSDG). Unlabeled source data is trained jointly with labeled one to significantly improve the performance. Nevertheless, different research adopts different settings, leading to unfair comparisons. Moreover, the initial annotation of unlabeled source data is random, causing unstable and unreliable training. To this end, we first specify the training paradigm, and then leverage active learning (AL) to handle the issues. We further develop a new task called Active Semi-supervised Domain Generalization (ASSDG), which consists of two parts, i.e., SSDG and AL. We delve deep into the commonalities of SSL and AL and propose a unified framework called Gradient-Similarity-based Sample Filtering and Sorting (GSSFS) to iteratively train the SSDG and AL parts. Gradient similarity is utilized to select reliable and informative unlabeled source samples for these two parts respectively. Our methods are simple yet efficient, and extensive experiments demonstrate that our methods can achieve the best results on the DG datasets in the low-data regime without bells and whistles.
引用
收藏
页码:186 / 199
页数:14
相关论文
共 50 条
  • [41] Protein Function Prediction Based on Active Semi-supervised Learning
    Wang Xuesong
    Cheng Yuhu
    Li Lijing
    CHINESE JOURNAL OF ELECTRONICS, 2016, 25 (04) : 595 - 600
  • [42] Improved Generalization in Semi-Supervised Learning: A Survey of Theoretical Results
    Mey, Alexander
    Loog, Marco
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (04) : 4747 - 4767
  • [43] Generalization and discrimination in a semantic network trained with semi-supervised learning
    Robare, RJ
    PROCEEDINGS OF THE SIXTH INTERNATIONAL CONFERENCE ON COGNITIVE MODELING, 2004, : 412 - 413
  • [44] Enhancing Active Learning With Semi-Supervised Loss Prediction Modules
    Hwang, Sekjin
    Choi, Jinwoo
    Choi, Joonsoo
    IEEE ACCESS, 2024, 12 : 118756 - 118765
  • [45] Protein Function Prediction Based on Active Semi-supervised Learning
    WANG Xuesong
    CHENG Yuhu
    LI Lijing
    Chinese Journal of Electronics, 2016, 25 (04) : 595 - 600
  • [46] Adapt then Generalize: A Simple Two-Stage Framework for Semi-Supervised Domain Generalization
    Xie, Han
    Shen, Zhifeng
    Yang, Shicai
    Chen, Weijie
    Lin, Luojun
    2023 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME, 2023, : 540 - 545
  • [48] Combining active learning and Semi-supervised learning using local and Global consistency
    Gu, Yingjie
    Jin, Zhong
    Chiu, Steve C
    Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2014, 8834 : 215 - 222
  • [49] Semi-supervised classification using bridging
    Chan, Jason
    Koprinska, Irena
    Poon, Josiah
    INTERNATIONAL JOURNAL ON ARTIFICIAL INTELLIGENCE TOOLS, 2008, 17 (03) : 415 - 431
  • [50] Combining Active Learning and Semi-supervised Learning Using Local and Global Consistency
    Gu, Yingjie
    Jin, Zhong
    Chiu, Steve C.
    NEURAL INFORMATION PROCESSING (ICONIP 2014), PT I, 2014, 8834 : 215 - 222