Exploring Active Learning in Meta-learning: Enhancing Context Set Labeling

被引:0
作者
Bae, Wonho [1 ]
Wang, Jing [1 ]
Sutherland, Danica J. [1 ,2 ]
机构
[1] Univ British Columbia, Vancouver, BC, Canada
[2] Alberta Machine Intelligence Inst Amii, Edmonton, AB, Canada
来源
COMPUTER VISION - ECCV 2024, PT LXXXIX | 2025年 / 15147卷
基金
加拿大自然科学与工程研究理事会;
关键词
Meta learning; Active learning; Low budget;
D O I
10.1007/978-3-031-73024-5_17
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Most meta-learning methods assume that the (very small) context set used to establish a new task at test time is passively provided. In some settings, however, it is feasible to actively select which points to label; the potential gain from a careful choice is substantial, but the setting requires major differences from typical active learning setups. We clarify the ways in which active meta-learning can be used to label a context set, depending on which parts of the meta-learning process use active learning. Within this framework, we propose a natural algorithm based on fitting Gaussian mixtures for selecting which points to label; though simple, the algorithm also has theoretical motivation. The proposed algorithm outperforms state-of-the-art active learning methods when used with various meta-learning algorithms across several benchmark datasets.
引用
收藏
页码:279 / 296
页数:18
相关论文
共 73 条
[1]  
Agarwal M, 2021, ADV NEUR IN, V34
[2]  
Aghaee A., 2016, PAKDD
[3]  
Al-Shedivat M., 2021, AISTAT
[4]  
[Anonymous], 2003, ICML WORKSH
[5]  
Antoniou A., 2019, 7 INT C LEARN REPR I, DOI DOI 10.1145/3351556.3351574
[6]  
Arnold S.M.R., 2020, learn2learn: A library for Meta-Learning re-search
[7]  
Bertinetto L., 2019, INT C LEARN REPR ICL
[8]  
Biyik Erdem, 2019, NEURIPS
[9]  
Boney R, 2018, Arxiv, DOI arXiv:1711.10856
[10]  
Chen T, 2020, PR MACH LEARN RES, V119