Data Efficiency of Meta-learning

被引:0
|
作者
Al-Shedivat, Maruan [1 ]
Li, Liam [2 ]
Xing, Eric [1 ,3 ,4 ]
Talwalkar, Ameet [1 ,2 ]
机构
[1] CMU, Pittsburgh, PA 15213 USA
[2] Determined AI, San Francisco, CA USA
[3] MBZU AI, Abu Dhabi, U Arab Emirates
[4] Petuum, Pittsburgh, PA USA
来源
24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS) | 2021年 / 130卷
基金
美国国家科学基金会;
关键词
STABILITY;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Meta-learning has enabled learning statistical models that can be quickly adapted to new prediction tasks. Motivated by use-cases in personalized federated learning, we study the often overlooked aspect of the modern meta-learning algorithms-their data efficiency. To shed more light on which methods are more efficient, we use techniques from algorithmic stability to derive bounds on the transfer risk that have important practical implications, indicating how much supervision is needed and how it must be allocated for each method to attain the desired level of generalization. Further, we introduce a new simple framework for evaluating meta-learning methods under a limit on the available supervision, conduct an empirical study of MAML, Reptile, and PROTONETS, and demonstrate the differences in the behavior of these methods on few-shot and federated learning benchmarks. Finally, we propose active meta-learning, which incorporates active data selection into learning-to-learn, leading to better performance of all methods in the limited supervision regime.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Learning Meta-Learning (LML) dataset: Survey data of meta-learning parameters
    Corraya, Sonia
    Al Mamun, Shamim
    Kaiser, M. Shamim
    DATA IN BRIEF, 2023, 51
  • [2] Data Augmentation for Meta-Learning
    Ni, Renkun
    Goldblum, Micah
    Sharaf, Amr
    Kong, Kezhi
    Goldstein, Tom
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [3] Meta-learning: Data, architecture, and both
    Binz, Marcel
    Dasgupta, Ishita
    Jagadish, Akshay
    Botvinick, Matthew
    Wang, Jane X.
    Schulz, Eric
    BEHAVIORAL AND BRAIN SCIENCES, 2024, 47
  • [4] Meta-learning Enhancements by Data Partitioning
    Merk, Beata
    Bratu, Camelia Vidrighin
    Potolea, Rodica
    2009 IEEE 5TH INTERNATIONAL CONFERENCE ON INTELLIGENT COMPUTER COMMUNICATION AND PROCESSING, PROCEEDINGS, 2009, : 59 - 62
  • [5] On sensitivity of meta-learning to support data
    Agarwal, Mayank
    Yurochkin, Mikhail
    Sun, Yuekai
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [6] A guidance of data stream characterization for meta-learning
    Debiaso Rossi, Andre Luis
    de Souza, Bruno Feres
    Soares, Carlos
    de Leon Ferreira de Carvalho, Andre Carlos Ponce
    INTELLIGENT DATA ANALYSIS, 2017, 21 (04) : 1015 - 1035
  • [7] Automated Data Cleansing through Meta-Learning
    Gemp, Ian
    Theocharous, Georgios
    Ghavamzadeh, Mohammad
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 4760 - 4761
  • [8] A Meta-Learning Architecture based on Linked Data
    Dos Santos, Ricardo
    Aguilar, Jose
    Puerto, Eduard
    2021 XLVII LATIN AMERICAN COMPUTING CONFERENCE (CLEI 2021), 2021,
  • [9] On the Accuracy of Meta-learning for Scalable Data Mining
    Chan P.K.
    Stolfo S.J.
    Journal of Intelligent Information Systems, 1997, 8 (1) : 5 - 28
  • [10] Meta-data: Characterization of input features for meta-learning
    Castiello, C
    Castellano, G
    Fanelli, AM
    MODELING DECISIONS FOR ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 2005, 3558 : 457 - 468