Meta-Learning from Multimodal Task Distributions Using Multiple Sets of Meta-Parameters

被引:2
|
作者
Vettoruzzo, Anna [1 ]
Bouguelia, Mohamed-Rafik [1 ]
Rognvaldsson, Thorsteinn [1 ]
机构
[1] Halmstad Univ, CAISR, Halmstad, Sweden
来源
2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN | 2023年
关键词
Meta-Learning; Few-Shot Learning; Transfer Learning; Task Representation; Multimodal Distribution;
D O I
10.1109/IJCNN54540.2023.10191944
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Meta-learning or learning to learn involves training a model on various learning tasks in a way that allows it to quickly learn new tasks from the same distribution using only a small amount of training data (i.e., few-shot learning). Current meta-learning methods implicitly assume that the distribution over tasks is unimodal and consists of tasks belonging to a common domain, which significantly reduces the variety of task distributions they can handle. However, in real-world applications, tasks are often very diverse and come from multiple different domains, making it challenging to meta-learn common knowledge shared across the entire task distribution. In this paper, we propose a method for meta-learning from a multimodal task distribution. The proposed method learns multiple sets of meta-parameters (acting as different initializations of a neural network model) and uses a task encoder to select the best initialization to fine-tune for a new task. More specifically, with a few training examples from a task sampled from an unknown mode, the proposed method predicts which set of meta-parameters (i.e., model's initialization) would lead to a fast adaptation and a good post-adaptation performance on that task. We evaluate the proposed method on a diverse set of few-shot regression and image classification tasks. The results demonstrate the superiority of the proposed method compared to other state-of-the-art meta-learning methods and the benefit of learning multiple model initializations when tasks are sampled from a multimodal task distribution.
引用
收藏
页数:8
相关论文
共 50 条
  • [1] Multimodal meta-learning through meta-learned task representations
    Anna Vettoruzzo
    Mohamed-Rafik Bouguelia
    Thorsteinn Rögnvaldsson
    Neural Computing and Applications, 2024, 36 : 8519 - 8529
  • [2] Multimodal meta-learning through meta-learned task representations
    Vettoruzzo, Anna
    Bouguelia, Mohamed-Rafik
    Rognvaldsson, Thorsteinn
    NEURAL COMPUTING & APPLICATIONS, 2024, 36 (15): : 8519 - 8529
  • [3] Leveraging enhanced task embeddings for generalization in multimodal meta-learning
    Rao, Shuzhen
    Huang, Jun
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (15): : 10765 - 10778
  • [4] Revisit Multimodal Meta-Learning through the Lens of Multi-Task Learning
    Abdollahzadeh, Milad
    Malekzadeh, Touba
    Cheung, Ngai-Man
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [5] Evolution of meta-parameters in reinforcement learning algorithm
    Eriksson, A
    Capi, G
    Doya, K
    IROS 2003: PROCEEDINGS OF THE 2003 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, VOLS 1-4, 2003, : 412 - 417
  • [6] Learning Meta-Learning (LML) dataset: Survey data of meta-learning parameters
    Corraya, Sonia
    Al Mamun, Shamim
    Kaiser, M. Shamim
    DATA IN BRIEF, 2023, 51
  • [7] Meta-Learning Dynamics Forecasting Using Task Inference
    Wang, Rui
    Walters, Robin
    Yu, Rose
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [8] PAC Prediction Sets for Meta-Learning
    Park, Sangdon
    Dobriban, Edgar
    Lee, Insup
    Bastani, Osbert
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [9] Leveraging Task Variability in Meta-learning
    Aimen A.
    Ladrecha B.
    Sidheekh S.
    Krishnan N.C.
    SN Computer Science, 4 (5)
  • [10] Meta-learning with an Adaptive Task Scheduler
    Yao, Huaxiu
    Wang, Yu
    Wei, Ying
    Zhao, Peilin
    Mahdavi, Mehrdad
    Lian, Defu
    Finn, Chelsea
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34