Adversarial Task Up-sampling for Meta-learning

被引:0
|
作者
Wu, Yichen [1 ,2 ]
Huang, Long-Kai [2 ]
Wei, Ying [1 ]
机构
[1] City Univ Hong Kong, Hong Kong, Peoples R China
[2] Tencent AI Lab, Shenzhen, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The success of meta-learning on existing benchmarks is predicated on the assumption that the distribution of meta-training tasks covers meta-testing tasks. Frequent violation of the assumption in applications with either insufficient tasks or a very narrow meta-training task distribution leads to memorization or learner overfitting. Recent solutions have pursued augmentation of meta-training tasks, while it is still an open question to generate both correct and sufficiently imaginary tasks. In this paper, we seek an approach that up-samples meta-training tasks from the task representation via a task up-sampling network. Besides, the resulting approach named Adversarial Task Up-sampling (ATU) suffices to generate tasks that can maximally contribute to the latest meta-learner by maximizing an adversarial loss. On few-shot sine regression and image classification datasets, we empirically validate the marked improvement of ATU over state-of-the-art task augmentation strategies in the meta-testing performance and also the quality of up-sampled tasks.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Image Up-Sampling for Super Resolution with Generative Adversarial Network
    Tsunekawa, Shohei
    Inoue, Katsufumi
    Yoshioka, Michifumi
    AI 2018: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, 11320 : 258 - 270
  • [2] Towards well-generalizing meta-learning via adversarial task augmentation
    Wang, Haoqing
    Mai, Huiyu
    Gong, Yuhang
    Deng, Zhi-Hong
    ARTIFICIAL INTELLIGENCE, 2023, 317
  • [3] Meta-Learning Adversarial Bandit Algorithms
    Khodak, Mikhail
    Osadchiy, Ilya
    Harris, Keegan
    Balcan, Maria-Florina
    Levy, Kfir Y.
    Meir, Ron
    Wu, Zhiwei Steven
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [4] Towards Task Sampler Learning for Meta-Learning
    Wang, Jingyao
    Qiang, Wenwen
    Su, Xingzhe
    Zheng, Changwen
    Sun, Fuchun
    Xiong, Hui
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2024, 132 (12) : 5534 - 5564
  • [5] Leveraging Task Variability in Meta-learning
    Aimen A.
    Ladrecha B.
    Sidheekh S.
    Krishnan N.C.
    SN Computer Science, 4 (5)
  • [6] Meta-learning with an Adaptive Task Scheduler
    Yao, Huaxiu
    Wang, Yu
    Wei, Ying
    Zhao, Peilin
    Mahdavi, Mehrdad
    Lian, Defu
    Finn, Chelsea
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [7] Enhancing Fault Diagnosis in Industrial Processes through Adversarial Task Augmented Sequential Meta-Learning
    Sun, Dexin
    Fan, Yunsheng
    Wang, Guofeng
    APPLIED SCIENCES-BASEL, 2024, 14 (11):
  • [8] Improving progressive sampling via meta-learning
    Leite, R
    Brazdil, P
    PROGRESS IN ARTIFICIAL INTELLIGENCE-B, 2003, 2902 : 313 - 323
  • [9] Deep Fourier Up-Sampling
    Zhou, Man
    Yu, Hu
    Huang, Jie
    Zhao, Feng
    Gu, Jinwei
    Loy, Chen Change
    Meng, Deyu
    Li, Chongyi
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [10] TASK2VEC: Task Embedding for Meta-Learning
    Achille, Alessandro
    Lam, Michael
    Tewari, Rahul
    Ravichandran, Avinash
    Maji, Subhransu
    Fowlkes, Charless
    Soatto, Stefano
    Perona, Pietro
    2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 6439 - 6448