Bayesian Multi-Task Transfer Learning for Soft Prompt Tuning

被引:0
|
作者
Lee, Haeju [1 ]
Jeong, Minchan [1 ]
Yun, Se-Young [1 ]
Kim, Kee-Eung [1 ]
机构
[1] Korea Adv Inst Sci & Technol, Kim Jaechul Grad Sch AI, Seoul, South Korea
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Prompt tuning, in which prompts are optimized to adapt large-scale pre-trained language models to downstream tasks instead of fine-tuning the full model parameters, has been shown to be particularly effective when the prompts are trained in the multi-task transfer learning setting. These methods generally involve individually training prompts for each source task and then aggregating them to provide the initialization of the prompt for the target task. However, this approach critically ignores the fact that some of the source tasks could be negatively or positively interfering with each other. We argue that when we extract knowledge from source tasks via training source prompts, we need to consider this correlation among source tasks for better transfer to target tasks. To this end, we propose a Bayesian approach where we work with the posterior distribution of prompts across source tasks. We obtain representative source prompts corresponding to the samples from the posterior utilizing Stein Variational Gradient Descent, which are then aggregated to constitute the initial target prompt. We show extensive experimental results on the standard benchmark NLP tasks, where our Bayesian multi-task transfer learning approach outperforms the state-of-the-art methods in many settings. Furthermore, our approach requires no auxiliary models other than the prompt itself, achieving high degree of parameter-efficiency.(1)
引用
收藏
页码:4942 / 4958
页数:17
相关论文
共 50 条
  • [31] Modeling longitudinal imaging biomarkers with parametric Bayesian multi-task learning
    Aksman, Leon M.
    Scelsi, Marzia A.
    Marquand, Andre F.
    Alexander, Daniel C.
    Ourselin, Sebastien
    Altmann, Andre
    HUMAN BRAIN MAPPING, 2019, 40 (13) : 3982 - 4000
  • [32] High-Dimensional Bayesian Optimization with Multi-Task Learning for RocksDB
    Alabed, Sami
    Yoneki, Eiko
    PROCEEDINGS OF THE 1ST WORKSHOP ON MACHINE LEARNING AND SYSTEMS (EUROMLSYS'21), 2021, : 111 - 119
  • [33] Information-Theoretic Multi-task Learning Framework for Bayesian Optimisation
    Ramachandran, Anil
    Gupta, Sunil
    Rana, Santu
    Venkatesh, Svetha
    AI 2019: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, 11919 : 497 - 509
  • [34] High-dimensional Bayesian optimization with multi-task learning for RocksDB
    Alabed, Sami
    Yoneki, Eiko
    arXiv, 2021,
  • [35] Multi-task Transfer with Practice
    Pattnaik, Upasana
    Lee, Minwoo
    2021 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2021), 2021,
  • [36] Prompt Guided Transformer for Multi-Task Dense Prediction
    Lu, Yuxiang
    Sirejiding, Shalayiding
    Ding, Yue
    Wang, Chunlin
    Lu, Hongtao
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 6375 - 6385
  • [37] Transfer Learning-Based Evolutionary Multi-task Optimization
    Li, Shuai
    Zhu, Xiaobing
    Li, Xi
    BIO-INSPIRED COMPUTING: THEORIES AND APPLICATIONS, PT 1, BIC-TA 2023, 2024, 2061 : 14 - 28
  • [38] Options in Multi-task Reinforcement Learning - Transfer via Reflection
    Denis, Nicholas
    Fraser, Maia
    ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, 11489 : 225 - 237
  • [39] Data-driven Task Allocation for Multi-task Transfer Learning on the Edge
    Chen, Qiong
    Zheng, Zimu
    Hu, Chuang
    Wang, Dan
    Liu, Fangming
    2019 39TH IEEE INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS (ICDCS 2019), 2019, : 1040 - 1050
  • [40] Transfer and Multi-task Learning in QSAR Modeling: Advances and Challenges
    Simoes, Rodolfo S.
    Maltarollo, Vinicius G.
    Oliveira, Patricia R.
    Honorio, Kathia M.
    FRONTIERS IN PHARMACOLOGY, 2018, 9