Source Hypothesis Transfer for Zero-Shot Domain Adaptation

被引:1
|
作者
Sakai, Tomoya [1 ]
机构
[1] NEC Corp Ltd, Tokyo, Japan
关键词
Hypothesis transfer learning; Zero-shot domain adaptation; Unseen domains; Domain adaptation;
D O I
10.1007/978-3-030-86486-6_35
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Making predictions in target unseen domains without training samples is frequent in real-world applications, such as new products' sales predictions. Zero-shot domain adaptation (ZSDA) has been studied to achieve this important but difficult task. An approach to ZSDA is to use multiple source domain data and domain attributes. Several recent domain adaptation studies have mentioned that source domain data are not often available due to privacy, technical, and contractual issues in practice. To address these issues, hypothesis transfer learning (HTL) has been gaining attention since it does not require access to source domain data. It has shown its effectiveness in supervised/unsupervised domain adaptation; however current HTL methods cannot be readily applied to ZSDA because we have no training data (even unlabeled data) for target domains. To solve this problem, we propose an HTL-based ZSDA method that connects multiple source hypotheses by domain attributes. Through theoretical analysis, we derive the convergence rate of the estimation error of our proposed method. Finally, we numerically demonstrate the effectiveness of our proposed HTL-based ZSDA method.
引用
收藏
页码:570 / 586
页数:17
相关论文
共 50 条
  • [41] Hypernetworks for Zero-Shot Transfer in Reinforcement Learning
    Rezaei-Shoshtari, Sahand
    Morissette, Charlotte
    Hogan, Francois R.
    Dudek, Gregory
    Meger, David
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 8, 2023, : 9579 - 9587
  • [42] Grounded Adaptation for Zero-shot Executable Semantic Parsing
    Zhong, Victor
    Lewis, Mike
    Wang, Sida I.
    Zettlemoyer, Luke
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 6869 - 6882
  • [43] Zero-Shot Task Adaptation with Relevant Feature Information
    Kumagai, Atsutoshi
    Iwata, Tomoharu
    Fujiwara, Yasuhiro
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 12, 2024, : 13283 - 13291
  • [44] Zero-Shot Transfer Learning for Event Extraction
    Huang, Lifu
    Ji, Heng
    Cho, Kyunghyun
    Dagan, Ido
    Riedel, Sebastian
    Voss, Clare R.
    PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL), VOL 1, 2018, : 2160 - 2170
  • [45] Combined scaling for zero-shot transfer learning
    Pham, Hieu
    Dai, Zihang
    Ghiasi, Golnaz
    Kawaguchi, Kenji
    Liu, Hanxiao
    Yu, Adams Wei
    Yu, Jiahui
    Chen, Yi-Ting
    Luong, Minh-Thang
    Wu, Yonghui
    Tan, Mingxing
    V. Le, Quoc
    NEUROCOMPUTING, 2023, 555
  • [46] Transfer Increment for Generalized Zero-Shot Learning
    Feng, Liangjun
    Zhao, Chunhui
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (06) : 2506 - 2520
  • [47] Zero-Shot Cross-Lingual Transfer in Legal Domain Using Transformer Models
    Shaheen, Zein
    Wohlgenannt, Gerhard
    Mouromtsev, Dmitry
    2021 INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE AND COMPUTATIONAL INTELLIGENCE (CSCI 2021), 2021, : 450 - 456
  • [48] Robust Zero-Shot Learning with Source Attributes Noise
    Yu, Jun
    Wu, Songsong
    Wang, Lu
    Jing, Xiao-Yuan
    PROCEEDINGS OF THE 2016 INTERNATIONAL CONFERENCE ON PROGRESS IN INFORMATICS AND COMPUTING (PIC), VOL 1, 2016, : 205 - 209
  • [49] Attribute fusion transfer for zero-shot fault diagnosis
    Fan, Linchuan
    Chen, Xiaolong
    Chai, Yi
    Lin, Wenyi
    ADVANCED ENGINEERING INFORMATICS, 2023, 58
  • [50] DARLA: Improving Zero-Shot Transfer in Reinforcement Learning
    Higgins, Irina
    Pal, Arka
    Rusu, Andrei
    Matthey, Loic
    Burgess, Christopher
    Pritzel, Alexander
    Botyinick, Matthew
    Blundell, Charles
    Lerchner, Alexander
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70