Domain Adaptation Transfer Learning by Kernel Representation Adaptation

被引:1
|
作者
Chen, Xiaoyi [1 ]
Lengelle, Regis [1 ]
机构
[1] Univ Technol Troyes, Charles Delaunay Inst, Res Team LM2S, CNRS,ROSAS Dept,UMR 6281, 12 Rue Marie Curie,CS 42060, F-10004 Troyes 10004, France
关键词
D O I
10.1007/978-3-319-93647-5_3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Domain adaptation, where no labeled target data is available, is a challenging task. To solve this problem, we first propose a new SVM based approach with a supplementary Maximum Mean Discrepancy (MMD)-like constraint. With this heuristic, source and target data are projected onto a common subspace of a Reproducing Kernel Hilbert Space (RKHS) where both data distributions are expected to become similar. Therefore, a classifier trained on source data might perform well on target data, if the conditional probabilities of labels are similar for source and target data, which is the main assumption of this paper. We demonstrate that adding this constraint does not change the quadratic nature of the optimization problem, so we can use common quadratic optimization tools. Secondly, using the same idea that rendering source and target data similar might ensure efficient transfer learning, and with the same assumption, a Kernel Principal Component Analysis (KPCA) based transfer learning method is proposed. Different from the first heuristic, this second method ensures other higher order moments to be aligned in the RKHS, which leads to better performances. Here again, we select MMD as the similarity measure. Then, a linear transformation is also applied to further improve the alignment between source and target data. We finally compare both methods with other transfer learning methods from the literature to show their efficiency on synthetic and real datasets.
引用
收藏
页码:45 / 61
页数:17
相关论文
共 50 条
  • [31] Domain Adaptation with Invariant Representation Learning: What Transformations to Learn?
    Stojanov, Petar
    Li, Zijian
    Gong, Mingming
    Cai, Ruichu
    Carbonell, Jaime G.
    Zhang, Kun
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [32] Joint predictive model and representation learning for visual domain adaptation
    Gheisari, Marzieh
    Baghshah, Mandieh Soleymani
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2017, 58 : 157 - 170
  • [33] Domain Adaptation with Representation Learning and Nonlinear Relation for Time Series
    Hussein, Amir
    Hajj, Hazem
    ACM TRANSACTIONS ON INTERNET OF THINGS, 2022, 3 (02):
  • [34] Unsupervised Domain Adaptation in the Wild via Disentangling Representation Learning
    Haoliang Li
    Renjie Wan
    Shiqi Wang
    Alex C. Kot
    International Journal of Computer Vision, 2021, 129 : 267 - 283
  • [35] Representation learning via serial robust autoencoder for domain adaptation
    Yang, Shuai
    Zhang, Yuhong
    Wang, Hao
    Li, Peipei
    Hu, Xuegang
    EXPERT SYSTEMS WITH APPLICATIONS, 2020, 160
  • [36] Joint metric and feature representation learning for unsupervised domain adaptation
    Xie, Yue
    Du, Zhekai
    Li, Jingjing
    Jing, Mengmeng
    Chen, Erpeng
    Lu, Ke
    KNOWLEDGE-BASED SYSTEMS, 2020, 192
  • [37] Domain Adaptation for Graph Representation Learning: Challenges, Progress, and Prospects
    Shi, Bo-Shen
    Wang, Yong-Qing
    Guo, Fang-Da
    Xu, Bing-Bing
    Shen, Hua-Wei
    Cheng, Xue-Qi
    JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, 2025,
  • [38] Domain adaptation with transfer learning for pasture digital twins
    Pylianidis, Christos
    Kallenberg, Michiel G. J.
    Athanasiadis, Ioannis N.
    ENVIRONMENTAL DATA SCIENCE, 2024, 3
  • [39] Visual domain adaptation via transfer feature learning
    Tahmoresnezhad, Jafar
    Hashemi, Sattar
    KNOWLEDGE AND INFORMATION SYSTEMS, 2017, 50 (02) : 585 - 605
  • [40] Riemannian representation learning for multi-source domain adaptation
    Chen, Sentao
    Zheng, Lin
    Wu, Hanrui
    PATTERN RECOGNITION, 2023, 137