Lighter, Better, Faster Multi-source Domain Adaptation with Gaussian Mixture Models and Optimal Transport

被引:0
|
作者
Montesuma, Eduardo Fernandes [1 ]
Mboula, Fred Ngole [1 ]
Souloumiac, Antoine [1 ]
机构
[1] Univ Paris Saclay, CEA, LIST, F-91120 Palaiseau, France
关键词
Domain Adaptation; Optimal Transport; Gaussian Mixture Models;
D O I
10.1007/978-3-031-70365-2_2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we tackle Multi-Source Domain Adaptation (MSDA), a task in transfer learning where one adapts multiple heterogeneous, labeled source probability measures towards a different, unlabeled target measure. We propose a novel framework for MSDA, based on Optimal Transport (OT) and Gaussian Mixture Models (GMMs). Our framework has two key advantages. First, OT between GMMs can be solved efficiently via linear programming. Second, it provides a convenient model for supervised learning, especially classification, as components in the GMM can be associated with existing classes. Based on the GMM-OT problem, we propose a novel technique for calculating barycenters of GMMs. Based on this novel algorithm, we propose two new strategies for MSDA: GMM-Wasserstein Barycenter Transport (WBT) and GMM-Dataset Dictionary Learning (DaDiL). We empirically evaluate our proposed methods on four benchmarks in image classification and fault diagnosis, showing that we improve over the prior art while being faster and involving fewer parameters ((sic) Our code is publicly available at https://github.com/eddardd/gmm_msda).
引用
收藏
页码:21 / 38
页数:18
相关论文
共 50 条
  • [1] Multi-Source Domain Adaptation with Mixture of Experts
    Guo, Jiang
    Shah, Darsh J.
    Barzilay, Regina
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 4694 - 4703
  • [2] Optimal Transport for Multi-source Domain Adaptation under Target Shift
    Redko, Ievgen
    Courty, Nicolas
    Flamary, Remi
    Tuia, Devis
    22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89 : 849 - 858
  • [3] Leveraging Mixture Alignment for Multi-Source Domain Adaptation
    Dayal, Aveen
    Shrusti, S.
    Cenkeramaddi, Linga Reddy
    Mohan, C. Krishna
    Kumar, Abhinav
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2025, 34 : 885 - 898
  • [4] Multi-Source Domain Adaptation with Mixture of Joint Distributions
    Chen, Sentao
    Pattern Recognition, 2024, 149
  • [5] Multi-Source Domain Adaptation with Mixture of Joint Distributions
    Chen, Sentao
    PATTERN RECOGNITION, 2024, 149
  • [6] MULTI-SOURCE DOMAIN ADAPTATION VIA OPTIMAL TRANSPORT FOR BRAIN DEMENTIA IDENTIFICATION
    Guan, Hao
    Wang, Li
    Liu, Mingxia
    2021 IEEE 18TH INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING (ISBI), 2021, : 1514 - 1517
  • [7] Multi-source Domain Adaptation via Weighted Joint Distributions Optimal Transport
    Turrisi, Rosanna
    Flamary, Remi
    Rakotomamonjy, Alain
    Pontil, Massimiliano
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, VOL 180, 2022, 180 : 1970 - 1980
  • [8] Class-aware sample reweighting optimal transport for multi-source domain adaptation
    Wang, Shengsheng
    Wang, Bilin
    Zhang, Zhe
    Heidari, Ali Asghar
    Chen, Huiling
    NEUROCOMPUTING, 2023, 523 : 213 - 223
  • [9] A survey of multi-source domain adaptation
    Sun, Shiliang
    Shi, Honglei
    Wu, Yuanbin
    INFORMATION FUSION, 2015, 24 : 84 - 92
  • [10] Multi-Source Distilling Domain Adaptation
    Zhao, Sicheng
    Wang, Guangzhi
    Zhang, Shanghang
    Gu, Yang
    Li, Yaxian
    Song, Zhichao
    Xu, Pengfei
    Hu, Runbo
    Chai, Hua
    Keutzer, Kurt
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 12975 - 12983