Unsupervised sub-domain adaptation using optimal transport

被引:6
作者
Gilo, Obsa [1 ]
Mathew, Jimson [1 ]
Mondal, Samrat [1 ]
Sanodiya, Rakesh Kumar [2 ]
机构
[1] Indian Inst Technol Patna, Comp Sci & Engn, Bihar 801106, India
[2] Indian Inst Informat Technol, Comp Sci & Engn, Chittoor 801106, India
关键词
Domain adaptation; Subdomain adaptation; Sliced wasserstein metric; Optimal transport; DOMAIN ADAPTATION; CORRELATION ALIGNMENT; KERNEL;
D O I
10.1016/j.jvcir.2023.103857
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We focus on domain adaptation, a branch of transfer learning that concentrates on transferring knowledge from one domain to another when the data distributions differ. Specifically, we investigate unsupervised domain adaptation methods, which have abundant labeled examples from a source domain and unlabeled examples from a target domain available. We aim to minimize the distribution divergences between the domains using optimal transport with subdomain adaptation. Previous methods have mainly focused on reducing global distribution discrepancies between the domains, but these approaches cannot capture fine-grained information and do not consider the structure or geometry of the data. To handle these limitations, we propose Optimal Transport via Subdomain Adaptation (OTSA). Our method utilizes the sliced Wasserstein metric to reduce transportation costs while preserving geometrical data information and the Local Maximum Discrepancy (LMMD) to compute the local discrepancy in each domain category, which helps capture relevant features. Experiments were conducted on six standard domain adaptation datasets, and our method outperformed the majority of baselines. Our approach increased the average accuracy when compared with baselines on the OfficeHome (67.7% to 68.31%), Office-Caltech10 (91.8% to 96.33%), IMAGECLEF-DA (87.9% to 89.9%), VisDA-2017 (79.6% to 81.83%), Office31 (88.17% to 89.11%), and PACS (69.08% to 83.72%) datasets, respectively.
引用
收藏
页数:12
相关论文
共 89 条
[1]   Unsupervised Domain Adaptation for Extra Features in the Target Domain Using Optimal Transport [J].
Aritake, Toshimitsu ;
Hino, Hideitsu .
NEURAL COMPUTATION, 2022, 34 (12) :2432-2466
[2]   A theory of learning from different domains [J].
Ben-David, Shai ;
Blitzer, John ;
Crammer, Koby ;
Kulesza, Alex ;
Pereira, Fernando ;
Vaughan, Jennifer Wortman .
MACHINE LEARNING, 2010, 79 (1-2) :151-175
[3]  
Ben-David Shai, 2006, NeurIPS
[4]   Representation Learning: A Review and New Perspectives [J].
Bengio, Yoshua ;
Courville, Aaron ;
Vincent, Pascal .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (08) :1798-1828
[5]   Sliced and Radon Wasserstein Barycenters of Measures [J].
Bonneel, Nicolas ;
Rabin, Julien ;
Peyre, Gabriel ;
Pfister, Hanspeter .
JOURNAL OF MATHEMATICAL IMAGING AND VISION, 2015, 51 (01) :22-45
[6]  
BONNOTTE N., 2013, Ph.D. thesis
[7]  
Bousquet O, 2004, LECT NOTES ARTIF INT, V3176, P169
[8]  
Chen C, 2020, AAAI CONF ARTIF INTE, V34, P3422
[9]  
Chen C, 2019, AAAI CONF ARTIF INTE, P3296
[10]  
Chen XY, 2019, PR MACH LEARN RES, V97