RKHS subspace domain adaption via minimum distribution gap

被引:0
|
作者
Yanzhen Qiu
Chuangfeng Zhang
Chenkui Xiong
Zhengming Ma
Shaolin Liao
机构
[1] Sun Yat-Sen University,
关键词
Domain adaption; RKHS; Maximum mean difference (MMD); Lagrange multiplier method (LMM) optimization;
D O I
暂无
中图分类号
学科分类号
摘要
Subspace learning of Reproducing Kernel Hilbert Space (RKHS) is most popular among domain adaption applications. The key goal is to embed the source and target domain samples into a common RKHS subspace where their distributions could match better. However, most existing domain adaption measures are either based on the first-order statistics that can’t accurately qualify the difference of distributions for non-Guassian distributions or complicated co-variance matrix that is difficult to be used and optimized. In this paper, we propose a neat and effective RKHS subspace domain adaption measure: Minimum Distribution Gap (MDG), where the rigorous mathematical formula can be derived to learn the weighting matrix of the optimized orthogonal Hilbert subspace basis via the Lagrange Multiplier Method. To show the efficiency of the proposed MDG measure, extensive numerical experiments with different datasets have been performed and the comparisons with four other state-of-the-art algorithms in the literature show that the proposed MDG measure is very promising.
引用
收藏
页码:1425 / 1439
页数:14
相关论文
共 50 条
  • [41] Ship Detection in Low-Quality SAR Images via an Unsupervised Domain Adaption Method
    Pu, Xinyang
    Jia, Hecheng
    Xin, Yu
    Wang, Feng
    Wang, Haipeng
    REMOTE SENSING, 2023, 15 (13)
  • [42] Bridging the Domain Gap: Improve Informal Language Translation via Counterfactual Domain Adaptation
    Wang, Ke
    Chen, Guandan
    Huang, Zhongqiang
    Wan, Xiaojun
    Huang, Fei
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 13970 - 13978
  • [43] A decoupled method for power system time domain simulation via invariant subspace partition
    Yang, D
    Ajjarapu, V
    2005 IEEE POWER ENGINEERING SOCIETY GENERAL MEETING, VOLS, 1-3, 2005, : 1330 - 1335
  • [44] Semi-Supervised Domain Adaptation via Joint Transductive and Inductive Subspace Learning
    Luo, Hao
    Tian, Zhiqiang
    Zhang, Kaibing
    Wang, Guofa
    Du, Shaoyi
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 10431 - 10445
  • [45] On the Minimum Subspace Coding Capacity of Multiplicative Finite-Field Matrix Channels with a Given Rank Distribution
    Liu, Chenchen
    Li, Xiaolin
    Zhou, Baojian
    Mow, Wai Ho
    2016 22ND ASIA-PACIFIC CONFERENCE ON COMMUNICATIONS (APCC), 2016, : 302 - 306
  • [46] Robust unsupervised feature selection via sparse and minimum-redundant subspace learning with dual regularization
    Zeng, Congying
    Chen, Hongmei
    Li, Tianrui
    Wan, Jihong
    NEUROCOMPUTING, 2022, 511 : 1 - 21
  • [47] Addressing Domain Gap via Content Invariant Representation for Semantic Segmentation
    Gao, Li
    Zhang, Lefei
    Zhang, Qian
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 7528 - 7536
  • [48] STRUCTURAL IMPLICATIONS OF THE PRICE-TO-MINIMUM-COST GAP IN ANHYDROUS AMMONIA PRODUCTION AND DISTRIBUTION
    WALSH, RG
    RATHJEN, RA
    JOURNAL OF FARM ECONOMICS, 1963, 45 (05): : 1380 - 1385
  • [49] Lifelong Domain Adaptation via Consolidated Internal Distribution
    Rostami, Mohammad
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [50] Cross Domain Distribution Adaptation via Kernel Mapping
    Zhong, Erheng
    Fan, Wei
    Peng, Jing
    Zhang, Kun
    Ren, Jiangtao
    Turaga, Deepak
    Verscheure, Olivier
    KDD-09: 15TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2009, : 1027 - 1035