Domain Invariant Transfer Kernel Learning

被引:175
作者
Long, Mingsheng [1 ,2 ]
Wang, Jianmin [1 ,3 ]
Sun, Jiaguang [1 ,3 ]
Yu, Philip S. [4 ]
机构
[1] Tsinghua Univ, Sch Software, Beijing 100084, Peoples R China
[2] Tsinghua Univ, Dept Comp Sci, Beijing 100084, Peoples R China
[3] Tsinghua Univ, Tsinghua Natl Lab Informat Sci & Technol, Beijing 100084, Peoples R China
[4] Univ Illinois, Dept Comp Sci, Chicago, IL 60607 USA
基金
美国国家科学基金会;
关键词
Transfer learning; kernel learning; Nystrom method; text mining; image classification; video recognition; REGULARIZATION; FRAMEWORK;
D O I
10.1109/TKDE.2014.2373376
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Domain transfer learning generalizes a learning model across training data and testing data with different distributions. A general principle to tackle this problem is reducing the distribution difference between training data and testing data such that the generalization error can be bounded. Current methods typically model the sample distributions in input feature space, which depends on nonlinear feature mapping to embody the distribution discrepancy. However, this nonlinear feature space may not be optimal for the kernel-based learning machines. To this end, we propose a transfer kernel learning (TKL) approach to learn a domain-invariant kernel by directly matching source and target distributions in the reproducing kernel Hilbert space (RKHS). Specifically, we design a family of spectral kernels by extrapolating target eigensystem on source samples with Mercer's theorem. The spectral kernel minimizing the approximation error to the ground truth kernel is selected to construct domain-invariant kernel machines. Comprehensive experimental evidence on a large number of text categorization, image classification, and video event recognition datasets verifies the effectiveness and efficiency of the proposed TKL approach over several state-of-the-art methods.
引用
收藏
页码:1519 / 1532
页数:14
相关论文
共 44 条
  • [1] Andrienko G., 2013, Introduction, P1
  • [2] [Anonymous], 2001, Learning with Kernels: Support Vector Machines, Regularization, Optimization, and beyond
  • [3] [Anonymous], 2020, 2020 PROC INT C MACH, DOI DOI 10.1145/1390156.1390311
  • [4] [Anonymous], 2007, Proc. ACM Int. Conf. on Multimedia
  • [5] Belkin M, 2006, J MACH LEARN RES, V7, P2399
  • [6] Blitzer John, 2008, Domain adaptation of natural language processing systems
  • [7] Blitzer John, 2006, P 2006 C EMPIRICAL M, P120
  • [8] A Practical Transfer Learning Algorithm for Face Verification
    Cao, Xudong
    Wipf, David
    Wen, Fang
    Duan, Genquan
    Sun, Jian
    [J]. 2013 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2013, : 3208 - 3215
  • [9] Multisource Domain Adaptation and Its Application to Early Detection of Fatigue
    Chattopadhyay, Rita
    Sun, Qian
    Fan, Wei
    Davidson, Ian
    Panchanathan, Sethuraman
    Ye, Jieping
    [J]. ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2012, 6 (04)
  • [10] Discovering Low-Rank Shared Concept Space for Adapting Text Mining Models
    Chen, Bo
    Lam, Wai
    Tsang, Ivor W.
    Wong, Tak-Lam
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (06) : 1284 - 1297