TSTELM: Two-Stage Transfer Extreme Learning Machine for Unsupervised Domain Adaptation

被引:6
|
作者
Zang, Shaofei [1 ]
Li, Xinghai [1 ]
Ma, Jianwei [1 ]
Yan, Yongyi [1 ]
Gao, Jiwei [1 ]
Wei, Yuan [2 ]
机构
[1] Henan Univ Sci & Technol, Coll Informat Engn, Luoyang 471000, Peoples R China
[2] Henan Univ Sci & Technol, Coll Vehicle & Traff Engn, Luoyang 471000, Peoples R China
基金
中国国家自然科学基金;
关键词
BAYESIAN CLASSIFICATION; SWARM OPTIMIZATION; KERNEL; MODEL;
D O I
10.1155/2022/1582624
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
As a single-layer feedforward network (SLFN), extreme learning machine (ELM) has been successfully applied for classification and regression in machine learning due to its faster training speed and better generalization. However, it will perform poorly for domain adaptation in which the distributions between training data and testing data are inconsistent. In this article, we propose a novel ELM called two-stage transfer extreme learning machine (TSTELM) to solve this problem. At the statistical matching stage, we adopt maximum mean discrepancy (MMD) to narrow the distribution difference of the output layer between domains. In addition, at the subspace alignment stage, we align the source and target model parameters, design target cross-domain mean approximation, and add the output weight approximation to further promote the knowledge transferring across domains. Moreover, the prediction of test sample is jointly determined by the ELM parameters generated at the two stages. Finally, we investigate the proposed approach in classification task and conduct experiments on four public domain adaptation datasets. The result indicates that TSTELM could effectively enhance the knowledge transfer ability of ELM with higher accuracy than other existing transfer and non-transfer classifiers.
引用
收藏
页数:18
相关论文
共 50 条
  • [41] Receiver-Agnostic Radio Frequency Fingerprinting Based on Two-stage Unsupervised Domain Adaptation and Fine-tuning
    Bao, Jiazhong
    Xie, Xin
    Lu, Zhaoyi
    Hong, Jianan
    Hua, Cunqing
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 6085 - 6090
  • [42] Representation learning for unsupervised domain adaptation
    Xu Y.
    Yan H.
    Harbin Gongye Daxue Xuebao/Journal of Harbin Institute of Technology, 2021, 53 (02): : 40 - 46
  • [43] Two-stage machine learning model for guideline development
    Mani, S
    Shankle, WR
    Dick, MB
    Pazzani, MJ
    ARTIFICIAL INTELLIGENCE IN MEDICINE, 1999, 16 (01) : 51 - 71
  • [44] A Two-Stage Machine Learning Approach for Pathway Analysis
    Zhang, Wei
    Emrich, Scott
    Zeng, Erliang
    2010 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE, 2010, : 274 - 279
  • [45] Unsupervised Domain Adaptation with Similarity Learning
    Pinheiro, Pedro O.
    2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, : 8004 - 8013
  • [46] Transfer Domain Class Clustering for Unsupervised Domain Adaptation
    Fan, Yunxin
    Yan, Gang
    Li, Shuang
    Song, Shiji
    Wang, Wei
    Peng, Xinping
    PROCEEDINGS OF THE 3RD INTERNATIONAL CONFERENCE ON ELECTRICAL AND INFORMATION TECHNOLOGIES FOR RAIL TRANSPORTATION (EITRT) 2017: ELECTRICAL TRACTION, 2018, 482 : 827 - 835
  • [47] Unsupervised Domain Adaptation for Neural Machine Translation
    Yang, Zhen
    Chen, Wei
    Wang, Feng
    Xu, Bo
    2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 338 - 343
  • [48] A novel automatic two-stage locally regularized classifier construction method using the extreme learning machine
    Du, Dajun
    Li, Kang
    Irwin, George W.
    Deng, Jing
    NEUROCOMPUTING, 2013, 102 : 10 - 22
  • [49] A Two-stage Deep Domain Adaptation Method for Hyperspectral Image Classification
    Li, Zhaokui
    Tang, Xiangyi
    Li, Wei
    Wang, Chuanyun
    Liu, Cuiwei
    He, Jinrong
    REMOTE SENSING, 2020, 12 (07)
  • [50] Decentralized Two-Stage Federated Learning with Knowledge Transfer
    Jin, Tong
    Chen, Siguang
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 3181 - 3186