Transfer Extreme Learning Machine with Output Weight Alignment

被引:7
作者
Zang, Shaofei [1 ]
Cheng, Yuhu [2 ]
Wang, Xuesong [2 ]
Yan, Yongyi [1 ]
机构
[1] Henan Univ Sci & Technol, Dept Informat, Engn Coll, Luoyang 471000, Peoples R China
[2] China Univ Min & Technol, Dept Informat & Control, Engn Coll, Xuzhou 221116, Jiangsu, Peoples R China
基金
中国国家自然科学基金;
关键词
CLASSIFICATION; KERNEL;
D O I
10.1155/2021/6627765
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
Extreme Learning Machine (ELM) as a fast and efficient neural network model in pattern recognition and machine learning will decline when the labeled training sample is insufficient. Transfer learning helps the target task to learn a reliable model by using plentiful labeled samples from the different but relevant domain. In this paper, we propose a supervised Extreme Learning Machine with knowledge transferability, called Transfer Extreme Learning Machine with Output Weight Alignment (TELM-OWA). Firstly, it reduces the distribution difference between domains by aligning the output weight matrix of the ELM trained by the labeled samples from the source and target domains. Secondly, the approximation between the interdomain ELM output weight matrix is added to the objective function to further realize the cross-domain transfer of knowledge. Thirdly, we consider the objective function as the least square problem and transform it into a standard ELM model to be efficiently solved. Finally, the effectiveness of the proposed algorithm is verified by classification experiments on 16 sets of image datasets and 6 sets of text datasets, and the result demonstrates the competitive performance of our method with respect to other ELM models and transfer learning approach.
引用
收藏
页数:14
相关论文
共 57 条
  • [1] A Heterogeneous AdaBoost Ensemble Based Extreme Learning Machines for Imbalanced Data
    Abuassba, Adnan Omer
    Zhang, Dezheng
    Luo, Xiong
    [J]. INTERNATIONAL JOURNAL OF COGNITIVE INFORMATICS AND NATURAL INTELLIGENCE, 2019, 13 (03) : 19 - 35
  • [2] Al-Stouhi S, 2011, LECT NOTES ARTIF INT, V6911, P60, DOI 10.1007/978-3-642-23780-5_14
  • [3] Arsa DMS, 2017, 2017 INTERNATIONAL WORKSHOP ON BIG DATA AND INFORMATION SECURITY (IWBIS 2017), P63, DOI 10.1109/IWBIS.2017.8275104
  • [4] Sparse Extreme Learning Machine for Classification
    Bai, Zuo
    Huang, Guang-Bin
    Wang, Danwei
    Wang, Han
    Westover, M. Brandon
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2014, 44 (10) : 1858 - 1870
  • [5] Inductive bias for semi-supervised extreme learning machine
    Bisio, Federica
    Decherchi, Sergio
    Gastaldo, Paolo
    Zunino, Rodolfo
    [J]. NEUROCOMPUTING, 2016, 174 : 154 - 167
  • [6] Integrating structured biological data by Kernel Maximum Mean Discrepancy
    Borgwardt, Karsten M.
    Gretton, Arthur
    Rasch, Malte J.
    Kriegel, Hans-Peter
    Schoelkopf, Bernhard
    Smola, Alex J.
    [J]. BIOINFORMATICS, 2006, 22 (14) : E49 - E57
  • [7] Local Block Multilayer Sparse Extreme Learning Machine for Effective Feature Extraction and Classification of Hyperspectral Images
    Cao, Faxian
    Yang, Zhijing
    Ren, Jinchang
    Chen, Weizhao
    Han, Guojun
    Shen, Yuzhen
    [J]. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2019, 57 (08): : 5580 - 5594
  • [8] Voting based extreme learning machine
    Cao, Jiuwen
    Lin, Zhiping
    Huang, Guang-Bin
    Liu, Nan
    [J]. INFORMATION SCIENCES, 2012, 185 (01) : 66 - 77
  • [9] Chen C., 2018, 2018 INT JOINT C NEU, P1
  • [10] Chen M., 2011, P ADV NEUR INF PROC, V24, P2456