Approximate empirical kernel map-based iterative extreme learning machine for clustering

被引:2
|
作者
Chen, Chuangquan [1 ]
Vong, Chi-Man [1 ]
Wong, Pak-Kin [2 ]
Tai, Keng-Iam [1 ]
机构
[1] Univ Macau, Dept Comp Informat Sci, Macau, Peoples R China
[2] Univ Macau, Dept Electromech Engn, Macau, Peoples R China
关键词
Maximum margin clustering; Extreme learning machine; Approximate empirical kernel map; Kernel learning; Compact model; NYSTROM METHOD; MATRIX;
D O I
10.1007/s00521-019-04295-6
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Maximum margin clustering (MMC) is a recent approach of applying margin maximization in supervised learning to unsupervised learning, aiming to partition the data into clusters with high discrimination. Recently, extreme learning machine (ELM) has been applied to MMC (called iterative ELM clustering or ELMCIter) which maximizes the data discrimination by iteratively training a weighted extreme learning machine (W-ELM). In this way, ELMC(Iter)achieves a substantial reduction in training time and provides a unified model for both binary and multi-class clusterings. However, there exist two issues in ELMCIter: (1) random feature mappings adopted in ELMC(Iter)are unable to well obtain high-quality discriminative features for clustering and (2) a large model is usually required in ELMC(Iter)because its performance is affected by the number of hidden nodes, and training such model becomes relatively slow. In this paper, the hidden layer in ELMC(Iter)is encoded by an approximate empirical kernel map (AEKM) rather than the random feature mappings, in order to solve these two issues. AEKM is generated from low-rank approximation of the kernel matrix, derived from the input data through a kernel function. Our proposed method is called iterative AEKM for clustering (AEKMC(Iter)), whose contributions are: (1) AEKM can extract discriminative and robust features from the kernel matrix so that better performance is always achieved in AEKMC(Iter)and (2) AEKMC(Iter)produces an extremely small number of hidden nodes for low memory consumption and fast training. Detailed experiments verified the effectiveness and efficiency of our approach. As an illustration, on the MNIST10 dataset, our approach AEKMC(Iter)improves the clustering accuracy over ELMC(Iter)up to 5%, while significantly reducing the training time and the memory consumption (i.e., the number of hidden nodes) up to 1/7 and 1/20, respectively.
引用
收藏
页码:8031 / 8046
页数:16
相关论文
共 50 条
  • [31] The adaptive kernel-based extreme learning machine for state of charge estimation
    Zhang, Yanxin
    Zhang, Zili
    Chen, Jing
    Liao, Cuicui
    IONICS, 2023, 29 (05) : 1863 - 1872
  • [32] Fast kernel extreme learning machine for ordinal regression
    Shi, Yong
    Li, Peijia
    Yuan, Hao
    Miao, Jianyu
    Niu, Lingfeng
    KNOWLEDGE-BASED SYSTEMS, 2019, 177 : 44 - 54
  • [33] Spectra data classification with kernel extreme learning machine
    Zheng, Wenbin
    Shu, Hongping
    Tang, Hong
    Zhang, Haiqing
    CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2019, 192
  • [34] Online sequential reduced kernel extreme learning machine
    Deng, Wan-Yu
    Ong, Yew-Soon
    Tan, Puay Siew
    Zheng, Qing-Hua
    NEUROCOMPUTING, 2016, 174 : 72 - 84
  • [35] Large-scale kernel extreme learning machine
    Deng, Wan-Yu
    Zheng, Qing-Hua
    Chen, Lin
    Jisuanji Xuebao/Chinese Journal of Computers, 2014, 37 (11): : 2235 - 2246
  • [36] Motor Imagery EEG Classification Based on Kernel Hierarchical Extreme Learning Machine
    Duan, Lijuan
    Bao, Menghu
    Cui, Song
    Qiao, Yuanhua
    Miao, Jun
    COGNITIVE COMPUTATION, 2017, 9 (06) : 758 - 765
  • [37] A multi-label classification algorithm based on kernel extreme learning machine
    Luo, Fangfang
    Guo, Wenzhong
    Yu, Yuanlong
    Chen, Guolong
    NEUROCOMPUTING, 2017, 260 : 313 - 320
  • [38] Multiple-Instance Learning via an RBF Kernel-Based Extreme Learning Machine
    Wang J.
    Cai L.
    Zhao X.
    Cai, Liangjian (cailiangjian@outlook.com), 1600, Walter de Gruyter GmbH (26): : 185 - 195
  • [39] Stock Volatility Prediction using Multi-Kernel Learning based Extreme Learning Machine
    Wang, Feng
    Zhao, Zhiyong
    Li, Xiaodong
    Yu, Fei
    Zhang, Hao
    PROCEEDINGS OF THE 2014 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2014, : 3078 - 3085
  • [40] A clustering based ensemble of weighted kernelized extreme learning machine for class imbalance learning
    Choudhary, Roshani
    Shukla, Sanyam
    EXPERT SYSTEMS WITH APPLICATIONS, 2021, 164