Approximate empirical kernel map-based iterative extreme learning machine for clustering

被引:0
作者
Chuangquan Chen
Chi-Man Vong
Pak-Kin Wong
Keng-Iam Tai
机构
[1] University of Macau,Department of Computer of Information Science
[2] University of Macau,Department of Electromechanical Engineering
来源
Neural Computing and Applications | 2020年 / 32卷
关键词
Maximum margin clustering; Extreme learning machine; Approximate empirical kernel map; Kernel learning; Compact model;
D O I
暂无
中图分类号
学科分类号
摘要
Maximum margin clustering (MMC) is a recent approach of applying margin maximization in supervised learning to unsupervised learning, aiming to partition the data into clusters with high discrimination. Recently, extreme learning machine (ELM) has been applied to MMC (called iterative ELM clustering or ELMCIter) which maximizes the data discrimination by iteratively training a weighted extreme learning machine (W-ELM). In this way, ELMCIter achieves a substantial reduction in training time and provides a unified model for both binary and multi-class clusterings. However, there exist two issues in ELMCIter: (1) random feature mappings adopted in ELMCIter are unable to well obtain high-quality discriminative features for clustering and (2) a large model is usually required in ELMCIter because its performance is affected by the number of hidden nodes, and training such model becomes relatively slow. In this paper, the hidden layer in ELMCIter is encoded by an approximate empirical kernel map (AEKM) rather than the random feature mappings, in order to solve these two issues. AEKM is generated from low-rank approximation of the kernel matrix, derived from the input data through a kernel function. Our proposed method is called iterative AEKM for clustering (AEKMCIter), whose contributions are: (1) AEKM can extract discriminative and robust features from the kernel matrix so that better performance is always achieved in AEKMCIter and (2) AEKMCIter produces an extremely small number of hidden nodes for low memory consumption and fast training. Detailed experiments verified the effectiveness and efficiency of our approach. As an illustration, on the MNIST10 dataset, our approach AEKMCIter improves the clustering accuracy over ELMCIter up to 5%, while significantly reducing the training time and the memory consumption (i.e., the number of hidden nodes) up to 1/7 and 1/20, respectively.
引用
收藏
页码:8031 / 8046
页数:15
相关论文
共 75 条
[1]  
Cortes C(1995)Support-vector networks Mach Learn 20 273-297
[2]  
Vapnik V(2006)Extreme learning machine: theory and applications Neurocomputing 70 489-501
[3]  
Huang G-B(2012)Extreme learning machine for regression and multiclass classification IEEE Trans Syst Man Cybern Part B Cybern 42 513-529
[4]  
Zhu Q-Y(1979)Algorithm AS 136: a k-means clustering algorithm J Roy Stat Soc: Ser C (Appl Stat) 28 100-108
[5]  
Siew C-K(2000)Finite mixture models Ann Rev Stat Appl 6 355-378
[6]  
Huang G-B(2009)Maximum margin clustering made practical IEEE Trans Neural Netw 20 583-596
[7]  
Zhou H(2003)Convergence of alternating optimization Neural Parallel Sci Comput 11 351-368
[8]  
Ding X(2013)Extreme maximum margin clustering IEICE Trans Inf Syst 96 1745-1753
[9]  
Zhang R(2015)Discriminative clustering via extreme learning machine Neural Netw 70 1-8
[10]  
Hartigan JA(2013)Weighted extreme learning machine for imbalance learning Neurocomputing 101 229-242