Local Sample-Weighted Multiple Kernel Clustering With Consensus Discriminative Graph

被引:44
作者
Li, Liang [1 ]
Wang, Siwei [1 ]
Liu, Xinwang [1 ]
Zhu, En [1 ]
Shen, Li [1 ]
Li, Kenli [2 ,3 ]
Li, Keqin [4 ]
机构
[1] Natl Univ Def Technol, Sch Comp, Changsha 410073, Peoples R China
[2] Hunan Univ, Coll Comp Sci & Elect Engn, Changsha 410073, Peoples R China
[3] Hunan Univ, Supercomp & Cloud Comp Inst, Changsha 410073, Peoples R China
[4] SUNY Coll New Paltz, Dept Comp Sci, New Paltz, NY 12561 USA
基金
中国国家自然科学基金;
关键词
Graph learning; localized kernel; multiview clustering; multiple kernel learning; MATRIX;
D O I
10.1109/TNNLS.2022.3184970
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multiple kernel clustering (MKC) is committed to achieving optimal information fusion from a set of base kernels. Constructing precise and local kernel matrices is proven to be of vital significance in applications since the unreliable distant-distance similarity estimation would degrade clustering performance. Although existing localized MKC algorithms exhibit improved performance compared with globally designed competitors, most of them widely adopt the KNN mechanism to localize kernel matrix by accounting for tau-nearest neighbors. However, such a coarse manner follows an unreasonable strategy that the ranking importance of different neighbors is equal, which is impractical in applications. To alleviate such problems, this article proposes a novel local sample-weighted MKC (LSWMKC) model. We first construct a consensus discriminative affinity graph in kernel space, revealing the latent local structures. Furthermore, an optimal neighborhood kernel for the learned affinity graph is output with naturally sparse property and clear block diagonal structure. Moreover, LSWMKC implicitly optimizes adaptive weights on different neighbors with corresponding samples. Experimental results demonstrate that our LSWMKC possesses better local manifold representation and outperforms existing kernel or graph-based clustering algorithms. The source code of LSWMKC can be publicly accessed from https://github.com/liliangnudt/LSWMKC.
引用
收藏
页码:1721 / 1734
页数:14
相关论文
共 67 条
[1]  
[Anonymous], 2014, Advances in Neural Information Processing Systems
[2]  
Bezdek J. C., 2003, Neural, Parallel & Scientific Computations, V11, P351
[3]  
Cortes C, 2012, J MACH LEARN RES, V13, P795
[4]   Nonlinear Dimensionality Reduction With Missing Data Using Parametric Multiple Imputations [J].
de Bodt, Cyril ;
Mulders, Dounia ;
Verleysen, Michel ;
Lee, John Aldo .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2019, 30 (04) :1166-1179
[5]  
Dhillon I. S., 2004, Proceedings of the Tenth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD'04, P551, DOI [DOI 10.1145/1014052.101411, DOI 10.1145/1014052.1014118]
[6]   Unsupervised Feature Selection with Adaptive Structure Learning [J].
Du, Liang ;
Shen, Yi-Dong .
KDD'15: PROCEEDINGS OF THE 21ST ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2015, :209-218
[7]  
Du L, 2015, PROCEEDINGS OF THE TWENTY-FOURTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE (IJCAI), P3476
[8]   A Local and Global Discriminative Framework and Optimization for Balanced Clustering [J].
Han, Junwei ;
Liu, Hanyang ;
Nie, Feiping .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2019, 30 (10) :3059-3071
[9]  
Hartigan J. A., 1979, Applied Statistics, V28, P100, DOI 10.2307/2346830
[10]   Multiple Kernel Fuzzy Clustering [J].
Huang, Hsin-Chien ;
Chuang, Yung-Yu ;
Chen, Chu-Song .
IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2012, 20 (01) :120-134