Improved Deep Embedded K-Means Clustering with Implicit Orthogonal Space Transformation

被引:0
作者
Liu, Xinrui [1 ,2 ]
Liu, Wenzheng [1 ]
Li, Yuxiang [1 ]
Tang, Xiaoyong [1 ]
Deng, Tan [1 ]
Cao, Ronghui [1 ]
机构
[1] Changsha Univ Sci & Technol, Sch Comp & Commun Engn, Changsha, Peoples R China
[2] Changsha Univ Sci & Technol, Hunan Int Sci & Technol Innovat Cooperat Base Adv, Changsha, Hunan, Peoples R China
来源
2023 IEEE 47TH ANNUAL COMPUTERS, SOFTWARE, AND APPLICATIONS CONFERENCE, COMPSAC | 2023年
关键词
deep clustering; implicit transformation; K-means; orthogonal transformation matrix;
D O I
10.1109/COMPSAC57700.2023.00047
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
The deep clustering algorithm can learn the latent embedded features of the data through the autoencoder, and cluster the data according to the similarity of the latent features. However, the feature information obtained by the autoencoder may not have a better value for the clustering algorithm and is not suitable for clustering, which greatly reduces the clustering effect. This paper proposes a deep K-means clustering algorithm with implicitly embedded space transformation to answer this question. We implicitly transform the latent feature space into a new type of space that is more friendly to the clustering task, which preserves space invariance. This implicit transformation is done through an orthogonal transformation matrix. The orthogonal transformation matrix is composed of the eigenvectors of the intra-class scattering matrix and the inter-class scattering matrix. In the new space, clusters can be better separated by cluster cohesion and inter-cluster difference. We alternately optimize feature acquisition and clustering to adjust the embedding space and disperse the embedding points, to enrich the clustering information in the latent feature space. Experimental results show that our proposed algorithm can produce better high-quality clusters than many current correlation clustering algorithms on the same experimental dataset.
引用
收藏
页码:304 / 309
页数:6
相关论文
共 24 条
  • [1] Alqahtani A, 2018, IEEE IMAGE PROC, P4058, DOI 10.1109/ICIP.2018.8451506
  • [2] DEVDAN: Deep evolving denoising autoencoder
    Ashfahani, Andri
    Pratama, Mahardhika
    Lughofer, Edwin
    Ong, Yew Soon
    [J]. NEUROCOMPUTING, 2020, 390 : 297 - 314
  • [3] Chen XK, 2022, Arxiv, DOI arXiv:2202.03026
  • [4] A comparison of deep networks with ReLU activation function and linear spline-type methods
    Eckle, Konstantin
    Schmidt-Hieber, Johannes
    [J]. NEURAL NETWORKS, 2019, 110 : 232 - 242
  • [5] Deep Embedded K-Means Clustering
    Guo, Wengang
    Lin, Kaiyan
    Ye, Wei
    [J]. 21ST IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS ICDMW 2021, 2021, : 686 - 694
  • [6] Guo XF, 2017, PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, P1753
  • [7] Deep Clustering with Convolutional Autoencoders
    Guo, Xifeng
    Liu, Xinwang
    Zhu, En
    Yin, Jianping
    [J]. NEURAL INFORMATION PROCESSING (ICONIP 2017), PT II, 2017, 10635 : 373 - 382
  • [8] Canonical Correlation Analysis Based Hyper Basis Feedforward Neural Network Classification for Urban Sustainability
    Haldorai, Anandakumar
    Ramu, Arulmurugan
    [J]. NEURAL PROCESSING LETTERS, 2021, 53 (04) : 2385 - 2401
  • [9] A DATABASE FOR HANDWRITTEN TEXT RECOGNITION RESEARCH
    HULL, JJ
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1994, 16 (05) : 550 - 554
  • [10] Jais L. K. M., 2019, KNOWL ENG DATA SCI, V2, P41, DOI DOI 10.17977/UM018V2I12019P41-46