SDAC-DA: Semi-Supervised Deep Attributed Clustering Using Dual Autoencoder

被引:18
|
作者
Berahmand, Kamal [1 ]
Bahadori, Sondos [2 ]
Abadeh, Maryam Nooraei [3 ]
Li, Yuefeng [1 ]
Xu, Yue [1 ]
机构
[1] Queensland Univ Technol QUT, Fac Sci, Sch Comp Sci, Brisbane, Qld 4000, Australia
[2] Islamic Azad Univ, Dept Comp Engn, Ilam Branch, J9QJ 3Q4, Ilam, Iran
[3] Islamic Azad Univ, Dept Comp Engn, Abadan Branch, Abadan 6317836531, Iran
关键词
Vectors; Clustering algorithms; Image edge detection; Clustering methods; Transforms; Task analysis; STEM; Attributed network; deep attributed clustering; semi-supervised clustering; pairwise constraints; COMMUNITY DETECTION; GRAPH; NETWORK;
D O I
10.1109/TKDE.2024.3389049
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Attributed graph clustering aims to group nodes into disjoint categories using deep learning to represent node embeddings and has shown promising performance across various applications. However, two main challenges hinder further performance improvement. First, reliance on unsupervised methods impedes the learning of low-dimensional, clustering-specific features in the representation layer, thus impacting clustering performance. Second, the predominant use of separate approaches leads to suboptimal learned embeddings that are insufficient for subsequent clustering steps. To address these limitations, we propose a novel method called Semi-supervised Deep Attributed Clustering using Dual Autoencoder (SDAC-DA). This approach enables semi-supervised deep end-to-end clustering in attributed networks, promoting high structural cohesiveness and attribute homogeneity. SDAC-DA transforms the attribute network into a dual-view network, applies a semi-supervised autoencoder layering approach to each view, and integrates dimensionality reduction matrices by considering complementary views. The resulting representation layer contains high clustering-friendly embeddings, which are optimized through a unified end-to-end clustering process for effectively identifying clusters. Extensive experiments on both synthetic and real networks demonstrate the superiority of our proposed method over seven state-of-the-art approaches.
引用
收藏
页码:6989 / 7002
页数:14
相关论文
共 50 条
  • [41] Composite kernels for semi-supervised clustering
    Carlotta Domeniconi
    Jing Peng
    Bojun Yan
    Knowledge and Information Systems, 2011, 28 : 99 - 116
  • [42] A SUPERVISORY APPROACH TO SEMI-SUPERVISED CLUSTERING
    Conroy, Bryan
    Xi, Yongxin Taylor
    Ramadge, Peter
    2010 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2010, : 1858 - 1861
  • [43] SemiSync: Semi-supervised Clustering by Synchronization
    Zhang, Zhong
    Kang, Didi
    Gao, Chongming
    Shao, Junming
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS, 2019, 11448 : 358 - 362
  • [44] Weighted Semi-supervised Fuzzy Clustering
    Kong, Yi-qing
    Wang, Shi-tong
    FUZZY INFORMATION AND ENGINEERING, VOL 1, 2009, 54 : 465 - 470
  • [45] Density-sensitive semi-supervised spectral clustering
    Wang, Ling
    Bo, Lie-Feng
    Jiao, Li-Cheng
    Ruan Jian Xue Bao/Journal of Software, 2007, 18 (10): : 2412 - 2422
  • [46] Semi-Supervised Clustering with Multiresolution Autoencoders
    Ienco, Dino
    Pensa, Ruggero G.
    2018 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2018,
  • [47] A Framework for Semi-Supervised Clustering Based on Dimensionality Reduction
    Cui Peng
    Zhang Ru-bo
    FIRST INTERNATIONAL WORKSHOP ON DATABASE TECHNOLOGY AND APPLICATIONS, PROCEEDINGS, 2009, : 192 - +
  • [48] Semi-supervised clustering based on affinity propagation algorithm
    Xiao, Yu
    Yu, Jian
    Ruan Jian Xue Bao/Journal of Software, 2008, 19 (11): : 2803 - 2813
  • [49] Adaptive and structured graph learning for semi-supervised clustering
    Chen, Long
    Zhong, Zhi
    INFORMATION PROCESSING & MANAGEMENT, 2022, 59 (04)
  • [50] Semi-Supervised Maximum Margin Clustering with Pairwise Constraints
    Zeng, Hong
    Cheung, Yiu-Ming
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2012, 24 (05) : 926 - 939