End-to-end variational graph clustering with local structural preservation

被引:5
作者
Guo, Lin [1 ]
Dai, Qun [1 ]
机构
[1] Nanjing Univ Aeronaut & Astronaut, Coll Comp Sci & Technol, Nanjing 211106, Peoples R China
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
Graph convolutional neural network; Variational graph embedding; Graph clustering; Variational graph auto-encoder;
D O I
10.1007/s00521-021-06639-7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph clustering, a basic problem in machine learning and artificial intelligence, facilitates a variety of real-world applications. How to perform a task of graph clustering, with a relatively high-quality optimization decision and an effective yet efficient way to use graph information, to obtain a more excellent assignment for discrete points is not an ordinary challenge that troubles scholars. Often, many preeminent works on graph clustering neglect an essential element that the defined clustering loss may destroy the feature space. This is also a vital factor that leads to unrepresentative nonsense features that generate poor partitioning decisions. Here, we propose an end-to-end variational graph clustering (EVGC) algorithm focusing on preserving the original information of the graph. Specifically, the KL loss with an auxiliary distribution serves as a specific guide to manipulate the embedding space, and consequently disperse data points. A graph auto-encoder plays a propulsive role in maximumly retaining the local structure of the generative distribution of the graph. And each node is represented as a Gaussian distribution in dealing with separating the true embedding position and uncertainty from the graph. Experimental results reveal the importance of preserving local structure, and our EVGC system outperforms state-of-the-art approaches.
引用
收藏
页码:3767 / 3782
页数:16
相关论文
共 34 条
[1]  
Cao SS, 2016, AAAI CONF ARTIF INTE, P1145
[2]  
Chang Jonathan, 2009, P 12 INT C ART INT S, P81
[3]   Nonparametric method of topic identification using granularity concept and graph-based modeling [J].
Ganguli, Isha ;
Sil, Jaya ;
Sengupta, Nandita .
NEURAL COMPUTING & APPLICATIONS, 2023, 35 (02) :1055-1075
[4]   node2vec: Scalable Feature Learning for Networks [J].
Grover, Aditya ;
Leskovec, Jure .
KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, :855-864
[5]  
Han Y., 2016, Partially supervised graph embedding for positive unlabeled feature selection, P1548
[6]   Convolutional Graph Autoencoder: A Generative Deep Neural Network for Probabilistic Spatio-Temporal Solar Irradiance Forecasting [J].
Khodayar, Mandi ;
Mohammadi, Saeed ;
Khodayar, Mohammad E. ;
Wang, Jianhui ;
Liu, Guangyi .
IEEE TRANSACTIONS ON SUSTAINABLE ENERGY, 2020, 11 (02) :571-583
[7]  
Kingma DP, 2014, ADV NEUR IN, V27
[8]  
Kipf Thomas N, 2016, NIPS WORKSH BAYES DE
[9]   SOPHIE velocimetry of Kepler transit candidates XVII. The physical properties of giant exoplanets within 400 days of period [J].
Santerne, A. ;
Moutou, C. ;
Tsantaki, M. ;
Bouchy, F. ;
Hebrard, G. ;
Adibekyan, V. ;
Almenara, J. -M. ;
Amard, L. ;
Barros, S. C. C. ;
Boisse, I. ;
Bonomo, A. S. ;
Bruno, G. ;
Courcol, B. ;
Deleuil, M. ;
Demangeon, O. ;
Diaz, R. F. ;
Guillot, T. ;
Havel, M. ;
Montagnier, G. ;
Rajpurohit, A. S. ;
Rey, J. ;
Santos, N. C. .
ASTRONOMY & ASTROPHYSICS, 2016, 587
[10]   A local-to-global scheme-based multi-objective evolutionary algorithm for overlapping community detection on large-scale complex networks [J].
Ma, Haiping ;
Yang, Haipeng ;
Zhou, Kefei ;
Zhang, Lei ;
Zhang, Xingyi .
NEURAL COMPUTING & APPLICATIONS, 2021, 33 (10) :5135-5149