PPNE: Property Preserving Network Embedding

被引:67
作者
Li, Chaozhuo [1 ]
Wang, Senzhang [2 ,3 ]
Yang, Dejian [1 ]
Li, Zhoujun [1 ]
Yang, Yang [1 ]
Zhang, Xiaoming [1 ]
Zhou, Jianshe [4 ]
机构
[1] Beihang Univ, State Key Lab Software Dev Environm, Beijing, Peoples R China
[2] Nanjing Univ Aeronaut & Astronaut, Nanjing, Jiangsu, Peoples R China
[3] Collaborat Innovat Ctr Novel Software Technol & I, Nanjing, Jiangsu, Peoples R China
[4] Capital Normal Univ, Beijing 100048, Peoples R China
来源
DATABASE SYSTEMS FOR ADVANCED APPLICATIONS (DASFAA 2017), PT I | 2017年 / 10177卷
基金
中国国家自然科学基金; 国家高技术研究发展计划(863计划);
关键词
DIMENSIONALITY REDUCTION;
D O I
10.1007/978-3-319-55753-3_11
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Network embedding aims at learning a distributed representation vector for each node in a network, which has been increasingly recognized as an important task in the network analysis area. Most existing embedding methods focus on encoding the topology information into the representation vectors. In reality, nodes in the network may contain rich properties, which could potentially contribute to learn better representations. In this paper, we study the novel problem of property preserving network embedding and propose a general model PPNE to effectively incorporate the rich types of node properties. We formulate the learning process of representation vectors as a joint optimization problem, where the topology-derived and property-derived objective functions are optimized jointly with shared parameters. By solving this joint optimization problem with an efficient stochastic gradient descent algorithm, we can obtain representation vectors incorporating both network topology and node property information. We extensively evaluate our framework through two data mining tasks on five datasets. Experimental results show the superior performance of PPNE.
引用
收藏
页码:163 / 179
页数:17
相关论文
共 27 条
[1]  
Al Hasan M, 2011, SOCIAL NETWORK DATA ANALYTICS, P243
[2]  
[Anonymous], 2009, CIKM, DOI 10.1145/1645953.1646094
[3]  
[Anonymous], J ASS INF SCI TECHNO
[4]   Laplacian eigenmaps for dimensionality reduction and data representation [J].
Belkin, M ;
Niyogi, P .
NEURAL COMPUTATION, 2003, 15 (06) :1373-1396
[5]   LIBSVM: A Library for Support Vector Machines [J].
Chang, Chih-Chung ;
Lin, Chih-Jen .
ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2011, 2 (03)
[6]   Heterogeneous Network Embedding via Deep Architectures [J].
Chang, Shiyu ;
Han, Wei ;
Tang, Jiliang ;
Qi, Guo-Jun ;
Aggarwal, Charu C. ;
Huang, Thomas S. .
KDD'15: PROCEEDINGS OF THE 21ST ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2015, :119-128
[7]  
Church, 2006, P 12 ACM SIGKDD INT, P287, DOI DOI 10.1145/1150402.1150436
[8]   An introduction to ROC analysis [J].
Fawcett, Tom .
PATTERN RECOGNITION LETTERS, 2006, 27 (08) :861-874
[9]  
Grover A., 2016, ACM SIGKDD INT C
[10]   Finding Structure with Randomness: Probabilistic Algorithms for Constructing Approximate Matrix Decompositions [J].
Halko, N. ;
Martinsson, P. G. ;
Tropp, J. A. .
SIAM REVIEW, 2011, 53 (02) :217-288