Graph Attention Auto-Encoders

被引:62
作者
Salehi, Amin [1 ]
Davulcu, Hasan [1 ]
机构
[1] Arizona State Univ, Comp Sci & Engn, Tempe, AZ 85281 USA
来源
2020 IEEE 32ND INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI) | 2020年
关键词
D O I
10.1109/ICTAI50040.2020.00154
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Auto-encoders have emerged as a successful framework for unsupervised learning. However, conventional auto-encoders are incapable of utilizing explicit relations in structured data. To take advantage of relations in graph-structured data, several graph auto-encoders have recently been proposed, but they neglect to reconstruct either the graph structure or node attributes. In this paper, we present the graph attention auto-encoder (GATE), a neural network architecture for unsupervised representation learning on graph-structured data. Our architecture is able to reconstruct graph-structured inputs, including both node attributes and the graph structure, through stacked encoder/decoder layers equipped with self-attention mechanisms. In the encoder, by considering node attributes as initial node representations, each layer generates new representations of nodes by attending over their neighbors' representations. In the decoder, we attempt to reverse the encoding process to reconstruct node attributes. Moreover, node representations are regularized to reconstruct the graph structure. Our proposed architecture does not need to know the graph structure upfront, and thus it can be applied to inductive learning. Our experiments demonstrate competitive performance on several node classification benchmark datasets for transductive and inductive tasks, even exceeding the performance of supervised learning baselines in most cases.
引用
收藏
页码:989 / 996
页数:8
相关论文
共 51 条
[21]  
Huang X., 2017, P 2017 SIAM INT C DA, P633, DOI DOI 10.1137/1.9781611974973.71
[22]   Label Informed Attributed Network Embedding [J].
Huang, Xiao ;
Li, Jundong ;
Hu, Xia .
WSDM'17: PROCEEDINGS OF THE TENTH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING, 2017, :731-739
[23]  
King DB, 2015, ACS SYM SER, V1214, P1, DOI 10.1021/bk-2015-1214.ch001
[24]  
Kipf Thomas N, 2017, 5 INT C LEARN REPR I
[25]  
Kipf TN, 2016, NEURIPS WORKSH BAYES
[26]  
Manning C. D, 2015, P 2015 C EMP METH NA
[27]   Geometric deep learning on graphs and manifolds using mixture model CNNs [J].
Monti, Federico ;
Boscaini, Davide ;
Masci, Jonathan ;
Rodola, Emanuele ;
Svoboda, Jan ;
Bronstein, Michael M. .
30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, :5425-5434
[28]   Asymmetric Transitivity Preserving Graph Embedding [J].
Ou, Mingdong ;
Cui, Peng ;
Pei, Jian ;
Zhang, Ziwei ;
Zhu, Wenwu .
KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, :1105-1114
[29]  
Pan SR, 2018, PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, P2609
[30]   DeepWalk: Online Learning of Social Representations [J].
Perozzi, Bryan ;
Al-Rfou, Rami ;
Skiena, Steven .
PROCEEDINGS OF THE 20TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING (KDD'14), 2014, :701-710