Factor Graph Neural Networks

被引:0
作者
Zhang, Zhen [1 ,2 ]
Dupty, Mohammed Haroon [3 ]
Wu, Fan [4 ]
Shi, Javen Qinfeng [1 ,2 ]
Lee, Wee Sun [3 ]
机构
[1] Australian Inst Machine Learning, Adelaide, Australia
[2] Univ Adelaide, Adelaide, Australia
[3] Natl Univ Singapore, Singapore, Singapore
[4] Univ Illinois, Champaign, IL USA
基金
新加坡国家研究基金会; 澳大利亚研究理事会;
关键词
Graphical Models; Belief Propagation; Graph Neural Networks; EFFICIENT BELIEF PROPAGATION; MAX-PRODUCT; INFERENCE; ALGORITHM;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In recent years, we have witnessed a surge of Graph Neural Networks (GNNs), most of which can learn powerful representations in an end-to-end fashion with great success in many real-world applications. They have resemblance to Probabilistic Graphical Models (PGMs), but break free from some limitations of PGMs. By aiming to provide expressive methods for representation learning instead of computing marginals or most likely configurations, GNNs provide flexibility in the choice of information flowing rules while maintaining good performance. Despite their success and inspirations, they lack efficient ways to represent and learn higher-order relations among variables/no des. More expressive higher-order GNNs which operate on k-tuples of nodes need increased computational resources in order to process higher-order tensors. We propose Factor Graph Neural Networks (FGNNs) to effectively capture higher-order relations for inference and learning. To do so, we first derive an efficient approximate Sum-Product loopy belief propagation inference algorithm for discrete higher-order PGMs. We then neuralize the novel message passing scheme into a Factor Graph Neural Network (FGNN) module by allowing richer representations of the message update rules; this facilitates both efficient inference and powerful end-to-end learning. We further show that with a suitable choice of message aggregation operators, our FGNN is also able to represent Max-Product belief propagation, providing a single family of architecture that can represent both Max and Sum-Product loopy belief propagation. Our extensive experimental evaluation on synthetic as well as real datasets demonstrates the potential of the proposed model.
引用
收藏
页数:54
相关论文
共 99 条
[1]  
Agarwal S., 2006, Proceedings of the 23rd International Conference on Machine Learning, P17, DOI DOI 10.1145/1143844.1143847
[2]  
Battaglia PW, 2016, ADV NEUR IN, V29
[3]   Max-product for maximum weight matching: Convergence, correctness, and LP duality [J].
Bayati, Mobsen ;
Shah, Devavrat ;
Sharma, Mayank .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2008, 54 (03) :1241-1251
[4]  
Boski M, 2017, 2017 10TH INTERNATIONAL WORKSHOP ON MULTIDIMENSIONAL (ND) SYSTEMS (NDS)
[5]   Poselets: Body Part Detectors Trained Using 3D Human Pose Annotations [J].
Bourdev, Lubomir ;
Malik, Jitendra .
2009 IEEE 12TH INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2009, :1365-1372
[6]  
Bruna J, 2014, Arxiv, DOI [arXiv:1312.6203, 10.48550/arXiv.1312.6203, DOI 10.48550/ARXIV.1312.6203]
[7]  
Chen GY, 2019, Arxiv, DOI [arXiv:1906.09427, DOI 10.48550/ARXIV.1906.09427, 10.48550/arXiv.1906.09427]
[8]  
Chen LC, 2015, PR MACH LEARN RES, V37, P1785
[9]  
Chen Z., 2018, INT C LEARNING REPRE
[10]  
Cho M, 2010, LECT NOTES COMPUT SC, V6315, P492