Distributed Graph Learning From Smooth Data: A Bayesian Framework

被引:0
作者
Zhang, Jiayin [1 ]
Wu, Nan [1 ]
Zhang, Tingting [1 ]
Li, Bin [1 ]
Yan, Qinsiwei [1 ]
Ma, Xiaoli [2 ]
机构
[1] Beijing Inst Technol, Sch Informat & Elect, Beijing 100081, Peoples R China
[2] Georgia Inst Technol, Sch Elect & Comp Engn, Atlanta, GA 30332 USA
基金
中国国家自然科学基金;
关键词
Probabilistic logic; Topology; Bayes methods; Signal processing algorithms; Laplace equations; Data models; Symmetric matrices; Message passing; Vectors; Uncertainty; Graph learning; smooth graph signal; graph signal processing; probabilistic model; variational expectation maximization; factor graph; generalized approximate message passing; MESSAGE-PASSING ALGORITHMS; SIGNAL; APPROXIMATION; PROPAGATION; CONVERGENCE; INFERENCE; SELECTION; FIELD;
D O I
10.1109/TSP.2025.3553915
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
The emerging field of graph learning, which aims to learn reasonable graph structures from data, plays a vital role in Graph Signal Processing (GSP) and finds applications in various data processing domains. However, the existing approaches have primarily focused on learning deterministic graphs, and thus are not suitable for applications involving topological stochasticity, such as epidemiological models. In this paper, we develop a hierarchical Bayesian model for graph learning problem. Specifically, the generative model of smooth signals is formulated by transforming the graph topology into self-expressiveness coefficients and incorporating individual noise for each vertex. Tailored probability distributions are imposed on each edge to characterize the valid graph topology constraints along with edge-level probabilistic information. Building upon this, we derive the Bayesian Graph Learning (BGL) approach to efficiently estimate the graph structure in a distributed manner. In particular, based on the specific probabilistic dependencies, we derive a series of message passing rules by a mixture of Generalized Approximate Message Passing (GAMP) message and Belief Propagation (BP) message to iteratively approximate the posterior probabilities. Numerical experiments with both artificial and real data demonstrate that BGL learns more accurate graph structures and enhances machine learning tasks compared to state-of-the-art methods.
引用
收藏
页码:1626 / 1642
页数:17
相关论文
共 40 条
[1]   On the contributions of topological features to transcriptional regulatory network robustness [J].
Al Zamal, Faiyaz ;
Ruths, Derek .
BMC BIOINFORMATICS, 2012, 13
[2]  
Banerjee O, 2008, J MACH LEARN RES, V9, P485
[3]   Mutual Information and Optimality of Approximate Message-Passing in Random Linear Estimation [J].
Barbier, Jean ;
Macris, Nicolas ;
Dia, Mohamad ;
Krzakala, Florent .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2020, 66 (07) :4270-4303
[4]   Network inference, error, and informant (in)accuracy: a Bayesian approach [J].
Butts, CT .
SOCIAL NETWORKS, 2003, 25 (02) :103-140
[5]   Graph Unrolling Networks: Interpretable Neural Networks for Graph Signal Denoising [J].
Chen, Siheng ;
Eldar, Yonina C. ;
Zhao, Lingxiao .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2021, 69 :3699-3713
[6]  
Daitch S.I., 2009, PROC 26 ANN INT C MA, P201, DOI [10.1145/1553374.1553400, DOI 10.1145/1553374.1553400]
[7]   Spatiotemporal Trends and Attribution of Drought across China from 1901-2100 [J].
Ding, Yongxia ;
Peng, Shouzhang .
SUSTAINABILITY, 2020, 12 (02)
[8]   Learning Laplacian Matrix in Smooth Graph Signal Representations [J].
Dong, Xiaowen ;
Thanou, Dorina ;
Frossard, Pascal ;
Vandergheynst, Pierre .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2016, 64 (23) :6160-6173
[9]   Message-passing algorithms for compressed sensing [J].
Donoho, David L. ;
Maleki, Arian ;
Montanari, Andrea .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2009, 106 (45) :18914-18919
[10]   Graph Learning From Data Under Laplacian and Structural Constraints [J].
Egilmez, Hilmi E. ;
Pavez, Eduardo ;
Ortega, Antonio .
IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2017, 11 (06) :825-841