Probabilistic inference with noisy-threshold models based on a CP tensor decomposition

被引:6
|
作者
Vomlel, Jiri [1 ]
Tichavsky, Petr [1 ]
机构
[1] Inst Informat Theory & Automat AS CR, Prague 18208 8, Czech Republic
关键词
Bayesian networks; Probabilistic inference; Candecomp-Parafac tensor decomposition; Symmetric tensor rank; BAYESIAN NETWORKS; CAUSAL INDEPENDENCE; EXPERT-SYSTEMS; RANK; PROPAGATION; TREES;
D O I
10.1016/j.ijar.2013.12.002
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The specification of conditional probability tables (CPTs) is a difficult task in the construction of probabilistic graphical models. Several types of canonical models have been proposed to ease that difficulty. Noisy-threshold models generalize the two most popular canonical models: the noisy-or and the noisy-and. When using the standard inference techniques the inference complexity is exponential with respect to the number of parents of a variable. More efficient inference techniques can be employed for CPTs that take a special form. CPTs can be viewed as tensors. Tensors can be decomposed into linear combinations of rank-one tensors, where a rank-one tensor is an outer product of vectors. Such decomposition is referred to as Canonical Polyadic (CP) or CANDECOMP-PARAFAC (CP) decomposition. The tensor decomposition offers a compact representation of CPTs which can be efficiently utilized in probabilistic inference. In this paper we propose a CP decomposition of tensors corresponding to CPTs of threshold functions, exactly l-out-of-k functions, and their noisy counterparts. We prove results about the symmetric rank of these tensors in the real and complex domains. The proofs are constructive and provide methods for CP decomposition of these tensors. An analytical and experimental comparison with the parent-divorcing method (which also has a polynomial complexity) shows superiority of the CP decomposition-based method. The experiments were performed on subnetworks of the well-known QMRT-DT network generalized by replacing noisy-or by noisy-threshold models. (C) 2013 Elsevier Inc. All rights reserved.
引用
收藏
页码:1072 / 1092
页数:21
相关论文
共 5 条
  • [1] Exploiting tensor rank-one decomposition in probabilistic inference
    Savicky, Petr
    Vomlel, Jiri
    KYBERNETIKA, 2007, 43 (05) : 747 - 764
  • [2] Using Value-Based Potentials for Making Approximate Inference on Probabilistic Graphical Models
    Bonilla-Nadal, Pedro
    Cano, Andres
    Gomez-Olmedo, Manuel
    Moral, Serafin
    Retamero, Ofelia Paula
    MATHEMATICS, 2022, 10 (14)
  • [3] Using Probabilistic Confidence Models for Trust Inference in Web-Based Social Networks
    Kuter, Ugur
    Golbeck, Jennifer
    ACM TRANSACTIONS ON INTERNET TECHNOLOGY, 2010, 10 (02)
  • [4] Probabilistic Inference-Based Least Squares Support Vector Machine for Modeling Under Noisy Environment
    Fan, Bi
    Lu, Xinjiang
    Li, Han-Xiong
    IEEE Transactions on Systems Man Cybernetics-Systems, 2016, 46 (12): : 1703 - 1710
  • [5] EA-ADMM: noisy tensor PARAFAC decomposition based on element-wise average ADMM
    Yue, Gang
    Sun, Zhuo
    EURASIP JOURNAL ON ADVANCES IN SIGNAL PROCESSING, 2022, 2022 (01)