Learning first-order probabilistic models with combining rules

被引:0
|
作者
Sriraam Natarajan
Prasad Tadepalli
Thomas G. Dietterich
Alan Fern
机构
[1] Oregon State University,School of Electrical Engineering and Computer Science
关键词
First-order probabilistic models; Quantified conditional influence statements; Directed graphs; 03B48;
D O I
暂无
中图分类号
学科分类号
摘要
Many real-world domains exhibit rich relational structure and stochasticity and motivate the development of models that combine predicate logic with probabilities. These models describe probabilistic influences between attributes of objects that are related to each other through known domain relationships. To keep these models succinct, each such influence is considered independent of others, which is called the assumption of “independence of causal influences” (ICI). In this paper, we describe a language that consists of quantified conditional influence statements and captures most relational probabilistic models based on directed graphs. The influences due to different statements are combined using a set of combining rules such as Noisy-OR. We motivate and introduce multi-level combining rules, where the lower level rules combine the influences due to different ground instances of the same statement, and the upper level rules combine the influences due to different statements. We present algorithms and empirical results for parameter learning in the presence of such combining rules. Specifically, we derive and implement algorithms based on gradient descent and expectation maximization for different combining rules and evaluate them on synthetic data and on a real-world task. The results demonstrate that the algorithms are able to learn both the conditional probability distributions of the influence statements and the parameters of the combining rules.
引用
收藏
页码:223 / 256
页数:33
相关论文
共 50 条
  • [1] Learning first-order probabilistic models with combining rules
    Natarajan, Sriraam
    Tadepalli, Prasad
    Dietterich, Thomas G.
    Fern, Alan
    ANNALS OF MATHEMATICS AND ARTIFICIAL INTELLIGENCE, 2008, 54 (1-3) : 223 - 256
  • [2] Probabilistic characterisation of models of first-order theories
    Rad, Soroush Rafiee
    ANNALS OF PURE AND APPLIED LOGIC, 2021, 172 (01)
  • [3] Loglinear models for first-order probabilistic reasoning
    Cussens, J
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 1999, : 126 - 133
  • [4] Learning probabilities for noisy first-order rules
    Koller, D
    Pfeffer, A
    IJCAI-97 - PROCEEDINGS OF THE FIFTEENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOLS 1 AND 2, 1997, : 1316 - 1321
  • [5] Lifted Aggregation in Directed First-order Probabilistic Models
    Kisynski, Jacek
    Poole, David
    21ST INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE (IJCAI-09), PROCEEDINGS, 2009, : 1922 - 1929
  • [6] Learning first-order rules: A rough set approach
    Stepaniuk, J
    Honko, P
    FUNDAMENTA INFORMATICAE, 2004, 61 (02) : 139 - 157
  • [7] Transformation Rules for First-Order Probabilistic Conditional Logic Yielding Parametric Uniformity
    Janning, Ruth
    Beierle, Christoph
    KI 2011: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2011, 7006 : 157 - 168
  • [8] First-order probabilistic languages: Into the unknown
    Milch, Brian
    Russell, Stuart
    INDUCTIVE LOGIC PROGRAMMING, 2007, 4455 : 10 - +
  • [9] Lifted First-Order Probabilistic Inference
    Braz, Rodrigo de Salvo
    Amir, Eyal
    Roth, Dan
    19TH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE (IJCAI-05), 2005, : 1319 - 1325
  • [10] Learning first-order rules from image applied to glaucoma diagnosis
    Ohwada, H
    Daidoji, M
    Shirato, S
    Mizoguchi, F
    PRICAI'98: TOPICS IN ARTIFICIAL INTELLIGENCE, 1998, 1531 : 494 - 505