Domain-Lifted Sampling for Universal Two-Variable Logic and Extensions

被引:0
作者
Wang, Yuanhong [1 ]
van Bremen, Timothy [2 ]
Wang, Yuyi [3 ,4 ]
Kuzelka, Ondrej [5 ]
机构
[1] Beihang Univ, Beijing, Peoples R China
[2] Katholieke Univ Leuven, Leuven, Belgium
[3] CRRC Zhuzhou Inst, Zhuzhou, Peoples R China
[4] Swiss Fed Inst Technol, Zurich, Switzerland
[5] Czech Tech Univ, Prague, Czech Republic
来源
THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE | 2022年
基金
欧洲研究理事会;
关键词
GENERATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Given a first-order sentence Gamma and a domain size n, how can one sample a model of Gamma on the domain {1, ..., n} efficiently as n scales? We consider two variants of this problem: the uniform sampling regime, in which the goal is to sample a model uniformly at random, and the symmetric weighted sampling regime, in which models are weighted according to the number of groundings of each predicate appearing in them. Solutions to this problem have applications to the scalable generation of combinatorial structures, as well as sampling in several statistical-relational models such as Markov logic networks and probabilistic logic programs. In this paper, we identify certain classes of sentences that are domain-liftable under sampling, in the sense that they admit a sampling algorithm that runs in time polynomial in n. In particular, we prove that every sentence of the form for all x for all y : psi(x, y) for some quantifier-free formula psi(x, y) is domain-liftable under sampling. We then further show that this result continues to hold in the presence of one or more cardinality constraints as well as a single tree axiom constraint.
引用
收藏
页码:10070 / 10079
页数:10
相关论文
共 28 条
[21]   Markov logic networks [J].
Richardson, M ;
Domingos, P .
MACHINE LEARNING, 2006, 62 (1-2) :107-136
[22]  
Soos M., 2020, P INT C COMP AID VER
[23]   Lifted Variable Elimination: Decoupling the Operators from the Constraint Language [J].
Taghipour, Nima ;
Fierens, Daan ;
Davis, Jesse ;
Blockeel, Hendrik .
JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 2013, 47 :393-439
[24]  
van Bremen T., 2021, P 18 INT C PRINC KNO, P599
[25]  
van Bremen T., 2021, UNCERTAINTY ARTIFICI, P1393
[26]  
Van den Broeck G., 2021, NEURAL INFORM PROCES
[27]  
Van den Broeck G., 2012, AAAI
[28]  
Van den Broeck G, 2015, AAAI CONF ARTIF INTE, P3599