Concepts and Algorithms for Computing Maximum Entropy Distributions for Knowledge Bases with Relational Probabilistic Conditionals

Share
Author
Finthammer, M.
Pub. date
April 2017
Pages
334
Binding
softcover
Volume
342 of Dissertations in Artificial Intelligence
ISBN print
978-1-61499-749-8
Subject
Artificial Intelligence, Computer Science
 
This book contains a subject index
€50 / US$58 / £43 Excl. VAT
Order Concepts and Algorithms for Computing Maximum Entropy Distributions for Knowledge Bases with Relational Probabilistic Conditionals ISBN @ €50.00

Many practical problems are concerned with incomplete and uncertain knowledge about domains where relations among different objects play an important role. Relational probabilistic conditionals provide an adequate way to express such uncertain, rule-like knowledge of the form "If A holds, then B holds with probability p". Recently, the aggregating semantics for such conditionals has been proposed, which, combined with the principle of maximum entropy (ME), allows probabilistic reasoning in a relational domain. However, there exist no specialized algorithms which would allow performing ME reasoning under aggregating semantics in practice.


The main topic of this publication is the development, implementation, evaluation, and improvement of the very first algorithms tailor-made for solving the ME optimization problem under aggregating semantics. We demonstrate how the equivalence of worlds can be exploited to compute the ME distribution more efficiently. We further introduce an algorithm which works on weighted conditional impacts (WCI) instead of worlds and we present a novel algorithm which computes the WCI of a conditional by employing combinatorial means. These algorithms allow us to process some larger examples which could not be computed before at all and can also be beneficial for other relational ME semantics.