Learning and Modeling with Probabilistic Conditional Logic

Share
Author
Fisseler, J.
Pub. date
February 2010
Pages
236
Binding
softcover
Volume
328 of Dissertations in Artificial Intelligence
ISBN print
978-1-60750-098-8
Subject
Artificial Intelligence, Computer & Communication Sciences, Computer Science
€50 / US$73 Excl. VAT
Order Learning and Modeling with Probabilistic Conditional Logic ISBN @ €50.00

Conditionals, also called if-then-rules, are a popular concept for knowledge representation. They have intuitive semantics and can also be annotated with probabilities, which, when combined with the principle of maximum entropy, yields a powerful formalism for representing uncertain knowledge.


This dissertation discusses several issues pertaining to probabilistic conditionals: learning them from data and using them for modeling. The first part of this thesis presents the implementation of a method for learning probabilistic conditionals from data. In the second part, this learning technique is applied to the problem of fusing data originating from different sources. The third part is the focal point of the thesis. Here, an extension of a propositional probabilistic conditional logic to a first-order probabilistic conditional logic is developed and an approach to reduce the complexity of computing the maximum entropy model of a set of first-order probabilistic conditionals is devised.