Learning and Modeling with Probabilistic Conditional Logic
Conditionals, also called if-then-rules, are a popular concept for knowledge representation. They have intuitive semantics and can also be annotated with probabilities, which, when combined with the principle of maximum entropy, yields a powerful formalism for representing uncertain knowledge.
This dissertation discusses several issues pertaining to probabilistic conditionals: learning them from data and using them for modeling. The first part of this thesis presents the implementation of a method for learning probabilistic conditionals from data. In the second part, this learning technique is applied to the problem of fusing data originating from different sources. The third part is the focal point of the thesis. Here, an extension of a propositional probabilistic conditional logic to a first-order probabilistic conditional logic is developed and an approach to reduce the complexity of computing the maximum entropy model of a set of first-order probabilistic conditionals is devised.