Unifying Probability and Logic for Learning

Date

2013

Authors

Hutter, Marcus
Lloyd, John
Ng, Kee Siong
Uther, William T.B.

Journal Title

Journal ISSN

Volume Title

Publisher

AAAI Press

Abstract

Uncertain knowledge can be modeled by using graded probabilities rather than binary truth-values, but so far a completely satisfactory integration of logic and probability has been lacking. In particular the inability of confirming universal hypotheses has plagued most if not all systems so far. We address this problem head on. The main technical problem to be discussed is the following: Given a set of sentences, each having some probability of being true, what probability should be ascribed to other (query) sentences? A natural wish-list, among others, is that the probability distribution (i) is consistent with the knowledge base, (ii) allows for a consistent inference procedure and in particular (iii) reduces to deductive logic in the limit of probabilities being 0 and 1, (iv) allows (Bayesian) inductive reasoning and (v) learning in the limit and in particular (vi) allows confirmation of universally quanti- fied hypotheses/sentences. We show that probabilities satisfying (i)-(vi) exist, and present necessary and sufficient conditions (Gaifman and Cournot). The theory is a step towards a globally consistent and empirically satisfactory unification of probability and logic.

Description

Keywords

Citation

Source

IJCAI International Joint Conference on Artificial Intelligence

Type

Conference paper

Book Title

Entity type

Access Statement

Open Access

License Rights

DOI

Restricted until