Skip navigation
Skip navigation

Distribution of mutual information

Hutter, Marcus


The mutual information of two random variables ı and  with joint probabilities {πij} is commonly used in learning Bayesian nets as well as in many other fields. The chances πij are usually estimated by the empirical sampling frequency nij/n leading to a point estimate I(nij/n) for the mutual information. To answer questions like “is I(nij/n) consistent with zero?” or “what is the probability that the true mutual information is much larger than the point estimate?” one has to go...[Show more]

CollectionsANU Research Publications
Date published: 2002
Type: Conference paper


There are no files associated with this item.

Items in Open Research are protected by copyright, with all rights reserved, unless otherwise indicated.

Updated:  20 July 2017/ Responsible Officer:  University Librarian/ Page Contact:  Library Systems & Web Coordinator