Skip navigation
Skip navigation

Distribution of mutual information

Hutter, Marcus

Description

The mutual information of two random variables ı and  with joint probabilities {πij} is commonly used in learning Bayesian nets as well as in many other fields. The chances πij are usually estimated by the empirical sampling frequency nij/n leading to a point estimate I(nij/n) for the mutual information. To answer questions like “is I(nij/n) consistent with zero?” or “what is the probability that the true mutual information is much larger than the point estimate?” one has to go...[Show more]

CollectionsANU Research Publications
Date published: 2002
Type: Conference paper
URI: http://hdl.handle.net/1885/15095

Download

There are no files associated with this item.


Items in Open Research are protected by copyright, with all rights reserved, unless otherwise indicated.

Updated:  12 November 2018/ Responsible Officer:  University Librarian/ Page Contact:  Library Systems & Web Coordinator