On the Entropy Rate of Word-Valued Sources
A word-valued source Y is a discrete finite alphabet random process which is created by encoding a discrete random process X with a symbol-to-word function f. In Information Theory (in particular source coding), it is of interest to know which word valued sources possess an entropy rate H̄(Y). Nishiara and Morita showed that if X is independent and identically distributed and f is prefix free, then H̄(Y) exists and is equal to H̄(X) divided the expected codeword length. This "conservation of...[Show more]
|Collections||ANU Research Publications|
|Source:||Proceedings of the Australasian Telecommunication Networks and Applications Conference|
|01_Timo_On_the_Entropy_Rate_of_2007.pdf||149.56 kB||Adobe PDF||Request a copy|
|02_Timo_On_the_Entropy_Rate_of_2007.pdf||225.44 kB||Adobe PDF||Request a copy|
Items in Open Research are protected by copyright, with all rights reserved, unless otherwise indicated.