Нашли опечатку? Выделите ее мышкой и нажмите Ctrl+Enter
Название: Mutual Information Functions versus Correlation Functions
Автор: Li W.
Journal of Statistical Physics, Vol. 60, Nos. 5/6, 1990. p. 823-837.
This paper studies one application of mutual information to symbolic sequences: the mutual information function M(d). This function is compared with the more frequently used correlation function Г(d). An exact relation between М(d) and Г(d) is derived for binary sequences. For sequences with more than two symbols, no such general relation exists; in particular, Г(d) = 0 may or may not lead to M(d)= 0. This linear, but not general, independence between symbols separated by a distance is studied for ternary sequences. Also included is the
estimation of the finite-size effect on calculating mutual information. Finally, the concept of "symbolic noise" is discussed.