Mutual information is equivalent to Mutual cross information applied to a signal x and a copy
of itself. Thus the mutual information of a discrete time series is defined by
where is the probability for the scalar observable
to take the value of x1, and is the
probability for the observable to take the value x2 after a time delay
is equivalent to the medium amount of
information the state x(t) contains about x(t+).
Mutual information operates on a single signal. The parameters are
- delay tau for phase space reconstruction (in samples),
- embedding dimension m,
- relative radius r,
- maximum relative shift maxdelta in samples and
- step size (stepsize) in samples.
Due to the dual logarithm in the formula above the output unit is 'bits'.
Note that since this function operates on a 'pure' time series, the scale and the shift of the given signal do not affect the result.
s = MutualInfo(x,tau,m,r,maxdelta,stepsize);
Mutual cross information
Shannon/Weaver , Liebert/Schuster ,