Mutual cross information
Entropy based mutual cross information
"Mutual cross information" calculates the Shannon cross-entropy as a measure of
the correlation between two signals x1 and x2 by means of phase space
The result is plotted as a function of the relative shift of the input signals.
The formula is
For a description of the symbols in the formula above,
refer to Mutual information.
Unlike in Mutual information, there are two phase space densities, and for the systems x1 and x2, respectively.
The parameters are:
- delay (tau) for phase space reconstruction (in samples),
- embedding dimension m,
- relative radius r (of neighbourhood in phase space),
- maximum relative shift maxdelta (in number of samples),
- step size stepsize (no. of samples between successive calculations
Due to the dual logarithm in the formula above, the output is 'bits'.
Note that since this function operates on 'pure' time series, scales and shifts of the given signals do not affect the result.
y = MutualCrossInfo([x1,x2],tau,m,r,maxdelta,stepsize);
int maxdelta, stepsize;
Conditional coupling divergence (PCCD), Delta test,
Pointwise transinformation, Post event scan,
Synchronicity histogram, Transinformation.
Shannon/Weaver , Abarbanel ,