Mutual cross information
Purpose
Description
Macro Synopsis
Modules
Related Functions
References
Purpose
Entropy based mutual cross information
Description
"Mutual cross information" calculates the Shannon cross-entropy as a measure of
the correlation between two signals x1 and x2 by means of phase space
embedding.
The result is plotted as a function of the relative shift of the input signals.
The formula is
For a description of the symbols in the formula above,
refer to Mutual information.
Unlike in Mutual information, there are two phase space densities,
and
for the systems x1 and x2, respectively.
The parameters are:
- delay (tau) for phase space reconstruction (in samples),
- embedding dimension m,
- relative radius r (of neighbourhood in phase space),
- maximum relative shift maxdelta (in number of samples),
- step size stepsize (no. of samples between successive calculations
)
Due to the dual logarithm in the formula above, the output is 'bits'.
Note that since this function operates on 'pure' time series, scales and shifts of the given signals do not affect the result.
Macro Synopsis
y = MutualCrossInfo([x1,x2],tau,m,r,maxdelta,stepsize);
signal x1,x2,y;
int tau,m;
float r;
int maxdelta, stepsize;
Modules
Nonlinear
Related Functions
Mutual information,
Conditional coupling divergence (PCCD), Delta test,
Pointwise transinformation, Post event scan,
Synchronicity histogram, Transinformation.
References
Shannon/Weaver [45], Abarbanel [46],
Vandenhouten [21]