Quantifying 'causality' in complex systems: understanding transfer entropy

Fatimah Abdul Razak, Henrik Jeldtoft Jensen, Fatimah Abdul Razak, Henrik Jeldtoft Jensen

Abstract

'Causal' direction is of great importance when dealing with complex systems. Often big volumes of data in the form of time series are available and it is important to develop methods that can inform about possible causal connections between the different observables. Here we investigate the ability of the Transfer Entropy measure to identify causal relations embedded in emergent coherent correlations. We do this by firstly applying Transfer Entropy to an amended Ising model. In addition we use a simple Random Transition model to test the reliability of Transfer Entropy as a measure of 'causal' direction in the presence of stochastic fluctuations. In particular we systematically study the effect of the finite size of data sets.

Conflict of interest statement

Competing Interests: The authors have declared that no competing interests exist.

Figures

Figure 1. Susceptibility on the Ising model…
Figure 1. Susceptibility on the Ising model with lengths L=10,25,50,100 obtained using equation (9).
Peaks can be seen at respective .
Figure 2. Covariance on the Ising model…
Figure 2. Covariance on the Ising model with lengths L=10,25,50,100 obtained using equation (10).
Figure 3. Mutual Information on the Ising…
Figure 3. Mutual Information on the Ising model with lengths L=10,25,50,100 obtained using equation (4).
Figure 4. Transfer Entropy and on the…
Figure 4. Transfer Entropy and on the Ising model of lengths L=50 obtained using equation (5).
Peaks for both direction are at .
Figure 5. Transfer Entropy on the Ising…
Figure 5. Transfer Entropy on the Ising model of lengths L=10,25,50,100 obtained using equation (5).
Peaks can be seen at respective .
Figure 6. Transfer Entropy on the Ising…
Figure 6. Transfer Entropy on the Ising model of lengths L=10,25,50,100 obtained using equation (5).
Peaks can be seen at respective .
Figure 7. Susceptibility on the amended Ising…
Figure 7. Susceptibility on the amended Ising model of lengths L=10,25,50,100 obtained using equation (9).
Peaks can be seen at respective .
Figure 8. Covariance on the amended Ising…
Figure 8. Covariance on the amended Ising model of lengths L=10,25,50,100 obtained using equation (10).
Peaks can be seen at respective , similar to Figure (2) of the Ising model.
Figure 9. Mutual Information on the amended…
Figure 9. Mutual Information on the amended Ising model with lengths L=10,25,50,100 obtained using equation (4).
Not much different from results on the Ising model in Figure 3.
Figure 10. Transfer Entropy and on the…
Figure 10. Transfer Entropy and on the amended Ising model of lengths and , obtained using equation (5).
Direction at time lag is indicated. Very different from result on Ising model in Figure 4.
Figure 11. Transfer Entropy on the Ising…
Figure 11. Transfer Entropy on the Ising model of lengths L=10,25,50,100 obtained using equation (5).
Values continue to increase after which is very different from Figure (5).
Figure 12. Transfer Entropy on the Ising…
Figure 12. Transfer Entropy on the Ising model of lengths L=10,25,50,100 obtained using equation (5).
Peaks can be seen at respective , similar to Ising model results in Figure (6).
Figure 13. versus for different time lags…
Figure 13. versus for different time lags in amended Ising model with and using equation (5).
The figure shows the effect of separation in time.
Figure 14. A different view of Figure…
Figure 14. A different view of Figure (13) where versus for different temperatures is plotted instead.
. Figure highlights time lag detection.
Figure 15. in Figure 17 up to…
Figure 15. in Figure 17 up to .
Transfer Entropy stabilizes due to Boltzmann distribution that approaches uniform distribution at higher temperatures.
Figure 16. , and in the Ising…
Figure 16. , and in the Ising model with .
due to distance (separation) in space where is closer to than . The nearest neighbour effect is observed.
Figure 17. , and in the amended…
Figure 17. , and in the amended Ising model with and .
due to implanted ‘causal’ lag. The effect of separation in space is no longer visible.
Figure 18. (Expected rate of change) of…
Figure 18. (Expected rate of change) of sites , and on amended Ising model with and .
Figure 19. on amended Ising model with…
Figure 19. on amended Ising model with and displaying phase-transition like behaviour.
Figure 20. on amended Ising model with…
Figure 20. on amended Ising model with and .
All with phase-transition like jump.
Figure 21. Analytical Transfer Entropy versus time…
Figure 21. Analytical Transfer Entropy versus time lags of the Random Transition model with (hence ) and in equation (16) where is varied but fixed.
is monotonically increasing with respect to . is affected by . Figure illustrates how the internal dynamics of influences when is the target variable. Transfer Entropy changes even though external influence is constant.
Figure 22. Analytical Transfer Entropy versus time…
Figure 22. Analytical Transfer Entropy versus time lags of the Random Transition model with (hence ) and in equation (16) where fixed and is varied.
Only at , does not effect and values remain constant. For at , Transfer Entropy is affected by . and coincides. Figure shows how the internal dynamics of influences when is the source variable.
Figure 23. Transfer Entropy versus number of…
Figure 23. Transfer Entropy versus number of state (number of chosen bins) for Cases and .
are uniformly distributed. Analytical values obtained from substituting respective values in equation (17). Simulated values are acquired using equation (5) on simulated data of varying sample size (length of time series) where . Error bars are displaying two standard deviation values above and two standard deviation below (some bars are very small, it can barely be seen). The aim is primarily to display how choosing has to be made according to length, , of available time series. For large the error bar becomes smaller than the width of the curve.
Figure 24. Transfer Entropy using equation (17)…
Figure 24. Transfer Entropy using equation (17) on simulated null model with varying sample size or length of time series, where .
Analytical values are all . Error bars in the first figure are displaying two standard deviation values above and two standard deviation below. For large the error bar becomes smaller than the width of the curve. In order to use the null model as surrogates, still has to be chosen in accordance to .

References

    1. Bak P (1996) How Nature Works: The Science of Self Organized Criticality. New York: Springer-Verlag.
    1. Christensen K, Moloney RN (2005) Complexity and Criticality. London: Imperial College Press.
    1. Jensen HJ (1998) Self Organized Criticality: Emergent Complex Behavior in Physical and Biological Systems. Cambridge: Cambridge University Press.
    1. Pruessner G (2012) Self-Organised Criticality: Theory, Models and Characterisation. Cambridge: Cambridge University Press.
    1. Jensen HJ (2009) Probability and statistics in complex systems, introduction to. In: Encyclopedia of Complexity and Systems Science. pp. 7024–7025.
    1. Runge J, Heitzig J, Marwan N, Kurths J (2012) Quantifying causal coupling strength: A lag-specific measure for multivariate time series related to transfer entropy. Phys Rev E 86: 061121.
    1. Wiener N (1956) I am Mathematician: The later life of a prodigy. Massachusetts: MIT Press.
    1. Granger CWJ (1969) Investigating causal relations by econometric models and cross-spectral methods. Econometrica 37: 424–438.
    1. Bressler SL, Seth A (2011) Wiener-granger causality: A well established methodolgy. NeuroImage 58: 323–329.
    1. Sauer N (2010) Causality and causation: What we learn from mathematical dynamic systems theory. Transactions of the Royal Society of South Africa 65: 65–68.
    1. Hausman DM (1999) The mathematical theory of causation. Brit J Phil Sci 3: 151–162.
    1. Friston K (2011) Dynamic causal modeling and Granger causality comments on: The identification of interacting networks in the brain using fMRI: Model selection, causality and deconvolution. NeuroImage 58: 303–305.
    1. Vicente R, Wibral M, Lindner M, Pipa G (2011) Transfer entropy: a model-free measure of effective connectivity for the neurosciences. J Comput Neurosci 30: 45–67.
    1. Martini M, Kranz TA, Wagner T, Lehnertz K (2011) Inferring directional interactions from transient signals with symbolic transfer entropy. Phys Rev E 83: 011919.
    1. Marschinski R, Kantz H (2002) Analysing the information flow between financial time series: An improved estimator for transfer entropy. Eur Phys J B 30: 275–281.
    1. Schreiber T (2000) Measuring information transfer. Phys Rev Lett 85: 461–464.
    1. Kaiser A, Schreiber T (2002) Information transfer in continuous process. Physica D 166: 43–62.
    1. Pompe B, Runge J (2011) Momentary information transfer as a coupling of measure of time series. Phys Rev E 83: 051122.
    1. Hlavackova-Schindler K, Paluš M, Vejmelka M, Bhattacharya J (2007) Causality detection based on information-theoretic approachesin time series analysis. PhysicsReport 441: 1–46.
    1. Vejmelka M, Palus M (2008) Inferring the directionality of coupling with conditional mutual information. Phys Rev E 77: 026214.
    1. Lungarella M, Ishiguro K, Kuniyoshi Y, Otsu N (2007) Methods for quantifying the causal structure of bivariate time series. J Bifurcation Chaos 17: 903–921.
    1. Wibral M, Pampu N, Priesemann V, Siebenhuhner F, Seiwert H, et al. (2013) Measuring information-transfer delays. PLoS ONE 8.
    1. Shannon CE (1948) A mathematical theory of communication. The Bell Systems Technical Journal 27: 379–656, 379-423, 623-656.
    1. Cover T, Thomas J (1999) Elements of information theory. New York: Wiley.
    1. Kraskov A, Stögbauer H, Grassberger P (2004) Estimating mutual information. Phys Rev E 69: 066138.
    1. Nichols JM, Seaver M, Trickey ST (2005) Detecting nonlinearity in structural systems using the transfer entropy. Phys Rev E 72: 046217.
    1. Li Z, Ouyang G, Li D, Li X (2011) Characterization of the causality between spike trains with permutation conditional mutual information. Phys Rev E 84: 021929.
    1. Cipra BA (1987) An introduction to the Ising model. The American Mathematical Monthly 94: 937–959.
    1. Krauth W(2006) Statistical Mechanics: Algorithms and Computations. Oxford: Oxford University Press.
    1. Norris JR (2008) Markov Chains. Cambridge: Cambridge University Press.
    1. Witthauer L, Dieterle M (2007). The phase transition of the 2D-Ising model. Available: . (refer to Figure 9).
    1. Abdul Razak F (2013) Mutual Information based measures on complex interdependent networks of neuro data sets. Ph.D. thesis, Department of Mathematics, Imperial College London.
    1. Theiler J (1986) Spurious dimension from correlation algorithms applied to limited time-series data. Phys Rev A 34: 2427–2432.
    1. Papana A, Kugiumtzis D, Larsson PG (2011) Reducing the bias of causality measures. Phys Rev E 83: 036207.
    1. Palus M, Stefanovska A (2003) Direction of coupling from phases of interacting oscillators: An information-theoretic approach. Phys Rev E 67: 055201.

Source: PubMed

3
Abonnieren