I want to use VG algorithm (visibility graph) to convert EEG time series to graphs while preserving the dynamic characteristics such as complexity, Now I want to know what the important complexity measures are.
There is another measurement called the Kolmogorov complexity which is essentially the minimum length which a string of bits may be compressed, but this is also the same or very similar, to Shannon's entropy for the same bit string. As Mohamed said, they are all much the same! One important measure for testing the randomness of a chaotic series is the Kolmogorov-Sinai (K-S) entropy. K-S entropy is a little different and takes into account the sequence of probabilities. For a regular series, K-S entropy is zero, is positive and finite for a chaotic series but for a random signal it is infinite. The concept of entropy has been applied in many areas such as physics, mathematics, and many other fields and as John von Neumann(possibly the greatest of all the mathematicians/Physicists.....), said "no one knows what entropy really is, so in a debate you'll always have the advantage" (Tribus & McIrvine 1971: 180). Much of the confusion is because people apply "entropy," like "dimension,'' to many different concepts or measures. Tsallis entropy is also another useful measure which you might want to look up.I attach a paper that might be of use.
If I understand you correctly, you want to calculate complexity measures from EEG time series in a graph fashion. This is possible obtaining adjacency matrixes via synchronization/connectivity measures among channels (check the NBToolbox). Maybe it would be useful for you to use the BCToolbox, which provides you complexity measures based on graph theory (integration, segregation, etc). Check the info attached.
You probably know about the following, but I'll mention it anyway, and you should look up the following site:
http://cvr.yorku.ca/webpages/wilson.htm#book
Hugh R Wilson's really excellent book on nonlinear EEG material called Spikes, Decisions, Actions, may be downloaded (thanks to H Wilson!) for free, along with his Matlab files
As for complexity measure, I developed ht-index, as an alternative index to fractal dimension, for characterizing complexity of geographic features or fractals in general:
Jiang B. and Yin J. (2014), Ht-index for quantifying the fractal or scaling structure of geographic features, Annals of the Association of American Geographers, 104(3), 530–541.
thank you so much for your comment, yes, you got my purpose completely correct, but I don't want to find correlation measures among all channels (for example 21 channels of EEG for epilepsy), because I've used several measures such as PLI, ImC, WPLI, etc (which are less sensitive to Volume Conduction problem) and tried to construct whole network adjacency matrix ,and because the problem of that methods, and also the problem of active- reference electrode, now I want to use visibility graph to convert each channel to a separate, unique graph and then trying to find complexity of each graph ( each signal).
Measures Related to Metric Complexity (V. Afraimovich, L. Glebsky, R. Vazquez) Discrete and Continuous Dynamical Systems. Vol 28, No. 4, 1299-1309, December 2010
Measures Related To n-Complexity Functions (With L. Glebsky) Discrete And Continuous Dynamical Systems, Vol. 22, N 1 2. 2008.
well, you want to analyze your EEG data in a isolated time series fashion. A way is quantifying the chaoticity via largest lyapunov exponent or the fractal nature via correlation dimension. In both cases, you need to map the data from time domain to a vector space to obtain an attractor. In this way you obtain a graph that represents the characteristics of the time series in a vector space. I know that it works nice in epilepsy cases, for example, where the signal is full of singularities, but in other cases it's not so useful. Otherwise, Lempel-Ziv Complexity or detrended fluctuation analysis could be also useful and you don't need to reconstruct your data in a vector space.
as you know, VG algorithm can simply convert each signal to it's related graph, I constructed those graphs and now want to know, which measure in graph can show complexity, I mean which characteristic of VG graph? for example eigenvalue of adjacency matrix or any thing else
by the way I think it can be applicable for AD as well in sub band frequencies.
My group works in exactly what you are asking about. We develop complexity measures based in Bandt and Pompe probability distribution and one of the latest papers (the one you'll find attached) links horizontal visibility graphs and measures from Information Theory. I think you could be interested in it. Let me know if you want to talk more about this.
please find atteched two more papers about our work
many thanks for your comments and for providing me those excellent papers, Indeed I'm really eager to learn more and more about it, I'll contact you after reading those articles.
Hi Negar... yes, I mean something like this. Although don't know the VG method, the idea of the DFA is analysing self-similarity in different scales, as a complexity metric similar to the Hurst exponent.
I have applied Differential Graph Theory to Time Series data and the result is a scale dependant (endogenous) factor structure where Entropy of the System can be conserved, say by user distribution properties (complexity). Please refer to my paper on nanostructure on my RG webpage. I hope it helps your research.