By definitions, the normalized cross-correlation is a number between -1 and 1 while the mutual information is a number between 0 and 1. It is striking that (at least that is my feeling) the two extreme cases match: (1) cross-correlation CC = 0 is equivalent to mutual info M = 0, (2) absolute of cross-correlation ABS(CC) = 1 is equivalent to  mutual info M = 1. While mutual info is much broader term than cross-correlation, under some assumptions there could be a functional relationship between the two, for example M = ABS(CC) or something alike.

Similar questions and discussions