With correlation, there may or may not be interdependence, but with mutual information, there is always dependence. If there is dependence in a correlation, it then becomes a regression study
The Mutual Information of two random variables is a measure of the mutual dependence between the two variables. It quantifies the "amount of information" obtained about one random variable by observing other variables. The concept of mutual information is linked to information theory that quantifies the expected "amount of information" held in a random variable.
Correlation Coefficients are used to measure how strong a relationship is between the relative movements of two variables. A correlation coefficient of 1 means that for every positive increase in one variable, there is a positive increase in a fixed proportion in the other.
Like Mr. Andrew Paul McKenzie Pegman mentioned, CC may or may not have interdependence, while MI always has the dependence.