Maximum likelihood is a strategy for estimating the parameters of a probability model. Let p(X | \theta) be the probability model of the experiment. X is the random variable (you realize this in your data) and \theta represents the parameters of the probability model. Given data, the idea is to 'find those parameters most likely to have generated the data you observed'. This objective is formalized by first writing the likelihood function for the observed data. Say you observed x_1, ..., x_n data. The likelihood (under the standard assumptions that the data are independent and identically distributed) is usually written as the product: L(\theta | {x_1, ..., x_n}) = \product_{i=1}^n p(x_i | \theta). Since the data are realized, L( . ) is viewed as a function of \theta. The maximum likelihood estimate of \theta is literally that value that maximizes L(\theta | {x_1, ..., x_n}). Insert any distribution function p( . ) and away you go! For your problem, p( . ) appears to be a stochastic process and the 'equal covariance' statement in the title appears to impose some specific assumptions on the model parameters \theta.