By coding the messages xi (i=1,2,…, N) of a finite information source X having the probabilities pi = p(xi) with letters yj (j=1, 2, …, D) of a finite coding alphabet Y, we obtain a new information source Y. I need to characterize Y as an information source with memory, in order to verify the relation H(Y) = LH(X), where L is the average length of the codewords and H(X) and H(Y) are the entropies of X and Y respectively. My questions: how we can obtain the stady-states of the information source Y? How the matrix containing the transition probabilities between the states of Y can be determined? How the probabilities of the stady-states of the information source Y are obtained? Does anyone have some results on this topic?