There are two notions which can be compared in part, least:
- the Markov Chain is a sequence indexed by an ordered set, as an interval of integers; Let this be X_n; n \in N_0. Then for any n, the conditional p.d. of X_{n+1} given X_0,X_1, . . . ,X_n equals the conditional p.d. X_{n+1} given X_n.
- the Markov Field is a sequence indexed by an at most countable set with a structure of neighborhoods; Let the index set be Z^2, and for given point (m,n) \in Z^2 let its neighborset consist of the four nearest points differing by one in one coordinate, only. Then the defining condition for the Markov Field is that for any subset A\subset Z^2, the conditional p.d. of the random elements X_{n,m} with (m,n) \in A, given the outer random elements X_{n,m} with (m,n) \in Z^2 \setminus A equals the conditional p.d. of the random elements X_{n,m} with (m,n) \in A, given X_{n,m} with those (m,n) \in Z^2 \setminus A which are neibors of at least one position of A.
In general, the relation between these notion is not direct. However, one of the basic facts says that if X_n, n \in N_0, is a Markov chain with contably many states, then it is also a Makov field (if the neighbors of n are n-1 and n+1,for n=1,2,3, and just the point 1 for n = 0) (see the note below)
The value of the notion of the Markov fields is that it extends the notion of Markov features also for much more complicated structure of neighbors. Let me suggest to find papers by D.Ruelle, R.L.Dobrushin, Ya.G.Sinai, R.A.Minlos, a book by Ch.J.Preston (on Gibbs sates - rigorous results) of late 60's and early 70's of the last century. (E.g. start TheGoogle entering Minlos)
NOTE. If P(j,k)=Pr\{X_{n+1}=k| X_n=j\}, n=0,1,2,...,, is the transition matrix for a homogeneous Markov chain, then
HMM and CRF are two state of the art models HMM used Markov Theory and CRF uses Conditional Random fields theory. Both works well but usually CRF is better for NLP tasks.