I want to know if a Markov process far from equilibrium corresponds to a non-equilibrium thermodynamics process or whether they have something in common?
Actually, a time-homogenous Markov Chain rigorously converges to a unique probability distribution, which can be fixed e.g. by the detailed balance condition. If this distribution is chosen to be the Boltzmann distribution, the MC converges to the Gibbs equilibrium measure, i.e. thermal equilibrium. You can also choose this distribution to be something completely different, e.g. a steady-state non-equilibrium distribution corresponding to some physical system. This is how you make a connection to non-equilibrium thermodynamics.
A Markov process is defined by a set of transitions probabilities (probability to be in a state, given the past). These transition probabilities can depend explicitly on time, corresponding to a non stationary situation. To this family of transition probabilities is attached (when they exist) one or several probability distributions weighting the set of states from time=-\infty to +\infty and satisfying compatibility conditions. Let $\mu$ be such a probability if you have the probability of a given past under $\mu$ you can compute the probability of the future, under $\mu$, by multipying by transitions probabilities. When the transition probabilities do not depend explicitly on time (homogeneous Markov chain) a classical example of such a probabilitity is the invariant probability of the chain : it exists and is unique e.g. if all transition probabilities are positive (sufficient condition). When transition probabilities depend explicitly on time, such probabilities correspond to the notion of a Gibbs distribution in the spatio temporal domain (non equilibrium). For a nice mathematical review on these aspects see the nice work by Grégory Maillard and Roberto Fernandez (which actually extend to systems with infinite memory http://www.latp.univ-mrs.fr/~maillard/Greg/Publications_files/cccjspfin.pdf
).
In this context, it is possible to define entropy production, currents, and e.g. Onsager relations. In a slightly different context of hyperbolic dynamical systems see the work of Gallavotti and the nice review by D. Ruelle http://www.ihes.fr/~ruelle/PUBLICATIONS/124nesm.pdf.
What physical situations does detailed balance correspond to in general? If I choose Boltzmann distributin as stationary distribution of Markov process, could I transform physical quantities into Markov process, e.g. entropy, temperature can be defined in Markov process?
To describe e.g. the dynamics of 1st order phase transitions (such as the condensation of water steam to water droplets) a classical approach is the Becker-Döring theory describing how e.g. an H2O-monomer gas clusters into a "large cluster" (the water droplet). In the original theory with a grand-canonical ensemble formulation (there is a reservoir with constant supply of monomers attached) the dynamical equations are just modified birth-death equations. These birth-death equations are typical subjects studied in the theory of Markov processes. However, if one wants to include memory effects or path-dependencies one has to move on towards "quasi birth death" processes (which are non-Markovian) - a brand new branch of probability theory developing since the late 90es.
"...could I transform physical quantities into Markov process, e.g. entropy, temperature can be defined in Markov process?"
The Markov process is just a tool to create an ensemble. It does (in general) not model a physical process, but provides (correlated) micro-states according to a certain probability.
Detailed balance ensures that the balance or flow between any two states is even, such that the overall-distribution does not change. This corresponds to equilibrium, but not necessarily to a specific ensemble (canonic, grand-canonic, etc). If you choose the Boltzmann distribution as stationary distribution of the Markov process, then you can derive quantiities like temperature (which you already know) and entropy from the created ensemble(s).
To add to Stefan, if you choose the Boltzmann distribution to be the invariant distribution of the Markov chain, then sampling the states (with detailed balance) corresponds to sampling phase space states in your physical system. From this sampling you can then calculate all thermodynamic quantities as simple averages over the states.