The principle of detailed balance is formulated for kinetic systems which are decomposed into elementary processes.


In 1901, Rudolf Wegscheider introduced the principle of detailed balance for chemical kinetics. In particular, he demonstrated that the irreversible cycles A1 → A2 → ... → An → A1 are impossible and found explicitly the relations between kinetic constants that follow from the principle of detailed balance. The principle of detailed balance is used in the Markov chain Monte Carlo methods since their invention in 1953. In particular, in the Metropolis–Hastings algorithm and in its important particular case, Gibbs sampling, it is used as a simple and reliable condition to provide the desirable equilibrium state.


Tthe concept of detailed balance: each process is equilibrated by its reverse process.



Reversible Markov Chains

(http://en.wikipedia.org/wiki/Markov_chain#Reversible_Markov_chain)

Reversibility in Markov chains arises from Kolmogorov's criterion which demands that the product of transition rates over any closed loop of states must be the same for the chain to be reversible. A Markov process satisfies detailed balance equations if and only if it is a reversible Markov chain.


A sufficient condition for a unique stationary distribution is that the detailed balance equation holds the reversibility condition for all i and j. A Markov chain is said to be reversible if there is a probability distribution over states, π, such that




for all times n and all states i and j. (한마디로, 노드(state) 간에 주고 받는 값이 동일함(즉, 수렴)을 의미)

This reversibility condition implies π = πP, as the jth element of πP is


(This is the definition of stationary distribution.)


P = {pij} is the Markov transition matrix (transition probability), i.e., pij = p(Xt = j | Xt-1 = i); and πi and πj are the equilibrium probabilities of being in states i and j, respectively. When p(Xt-1 = i) = πi for all i, this is equivalent to the joint probability matrix, p(i,j) = p(Xt-1 = i, Xt = j) being symmetric in i and j; or symmetric in t−1 and t.


A simple two-state Markov chain


Reducibility of Markov Chains

A Markov chain is said to be irreducible if its state space is a single communicating class; in other words, if it is possible to get to any state from any state.



References