Markov chain

A Markov chain is a mathematical object that consists of a sequence of states and a set of transition probabilities that describe how to move among the states. A chain is Markovian if the probability of moving to any other state depends only on the current state. Given such a chain, we can perform a random walk by choosing a starting point and moving to other states according to the transition probabilities. If we somehow find a Markov chain with transitions proportional to the distribution we want to sample from (the posterior distribution in Bayesian analysis), sampling simply becomes a matter of moving between states in this chain.

So, how do we find this chain if we do not know the posterior in the first place? Well, there is something known as a detailed balance condition. Intuitively, this condition says that we should move in a reversible way (a reversible process is a common approximation in physics). That is, the probability of being in state and moving to state  should be the same as the probability of being in state  and moving toward state . This condition is not really necessary, but it is sufficient and generally easier to prove, so it is generally used as a guide to design the majority of the most popular MCMC methods.

In summary, if we manage to create a Markov Chain that satisfies this detailed balance, we can sample from that chain with the guarantee that we will get samples from the correct distribution. This is a truly remarkable result! And the basic engine under the hood of software like PyMC3.

The most popular Markov Chain Monte Carlo method is probably the Metropolis-Hasting algorithm, and we will discuss it in the following section.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.145.27.58