Bayes rule

Bayes rule is one of the building blocks of probability theory. It stems from conditional probability and joint probability and extends beyond.

We will explain this in a simple way by again taking an example from cricket. In cricket, pitch condition varies as you go from one place to another and it is one of the factors that can be significant when deciding the team. The outcome can also be dependent upon it.

Let's say the Indian team goes to Australia for a game and we have to predict the belief of an Indian player scoring a century (100 runs) in the game. If that player has got experience of playing in that country, we might say with strong belief that he might score a century. But, there is another player who is a first-timer in this country. What would the the prior belief be for him? Of course, many would have less belief that he would score a century.

However, our prior belief will change as we see the way the player is performing. That is, more data about the player will be at our disposal as more games are played by that player. Based on that, posterior belief will keep getting updated. It changes a lot, largely due to the observations or more data (which is called likelihood). Bayes rule is based on these concepts.

Let's say that Ai forms a mutually exhaustive event with B:

           

The probability of B will be as follows:

We get the probability of B from conditional probability like so:

Hence:

Now, extracting the value of  from equation 2 and putting it in equation 1, we get this:

After replacing the value of P(B) from the preceding equation, we get this:

Have a look at equation 3 first. This is called Bayes rule.

P(A|B) is called posterior, which needs to be estimated. In the preceding example, this would be the probability of scoring a century given that the player has got the earlier experience of playing there.

P(B|A) is called the likelihood, which is the probability of observing the new evidence, given our initial hypothesis. For example, the probability of a player having previous experience in playing cricket get to score a century.

P(A) is called the prior, which is the probability of our hypothesis without any additional prior information.

P(B) is called the marginal likelihood, which is the total probability of observing the evidence.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.188.131.255