Sales Toll Free No: 1-855-666-7446

Baye's Theorem

Top

Baye’s theorem is a very interesting topic of Statistics and Probability Theory. The Baye’s theorem is an alternative of Baye’s Law or Rules.

This theorem is based on two interpretations: first is Bayesian Interpretation and the other one is Frequentist Interpretation:
1) The Bayesian Interpretation describes how a subjective degree of belief actually should change rationally to account for the given evidences.

2) The Frequentist Interpretation completely relates to the inverse or just opposite representations of the Probabilities which is concerned with two events.

The Bayesian Interpretation has applications in almost all the important fields like science, engineering, law, medical science etc. The Bayesian Inference is the application of Baye’s theorem in which it can update beliefs.

An example is discussed below, by which one can view the related problems of Baye’s theorem.

If Tom says that he had a good conversation in the bus, then the probability that the person with whom he was talking is a woman is 50%. And if Tom tells that the same person he spoke was going to visit New York, here is also 50% probability is that the person is woman. Now let w be an event he talked to a woman, and q be the event “a visitor of New York”. Now the probability of w is P (w) which is 0.05.

But the updated value of the probability that the person Tom spoke to is woman given that he/she is a a visitor of New York is P(w|q) that can be calculated with the help of Baye’s law:

P(w |q) = $\frac{P(q |w) P(w)} {P(q)}$ = $\frac{P(q |w) P(w)} {P(q |w) P(w) + P(q |m) P(m)}$

Here M represents man and it is a complement of w. As P (M) = P (w) = 0.5 and P (q |w) is much greater than P (q |M), so the updated values will become close to 1.

From the above example the basics behind this theorem have been made clear. The Statement and Interpretation of Baye’s Theorem are given below:

In mathematics the Baye’s theorem gives the relationship between the probabilities between two events. Consider that the two events are G and H. Now the probability of event G is denoted by P (G) and probability of event H is P (H). Here the conditional probabilities of these two events can be given as G given H and H given G, P (G | H) and P (H | G).
In the general form it is:

P (G | H) = $\frac{P (H | G) P (G)}{P (H)}$

And P (H | G) = $\frac{P (G | H) P (H)} {P (G)}$

This means that it depends on the interpretation of probability.
Now consider here one more example where one has to find out the probability that G is true. Suppose that the new piece of evidence is true. This is called a Conditional Probability where it is given that the probability that one proposition is true provided that the other proposition is definitely true.

For Example: Suppose Bella draws a card from a deck of 52 cards, without showing it to other person. Now assuming the deck has been shuffled, the other person must believe the probability that the card is Jack is 4/12, or 1/3, since there exist 4 Jacks in the deck. But now suppose that Bella has told the other person that the card is a face card. Then the probability that the card is Jack, given that it is a face card, is 4/12 or 1/3, as there exist 12 face cards in a single deck. Then this conditional probability is denoted as P (J | F), which represents the probability that the card is a Jack given that it is a face card.


Statement

Back to Top
Let S be the sample space and let E1, E2……, En be n mutually exclusive and exhaustive events associated with a random experiment. If A is any event which occurs with E1 or E2 or….or En, then

$\frac{P(E_i) P(A|E_i)}{\sum_{i=1}^nP(E_i) P(A|E_i)}$ i = 1, 2, …… n

Proof

Back to Top
Since E1, E2……, En are n mutually exclusive and exhaustive events, we have

S = $E_1$ U $E_2$ U… U En, where $E_i$ U $E_j$ = 0 for i $\neq$ j

A = A $\cap$ S

A = (A $\cap E_1$) U (A$\cap E_2$) U…U (A$\cap E_n$)

P (A) = P (A$\cap E_1$) + P (A$\cap E_2$) +……+ P (A$\cap E_n$) [By add. theorem]

P (A) = $\sum_{i=1}^nP(A\cap E_i)$

P (A) = $\sum_{i=1}^n$P(Ei) P(A/Ei)$
[since P(A$\cap E_i$ )= P($E_i$) P($A|E_i$)]

Now, using multiplication theorem of probability, we have

P(A$\cap E_i$ ) = P (A) P ($E_i$ | A) for i = 1, 2, ….., n

P ($E_i$ | A) = $\frac{P(A\cap E_i)} {P(A)}$

P ($E_i$ | A) = $\frac{P(E_i )P(A|E_i)}{(P (A)}$

P ($E_i$ | A) = $\frac{P(E_i) P(A|E_i)}{\sum_{i=1}^nP(E_i) P(A|E_i)}$

Hence, P ($E_i$|A) =$\frac{P(E_i) P(A|E_i)}{\sum_{i=1}^nP(E_i) P(A|E_i)}$ for i = 1, 2, …… n
[Note: P ($E_i$|A) means probability of occurrence of Ei given that A already occurred and is known as conditional probability.]

Simply we can say that Baye’s theorem shows the relation between conditional probability and its reverse.

Examples

Back to Top
Example 1: A company has two plants to manufacture scooters. Plant 1 manufacture 70% of the scooters and plant 2 manufacture 30%. At plant 1, 80% of the scooters are rated as of
standard quality and at plant 2, 90% of the scooters are rated as of standard
quality. What is the probability that it has come from plant 2?

Solution:
Let $E_1$, $E_2$

and A be the following events.

$E_1$ = Plant 1
is chosen,
$E_2$ = Plant 2
is chosen, and
A = Scooter is of standard
quality.

P($E_1$) =$\frac{70}{100}$ , P($E_2$) = $\frac{30}{100}$ , P(A | E1) = $\frac{80}{100}$ , P(A|E2) = $\frac{90}{100}$

We are required to find

By Bayes theorem, we
have
P($E_2$ | A)= $\frac{P(E_2)P(A|E_2)}{P(E_1)P(A|E_1)+ P(E_2)P(A|E_2)}$

P($E_2$ | A) = $\frac{\frac{30}{100}\frac{90}{100}}{\frac{70}{100} \frac{80}{100}+ \frac{30}{100}\frac{90}{100}}$

=$\frac{27}{56+27}$

= $\frac{27}{83}$

Applications

Back to Top
Application of Bayes theorem:
  • Bayes theorem is used to determine probability of an event occurring.
  • Bayes helps us to estimate prior probabilities of the events we are concerned with.
  • Using Bayes rule we can predict the outcome of an event like sporting event, political elections, etc.
The interpretations of the probability have been discussed below:

1) Bayesian Interpretation: This interpretation is based on the Baye’s law. In this the probability measures the degree of beliefs. Bays theorem joins the degree of belief in a proposition before and after accounting the given evidences. For instance suppose that person A proposes that the biased coin is twice as likely to land heads than tails. Now the degree of belief in this can be true initially is 50%. The coin is now flipped many times in order to collect all the evidences. So the belief may rise to 70% if all the evidences support the proposition.
For proposition G and evidence H:
a) P (G), the prior is the initial degree of the belief in proposition G.

b) P (G|H) the posterior is the degree of belief which is accounted for the evidence H.

c) P (B|A) / P (B) represents the support evidence H provides for proposition G.
2) Frequentist Interpretation: This is defined with respect to a great number of trials where each trial produces one outcome from a given Set of all the possible outcomes. An event is basically a subset of the set of all possible outcomes. The probability f an event G is P (G) is the proportion of all the trials which produce an outcome in event G. Similarly the probability of H is P (H). Now here if only trials in which event G occurs are considered, the proportion in which event H also occurs is P (H|G). Similarly if only trials in which event H occurs are considered, the proportion in which event H also occurs is P (G|H). So it completely relates to the inverse or just opposite representations of the probabilities which concern with two events.