
Statement of theorem
Bayes' theorem is stated mathematically as the following equation:
is a conditional probability: the probability of event
occurring given that
is true. It is also called the posterior probability of
given
.
is also a conditional probability: the probability of event
occurring given that
is true. It can also be interpreted as the likelihood of
given a fixed
because
.
and
are the probabilities of observing
and
respectively without any given conditions; they are known as the prior probability and marginal probability.
Proof
Visual proof of Bayes' theorem
For events
Bayes' theorem may be derived from the definition of conditional probability:
where is the probability of both A and B being true. Similarly,
Solving for and substituting into the above expression for
yields Bayes' theorem:
For continuous random variables
For two continuous random variables X and Y, Bayes' theorem may be analogously derived from the definition of conditional density:
General case
Let be the conditional distribution of
given
and let
be the distribution of
. The joint distribution is then
. The conditional distribution
of
given
is then determined by
Existence and uniqueness of the needed conditional expectation is a consequence of the Radon–Nikodym theorem. This was formulated by Kolmogorov in his famous book from 1933. Kolmogorov underlines the importance of conditional probability by writing "I wish to call attention to ... and especially the theory of conditional probabilities and conditional expectations ..." in the Preface. The Bayes theorem determines the posterior distribution from the prior distribution. Bayes' theorem can be generalized to include improper prior distributions such as the uniform distribution on the real line. Modern Markov chain Monte Carlo methods have boosted the importance of Bayes' theorem including cases with improper priors.