
Definition
Conditioning on an event
Kolmogorov definition
Given two events A and B from the sigma-field of a probability space, with the unconditional probability of B being greater than zero (i.e., P(B) > 0), the conditional probability of A given ) is the probability of A occurring if B has or is assumed to have happened. A is assumed to be the set of all possible outcomes of an experiment or random trial that has a restricted or reduced sample space. The conditional probability can be found by the quotient of the probability of the joint intersection of events A and B, that is,
, the probability at which A and B occur together, and the probability of B:
Illustration of conditional probabilities with an Euler diagram. The unconditional probability P(A) = 0.30 + 0.10 + 0.12 = 0.52. However, the conditional probability P(A|B1) = 1, P(A|B2) = 0.12 ÷ (0.12 + 0.04) = 0.75, and P(A|B3) = 0.
On a tree diagram,
branch probabilities are conditional on the event associated with the
parent node. (Here, the overbars indicate that the event does not
occur).
For a sample space consisting of equal likelihood outcomes, the probability of the event A is understood as the fraction of the number of outcomes in A to the number of all outcomes in the sample space. Then, this equation is understood as the fraction of the set to the set B. Note that the above equation is a definition, not just a theoretical result. We denote the quantity
as
and call it the "conditional probability of A given B".
As an axiom of probability
Venn Pie Chart describing conditional probabilities
Some authors, such as de Finetti, prefer to introduce conditional probability as an axiom of probability:
This equation for a conditional probability, although mathematically equivalent, may be intuitively easier to understand. It can be interpreted as "the probability of B occurring multiplied by the probability of A occurring, provided that B has occurred, is equal to the probability of the A and B occurrences together, although not necessarily occurring at the same time". Additionally, this may be preferred philosophically; under major probability interpretations, such as the subjective theory, conditional probability is considered a primitive entity. Moreover, this "multiplication rule" can be practically useful in computing the probability of and introduces a symmetry with the summation axiom for Poincaré Formula:
Thus the equations can be combined to find a new representation of the :
As the probability of a conditional event
Conditional probability can be defined as the probability of a conditional event . The Goodman–Nguyen–Van Fraassen conditional event can be defined as:
, where
and
represent states or elements of A or B.
It can be shown that
which meets the Kolmogorov definition of conditional probability.
Conditioning on an event of probability zero
If , then according to the definition,
is undefined.
The case of greatest interest is that of a random variable Y, conditioned on a continuous random variable X resulting in a particular outcome x. The event has probability zero and, as such, cannot be conditioned on.
Instead of conditioning on X being exactly x, we could condition on it being closer than distance away from x. The event
will generally have nonzero probability and hence, can be conditioned on. We can then take the limit
For example, if two continuous random variables X and Y have a joint density , then by L'Hôpital's rule and Leibniz integral rule, upon differentiation with respect to
:
The resulting limit is the conditional probability distribution of Y given X and exists when the denominator, the probability density , is strictly positive.
It is tempting to define the undefined probability using this limit, but this cannot be done in a consistent manner. In particular, it is possible to find random variables X and W and values x, w such that the events
and
are identical but the resulting limits are not:
The Borel–Kolmogorov paradox demonstrates this with a geometrical argument.
Conditioning on a discrete random variable
Let X be a discrete random variable and its possible outcomes denoted V. For example, if X represents the value of a rolled die then V is the set . Let us assume for the sake of presentation that X is a discrete random variable, so that each value in V has a nonzero probability.
For a value x in V and an event A, the conditional probability is given by . Writing
for short, we see that it is a function of two variables, x and A.
For a fixed A, we can form the random variable . It represents an outcome of
whenever a value x of X is observed.
The conditional probability of A given X can thus be treated as a random variable Y with outcomes in the interval . From the law of total probability, its expected value is equal to the unconditional probability of A.
Partial conditional probability
The partial conditional probability is about the probability of event
given that each of the condition events
has occurred to a degree
(degree of belief, degree of experience) that might be different from 100%. Frequentistically, partial conditional probability makes sense, if the conditions are tested in experiment repetitions of appropriate length
. Such
-bounded partial conditional probability can be defined as the conditionally expected average occurrence of event
in testbeds of length
that adhere to all of the probability specifications
, i.e.:
Based on that, partial conditional probability can be defined as
Jeffrey conditionalization is a special case of partial conditional probability, in which the condition events must form a partition: