### Unit 6: Introduction to Counting and Probability

Counting is not always as easy as it sounds. Say, for example, you are asked to arrange three balls of three different colors. What are the possibilities? What if two of them have the same color? This is an example of a set of problems known as "counting problems". In this unit, we will learn different types of counting using possibility trees and general counting theorems. Once we understand the concepts of counting, we will discuss the probability of event occurrence, which refers to the likelihood that a certain outcome for a given problem set will occur. Events can be dependent or independent, making them subject to different sets of rules. Throughout the unit, we will relate counting and probability to computer science topics such as counting list and array elements, password computation, and brute force attacks.

In this unit, we are going to rely on the Devadas and Lehman reference as primary. Use the Bender and Williamson reference as a supplement.

**Completing this unit should take you approximately 15 hours.**

Upon successful completion of this unit, you will be able to:

- solve counting problems;
- compute the number of permutations and combinations;
- compute conditional probabilities; and
- compute the probability of independent events.

### 6.1: Definitions and Basic Counting

Read from the Introduction to Probability through Section 1.2. This section motivates our study of probability and counting, because counting often comes up in the analysis of problems that we solve using probability.

### 6.1.1: What Is a Sample Space?

Read Section 1.3 on pages 3–5, which discusses the first step of the four-step process for building a probabilistic model to analyze and solve problems using probability.

### 6.1.2: What Is an Event?

Read Section 1.4 on pages 5–6, which discusses the second step of the four-step process for building a probabilistic model for solving probability problems.

### 6.1.3: Equally Likely Probability Formula

Read Sections 1.5 and 1.6 on pages 6–9. These discuss the third and fourth steps of the four-step process for building a probabilistic model for solving probability problems. In the third step, probabilities are assigned using the possibility tree drawn during steps one and two. In step three, probabilities are assigned to the edges of the tree. At each level of the tree, the branches of a node represent possible outcomes for that node. If the outcomes of a node are assigned the same probability (the outcomes of a node add to 1), we say that they represent equally likely outcomes.

### 6.1.4: Counting Elements of Lists and Sublists

Read through Section 1.4. This work presents some strategies and rules that aid us in counting members of sets arising in the analysis of probability problems.

Section 1.3 illustrates the bijection rule for arrays and lists. The bijection rule can be applied to counting the elements of an array. The elements of a list can be mapped via a bijection to a one-dimensional array. Thus, because we can count the elements of a list, we can count the elements of such an array. (We count the elements of a list, by walking down the list and incrementing a tally, initialized to zero, by one for each item of the list.)

### 6.2: Possibility Trees and Advanced Counting

Each of the analyses discussed in "Counting I" can be illustrated by drawing a tree. Take another look at Section 2.1 and Section 2.2 on the sum rule and the product rule. These explain what a possibility tree is. If you illustrate these rules by drawing a tree, it will depict all the elements of a union of disjoint sets (sum rule) and of the product of sets (multiplication rule). If these sets represent outcomes for events, then the tree is called a possibility tree.

### 6.2.1: Possibility Trees and the Multiplication Rule

Read up to Section 1.3. Look at the line drawings beginning in Section 1.3 and in following sections; these are possibility trees. Realize that they are just representations of functions used in modeling a problem. Note the modeling advice and steps 1–4 on pages 1–9.

Multiplication often applies to each level of the tree. That is, each node at level 1 is expanded (multiplied) by the same number of branches at level 2, and so on, for each level. If the outcomes of an event for a probability problem are modeled using a tree is called a possibility tree.

### 6.2.2: Permutation

Read Section 1.3 for the definition of a permutation, how to count permutations, and a reminder of Stirling's formula, which approximates n!

### 6.2.3: Counting Elements of Sets: Addition, Product, and Division Rules

Revisit Section 2, "Two Basic Counting Rules", to reinforce your understanding of the counting rules, focusing on the sum rule and the product rule.

Read from the beginning through Section 1.2; then study the division rule covered in Section 2. Lastly, study Section 3, which discusses counting elements of the union of sets, extending the addition rule: disjoint sets and, then, non-disjoint sets.

The difference rule is as follows: the number of elements in B - A equals (the number of elements in B) minus (the number of elements in B intersect A). In symbols # (B - A) = # (B) - # (B A).

### 6.2.4: Combinations

Read Section 1 on pages 1–3. Section 1.1 and Section 1.2 discuss counting sequences; Section 1.3 discusses counting combinations.

### 6.2.5: The Algebra of Combinations

Read this article for more information on the algebra of combinations.

Pascal's triangle is a simple, manual way to calculate binomial coefficients.

Look at the table at the bottom of the article. The first column, (0,1,2,3,4,5), contains the numbers of the rows (the 0

^{th}row, 1^{st}row, 2^{nd}row, and so on) and corresponds to the exponent 'n' in (x + y)_{n}.Ignore the first column for the moment. Take the n

^{th}row and shift it n spaces to the left (the 0^{th}row is not shifted), the 1^{st}row is shifted 1 space to the left, the 2^{nd}row is shifted 2 spaces to the left, the 3^{rd}row is shifted 3 spaces to the left, etc. This shifting results in a shape of a triangle: '1' is at the top of the triangle in the 0^{th}row, '1 1' is next in the 1^{st}row offset by one space (so that the '1' at the top is above the space in '1 1'), etc. This triangle for n from 0 to 5 generalizes to any n and is called Pascal's triangle. These numbers are the coefficients for the binomial equation presented next.Read Section 2, which defines the binomial expansion expressed in the binomial theorem, that is, the expansion of (a + b)

_{n}. The binomial expansion is very important because it is used to make approximations in many domains, including the sciences, engineering, weather forecasting, economics, and polling.The expansion of (a + b)

_{n}is a polynomial whose coefficients are the integers in the nth row of Pascal's Triangle.

### 6.3: Probability Process

Read Section 2, which presents a four-step process for solving probability problems. This process incorporates basic probability axioms, albeit indirectly, as part of the process steps.

### 6.3.1: Probability Axioms

Read Section 4 on pages CL-28 through CL-30.

Some useful results are summarized here:

∑xε S P(x) = 1, where S is the sample space, also written P(S) = 1;

P(x) >=0, for x in the sample space;

Let E be an event, defined as a subset of S. P(E) = ∑xε EP(x);

If E and D are events such that E is contained in D, then P(E) <= P(D); and

P(E È D) = P(E) + P(D) for E, D disjoint.

From the above the following can be proved: Let E be an event. E' = S -E. P(E È E') = P(E) + P(E'), P(S= E È E') = 1; thus, 1 = P(E) + P(E') and P(E') = 1 - P(E).

The probability of the complement of an event E is 1 - (probability of E). This result is often very useful, because for some problems the probability of E may be difficult to calculate, but the probability of the complement of E may be easy to calculate.

### 6.3.2: The Probability of a General Union of Two Events

Read Section 4, "Conditional Probability Pitfalls", in particular "Conditional Probability Theorem 2" and "Theorem 3" on page 12. This will serve as a lead-in to conditional probability, which is covered next.

### 6.4: Conditional Probability and Independent Events

Focus on the modeling advice and steps 1–4 on pages 1–9.

Read this lecture. Try to understand the concepts of conditional probability, and just scan the examples. We'll take a closer look at the problems in the next section.

### 6.4.1: Computing a Conditional Probability

Work through the following examples in this lecture:

Section 1, "The Halting Problem" (fictitious name of a hockey team) on page 2;

Section 2.1, "A Coin Problem" on page 6;

Section 2.2, "A Variant of the Two Coins Problem" on page 8;

Section 3, "Medical Testing" on page 9;

Section 4.1, "Carnival Dice" on page 11;

Section 4.3, "Discrimination Lawsuit" on page 14; and

Section 4.4, "On-Time Airlines" on page 15.

Understanding the examples is critical to really understanding conditional probability.

### 6.4.2: Bayes' Theorem

Read Section 3 on pages DT-28 through DT-35.

Bayes' theorem is a famous theorem for computing conditional probabilities. Assume Ei are mutually exclusive events for i = 1,..., n and U

_{i}E_{i}= D, for arbitrary event D, P(E_{i}| D) = P(D | E_{i}) P(E_{i}) / [P(D|E_{1})P( E_{1}) + ....+ P(D|E_{n}) P(E_{n})]. Statements of Bayes' theorem are given on pages DT-28 and DT-32. Like the binomial theorem, Bayes' theorem is very useful in calculating probabilities for many applications, for example, in diagnosis and in decision theory.

### 6.4.3: Computing the Probability of Independent Events

Read this lecture for practice with the probability of independent events.

### Unit 6 Assessment

Take this assessment to see how well you understood this unit.

- This assessment
**does not count towards your grade**. It is just for practice! - You will see the correct answers when you submit your answers. Use this to help you study for the final exam!
- You can take this assessment as many times as you want, whenever you want.

- This assessment