Probabilities in simple random samples

1 Introduction

When it comes to math, one of the things that people are phenomenally bad at – aside from the obvious everything – is probabilities. Specifically, calculating probabilities. People are bad enough at estimating chances, but properly calculating then has its own hurdles.

In this particular case, I'd like to take a look at the probability of a Simple random sample: getting a particular distribution of results from a number of samples, like getting 3 sixes in a certain number of rolls. Specifically, I'll look at the case with replacement. In this case, the sample pool remains the same after each individual sample. The case without replacement (like picking colored marbles from a vase without putting them back) is a bit trickier so I won't cover that now.

The binomial distribution

One of the equations that is relevant here is the binomial distribution (see Eq 1), the scourge of math students everywhere. What I'd like to do is explain why it looks the way it does, what the terms mean and how you can generalize it to cover more than two outcomes (that's what the “bi-” stands for: two). Basically, I'd like to show you how to “read” the equation and to interpret the terms in it, which will make it easier to deal with the subject.

(1) P(W=k) = {n\choose k} p^k (1-p)^{n-k}

First, a few words about the Eq 1 itself. Say you have a random event that can have two outcomes: coin-flip, true/false question, but also a die-roll if you just look at an X vs non-X result. Or, in the parlance of our times, whether you have a win (W) or a fail (F). What Eq 1 gives you is the probability of getting exactly k wins in n trials.

 

That's all well and good, of course, but unless you actually understand how the equation is built up and what all the terms in it mean, it'll just be yet another magic formula that someone dreamed up to torture you with. In practical terms, it also means that it's harder to apply it correctly. So let's look at what it means, starting with what each individual symbol means.

Table 1: Explanation of symbols in Eq 1.
SymbolMeaning
W Win index, or number of wins.
F Fail index, or number of fails.
n Number of trials.
k Selection value (usually linked to the number of wins).
p Single-trial probability (usually linked to wins).
P(cond) Total probability, with the condition.

Here's an example of its use. Say you have 7 dice (7d6) and you want to know the probability of throwing 3 sixes. This means you have n = 7, k = 3 and p = 1/6, and the condition is W = k = 3.

Divide and conquer

The next step is to split the equation into the two parts that are of true relevance, namely:


Fig 1. Dividing the probability function into two parts. One for the base chance per solution, and one for how many solutions there are.
  1. The base chance of getting a specific result that satisfies your conditions.
  2. The number of results that satisfies the conditions (each with the same base probability).

I've always found this division to make the most sense. Note that I've split it into two parts, not three. The terms involving p and (1−p) belong in the same group, as I'll explain later. The nice thing about this division is that the two parts neatly correspond to two concepts of mathematics: probability and combinatorics. If you understand those two, equations like Eq 1 should start to make sense.

2 Probability and its rules

When I say you need to understand probability theory to understand the binomial distribution, what I really mean is that you understand the rules or probability, and how to combine single chance occurrences into larger conditions. See also wiki:probability.

2.1 Probability definition

The probability of a selected case x is the number of times it occurs relative to the total pool of cases. This is the fraction given in Eq 2. All other rules are the result of this one.

(2) P(X) = {{# \text{Selected case}} \over { # \text{Total cases}}}

For example, we roll a d6 and we're interested in getting a six. Since there are 6 sides and only 1 six, the probability is P(1 six) = 1/6.

2.2 Probability range

Since the total number of items is at least as large as the selection, each probability itself is at most 1. A probability can't be lower than zero either, because counts are natural numbers.

This also holds for the sum probability of multiple items. Also, the probability to get any result at all is 1, because obviously something needs to happen. If you sum up probabilities and you get an answer greater than 1, some of the cases you've counted actually overlap. If you add up all probabilities and you get less than 1, you've missed a few cases.

(3) 0 \le P(X) \le 1

2.3 Complementary rule : NOT

If the probability of some event is P(X) and the sum probability of all events, then the probability of it not happening is 1−P(X). The symbol for “not” is ¬.

(4) P(\text{not}\,X) = P(\neg X) = 1 - P(X)

For example, if the probability of a six on a d6 is 1/6, the probability of not getting a six is P(~six) = 5/6.

2.4 Disjoint conditions : XOR

If you have multiple conditions that are mutually exclusive (disjoint), the probabilities add. This corresponds to an “or” case.

Please note the “mutually” here. It must be either X or Y. For the programmers: the English word “or” actually mean an exclusive OR ( ^ in C), not an inclusive OR ( | ). This difference is the main reason why calculated probabilities can end up greater than 1. There are several symbols that can be used for a disjoint condition, all of which are problematic. I'll use the boolean XOR symbol, ⊕, because even though it may be a little obscure for most people, at least it's non-ambiguous.

(5) P(X\text{ or }Y) = P(X \oplus Y) = P(X) + P(Y)

For example, the chance of getting a 6 or a 5 on a single d6. is P(six or 5) = 1/6 + 1/6 = 1/3. You can also approach it from the definition of Eq 2, where you still have a total of 6, but now you have 2 valid cases, leading directly to 2/6.

Language issues and silly symbols

The term "or" is one of those areas where human languages, mathematics and logic are at odds. As said, the word “or” actually means an exclusive or (one of the two, but not both), even though in math and logic it's used for the union (the full combination; and/or).

It gets even more vexing when one looks at the symbols used. The C equivalent for OR (the inclusive OR) is | or ||. However, the | symbol is also used for a conditional probability. So that's out. I can't use the C XOR symbol, ^either, as that actually means AND in logic. There's also the union symbol, , but I think that stands for inclusive OR, not exclusive. So in the end it comes down to the O-plus, ⊕. While relatively unknown, at least it only has a single meaning.

If anyone has a better idea, I'd love to hear it.

2.5 Joint conditions : AND

When you have two independent conditions that both need to be true, the probabilities multiply. Think of it as a sub-selection, where you have to get a fraction of a fraction. This corresponds to an “and” case. The logical symbol for it is a cap ( ∩ ), but since it looks too much like a cup, I'll use the actual AND-symbol ( & ).

(6) P(X\text{ and }Y) = P(X \& Y) = P(X) \cdot P(Y)

An example of this rule is the chance to throw two sixes with two dice. First, you need to throw a six with one for a 1/6 chance, and in the event that that happens, there's another 1/6 chance for the other, combining into P(six and six) = 1/6 · 1/6 = 1/36. You can also get this result from the definition. With 2 dice you have 36 possibilities in total, and only one of them will contain (6,6). Note that for a 5 and 6 the rules are different, since both (5,6) and (6,5) are possible here.

2.6 Usage in general and binomials in particular

When considering probabilities, you'll often have multiple trials/samples and a diffent type of answer for each trial. For example, say you have three different options, a, b and c. Each option has a different probability, pa, pb, and pc. Now I'm looking for the probability of getting 2 a, 3 bs and 4 cs. In other words, the number of results for each possibility is ka = 2, kb = 3 and kc = 4, for a total number of trials n = Σi ki = 10. Now look at one possible result that satisfies the condition, say, "aabbbcccc". This is an AND case, so probabilities multiply. Each trial result gets the probability belonging to that answer, which in this case leads to

P(\text{exactly } aabbbcccc) = p_a^2\,p_b^3\,p_c^4.

The generalized case for this has m classes, labelled Xj, and we're the number of outcomes we're looking for for each class is nj. The probability for a single result that satisfies this condition is Eq 7. It may look a little hairy, but all it really says is that you should multiply the probability of each of the result in the condition.

(7) P(k_1 X_1\,\&\,k_2 X_2\,\& ... \&\,k_m X_m) = p_1^{k_1} \,\times\, p_2^{k_2} \,\times\, ... \,\times\, p_m^{k_m} = \prod_{j=1}^m p_j^{k_j}

Now, look back at the binomial probability. In that case, you have two options, Win and Fail, with probabilities pw and pf, respectively. Let's look at the case for k wins. That is W = kw = k. Now, because there are only two options (that's the “bi-” in binomial), the probabilities of the fail-case are now also set. The probability for failing is pf = p¬w = 1−pw. As for the number of failed cases, remember that the binomial looks for exactly k wins, meaning the rest of the cases fail, leading to kf = nk. This little fact is never mentioned explicitly, but it shouldn't be overlooked.

In the end, what you get is Eq 8. The only difference with the actual binomial equation is that the subscript has been removed in the latter. Never underestimate the mathematician's laziness when it comes to writing stuff down.

(8) P(W=k_w\,\&\, F=k_f) = p_w^{k_w}\,p_f^{k_f} = p_w^k\,(1-p_w)^{n-k}

There's still one thing missing though: both Eq 7 and Eq 8 only cover the possibility of one specific valid result. In most cases, there will be others as well. In the a,b,c case, for example, I only looked at "aabbbcccc", but "bbbaacccc", "abcabcbcc", or "cccbbaaa" would have worked as well, as would any other combination. You need take all of these into account, but for that you need a way to quickly find all variations first. And that's the topic of the next section: combinatorics.

3 Combinatorics : counting evolved

Very briefly, combinatorics is the study of counting discrete variations. In particular, how you can quickly count them all in an orderly and correct fashion. In this case, what I'll need is a way to quickly count the different possible ways of distributing items over a number of bins. What also matters is whether the items are unique (that is to say, whether you can tell them apart from other items) or non-unique.

I'll start with unique items and then show how to incorporate non-unique ones. This will involve factorials (n!). Along the way, we'll pick up permutations (nPr) and combinations (nCr), which are merely special cases of the general formula.

3.1 Arrangements with unique items

The specific task here is to find out in how many ways you can distribute n unique objects over n different slots. This is a case of combinatorics without replacement, because once you've placed an object in a particular slot, you can't use it anywhere else.

I'll start by looking at an example with four objects, a, b, c and d. Naturally, there will also be four slots to put them in. To count the possibilities, we'll make a tree (see Fig 2). There are two ways to fill the tree: by a slot or by item. When working without replacement, it's more convenient to fill by item. Right now it won't matter that much, because when you start dealing with groups of non-unique items, it has a nice way of grouping all of those on the same depth.


Fig 2. Tree with 4 unique items. Branching per item placed, not by slot. For each successive item, there's one less slot left.

As you can see in Fig 2, we start with four open slots and four items. First we place a, which has a choice of all four slots. For b, only three slots are left open; c has only 2 and d takes the last remaining slot. The point here is that each next item has one fewer slot. The last column has all the different arrangements, and as you can see there are 24 in total. The really important point, though, is to note that all the branches for a given level has the same number of sub-branches. This means that you can simply multiply the number of slots left, leading to 4·3·2·1. In other words, it's simply the factorial: 4!.

Generalizing this, given n unique items, the number of variations is n!.

3.2 Arrangements with non-unique items

Now lets look at what happens when there are also identical-looking items. For example, lets take one a, one b and three cs, for a total of five items (see Fig 3). Technically, for five items you'd get 5! variations, however because the cs are identical some solutions are actually the same: we've overcounted. To resolve this, we need to know how many variations are actually the same. This is simply the number of variations within the c-group. With 3 cs, that means 3!. So to get the final answer, we divide by 3!.


Fig 3. Tree with 5 items in 3 groups: 1×, 1×b, 3×c (c1, c2, c3). Technically the number of branches is 5!, but each 'c' branch overestimate by 3!, so it's actually 5!/3!

This is becoming similar to the case in section 2.6. Again, take the case that you have m groups, kj identical items per group, and n = Σj kj items in total. If all items were unique, you'd get n! variations, but in each group you're counting kj! times too many. The proper number is n! divided by all the overcounts. This term is called the multinomial coefficient, and defined as:

(9) A(k_1 X_1, ..., k_m X_m) = { n \choose k_1,\,...,\,k_m } = { { \(k_1 + ... + k_m\)! } \over { k_1! \,\cdot\, ... \,\cdot\, k_m! } } = { {n!} \over { \prod_{j=1}^m\, k_j!} }

As an extra example of this, consider again the case with 2 as, 3 bs and 4 cs. That is, ka = 2, kb = 3 and kc = 4, and n = 9. The number of arrangements is n! / (ka! kb! kc!) =  9!/(2! 3! 4!) = 1260.

 

There's also a neat little way of describing this graphically, via a table (see Table 2). First, you take the most regular answer, where everything's divided into groups, like aabbbcccc. Write the indices above each term, ultimately resulting in n!. Now break up the string into groups index each group, giving you all the kj! terms. The result is dividing the full indexed part with the group-parts.

Table 2. Multinomial expressed in tabular form.
9 8 7 6 5 4 3 2 1
a a b b b c c c c
2 1 3 2 1 4 3 2 1
\;\;\Rightarrow\;\;{ 9! \over 2! 3! 4!}

3.3 Permutations and combinations

Permutations and combinations are merely special cases of Eq 9. This is not how it's generally taught, but it's still true. In permutation, you have r unique items and n slots. In other words, you have nr identical, empty slots. You only have an overcount for the empty slots, so you end up with

(10) ^nP_r = { { n!} \over { \underbrace{1!...1!}_{r\,\text{times}} (n-r)! } } = { { n!} \over {(n-r)! } }.

This also has another name: the falling factorial, written as n(r). It's called that because it's basically the top r terms of a factorial. For example, 7P3 = 7(3) = 7·6·5. It's written out in the equation below. Notice how the lower terms cancel out.

^7P_3 = \frac{7!}{(7-3)!} = \frac{7!}{4!} = \frac{7 \cdot 6 \cdot 5 \cdot}{\,} \frac{4 \cdot 3 \cdot 2 \cdot 1}{4 \cdot 3 \cdot 2 \cdot 1}= 7 \cdot 6 \cdot 5

For combinations, you have r non-unique items over n slots. In this case you have two groups with overcounts, one of size r and one of size n−r. This becomes

(11) ^nC_r = { n \choose r } = { n! \over r! (n-r)! } = {^nP_r \over r!}.

The combination has three interesting properties.

  1. It can be interpreted as a variation from the permutation, namely by switching from unique to non-unique items. That's where the division by r! comes from.
  2. Two, it's symmetric between r and nr. That is, nCr and nCn−r are identical.
  3. Three, it uses two groups of identical items, which is exactly what you have in the case of the binomial distribution: one set of wins and one set of fails. This is why it shows up as a multiplier in Eq 1, and why the combination is also called the binomial coefficient.
 

Instead of looking at unique versus non-unique, you can also interpret permutation and combination by looking at whether the order of the positions matter. If it does (that is, "123" is not the same as "132"), then it's a permutation. If the order doesn't matter, you have a combination.

 

In the above , I've considered the binomial coefficient as a special case of the multinomial, but you can also do the reverse: string multiple binomials together to form a multinomial. In this case, you first distribute k1 elements over the whole n slots. This can be done in nCk1 ways. At this point, you've got nk1 slots remaining, over which you distribute k2. This leaves nk1k2 for group 3, and so on until you've got no slots and items left. This is illustrated by Eq 12. Note how one part of the denominator of one term cancels out against the numerator of the next.

(12) \begin{eqnarray} { n \choose k_1 } { n-k_1 \choose k_2 } { n-k_1-k_2 \choose k_3 } \dots { k_m \choose k_m } &=& { n! \over k_1! (n-k_1)! }\, { (n-k_1)! \over k_2! (n-k_1-k_2)! } \,\cdots\, { k_m! \over k_m! 0! } \\ &=& { n! \over k_1!\,k_2!\,\cdots\,k_m! } \\ &=& { n \choose k_1,\, k2, \,...\, ,\, k_m } \end{eqnarray}

The binomial and multinomial coefficients are usually written as a 2-by-1 matrix between parentheses (see Eq 13). The construct is pronounced “n choose k”. No, I don't know why either.

(13) \begin{eqnarray} \text{Binomial: }& { n \choose k}= { n! \over k!\,(n-k)! } \\ \text{Multinomial: }& { n \choose k_1, k_2, \,...\,, k_m }= { n! \over k_1!\,k_2!\,\cdots\,k_m! } \end{eqnarray}

Back to the binomial

Now that both parts of the division has been covered, lets go back to the binomial distribution and see how they fit together. For convenience, here's what we were talking about again.

(14) \begin{eqnarray} P(W=k \text{ in } n) &=& {n\choose k} p^k (1-p)^{n-k} \\ \text{or} \\ P(W=k, F=n-k) &=& {n\choose k} p_w^k \, p_f^{n-k} \end{eqnarray}
  1. The base chance of getting a specific result that satisfies your conditions.
  2. The number of results that satisfies the conditions (each with the same base probability).

The thing about these two parts is that they actually represent two of the rules of probability, namely those describing how probabilities combine. Part 1, the base chance is a joint condition; an AND case. The second part covers all the mutually exclusive ways that the joint condition can be satisfied.

For example with 2 as and 1 b, the ways the cases combine are aab, aba or baa. If you were to write it out completely, you'd get

P(A=2,B=1) = \begin{eqnarray} (\,P(A) \text{ and } P(A) \text{ and } P(B)\,) &\text{ or } \\ (\,P(A) \text{ and } P(B) \text{ and } P(A)\,) &\text{ or } \\ (\,P(B) \text{ and } P(A) \text{ and } P(A)\,) & \end{eqnarray}

Obviously, you're not going to write this out all the time, especially not when the numbers get large, but this is really what you're looking for. The use of the base-chance and combinations are ways of shortcircuiting this. As said, the base-chance is the probability of the string of ANDs, which will be the same for case. Because of this, the addition of all the (X)ORs simply becomes a multiplication by the number of (X)ORs that you have, which is given by the binomial, or multinomial if you have more than one option.

 

Another interesting point is that the binomial and multinomial coefficients themselves can be read as probabilities as well. It's just that because we're certain kj items will exist if we're just arranging items, the probability for each group is 1, so the whole probability part is unnecessary.

4 Examples

Because I know all of this can be a little mesmerizing at first, I thought I'd give a few examples as well.

4.1 Beastly dice

First, simple example of the binomial distribution. Say we have five dice (5d6) and we're looking at the different ways of getting exactly three sixes. Note that this also means the remaining cases should be a non-six!

First, the probability of a six is p6 = 1/6, and of a non-six p¬6 = 5/6. As for the group-counts, we have 2 groups with k = k6 = 3 and k¬6 = 2, and n = 5. Putting this together results in Eq 15. Note the base chance fo getting any triple-six result, and the number of ways you can vary this over five dice.

(15) P(3 \times 6 \text{ in } 5) = P(3 \times 6 \,\&\, 2 \times \not6) = { 5 \choose 3} p_6^3 p_{\not6}^2 = 10\cdot\(\frac16\)^3\(\frac56\)^2 = 3.21%

There's also another way of getting to this result, namely by direct use of the definition of probability, Eq 2. To do this, we need know the total number of possibilities, and how many will result in 3 sixes. With five dice, the total is T = 65. There are 5C3 = 10 ways of getting exactly 3 sixes in five dice and 5 non-six options for the remaining two dice, leading to a total of valid cases of 10·52 = 250.

P(3 \times 6 \text{ in } 5) = { #(3\times 6 \text{ in } 5) \over #\text{total} } = { 10 \cdot 5^2 \over 5^6 } = \frac{250}{7776} = 3.21%

4.2 Yahtzee : full house

More dice, this time looking at the yahtzee game. A full house is three of a kind plus a pair, like 66655, 11122 or 22333, and any variation thereof. In this particular case I want to look at getting a full house in one go, because to take all three turns into account is way too hairy.

The case is actually similar to the previous one in that you have 3 of one group and 2 of another, but the probabilities are a little different. For a specific full house, say 11122 the base chance is 1/65. However, we're not looking for a specific one, but any full house. This means that there will be 6 options for the first group, and 5 for the second. The final probability is

(16) P(\text{any full house}) = 6 \cdot 5 \times P(\text{full house}) = 30\cdot{ 5 \choose 3 } \times \(\frac16\)^5 = 30\cdot10 \times \frac1{7776} = 3.86%

An alternative approach is to consider the group-determining dice to be “free”, giving the probability 6/6 and 5/6 for the first and second group, respectively. After that, the remaining dice have to follow suit to give (1/6)3.

4.3 Multinomial : spinning wheels


Fig 4. 10-lettered wheel; na = 2, nb = 3, nc = 5.

And now for an application of the multinomial. Consider the wheel from Fig 4. It has 2 as, 3 bs and 5 cs: na = 2, nb = 3, nc = 5. The probabilities for each letter follow from the probability definition, resulting in pa = 2/10 = 20%, pb = 3/10 = 30% and pc = 5/10 = 50%.

The question is: what's the chance of getting 3 as, 2 cs in 9 spins?

 

Well, first you have to note that 3+2 is less than 9, so there's a hidden condition: the remaining 4 tries should be a b. This transforms the question into 3 as, 4 bs and 2 cs, or ka = 3, kb = 4, kc = 2, Also note that the number of results have nothing to do with how many appear on the wheel.

(17) \begin{eqnarray} P(A=3,C=2 \text{ in } 9) &=& P(A=3,B=4,C=2) \\ \vspace{10} \\ &=& { \(k_a + k_b + k_c\)! \over k_a!\,k_b!\,k_c! } p_a^{k_a}\,p_b^{k_b}\,p_c^{k_c}\, \\ \vspace{10} \\ &=& { (3+4+2)! \over 3! 4! 2! } \(\frac2{10}\)^3 \, \(\frac3{10}\)^4 \, \(\frac5{10}\)^2\, \\ \vspace{10} \\ &=& 1260 \cdot \frac{8 \cdot 81 \cdot 25 }{10^9} = 2.04 % \end{eqnarray}

5 Summary

As this post covers quite a bit, I think a summary might be in order.

  • Rules of probability:
    1. Definition: the probability of event X is the number of occurances over the total number of events.
      P(X) = {{# \text{Xs}} \over { \text{All events}}}
      Do not forget this rule, as its the basis for all the others.
    2. Complementary rule (NOT). The probability of not getting X is the rest of the cases.
      P(\neg X) = 1 - P(X)
    3. Disjoint/Parallel condition ((X)OR) : intersection of cases; probabilities add.
      P(X \oplus Y) = P(X) + P(Y)
    4. Joint/Serial condition (AND) : union of cases; probabilities multiply.
      P(X \& Y) = P(X) \cdot P(Y)
    The NOT, XOR and AND terms have commonalities with those used in boolean logic, and electrical engineering.
  • Compound-condition probabilities usually involve both AND and OR conditions, making up the two parts of a binomial/multinomial probability.
    1. The AND conditions results in the probability of one solution, which multiplies all the probabilities of each sub-sample.
    2. The OR condition counts how many solutions there are, given by the bi-/multinomial coefficient.
 
  • Factorial ( n! ). The factorial represents the number of ways you can arrange n unique/distinct items.
    n! = 1\cdot2\cdot \,...\, \cdot(n-1)\cdot n
  • Permutation ( nPr ). Permutation refers to how many ways you can distribute r unique/distinct items over n slots, or when the order of the elements matters.
    ^nP_r = n_{(r)} = { { n!} \over {(n-r)! } }
  • Combination ( nCr ). Combination refers to how many ways you can distribute r non-unique/indistinct items over n slots, or when the order of the elements does not matter.
    ^nC_r = {^nP_r}/r! = { n! \over r!(n-r)! }
  • Binomial / Multinomial coefficient ( n choose k(1, k2, … km) ), The binomial is the number of ways you can arrange 2 groups with k and nk non-unique elements over n bins. This is basically the same thing as the combination. The multinomial is the extension to m groups, with kj elements in each group. In both cases you take the base n! for the unique variations, and divide by kj! for the overcounts for all the groups.
    \begin{eqnarray} \text{Binomial: }& { n \choose k}= { n! \over k!\,(n-k)! } \\ \text{Multinomial: }& { n \choose k_1, k_2, \,...,\, k_m }= { n! \over k_1!\,k_2!\,\cdots\,k_m! } \end{eqnarray}

And lastly,

  • Bi-/multinomial probability. When you have m groups with probability pj for each group and you want to know the chance of drawing exactly kj items in n = Σj kj samples, the probability is:
    (18) \begin{eqnarray} P(X_1=k_1\,...\,X_m=k_m) &=& { n \choose k_1,\,...,\,k_m } &\cdot& \prod_{j=1}^m p_j^m \\ &=& { n! \over k_1!\cdot\,...\, \cdot k_m! } &\cdot& p_1^{k_1}\cdot\,...\,\cdot p_m^{k_m} \end{eqnarray}

    And remember that this holds for just two groups (the binomial case) as well, except that often they'll just pretend that the second group isn't really there. Which is unfortunate, as that's where the (1−p) and (nk) terms come from.

4 thoughts on “Probabilities in simple random samples

  1. hum, your conclusion miss the most important things....

    don't play loto...

  2. You could have talked about or for non disjoint probabilities: P(A or B) = P(A) + P(B) - P(A&B)
    Still, very good. If only I read this before my test last Friday...

  3. @ elwing

    I think the phrase is "Gambling is a tax for people that are bad at math."

    @ Tyr

    Oh, disjoint! So that's what it's called. I'd looked for a different term than "mutually exclusive", but couldn't find it.

    About non-disjoint cases, I did cover this in an earlier draft, but then discovered that it gets complicated really fast and decided against including it. For two conditions, it is indeed
    P(A or B) = P(A) + P(B) - P(A \& B) .
    The minus is to get rid of the intersection that was counted twice by using P(A)+P(B). The problem is that this doesn't extend well into more conditions. For three, you get
    P(A\text{ or }B\text{ or }C) = P(A) + P(B) + P(C) \,-\,P(A\&B)-P(A\&C)-P(B\&C) \,+\,P(A\&B\&C)
    Three subtractions because to correct for the overlaps, and then an additional addition to correct for the correction, yiich. I don't even know what it'd be for 4 or more cases. I think in these cases it'll be easier to break up the OR cases into pure disjoint (XOR) and joint (AND) cases. Or even use the complement rule if possible.

    If only I read this before my test last Friday...

    Sorry :\. I'd hoped I got it out on time for any upcoming exams, but I guess not.

  4. Fantastic article. I hope I got the details right. Miss one small thing and your probability calculations give answers that are just plain wrong.

    I think the conclusion is play lotto, but make sure you are the house. =) [Details relating public law, politics, and economics omitted.]

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>