When it comes to math, one of the things that people are phenomenally bad at – aside from the obvious everything – is probabilities. Specifically, calculating probabilities. People are bad enough at estimating chances, but properly calculating then has its own hurdles.
In this particular case, I'd like to take a look at the probability of a [wiki]Simple random sample[/wiki]: getting a particular distribution of results from a number of samples, like getting 3 sixes in a certain number of rolls. Specifically, I'll look at the case with replacement. In this case, the sample pool remains the same after each individual sample. The case without replacement (like picking colored marbles from a vase without putting them back) is a bit trickier so I won't cover that now.
The binomial distribution
One of the equations that is relevant here is the [wiki]binomial distribution[/wiki] (see [$EQ][[ref:eq-binom-std]]), the scourge of math students everywhere. What I'd like to do is explain why it looks the way it does, what the terms mean and how you can generalize it to cover more than two outcomes (that's what the “bi-” stands for: two). Basically, I'd like to show you how to “read” the equation and to interpret the terms in it, which will make it easier to deal with the subject.
[eq id="eq-binom-std"] P(W=k) = {n\choose k} p^k (1-p)^{n-k} [/eq]First, a few words about the [$EQ][[ref:eq-binom-std]] itself. Say you have a random event that can have two outcomes: coin-flip, true/false question, but also a die-roll if you just look at an X vs non-X result. Or, in the parlance of our times, whether you have a win (W) or a fail (F). What [$EQ][[ref:eq-binom-std]] gives you is the probability of getting exactly k wins in n trials.
[br]That's all well and good, of course, but unless you actually understand how the equation is built up and what all the terms in it mean, it'll just be yet another magic formula that someone dreamed up to torture you with. In practical terms, it also means that it's harder to apply it correctly. So let's look at what it means, starting with what each individual symbol means.
Symbol | Meaning |
---|---|
W | Win index, or number of wins. |
F | Fail index, or number of fails. |
n | Number of trials. |
k | Selection value (usually linked to the number of wins). |
p | Single-trial probability (usually linked to wins). |
P(cond) | Total probability, with the condition. |
Here's an example of its use. Say you have 7 dice (7d6) and you want to know the probability of throwing 3 sixes. This means you have n[=]7, k[=]3 and p[=]1/6, and the condition is W[=]k[=]3.
Divide and conquer
The next step is to split the equation into the two parts that are of true relevance, namely:
[img id="img-binom-div" type="cptfr" src="[$IMGBASE]binom-div.png" cap="Dividing the probability function into two parts. One for the base chance per solution, and one for how many solutions there are." /]- The base chance of getting a specific result that satisfies your conditions.
- The number of results that satisfies the conditions (each with the same base probability).
I've always found this division to make the most sense. Note that I've split it into two parts, not three. The terms involving p and (1[-]p) belong in the same group, as I'll explain later. The nice thing about this division is that the two parts neatly correspond to two concepts of mathematics: probability and combinatorics. If you understand those two, equations like [$EQ][[ref:eq-binom-std]] should start to make sense.
When I say you need to understand probability theory to understand the binomial distribution, what I really mean is that you understand the rules or probability, and how to combine single chance occurrences into larger conditions. See also [wiki cap="wiki:probability"]Probability#Mathematical_treatment[/wiki].
[h3 id="ssec-prob-def"]Probability definition[/h3]The probability of a selected case x is the number of times it occurs relative to the total pool of cases. This is the fraction given in [$EQ][[ref:eq-prob-def]]. All other rules are the result of this one.
[eq id="eq-prob-def"] P(X) = \dfrac{\# \text{Selected case}}{\# \text{Total cases}} [/eq]For example, we roll a d6 and we're interested in getting a six. Since there are 6 sides and only 1 six, the probability is P(1[ ]six)[=]1/6.
[h3 id="ssec-prob-range"]Probability range[/h3]Since the total number of items is at least as large as the selection, each probability itself is at most 1. A probability can't be lower than zero either, because counts are natural numbers.
This also holds for the sum probability of multiple items. Also, the probability to get any result at all is 1, because obviously something needs to happen. If you sum up probabilities and you get an answer greater than 1, some of the cases you've counted actually overlap. If you add up all probabilities and you get less than 1, you've missed a few cases.
[eq id="eq-prob-range"] 0 \le P(X) \le 1 [/eq] [h3 id="ssec-prob-not"]Complementary rule : NOT[/h3]If the probability of some event is P(X) and the sum probability of all events, then the probability of it not happening is 1[-]P(X). The symbol for “not” is ¬.
[eq id="eq-prob-not"] P(\text{not}\,X) = P(\neg X) = 1 - P(X) [/eq]For example, if the probability of a six on a d6 is 1/6, the probability of not getting a six is P(~six)[=]5/6.
[h3 id="ssec-prob-xor"]Disjoint conditions : XOR[/h3]If you have multiple conditions that are mutually exclusive (disjoint), the probabilities add. This corresponds to an “or” case.
Please note the “mutually” here. It must be either X or Y. For the programmers: the English word “or” actually mean an exclusive OR ( ^ in C), not an inclusive OR ( | ). This difference is the main reason why calculated probabilities can end up greater than 1. There are several symbols that can be used for a disjoint condition, all of which are problematic. I'll use the boolean XOR symbol, ⊕, because even though it may be a little obscure for most people, at least it's non-ambiguous.
[eq id="eq-prob-or"] P(X\text{ or }Y) = P(X \oplus Y) = P(X) + P(Y) [/eq]For example, the chance of getting a 6 or a 5 on a single d6. is P(six or 5)[=]1/6[+]1/6[=]1/3. You can also approach it from the definition of [$EQ][[ref:eq-prob-def]], where you still have a total of 6, but now you have 2 valid cases, leading directly to 2/6.
[note id="nt-xor" title="Language issues and silly symbols" type="nhcare"]The term "or" is one of those areas where human languages, mathematics and logic are at odds. As said, the word “or” actually means an exclusive or (one of the two, but not both), even though in math and logic it's used for the union (the full combination; and/or).
It gets even more vexing when one looks at the symbols used. The C
equivalent for OR (the inclusive OR) is |
or
||
. However, the |
symbol is also used for
a [wiki]conditional probability[/wiki]. So that's out. I can't use the
C XOR symbol, ^
either, as that actually means AND in logic.
There's also the union symbol, ∪
, but I think that
stands for inclusive OR, not exclusive. So in the end it comes
down to the O-plus, ⊕. While relatively unknown, at least it
only has a single meaning.
If anyone has a better idea, I'd love to hear it.
[/note] [h3 id="ssec-prob-and"]Joint conditions : AND[/h3]When you have two independent conditions that both need to be true, the probabilities multiply. Think of it as a sub-selection, where you have to get a fraction of a fraction. This corresponds to an “and” case. The logical symbol for it is a cap ( ∩ ), but since it looks too much like a cup, I'll use the actual AND-symbol ( & ).
[eq id="eq-prob-and"] P(X\text{ and }Y) = P(X \& Y) = P(X) \cdot P(Y) [/eq]An example of this rule is the chance to throw two sixes with two dice. First, you need to throw a six with one for a 1/6 chance, and in the event that that happens, there's another 1/6 chance for the other, combining into P(six and six)[=]1/6[ ]·[ ]1/6[=]1/36. You can also get this result from the definition. With 2 dice you have 36 possibilities in total, and only one of them will contain (6,6). Note that for a 5 and 6 the rules are different, since both (5,6) and (6,5) are possible here.
[h3 id="ssec-prob-use"]Usage in general and binomials in particular[/h3]When considering probabilities, you'll often have multiple trials/samples and a diffent type of answer for each trial. For example, say you have three different options, a, b and c. Each option has a different probability, pa, pb, and pc. Now I'm looking for the probability of getting 2 a, 3 bs and 4 cs. In other words, the number of results for each possibility is ka[=]2, kb[=]3 and kc[=]4, for a total number of trials n[=]Σi[ ]ki[=]10. Now look at one possible result that satisfies the condition, say, "aabbbcccc". This is an AND case, so probabilities multiply. Each trial result gets the probability belonging to that answer, which in this case leads to
[eq] P(\text{exactly } aabbbcccc) = p_a^2\,p_b^3\,p_c^4. [/eq]The generalized case for this has m classes, labelled Xj, and we're the number of outcomes we're looking for for each class is nj. The probability for a single result that satisfies this condition is [$EQ][[ref:eq-prob-single]]. It may look a little hairy, but all it really says is that you should multiply the probability of each of the result in the condition.
[eq id="eq-prob-single"] P(k_1 X_1\,\&\,k_2 X_2\,\& ... \&\,k_m X_m) = p_1^{k_1} \,\times\, p_2^{k_2} \,\times\, ... \,\times\, p_m^{k_m} = \prod_{j=1}^m p_j^{k_j} [/eq]Now, look back at the binomial probability. In that case, you have two options, Win and Fail, with probabilities pw and pf, respectively. Let's look at the case for k wins. That is W[=]kw[=]k. Now, because there are only two options (that's the “bi-” in binomial), the probabilities of the fail-case are now also set. The probability for failing is pf[=]p¬w[=]1[-]pw. As for the number of failed cases, remember that the binomial looks for exactly k wins, meaning the rest of the cases fail, leading to kf[=]n−k. This little fact is never mentioned explicitly, but it shouldn't be overlooked.
In the end, what you get is [$EQ][[ref:eq-prob-binom-single]]. The only difference with the actual binomial equation is that the subscript has been removed in the latter. Never underestimate the mathematician's laziness when it comes to writing stuff down.
[eq id="eq-prob-binom-single"] P(W=k_w\,\&\, F=k_f) = p_w^{k_w}\,p_f^{k_f} = p_w^k\,(1-p_w)^{n-k} [/eq]There's still one thing missing though: both [$EQ][[ref:eq-prob-single]] and [$EQ][[ref:eq-prob-binom-single]] only cover the possibility of one specific valid result. In most cases, there will be others as well. In the a,b,c case, for example, I only looked at "aabbbcccc", but "bbbaacccc", "abcabcbcc", or "cccbbaaa" would have worked as well, as would any other combination. You need take all of these into account, but for that you need a way to quickly find all variations first. And that's the topic of the next section: combinatorics.
[h2 id="sec-cmb"]Combinatorics : counting evolved[/h2]Very briefly, [wiki]combinatorics[/wiki] is the study of counting discrete variations. In particular, how you can quickly count them all in an orderly and correct fashion. In this case, what I'll need is a way to quickly count the different possible ways of distributing items over a number of bins. What also matters is whether the items are unique (that is to say, whether you can tell them apart from other items) or non-unique.
I'll start with unique items and then show how to incorporate non-unique ones. This will involve factorials (n!). Along the way, we'll pick up permutations (nPr) and combinations (nCr), which are merely special cases of the general formula.
[h3 id="ssec-cmb-unique"]Arrangements with unique items[/h3]The specific task here is to find out in how many ways you can distribute n unique objects over n different slots. This is a case of combinatorics without replacement, because once you've placed an object in a particular slot, you can't use it anywhere else.
I'll start by looking at an example with four objects, a, b, c and d. Naturally, there will also be four slots to put them in. To count the possibilities, we'll make a tree (see [$FIG][[ref:img-tree-abcd]]). There are two ways to fill the tree: by a slot or by item. When working without replacement, it's more convenient to fill by item. Right now it won't matter that much, because when you start dealing with groups of non-unique items, it has a nice way of grouping all of those on the same depth.
As you can see in [$FIG][[ref:img-tree-abcd]], we start with four open slots and four items. First we place a, which has a choice of all four slots. For b, only three slots are left open; c has only 2 and d takes the last remaining slot. The point here is that each next item has one fewer slot. The last column has all the different arrangements, and as you can see there are 24 in total. The really important point, though, is to note that all the branches for a given level has the same number of sub-branches. This means that you can simply multiply the number of slots left, leading to 4·3·2·1. In other words, it's simply the factorial: 4!.
Generalizing this, given n unique items, the number of variations is n!.
[h3 id="ssec-cmb-non-unique"]Arrangements with non-unique items[/h3]Now lets look at what happens when there are also identical-looking items. For example, lets take one a, one b and three cs, for a total of five items (see [$FIG][[ref:img-tree-abccc]]). Technically, for five items you'd get 5! variations, however because the cs are identical some solutions are actually the same: we've overcounted. To resolve this, we need to know how many variations are actually the same. This is simply the number of variations within the c-group. With 3 cs, that means 3!. So to get the final answer, we divide by 3!.
This is becoming similar to the case in section [[ref:ssec-prob-use]]. Again, take the case that you have m groups, kj identical items per group, and n[=]Σj[ ]kj items in total. If all items were unique, you'd get n! variations, but in each group you're counting kj! times too many. The proper number is n! divided by all the overcounts. This term is called the [wiki cap="multinomial coefficient"]Multinomial_theorem[/wiki], and defined as:
[eq id="eq-cmb-mnom"] A(k_1 X_1, ..., k_m X_m) = { n \choose k_1,\,...,\,k_m } = \dfrac{ (k_1 + ... + k_m)! }{ k_1! \,\cdot\, ... \,\cdot\, k_m! } = \dfrac{n!}{ \prod_{j=1}^m\, k_j!} [/eq]As an extra example of this, consider again the case with 2 as, 3 bs and 4 cs. That is, ka[=]2, kb[=]3 and kc[=]4, and n[=]9. The number of arrangements is n! / (ka! kb! kc!)[=] 9!/(2! 3! 4!) = 1260.
[br]There's also a neat little way of describing this graphically, via a table (see [$TBL][[ref:tbl-mnom]]). First, you take the most regular answer, where everything's divided into groups, like aabbbcccc. Write the indices above each term, ultimately resulting in n!. Now break up the string into groups index each group, giving you all the kj! terms. The result is dividing the full indexed part with the group-parts.
|
\;\;\Rightarrow\;\;{ 9! \over 2! 3! 4!} |
Permutations and combinations are merely special cases of [$EQ][[ref:eq-cmb-mnom]]. This is not how it's generally taught, but it's still true. In permutation, you have r unique items and n slots. In other words, you have n[-]r identical, empty slots. You only have an overcount for the empty slots, so you end up with
[eq id="eq-cmb-perm"] ^nP_r = { { n!} \over { \underbrace{1!...1!}_{r\,\text{times}} (n-r)! } } = { { n!} \over {(n-r)! } }. [/eq]This also has another name: the [wiki]falling factorial[/wiki], written as n(r). It's called that because it's basically the top r terms of a factorial. For example, 7P3 = 7(3) = 7·6·5. It's written out in the equation below. Notice how the lower terms cancel out.
[eq] ^7P_3 = \frac{7!}{(7-3)!} = \frac{7!}{4!} = \frac{7 \cdot 6 \cdot 5 \cdot}{\,} \frac{4 \cdot 3 \cdot 2 \cdot 1}{4 \cdot 3 \cdot 2 \cdot 1}= 7 \cdot 6 \cdot 5 [/eq]For combinations, you have r non-unique items over n slots. In this case you have two groups with overcounts, one of size r and one of size n[-]r. This becomes
[eq id="eq-cmb-combi"] ^nC_r = { n \choose r } = { n! \over r! (n-r)! } = {^nP_r \over r!}. [/eq]The combination has three interesting properties.
- It can be interpreted as a variation from the permutation, namely by switching from unique to non-unique items. That's where the division by r! comes from.
- Two, it's symmetric between r and n[-]r. That is, nCr and nCn[-]r are identical.
- Three, it uses two groups of identical items, which is exactly what you have in the case of the binomial distribution: one set of wins and one set of fails. This is why it shows up as a multiplier in [$EQ][[ref:eq-binom-std]], and why the combination is also called the binomial coefficient.
Instead of looking at unique versus non-unique, you can also interpret permutation and combination by looking at whether the order of the positions matter. If it does (that is, "123" is not the same as "132"), then it's a permutation. If the order doesn't matter, you have a combination.
[br]In the above , I've considered the binomial coefficient as a special case of the multinomial, but you can also do the reverse: string multiple binomials together to form a multinomial. In this case, you first distribute k1 elements over the whole n slots. This can be done in nCk1 ways. At this point, you've got n[-]k1 slots remaining, over which you distribute k2. This leaves n[-]k1[-]k2 for group 3, and so on until you've got no slots and items left. This is illustrated by [$EQ][[ref:eq-cmb-mnom2]]. Note how one part of the denominator of one term cancels out against the numerator of the next.
[eq id="eq-cmb-mnom2"] \begin{array}{} { n \choose k_1 } { n-k_1 \choose k_2 } { n-k_1-k_2 \choose k_3 } \dots { k_m \choose k_m } &=& { n! \over k_1! (n-k_1)! }\, { (n-k_1)! \over k_2! (n-k_1-k_2)! } \,\cdots\, { k_m! \over k_m! 0! } \\ &=& { n! \over k_1!\,k_2!\,\cdots\,k_m! } \\ &=& { n \choose k_1,\, k2, \,...\, ,\, k_m } \end{array} [/eq]The binomial and multinomial coefficients are usually written as a 2-by-1 matrix between parentheses (see [$EQ][[ref:eq-cmb-choose]]). The construct is pronounced “n choose k”. No, I don't know why either.
[eq id="eq-cmb-choose"] \begin{array}{} \text{Binomial: }& { n \choose k}= { n! \over k!\,(n-k)! } \\ \text{Multinomial: }& { n \choose k_1, k_2, \,...\,, k_m }= { n! \over k_1!\,k_2!\,\cdots\,k_m! } \end{array} [/eq] [h3 id="ssec-binom]Back to the binomial[/h3]Now that both parts of the division has been covered, lets go back to the binomial distribution and see how they fit together. For convenience, here's what we were talking about again.
[eq id="eq-binom-ex"] \begin{array}{} P(W=k \text{ in } n) &=& {n\choose k} p^k (1-p)^{n-k} \\ \text{or} \\ P(W=k, F=n-k) &=& {n\choose k} p_w^k \, p_f^{n-k} \end{array} [/eq]- The base chance of getting a specific result that satisfies your conditions.
- The number of results that satisfies the conditions (each with the same base probability).
The thing about these two parts is that they actually represent two of the rules of probability, namely those describing how probabilities combine. Part 1, the base chance is a joint condition; an AND case. The second part covers all the mutually exclusive ways that the joint condition can be satisfied.
For example with 2 as and 1 b, the ways the cases combine are aab, aba or baa. If you were to write it out completely, you'd get
[eq] P(A=2,B=1) = \begin{array}{} (\,P(A) \text{ and } P(A) \text{ and } P(B)\,) &\text{ or } \\ (\,P(A) \text{ and } P(B) \text{ and } P(A)\,) &\text{ or } \\ (\,P(B) \text{ and } P(A) \text{ and } P(A)\,) & \end{array} [/eq]Obviously, you're not going to write this out all the time, especially not when the numbers get large, but this is really what you're looking for. The use of the base-chance and combinations are ways of shortcircuiting this. As said, the base-chance is the probability of the string of ANDs, which will be the same for case. Because of this, the addition of all the (X)ORs simply becomes a multiplication by the number of (X)ORs that you have, which is given by the binomial, or multinomial if you have more than one option.
[br]Another interesting point is that the binomial and multinomial coefficients themselves can be read as probabilities as well. It's just that because we're certain kj items will exist if we're just arranging items, the probability for each group is 1, so the whole probability part is unnecessary.
[h2 id="sec-eg"]Examples[/h2]Because I know all of this can be a little mesmerizing at first, I thought I'd give a few examples as well.
[h3 id="ssec-eg-binom"]Beastly dice[/h3]First, simple example of the binomial distribution. Say we have five dice (5d6) and we're looking at the different ways of getting exactly three sixes. Note that this also means the remaining cases should be a non-six!
First, the probability of a six is p6[=]1/6, and of a non-six p¬6[=]5/6. As for the group-counts, we have 2 groups with k[=]k6[=]3 and k¬6[=]2, and n[=]5. Putting this together results in [$EQ][[ref:eq-eg-666]]. Note the base chance fo getting any triple-six result, and the number of ways you can vary this over five dice.
[eq id="eq-eg-666"] P(3 \times 6 \text{ in } 5) = P(3 \times 6 \,\&\, 2 \times \not 6) = { 5 \choose 3} p_6^3 p_{\not6}^2 = 10\cdot(\frac{1}{6})^3 (\frac{5}{6})^2 = 3.21% [/eq]There's also another way of getting to this result, namely by direct use of the definition of probability, [$EQ][[ref:eq-prob-def]]. To do this, we need know the total number of possibilities, and how many will result in 3 sixes. With five dice, the total is T[=]65. There are 5C3[=]10 ways of getting exactly 3 sixes in five dice and 5 non-six options for the remaining two dice, leading to a total of valid cases of 10·52[=]250.
[eq] P(3 \times 6 \text{ in } 5) = \dfrac{ \#(3\times 6 \text{ in } 5) }{ \#\text{total} } = \dfrac{ 10 \cdot 5^2 }{ 5^6 } = \dfrac{250}{7776} = 3.21% [/eq] [h3 id="ssec-eg-full"]Yahtzee : full house[/h3]More dice, this time looking at the yahtzee game. A full house is three of a kind plus a pair, like 66655, 11122 or 22333, and any variation thereof. In this particular case I want to look at getting a full house in one go, because to take all three turns into account is way too hairy.
The case is actually similar to the previous one in that you have 3 of one group and 2 of another, but the probabilities are a little different. For a specific full house, say 11122 the base chance is 1/65. However, we're not looking for a specific one, but any full house. This means that there will be 6 options for the first group, and 5 for the second. The final probability is
[eq id="eq-eg-full"] P(\text{any full house}) = 6 \cdot 5 \times P(\text{full house}) = 30\cdot{ 5 \choose 3 } \times (\frac{1}{6})^5 = 30\cdot10 \times \frac{1}{7776} = 3.86% [/eq]An alternative approach is to consider the group-determining dice to be “free”, giving the probability 6/6 and 5/6 for the first and second group, respectively. After that, the remaining dice have to follow suit to give (1/6)3.
[h3 id="ssec-eg-wheel"]Multinomial : spinning wheels[/h3] [img id="img-wheel" type="cptfr" src="[$IMGBASE]wheel.png" cap="10-lettered wheel; na[=]2, nb[=]3, nc[=]5." /]And now for an application of the multinomial. Consider the wheel from [$FIG][[ref:img-wheel]]. It has 2 as, 3 bs and 5 cs: na[=]2, nb[=]3, nc[=]5. The probabilities for each letter follow from the probability definition, resulting in pa[=]2/10[=]20%, pb[=]3/10[=]30% and pc[=]5/10[=]50%.
The question is: what's the chance of getting 3 as, 2 cs in 9 spins?
[br]Well, first you have to note that 3+2 is less than 9, so there's a hidden condition: the remaining 4 tries should be a b. This transforms the question into 3 as, 4 bs and 2 cs, or ka[=]3, kb[=]4, kc[=]2, Also note that the number of results have nothing to do with how many appear on the wheel.
[eq id="eq-eg-wheel"] \begin{array}{} P(A=3,C=2 \text{ in } 9) &=& P(A=3,B=4,C=2) \\[10pt] &=& \dfrac{ (k_a + k_b + k_c)! }{ k_a!\,k_b!\,k_c! } p_a^{k_a}\,p_b^{k_b}\,p_c^{k_c}\, \\[10pt] &=& \dfrac{ (3+4+2)! }{ 3! 4! 2! } (\frac{2}{10})^3 \, (\frac{3}{10})^4 \, (\frac{5}{10})^2\, \\[10pt] &=& 1260 \cdot \frac{8 \cdot 81 \cdot 25 }{10^9} = 2.04 % \end{array} [/eq] [h2 id="sec-summary"]Summary[/h2]As this post covers quite a bit, I think a summary might be in order.
-
Rules of probability:
-
Definition: the probability of event X is the
number of occurances over the total number of events.
P(X) = \frac{\# \text{Xs}}{ \text{All events}}
Do not forget this rule, as its the basis for all the others. -
Complementary rule (NOT). The probability of not getting
X is the rest of the cases.
P(\neg X) = 1 - P(X) -
Disjoint/Parallel condition ((X)OR) :
intersection of cases; probabilities add.
P(X \oplus Y) = P(X) + P(Y) -
Joint/Serial condition (AND) :
union of cases; probabilities multiply.
P(X \& Y) = P(X) \cdot P(Y)
-
Definition: the probability of event X is the
number of occurances over the total number of events.
-
Compound-condition probabilities usually involve both AND and
OR conditions, making up the two parts of a
binomial/multinomial probability.
- The AND conditions results in the probability of one solution, which multiplies all the probabilities of each sub-sample.
- The OR condition counts how many solutions there are, given by the bi-/multinomial coefficient.
-
Factorial ( n! ).
The factorial represents the number of ways you can arrange
n unique/distinct items.
n! = 1\cdot2\cdot \,...\, \cdot(n-1)\cdot n -
Permutation ( nPr ).
Permutation refers to how many ways you can distribute r
unique/distinct items over n slots, or when the order
of the elements matters.
^nP_r = n_{(r)} = { { n!} \over {(n-r)! } } -
Combination ( nCr ).
Combination refers to how many ways you can distribute r
non-unique/indistinct items over n slots, or when the order
of the elements does not matter.
^nC_r = {^nP_r}/r! = { n! \over r!(n-r)! } -
Binomial / Multinomial coefficient (
n choose k(1,
k2, … km) ),
The binomial is the number of ways you can arrange 2 groups with
k and n[-]k non-unique elements over n
bins. This is basically the same thing as the combination.
The multinomial is the extension to m groups, with
kj elements in each group. In both cases
you take the base n! for the unique variations, and divide
by kj! for the overcounts for all the groups.
\begin{array}{} \text{Binomial: }& { n \choose k}= { n! \over k!\,(n-k)! } \\ \text{Multinomial: }& { n \choose k_1, k_2, \,...,\, k_m }= { n! \over k_1!\,k_2!\,\cdots\,k_m! } \end{array}
And lastly,
- Bi-/multinomial probability.
When you have m groups with probability pj
for each group and you want to know the chance of drawing exactly
kj items in
n[=]Σj[ ]kj samples, the
probability is:
[eq id="eq-prob-mnom"] \begin{array}{} P(X_1=k_1\,...\,X_m=k_m) &=& { n \choose k_1,\,...,\,k_m } &\cdot& \prod_{j=1}^m p_j^m \\ &=& { n! \over k_1!\cdot\,...\, \cdot k_m! } &\cdot& p_1^{k_1}\cdot\,...\,\cdot p_m^{k_m} \end{array} [/eq]
And remember that this holds for just two groups (the binomial case) as well, except that often they'll just pretend that the second group isn't really there. Which is unfortunate, as that's where the (1[-]p) and (n[-]k) terms come from.
hum, your conclusion miss the most important things....
don't play loto...
You could have talked about or for non disjoint probabilities: P(A or B) = P(A) + P(B) - P(A&B)
Still, very good. If only I read this before my test last Friday...
@ elwing
I think the phrase is "Gambling is a tax for people that are bad at math."
@ Tyr
Oh, disjoint! So that's what it's called. I'd looked for a different term than "mutually exclusive", but couldn't find it.
About non-disjoint cases, I did cover this in an earlier draft, but then discovered that it gets complicated really fast and decided against including it. For two conditions, it is indeed
.

The minus is to get rid of the intersection that was counted twice by using P(A)+P(B). The problem is that this doesn't extend well into more conditions. For three, you get
Three subtractions because to correct for the overlaps, and then an additional addition to correct for the correction, yiich. I don't even know what it'd be for 4 or more cases. I think in these cases it'll be easier to break up the OR cases into pure disjoint (XOR) and joint (AND) cases. Or even use the complement rule if possible.
Sorry :\. I'd hoped I got it out on time for any upcoming exams, but I guess not.
Fantastic article. I hope I got the details right. Miss one small thing and your probability calculations give answers that are just plain wrong.
I think the conclusion is play lotto, but make sure you are the house. =) [Details relating public law, politics, and economics omitted.]
I’m glad you linked back to this post today because I’m going through my finding the right food phase now. Gus and gels all make me feel so sick it’s hard to keep running. My coach suggested PB crackers but they sounded SO dry. I think this might be it!