Section 1

The Language of Probability (BH Chapter 1)

Like the English language, the language of probability has its own nouns, verbs, and adjectives. Confusing these parts of speech will result in "category errors." Let's familiarize ourselves with these parts of speech.

  • Probability will always be a value between 0 and 1 inclusive that expresses how likely a particular event will occur out of all possible outcomes in the sample space. Moreover, P(Ω)=1P(\Omega) = 1.

  • Experiment: An experiment is a process of obtaining outcomes about an uncertain phenomenon.

  • Sample space (Ω\Omega): A sample space contains all the possible experimental outcomes that could happen.

  • Event: An event is a certain subset of the sample space.

  • Naive Definition - If all outcomes are equally likely, the probability of event {A} happening is

Pnaive(A)=number of outcomes favorable to Anumber of outcomesP_{{naive}}(A) = \frac{{number\ of\ outcomes\ favorable\ to\ {A}}}{{number\ of\ outcomes}}

Set Theory

Intersection - Given two events A and B, A ∩ B means A and B.

Union -Given two events A and B, A ∪ B means A or B.

Complement - Given an event A, ACA^C is called A’s complement, and means when "A does not occur", everything that’s not in A.

Subevent - Given two events A and B, AB means "B includes everything in A". We can write all valid events A as a subevent of the total sample space Ω: A ⊆ Ω.

De Morgan’s Laws - A useful identity that can make calculations easier by relating unions to intersections. Analogous results hold with more than two sets.

(AB)c=AcBc\begin{array}{l}{(A \cup B)^{c}=A^{c} \cap B^{c}} \end{array}
(AB)c=AcBc\begin{array}{l}{ (A \cap B)^{c}=A^{c} \cup B^{c}}\end{array}

Principle of Inclusion-Exclusion - For any events A1,,AnA_1 ,\dots , A_n,

P(i=1nAi)=iP(Ai)i<jP(AiAj)++(1)n+1P(A1An)P\left(\bigcup_{i=1}^{n} A_{i}\right)=\sum_{i} P\left(A_{i}\right)-\sum_{i<j} P\left(A_{i} \cap A_{j}\right)+\cdots+(-1)^{n+1} P\left(A_{1} \cap \cdots \cap A_{n}\right)

Counting

Multiplication Rule - If we have n decisions to make and the j-th decision has rjr_j outcomes, then the total number of potential outcomes is r1r2rn1rnr_1\cdot r_2\cdot\dots\cdot r_{n-1}\cdot r_n

Factorial - The number of ways to order n objects is given as n!=n(n1)21n! = n\cdot(n-1)\cdot\dots\cdot2\cdot1

Binomial Coefficient Formula - For kn, we have

(nk)=n(n1)(nk+1)k!=n!k!(nk)!\binom{n}{k} = \frac{n(n-1)\dots(n-k+1)}{k!} = \frac{n!}{k!(n-k)!}

(nk)=(nnk)\binom{n}{k} = \binom{n}{n-k}

Binomial Theorem - (x+y)n=k=0n(nk)xkynk(x+y)^n=\sum_{k=0}^n \binom{n}{k}x^k y^{n-k}

The sampling table gives the number of possible samples of size kk out of a population of size nn, under various assumptions about how the sample is collected.

 Order Matters  Order Doesn’t Matter  With Replacement nk(n+k1k) Without Replacement n!(nk)!(nk)\begin{array}{c|cc} & \text { Order Matters } & \text { Order Doesn't Matter } \\ \hline \text { With Replacement } & n^{k} & \left(\begin{array}{c} n+k-1 \\ k \end{array}\right) \\ \text { Without Replacement } & \frac{n !}{(n-k) !} & \left(\begin{array}{l} n \\ k \end{array}\right) \end{array}

Last updated