Section 7

Multivariate Distributions, Covariance & Correlation (BH Chapter 7)

Multinomial (Multivariate Discrete)

Let us say that the vector X=(X1,X2,X3,,Xk)Multk(n,p)\vec{\textbf{X}} = (X_1, X_2, X_3, \dots, X_k) \sim {Mult}_k(n, \vec{p}) where p=(p1,p2,,pk)\vec{p} = (p_1, p_2, \dots, p_k)

Story - We have n items, and then can fall into any one of the k buckets independently with the probabilities p=(p1,p2,,pk)\vec{p} = (p_1, p_2, \dots, p_k).

Example - Let us assume that every year, 100 students in the Harry Potter Universe are randomly and independently sorted into one of four houses with equal probability. The number of people in each of the houses is distributed Mult4(100,p)Mult_4(100,\vec{p}), where p=(.25,.25,.25,.25)\vec{p} = (.25, .25, .25, .25). Note that X1+X2++X4=100X_1 + X_2 + \cdots + X_4 = 100, and they are dependent.

Multinomial Coefficient The number of permutations of n objects where you have n1, n2, n3 . . . , n**k of each of the different variants is the multinomial coefficient

(nn1n2nk)=n!n1!n2!nk!{n \choose n_1n_2\dots n_k} = \frac{n!}{n_1!n_2!\dots n_k!}

Joint PMF - For n1,n2,n3,nkn_1, n_2, n_3 \dots, n_k,

P(X1=n1,X2=n2,,Xk=xk)=(nn1n2nk)p1n1p2n2pknkP(X_1 = n_1, X_2 = n_2, \dots, X_k = x_k) = {n \choose n_1n_2\dots n_k}p_1^{n_1}p_2^{n_2}\dots p_k^{n_k}

Lumping If you lump together multiple categories in a multinomial, then it is still multinomial. A multinomial with two dimensions (success, failure) is a binomial distribution.

Marginal PMF and Lumping

XiBin(n,pi)X_i \sim Bin(n, p_i)
Xi+XjBin(n,pi+pj)X_i + X_j \sim Bin(n, p_i + p_j)
X1,X2,X3Mult3(n,(p1,p2,p3))X1,X2+X3Mult2(n,(p1,p2+p3))X_1, X_2, X_3 \sim Mult_3(n, (p_1, p_2, p_3)) \Longrightarrow X_1, X_2 + X_3 \sim Mult_2(n, (p_1, p_2 + p_3))
X1,X2,,Xk1Xk=nkMultk1(nnk,(p11pk,p21pk,,pk11pk))X_1, X_2, \dots, X_{k-1} | X_k = n_k \sim Mult_{k-1}\left(n - n_k, \left(\frac{p_1}{1 - p_k}, \frac{p_2}{1 - p_k}, \dots, \frac{p_{k-1}}{1 - p_k}\right)\right)

Covariance and Correlation

Covariance is the two-random-variable equivalent of Variance, defined by the following:

Cov(X,Y)=E[(XE(X))(YE(Y))]=E(XY)E(X)E(Y)Cov(X, Y) = E[(X - E(X))(Y - E(Y))] = E(XY) - E(X)E(Y)

Correlation is a rescaled variant of Covariance that is always between -1 and 1.

Corr(X,Y)=Cov(X,Y)Var(X)Var(Y)=Cov(X,Y)σXσYCorr(X, Y) = \frac{Cov(X, Y)}{\sqrt{Var(X)Var(Y)}} = \frac{Cov(X, Y)}{\sigma_X\sigma_Y}

Covariance and Indepedence - If two random variables are independent, then they are uncorrelated. The inverse is not necessarily true.

Covariance and Variance - Covariance of a variable with itself is just the variance!

Cov(X,X)=E(XX)E(X)E(X)=E(X2)[E(X)]2=Var(X)Cov(X, X) = E(XX) - E(X)E(X) = E(X^2) - [E(X)]^2 = Var(X)
Var(X+Y)=Var(X)+Var(Y)+2Cov(X,Y)Var(X + Y) = Var(X) + Var(Y) + 2Cov(X, Y)
Var(X1+X2++Xn)=i=1nVar(Xi)+2i<jCov(Xi,Xj)Var(X_1 + X_2 + \dots + X_n ) = \sum_{i = 1}^{n}Var(X_i) + 2\sum_{i < j} Cov(X_i, X_j)

Covariance and Linearity - For random variables W, X, Y, Z and constants b, c:

Cov(X+b,Y+c)=Cov(X,Y)Cov(X + b, Y + c) = Cov(X, Y)
Cov(2X,3Y)=6Cov(X,Y)Cov(2X, 3Y) = 6Cov(X, Y)
Cov(W+X,Y+Z)=Cov(W,Y)+Cov(W,Z)+Cov(X,Y)+Cov(X,Z)Cov(W + X, Y + Z) = Cov(W, Y) + Cov(W, Z) + Cov(X, Y) + Cov(X, Z)

Covariance and Invariance - Correlation, Covariance, and Variance are addition-invariant, which means that adding a constant to the term(s) does not change the value. Let b and c be constants

Cov(X+b,Y+c)=Cov(X,Y)Cov(X + b, Y + c) = Cov(X, Y)
Cov(2X,3Y)=6Cov(X,Y)Cov(2X, 3Y) = 6Cov(X, Y)
Cov(W+X,Y+Z)=Cov(W,Y)+Cov(W,Z)+Cov(X,Y)+Cov(X,Z)Cov(W + X, Y + Z) = Cov(W, Y) + Cov(W, Z) + Cov(X, Y) + Cov(X, Z)

In addition to addition-invariance, Correlation is scale-invariant, which means that multiplying the terms by any constant does not affect the value. Covariance and Variance are not scale-invariant.

Corr(2X,3Y)=Cov(2X,3Y)Var(2X)Var(3Y)=6Cov(X,Y)36Var(X)Var(Y)=Cov(X,Y)Var(X)Var(Y)=Corr(X,Y)Corr(2X, 3Y) = \frac{Cov(2X, 3Y)}{\sqrt{Var(2X)Var(3Y)}} = \frac{6Cov(X, Y)}{\sqrt{36Var(X)Var(Y)}} = \frac{Cov(X, Y)}{\sqrt{Var(X)Var(Y)}} = Corr(X, Y)

Last updated