Section 6
MGF and Joint, Conditional, and Marginal Distributions (BH Chapter 6)
Moment - Moments describe the shape of a distribution. The first three moments, are related to Mean, Variance, and Skewness of a distribution. The moment of a random variable X is
Mean, Variance, and other moments (Skewness, Kurtosis, etc.) can be expressed in terms of the moments of a random variable.
Mean
Variance
Moment Generating Functions
MGF For any random variable X, the moment generating function (MGF) of X is an exponential function of X if it exists for a finitely-sized interval centered around 0. The MGF is just a function of a dummy variable t.
Why is it called the Moment Generating Function? Because the derivative of the moment generating function evaluated 0 is the moment of X!
Why does this relationship hold? By differentiation under the integral sign and then plugging in :
MGF Properties
MGF of linear combination of X: MGF of linear combination of X. If we have , then
Uniqueness of the MGF: If it exists, the MGF uniquely defines the distribution. This means that for any two random variables X and Y, they are distributed the same (their CDFs/PDFs are equal) if and only if their MGF’s are equal. You can’t have different PDFs when you have two random variables that have the same MGF.
Summing Independent R.V.s by Multiplying MGFs:
If X and Y are independent, then the MGF of the sum of two random variables is the product of the MGFs of those two random variables.
Joint Distributions
Sometimes we have more than one random variable of interest, and we want to study probabilities associated with all of the random variables. Instead of studying the distributions of separately, we can study the distribution of the multivariate vector . Joint PDFs and CDFs are analogous to multivariate versions of univariate PDFs and CDFs. Usually joint PDFs and PMFs carry more information than the marginal ones do, because they account for the interactions between the various random variables. If, however, the random variables are independent, then the joint PMF/PDF is just the product of the marginals and we get no extra information by studying them jointly rather than marginally.
Both the Joint PMF and Joint PDF must be non-negative and sum/integrate to 1. () (). Like in the univariate case, you sum/integrate the PMF/PDF to get the CDF.
Conditional Distributions
By Bayes' Rule, Similar conditions apply to conditional distributions of random variables.
For discrete random variables:
For continuous random variables:
Last updated