Section 5
Continuous Probability and Poisson Process (BH Chapter 5)
Last updated
Continuous Probability and Poisson Process (BH Chapter 5)
Last updated
What is a Continuous Random Variable (CRV)? A continuous random variable can take on any possible value within a certain interval (for example, [0, 1]), whereas a discrete random variable can only take on variables in a list of countable values (for example, all the integers, or the values , etc.)
PMF’s vs. PDF’s Discrete R.V’s have Probability Mass Functions, while continuous R.V.’s have Probability Density Functions. We visualize a PDF as a graph where the x axis is the support of our values
Intuitively, what do the values represent? Well it doesn't make sense to say for a continuous r.v. because for all . Think about the value in the above graph as: the relative frequency for getting a value within of the value where is small
What is the Cumulative Density Function (CDF)? It is the following function of x.
1) is increasing.
2) is right-continuous.
3) as , as
What is the Probability Density Function (PDF)? The PDF, f (x), is the derivative of the CDF.
Thus to find the probability that a CRV takes on a value in an interval, you can integrate the PDF, thus finding the area under the density curve.
Two additional properties of a PDF:
The PDF must always be nonnegative.
How do I find the expected value of a CRV? Where in discrete cases you sum over the probabilities, in continuous cases you integrate over the densities.
When you plug any random variable into its own CDF, you get a Uniform[0,1] random variable. When you put a Uniform[0,1] into an inverse CDF, you get the corresponding random variable. For example, let’s say that a random variable X has a CDF
By the Universality of the the Uniform, if we plug in X into this function then we get a uniformly distributed random variable.
The number of arrivals that occur in disjoint intervals are independent of each other.
Mean, Variance, and other moments (Skewness, Kurtosis, etc.) can be expressed in terms of the moments of a random variable.
MGF For any random variable X, the moment generating function (MGF) of X is an exponential function of X if it exists for a finitely-sized interval centered around 0. The MGF is just a function of a dummy variable t.
Uniqueness of the MGF: If it exists, the MGF uniquely defines the distribution. This means that for any two random variables X and Y, they are distributed the same (their CDFs/PDFs are equal) if and only if their MGF’s are equal. You can’t have different PDFs when you have two random variables that have the same MGF.
Summing Independent R.V.s by Multiplying MGFs:
If X and Y are independent, then the MGF of the sum of two random variables is the product of the MGFs of those two random variables.
It must integrate to 1 (because the probability that a CRV falls in the interval is 1
Similarly, since then . The key point is that for any continuous random variable X, we can transform it into a uniform random variable and back by using its CDF.
In one dimension, we have: , or
For discrete random variables: For continuous random variables:
The Poisson process gives a story that links the Exponential distribution with the Poisson distribution. A Poisson process with rate has the following properties:
The number of arrivals that occur in an interval of length is distributed .
Count-Time Duality- Instead of asking how many events occur within some amount of time, we can flip the question around and ask how long it takes until some number of events occur. Let be the amount of time it takes until the n-th event occurs and let be the number of events that occur within time . What relationship do we have between and ?
To reason about this in words, the event that the n-th arrival time is greater than is equivalent to the event that the number of arrivals by time is less than .
Using count-time duality, we can discern the distribution of . Let us first look at the distribution for the first arrival.
For the first arrival, we have This is the CDF!
Moment - Moments describe the shape of a distribution. The first three moments, are related to Mean, Variance, and Skewness of a distribution. The moment of a random variable X is
Mean
Variance
Why is it called the Moment Generating Function? Because the derivative of the moment generating function evaluated 0 is the moment of X!
Why does this relationship hold? By differentiation under the integral sign and then plugging in :
MGF of linear combination of X: MGF of linear combination of X. If we have , then