# Procrastinators

Three students are working independently on their Stat 110 problem set. All three start at 1pm on the day the pset is due, and each takes an Exponential time with mean 6 hours to complete the homework.

Q
A
Q

What is the earliest time when all three students will have completed the homework, on average?

A

Label the students, and let $X_i$ be the amount of time it takes student $i$ to finish the pset. Let $T$ be the total amount of time it takes for all three students to finish, and let $T = T_1 + T_2 + T_3$, where $T_1 = \min(X_1, X_2,X_3)$, and so on. We know that $T_1 \sim Expo(\frac{3}{6})$, as we showed in the previous problem set, $T_2 \sim Expo(\frac{2}{6})$, and $T_3\sim Expo(\frac{1}{6})$. Therefore,

$E[T] = 2 + 3 + 6 = \boxed{11}$

# Counting Cars

Cars pass by a certain point on a road according to a Poisson process with rate $\lambda$ cars/minute. Let $N_t \sim Pois(\lambda t)$ be the number of cars that pass by that point in the time interval $[0, t]$, with $t$ measured in minutes.

A certain device is able to count cars as they pass by, but it does not record the arrival times. At time 0, the counter on the device is reset to 0. At time 3 minutes, the device is observed and it is found that exactly 1 car had passed by.

Q
A
Q

Given this information, find the conditional CDF of when that car arrived. Also describe in words what the result says.

A

Let $T_1$ be the arrival time of the first car to arrive after time 0. Unconditionally, $T_1 \sim Expo(\lambda)$. Given $N_3 = 1$, for $0 \leq t \leq 3$, we have $P(T_1 \leq t | N_3 = 1) = P(N_t \geq 1 | N_3 = 1) = \frac{P(N_t \geq 1, N_3 = 1)}{P(N_3 = 1)} = \frac{P(N_t = 1, N_3 = 1)}{P(N_3 = 1)}$ By definition of the Poisson process, the numerator is

\begin{aligned} P(N_t = 1, N_3 = 1) &= P\left({N_{[0, t]} = 1, N_{(t, 3]} = 0}\right) \\ &= P\left({N_{[0, t]} = 1}\right)P\left({N_{(t, 3]} = 0}\right)\\ &= e^{-\lambda t}\lambda t e^{-\lambda(3 - t)}\\ &= \lambda t e^{-3 \lambda} \end{aligned}

and the denominator is $e^{-3\lambda} 3 \lambda$. Hence,

$P(T_1 \leq t | N_3 = 1) = \frac{\lambda t e^{-3 \lambda}}{e^{-3\lambda} 3 \lambda} = \frac{t}{3}$

for $0 \leq t \leq 3$ (and 0 for $t < 0$ and 1 for $t > 3$). This says that the conditional distribution of the first arrival time, given that there was exactly one arrival in $[0, 3]$, is $Unif(0, 3)$.

In the late afternoon, you are counting blue cars. Each car that passes by is blue with probability $b$, independently of all other cars.

Q
A
Q

Find the marginal PMF of the number of blue cars and number of non-blue cars that pass by the point in 10 minutes.

A

Let $X$ and $Y$ be the number of blue and non-blue cars that pass by in those 10 minutes respectively, and $N = X + Y$. Then $N \sim Pois(10\lambda)$ and $X|N \sim Bin(N, b)$. By the chicken-egg story, $X$ and $Y$ are independent with

$X \sim Pois(10 \lambda b)$
$Y \sim Pois(10 \lambda (1 - b))$
Q
A
Q

Find the joint PMF of the number of blue cars and number of non-blue cars that pass by the point in 10 minutes.

A

The joint PMF is the product of the marginal PMFs:

$P(X = x, Y = y) = \frac{e^{-10 \lambda b}(10 \lambda b)^x}{x!}\frac{e^{-10 \lambda (1 - b)}(10 \lambda (1 - b))^y}{y!}$

for all nonnegative integers $x, y$.

# Normal Moments

Let $X_1 \sim N(\mu_1, \sigma_1^2)$ and $X_2 \sim N(\mu_2, \sigma_2^2)$.

Q
A
Q

Use the MGF to show that for any values of $a, b \not= 0$, $Y = a X_1 + b X_2$ is also Normal. (You may use the fact that the MGF of a $N(\mu, \sigma^2)$ distribution is $e^{\mu t + \frac{1}{2}\sigma^2 t^2}$.)

A

We know that the MGF for the sum of 2 independent random variables is the product of their individual MGFs. We also know that the MGF of a scalar $c$ times a random variable is equal to that random variable's MGF with $ct$ subsituted for $t$.

\begin{aligned} M_Y(t) &= \exp \left(\mu_1 (at) + \frac{1}{2} \sigma_1^2 (at)^2\right) \cdot \exp\left(\mu_2 (bt) + \frac{1}{2} \sigma_2^2 (bt)^2\right) \\ &= \exp\left( \mu_1 (at) + \frac{1}{2} \sigma_1^2 (at)^2 + \mu_2 (bt) + \frac{1}{2} \sigma_2^2 (bt)^2\right) \\ &= \exp\left( (a\mu_1 + b \mu_2)t + {\frac{(a \sigma_1)^2 + (b \sigma_2)^2}{2} } t^2\right) \end{aligned}

This is the $N\left({a \mu_1 + b \mu_2, a^2 \sigma_1^2 + b^2 \sigma_2^2}\right)$ MGF.

Let $Z \sim N(0, 1)$ and $Y = e^Z$. Then $Y$ has a LogNormal distribution (because its log is Normal!).

Q
A
Q

Find all the moments of $Y$. That is, find $E(Y^n)$ for all $n$.

A

Using exponent rules,

$E\left({Y^n}\right) = E\left({\left({e^Z}\right)^n}\right) = E\left({e^{nZ}}\right)$

We may be tempted to use LOTUS at this point, but it is easier to recognize that $E\left({e^{nZ}}\right)$ is the MGF of $Z$, evaluated at $t = n$. In other words, $E\left({e^{nZ}}\right) = M_Z(n)$. Since $M_Z(t) = e^{t^2/2}$, we conclude that $M_Z(n) = e^{n^2/2}$.

Q
A
Q

Show the MGF of $Y$ does not exist, by arguing that the integral $E\left({e^{tY}}\right)$ diverges for $t > 0$.

A

The key is to remember that the MGF is an expectation, which requires LOTUS and integration:

$E\left({e^{tY}}\right) = E\left({e^{te^Z}}\right) = \int_{-\infty}^\infty e^{te^z} \frac{1}{\sqrt{2 \pi}} e^{-z^2/2} \textrm{d} z = \int_{-\infty}^\infty \frac{1}{\sqrt{2 \pi}} e^{te^z -z^2/2} \textrm{d} z$

Suppose $t > 0$. Then as $z$ grows larger, $te^z$ will grow at a faster rate than $z^2 / 2$. Hence, $te^z - z^2 / 2$ will explode. Therefore, the integral diverges for all $t > 0$. Since $E\left({e^{tY}}\right)$ is not finite on an open interval around 0, the MGF of $Y$ does not exist.