STAT110: Fall 2020
  • Welcome!
  • FAQ
  • Resources
  • Section 1
    • Questions
  • Section 2
    • Questions
  • Section 3
    • Questions
  • Section 4
    • Questions
  • Section 5
    • Questions
  • Section 6
    • Questions
  • Section 7
    • Questions
  • Section 8
    • Questions
  • Section 9
    • Questions
  • Section 10
    • Questions
Powered by GitBook
On this page
  1. Section 6

Questions

PreviousSection 6NextSection 7

Last updated 4 years ago

Was this helpful?

CtrlK
  • Procrastinators
  • Counting Cars
  • Normal Moments

Was this helpful?

Procrastinators

Three students are working independently on their Stat 110 problem set. All three start at 1pm on the day the pset is due, and each takes an Exponential time with mean 6 hours to complete the homework.

What is the earliest time when all three students will have completed the homework, on average?

Label the students, and let XiX_iXi​ be the amount of time it takes student iii to finish the pset. Let TTT be the total amount of time it takes for all three students to finish, and let T=T1+T2+T3T = T_1 + T_2 + T_3T=T1​+T2​+T3​, where T1=min⁡(X1,X2,X3)T_1 = \min(X_1, X_2,X_3)T1​=min(X1​,X2​,X3​), and so on. We know that T1∼Expo(36)T_1 \sim Expo(\frac{3}{6})T1​∼Expo(63​), as we showed in the previous problem set, T2∼Expo(26)T_2 \sim Expo(\frac{2}{6})T2​∼Expo(62​), and T3∼Expo(16)T_3\sim Expo(\frac{1}{6})T3​∼Expo(61​). Therefore,

E[T]=2+3+6=11E[T] = 2 + 3 + 6 = \boxed{11}E[T]=2+3+6=11​

Counting Cars

Cars pass by a certain point on a road according to a Poisson process with rate λ\lambdaλ cars/minute. Let Nt∼Pois(λt)N_t \sim Pois(\lambda t)Nt​∼Pois(λt) be the number of cars that pass by that point in the time interval [0,t][0, t][0,t], with ttt measured in minutes.

A certain device is able to count cars as they pass by, but it does not record the arrival times. At time 0, the counter on the device is reset to 0. At time 3 minutes, the device is observed and it is found that exactly 1 car had passed by.

Given this information, find the conditional CDF of when that car arrived. Also describe in words what the result says.

Let be the arrival time of the first car to arrive after time 0. Unconditionally, . Given , for , we have By definition of the Poisson process, the numerator is

and the denominator is . Hence,

for (and 0 for and 1 for ). This says that the conditional distribution of the first arrival time, given that there was exactly one arrival in , is .

In the late afternoon, you are counting blue cars. Each car that passes by is blue with probability $b$, independently of all other cars.

Find the marginal PMF of the number of blue cars and number of non-blue cars that pass by the point in 10 minutes.

Let and be the number of blue and non-blue cars that pass by in those 10 minutes respectively, and . Then and . By the chicken-egg story, and are independent with

Find the joint PMF of the number of blue cars and number of non-blue cars that pass by the point in 10 minutes.

The joint PMF is the product of the marginal PMFs:

P(X=x,Y=y)=e−10λb(10λb)xx!e−10λ(1−b)(10λ(1−b))yy!P(X = x, Y = y) = \frac{e^{-10 \lambda b}(10 \lambda b)^x}{x!}\frac{e^{-10 \lambda (1 - b)}(10 \lambda (1 - b))^y}{y!}P(X=x,Y=y)=x!e−10λb(10λb)x​y!e−10λ(1−b)(10λ(1−b))y​

for all nonnegative integers x,yx, yx,y.

Normal Moments

Let X1∼N(μ1,σ12)X_1 \sim N(\mu_1, \sigma_1^2)X1​∼N(μ1​,σ12​) and X2∼N(μ2,σ22)X_2 \sim N(\mu_2, \sigma_2^2)X2​∼N(μ2​,σ22​).

Use the MGF to show that for any values of , is also Normal. (You may use the fact that the MGF of a distribution is .)

We know that the MGF for the sum of 2 independent random variables is the product of their individual MGFs. We also know that the MGF of a scalar times a random variable is equal to that random variable's MGF with subsituted for .

This is the MGF.

Let Z∼N(0,1)Z \sim N(0, 1)Z∼N(0,1) and Y=eZY = e^ZY=eZ. Then YYY has a LogNormal distribution (because its log is Normal!).

Find all the moments of . That is, find for all .

Using exponent rules,

We may be tempted to use LOTUS at this point, but it is easier to recognize that is the MGF of , evaluated at . In other words, . Since , we conclude that .

Show the MGF of does not exist, by arguing that the integral diverges for .

The key is to remember that the MGF is an expectation, which requires LOTUS and integration:

Suppose . Then as grows larger, will grow at a faster rate than . Hence, will explode. Therefore, the integral diverges for all . Since is not finite on an open interval around 0, the MGF of does not exist.

T1T_1T1​
T1∼Expo(λ)T_1 \sim Expo(\lambda)T1​∼Expo(λ)
N3=1N_3 = 1N3​=1
0≤t≤30 \leq t \leq 30≤t≤3
P(T1≤t∣N3=1)=P(Nt≥1∣N3=1)=P(Nt≥1,N3=1)P(N3=1)=P(Nt=1,N3=1)P(N3=1)P(T_1 \leq t | N_3 = 1) = P(N_t \geq 1 | N_3 = 1) = \frac{P(N_t \geq 1, N_3 = 1)}{P(N_3 = 1)} = \frac{P(N_t = 1, N_3 = 1)}{P(N_3 = 1)}P(T1​≤t∣N3​=1)=P(Nt​≥1∣N3​=1)=P(N3​=1)P(Nt​≥1,N3​=1)​=P(N3​=1)P(Nt​=1,N3​=1)​
P(Nt=1,N3=1)=P(N[0,t]=1,N(t,3]=0)=P(N[0,t]=1)P(N(t,3]=0)=e−λtλte−λ(3−t)=λte−3λ\begin{aligned} P(N_t = 1, N_3 = 1) &= P\left({N_{[0, t]} = 1, N_{(t, 3]} = 0}\right) \\ &= P\left({N_{[0, t]} = 1}\right)P\left({N_{(t, 3]} = 0}\right)\\ &= e^{-\lambda t}\lambda t e^{-\lambda(3 - t)}\\ &= \lambda t e^{-3 \lambda} \end{aligned}P(Nt​=1,N3​=1)​=P(N[0,t]​=1,N(t,3]​=0)=P(N[0,t]​=1)P(N(t,3]​=0)=e−λtλte−λ(3−t)=λte−3λ​
e−3λ3λe^{-3\lambda} 3 \lambdae−3λ3λ
P(T1≤t∣N3=1)=λte−3λe−3λ3λ=t3P(T_1 \leq t | N_3 = 1) = \frac{\lambda t e^{-3 \lambda}}{e^{-3\lambda} 3 \lambda} = \frac{t}{3}P(T1​≤t∣N3​=1)=e−3λ3λλte−3λ​=3t​
0≤t≤30 \leq t \leq 30≤t≤3
t<0t < 0t<0
t>3t > 3t>3
[0,3][0, 3][0,3]
Unif(0,3)Unif(0, 3)Unif(0,3)
XXX
YYY
N=X+YN = X + YN=X+Y
N∼Pois(10λ)N \sim Pois(10\lambda)N∼Pois(10λ)
X∣N∼Bin(N,b)X|N \sim Bin(N, b)X∣N∼Bin(N,b)
XXX
YYY
X∼Pois(10λb)X \sim Pois(10 \lambda b)X∼Pois(10λb)
Y∼Pois(10λ(1−b))Y \sim Pois(10 \lambda (1 - b))Y∼Pois(10λ(1−b))
a,b≠0a, b \not= 0a,b=0
Y=aX1+bX2Y = a X_1 + b X_2Y=aX1​+bX2​
N(μ,σ2)N(\mu, \sigma^2)N(μ,σ2)
eμt+12σ2t2e^{\mu t + \frac{1}{2}\sigma^2 t^2}eμt+21​σ2t2
ccc
ctctct
ttt
MY(t)=exp⁡(μ1(at)+12σ12(at)2)⋅exp⁡(μ2(bt)+12σ22(bt)2)=exp⁡(μ1(at)+12σ12(at)2+μ2(bt)+12σ22(bt)2)=exp⁡((aμ1+bμ2)t+(aσ1)2+(bσ2)22t2)\begin{aligned} M_Y(t) &= \exp \left(\mu_1 (at) + \frac{1}{2} \sigma_1^2 (at)^2\right) \cdot \exp\left(\mu_2 (bt) + \frac{1}{2} \sigma_2^2 (bt)^2\right) \\ &= \exp\left( \mu_1 (at) + \frac{1}{2} \sigma_1^2 (at)^2 + \mu_2 (bt) + \frac{1}{2} \sigma_2^2 (bt)^2\right) \\ &= \exp\left( (a\mu_1 + b \mu_2)t + {\frac{(a \sigma_1)^2 + (b \sigma_2)^2}{2} } t^2\right) \end{aligned}MY​(t)​=exp(μ1​(at)+21​σ12​(at)2)⋅exp(μ2​(bt)+21​σ22​(bt)2)=exp(μ1​(at)+21​σ12​(at)2+μ2​(bt)+21​σ22​(bt)2)=exp((aμ1​+bμ2​)t+2(aσ1​)2+(bσ2​)2​t2)​
N(aμ1+bμ2,a2σ12+b2σ22)N\left({a \mu_1 + b \mu_2, a^2 \sigma_1^2 + b^2 \sigma_2^2}\right)N(aμ1​+bμ2​,a2σ12​+b2σ22​)
YYY
E(Yn)E(Y^n)E(Yn)
nnn
E(Yn)=E((eZ)n)=E(enZ)E\left({Y^n}\right) = E\left({\left({e^Z}\right)^n}\right) = E\left({e^{nZ}}\right)E(Yn)=E((eZ)n)=E(enZ)
E(enZ)E\left({e^{nZ}}\right)E(enZ)
ZZZ
t=nt = nt=n
E(enZ)=MZ(n)E\left({e^{nZ}}\right) = M_Z(n)E(enZ)=MZ​(n)
MZ(t)=et2/2M_Z(t) = e^{t^2/2}MZ​(t)=et2/2
MZ(n)=en2/2M_Z(n) = e^{n^2/2}MZ​(n)=en2/2
YYY
E(etY)E\left({e^{tY}}\right)E(etY)
t>0t > 0t>0
E(etY)=E(eteZ)=∫−∞∞etez12πe−z2/2dz=∫−∞∞12πetez−z2/2dzE\left({e^{tY}}\right) = E\left({e^{te^Z}}\right) = \int_{-\infty}^\infty e^{te^z} \frac{1}{\sqrt{2 \pi}} e^{-z^2/2} \textrm{d} z = \int_{-\infty}^\infty \frac{1}{\sqrt{2 \pi}} e^{te^z -z^2/2} \textrm{d} zE(etY)=E(eteZ)=∫−∞∞​etez2π​1​e−z2/2dz=∫−∞∞​2π​1​etez−z2/2dz
t>0t > 0t>0
zzz
tezte^ztez
z2/2z^2 / 2z2/2
tez−z2/2te^z - z^2 / 2tez−z2/2
t>0t > 0t>0
E(etY)E\left({e^{tY}}\right)E(etY)
YYY