Questions

LLN

Let there be i.i.d. r.v.s. X1,X2,XnX_1, X_2, \ldots X_n with mean μ\mu and variance σ2\sigma^2.

Q
A
Q

Give a value of nn (a specific number) that will ensure that there is at least a 99% chance that the sample mean will be within 2 standard deviations of the true mean μ\mu.

A

Let Xˉ\bar{X} be the sample mean, which is equal to X1++Xnn\frac{X_1 + \ldots + X_n}{n}. Saying that there is at least a 99% chance that Xˉnμ<2σ|\bar{X}_n - \mu| < 2 \sigma is the same as saying that there is at most a 1% chance for Xˉnμ>2σ|\bar{X}_n - \mu| > 2 \sigma. Thus, we want to calculate an nn such that:

P(Xˉnμ>2σ)<0.01P\left( |\bar{X}_n - \mu | > 2 \sigma \right) < 0.01

Applying Chebyshev's inequality, we get the following:

P(Xˉnμ>2σ)VarXˉn(2σ)2=σ2n4σ2=14nP(|\bar{X}_n - \mu| > 2 \sigma) \leq \frac{Var \bar{X}_n}{(2 \sigma)^2} = \frac{\frac{\sigma^2}{n}}{4 \sigma^2} = \frac{1}{4n}

Therefore, if we choose n=25n = 25, we get the desired inequality.

Gamma CLT

Q
A
Q

Explain why a Gamma random variable with parameters (n,λ)(n, \lambda) is approximately Normal when n is large.

A

Let Xn=Y1+Y2++YnX_n = Y_1 + Y_2 + \ldots + Y_n. Then, we can have XnGamma(n,λ)X_n \sim Gamma(n, \lambda) and YiY_i's be i.i.d. Expo(λ)Expo(\lambda). By the central limit theorem, since XnX_n is the sum of i.i.d. random variables, it converges to a normal distribution as nn \to \infty.

Q
A
Q

Let XnGamma(n,λ)X_n \sim Gamma(n, \lambda). Determine aa and bb such that XnabN(0,1)\frac{X_n - a}{b} \to N(0,1) as nn \to \infty

A

Using the parameters of the Gamma distribution, the central limit theorem also tells us that

XnN(nλ,nλ2)X_n \sim N\left(\frac{n}{\lambda}, \frac{n}{\lambda^2} \right)

In order to convert this normal distribution to a standard normal (N(0,1)N(0,1)), all we need to do is subtract the mean and divide by the standard deviation. Thus:

XnnλnλN(0,1)\frac{X_n - \frac{n}{\lambda}}{\frac{\sqrt{n}}{\lambda}} \sim N(0,1)

as nn \to \infty.

Markov Chain

Suppose XnX_n is a two-state Markov chain with transition matrix

Q=(1ααβ1β)Q = \left(\begin{array}{cc} 1-\alpha & \alpha \\ \beta & 1-\beta \end{array}\right)

The rows and columns are indexed 0,1 such that q0,0=1α,q0,1=α,q1,0=β,q1,1=1βq_{0,0}=1-\alpha, q_{0,1}=\alpha, q_{1,0}=\beta, q_{1,1}=1-\beta.

Q
A
Q

Find the stationary distribution s=(s0,s1)\vec{s} = (s_0, s_1) of XnX_n by solving sQ=s\vec{s} Q = \vec{s}.

A

By solving sQ=s\vec{s} Q = \vec{s}, we have that

s0=s0(1α)+s1β and s1=s0α+s1(1β)s_0 = s_0 (1 - \alpha) + s_1 \beta \textrm{ and } s_1 = s_0 \alpha + s_1 (1 - \beta)

And by solving this system of linear equations, it follows that

s=(βα+β,αα+β)\vec{s} = \left({\frac{\beta}{\alpha + \beta}, \frac{\alpha}{\alpha + \beta}}\right)
Q
A
Q

Show that this Markov Chain is reversible under the stationary distribution found in the previous question.

A

To verify the validity of a stationary distribution for a chain, we just need to show that siqij=sjqjis_i q_{ij} = s_j q_{ji}, which is done if we can show that s0q01=s1q10s_0 q_{01} = s_1 q_{10}. We have that

s0q01=αβα+β=s1q10s_0 q_{01} = \frac{\alpha\beta}{\alpha + \beta} = s_1 q_{10}

which satisfies our reversibility condition and verifies our stationary distribution from the previous question.

Q
A
Q

Let Zn=(Xn1,Xn)Z_n = (X_{n-1}, X_n). Is ZnZ_n a Markov chain? If so, what are the states and transition matrix?

A

Yes, ZnZ_n is a Markov Chain because conditional on ZnZ_n, Zn+1Z_{n+1} is independent to Zn1Z_{n-1}. This is because the components of Zn+1Z_{n+1} and Zn1Z_{n-1} are either constants conditioned in ZnZ_n or independent of each other given that XnX_n is a Markov Chain.

The states are given as {(0,0),(0,1),(1,0),(1,1)}\{ (0, 0), (0, 1), (1, 0), (1, 1) \}. The transition matrix is given as

Q=(0,0)(0,1)(1,0)(1,1)(0,0)1αα00(0,1)00β1β(1,0)1αα00(1,1)00β1βQ= \begin{array}{ccccc} & (0,0) & (0,1) & (1,0) & (1,1) \\ (0,0) & 1-\alpha & \alpha & 0 & 0 \\ (0,1) & 0 & 0 & \beta & 1-\beta \\ (1,0) & 1-\alpha & \alpha & 0 & 0 \\ (1,1) & 0 & 0 & \beta & 1-\beta \end{array}