Questions

Box-Muller Transformation

Suppose we are told that UUnif(0,1)U \sim Unif\left({0, 1}\right) and VExpo(12)V \sim Expo\left({\frac{1}{2}}\right) independently.

Q
A
Q

Find the density function and thus the joint distribution of

X=Vsin(2πU)Y=Vcos(2πU)\begin{aligned} X &= \sqrt{V} \sin\left({2 \pi U}\right) \\ Y &= \sqrt{V} \cos\left({2 \pi U}\right)\end{aligned}
A

To do this, we can obtain the Jacobian and perform a change of variables. Note that it is easier to find the Jacobian of the transformation going for (U,V)(X,Y)\left({U, V}\right) \rightarrow \left({X, Y}\right) since XX and YY are already expressed in terms of UU and VV. The Jacobian is given as follows.

J=(x,y)(u,v)=(xuxvyuyv)=(vcos(2πu)(2π)12vsin(2πu)vsin(2πu)(2π)12vcos(2πu))J = \frac{\partial(x, y)}{\partial(u, v)} = \begin{pmatrix} \frac{\partial x}{\partial u} & \frac{\partial x}{\partial v} \\ \frac{\partial y}{\partial u} & \frac{\partial y}{\partial v} \\ \end{pmatrix} = \begin{pmatrix} \sqrt{v} \cos\left({2 \pi u}\right) \left({2 \pi}\right) & \frac{1}{2 \sqrt{v}} \sin\left({2 \pi u}\right) \\ \sqrt{v} \sin\left({2 \pi u}\right) \left({2 \pi}\right) & \frac{1}{2 \sqrt{v}} \cos\left({2 \pi u}\right) \\ \end{pmatrix}

The Jacobian J|J| is found to be the determinant of JJ above.

J=vcos(2πu)(2π)12vcos(2πu)+vsin(2πu)(2π)12vsin(2πu)=(cos2(2πu)+sin2(2πu))π=π\begin{aligned} |J| &= \sqrt{v} \cos\left({2 \pi u}\right) \left({2 \pi}\right) \frac{1}{2 \sqrt{v}} \cos(2 \pi u) + \sqrt{v} \sin({2 \pi u}) (2 \pi) \frac{1}{2 \sqrt{v}} \sin ({2 \pi u}) \\ &= ({\cos^2({2 \pi u}) + \sin^2({2 \pi u})}) \pi \\ &= \pi\end{aligned}

Thus, our transformation can be specified as follows. Note that X2+Y2=VX^2 + Y^2 = V.

fU,V(u,v)=fX,Y(x,y)JfX,Y(x,y)=1JfU,V(u,v)=1JfU(u)fV(v)=1π12exp(12v)=12πexp(x22y22)=12πexp(x22)12πexp(y22)\begin{aligned} f_{U, V}(u, v) &= f_{X, Y}(x, y) |J| \\ f_{X, Y}(x, y) &= \frac{1}{|J|} f_{U, V}(u, v) \\ &= \frac{1}{|J|} f_{U}(u) f_{V}(v) \\ &= \frac{1}{\pi}\frac{1}{2}\exp\left({- \frac{1}{2}v}\right) \\ &= \frac{1}{2\pi}\exp\left({- \frac{x^2}{2} - \frac{y^2}{2}}\right) \\ &= \frac{1}{\sqrt{2\pi}}\exp\left({-\frac{x^2}{2}}\right)\frac{1}{\sqrt{2\pi}}\exp\left({-\frac{y^2}{2}}\right)\end{aligned}

Hence, because we see that the pdf factors nicely into a term involving xx and a term involving yy, those terms are in fact the marginal densities of XX and YY (be sure to make sure that the normalizing constants go to the right places). We recognize the marginal densities as standard normal densities. And thus, it follows that XX and YY are i.i.d. N(0,1)N(0, 1).

Beta, Gamma, and Binomial

Let BBeta(α,β)B\sim Beta(\alpha, \beta). Find the distribution of 1B1 - B in the following two ways.

Q
A
Q

Find the distribution of 1B1 - B using a change of variables

A

Let W=1BW = 1 - B. Then we have B=1WB = 1 - W, and so dbdw=1=1\left| \frac{db}{dw} \right| = |-1| = 1. Hence, the PDF of WW is

fW(w)=fB(b)dbdw=Γ(α+β)Γ(α)Γ(β)bα1(1b)β1=Γ(α+β)Γ(α)Γ(β)(1w)α1wβ1f_W(w) = f_B(b) \left| \frac{db}{dw} \right| = \frac{\Gamma(\alpha + \beta)}{\Gamma(\alpha)\Gamma(\beta)}b^{\alpha - 1}(1 - b)^{\beta - 1} = \frac{\Gamma(\alpha + \beta)}{\Gamma(\alpha)\Gamma(\beta)}(1 - w)^{\alpha - 1}w^{\beta - 1}

for 0<w<10 < w < 1. We recognize this PDF as the Beta(β,α)Beta(\beta, \alpha) distribution, and so WBeta(β,α)W \sim Beta(\beta, \alpha).

Q
A
Q

Find the distribution of 1B1 - B using a story proof related to the Gamma distribution.

A

Using the bank-post office story, we can represent B=XX+YB = \frac{X}{X + Y} with XGamma(α,1)X \sim Gamma(\alpha, 1) and YGamma(β,1)Y \sim Gamma(\beta, 1) independent. Then 1B=YX+YBeta(β,α)1 - B = \frac{Y}{X + Y} \sim Beta(\beta, \alpha) by the same story.

Q
A
Q

How does BB and 1B1-B relate to the Binomial distribution?

A

If we use Beta(α,β)Beta(\alpha, \beta) as the prior distribution for the probability pp of success in a Binomial problem, interpreting α\alpha as the number of prior successes and β\beta as the number of prior failures, then 1p1 - p is the probability of failure and, interchanging the roles of "success" and "failure," it makes sense to have 1p1-p be distributed as the following 1pBeta(β,α)1 - p \sim Beta(\beta, \alpha).

Order Statistics

Q
A
Q

Let U1,,UnU_1, \ldots, U_n be i.i.d. Unif(0,1)Unif(0, 1). Find the unconditional distribution of U(n1)U_{(n - 1)}, and the conditional distribution of U(n1)U_{(n - 1)} given U(n)=cU_{(n)} = c.

A

Unconditionally, U(n1)Beta(n1,2)U_{(n - 1)} \sim Beta(n - 1, 2), using what we know about Uniform order statistics. For the conditional distribution,

for 0<x<c0 < x < c. Hence, the PDF is fU(n1)U(n)(xc)=(n1)(xcn21c)f_{U_{(n - 1)} | U_{(n)}}(x | c) = (n - 1)\left({\frac{x}{c}}^{n - 2} \cdot \frac{1}{c}\right). This is the distribution of cXcX where XBeta(n1,1)X \sim Beta(n - 1, 1), which can be easily verified with a change of variables. Hence, the conditional distribution of U(n1)U_{(n - 1)} is that of a scaled Beta!

Q
A
Q

Let XBin(n,p)X\sim Bin(n, p) and BBeta(j,nj+1)B \sim Beta(j, n - j +1), where n is a positive integer and jj is a positive integer with jnj \leq n. Show using a story about order statistics that P(Xj)=P(Bp)P(X \geq j) = P(B \leq p) This shows that the CDF of the continuous r.v. B is closely related to the CDF of the discrete r.v. X, and is another connection between the Beta and Binomial.

A

Let U1,,UnU_1, \ldots, U_n be i.i.d. Unif(0,1)Unif(0, 1). Think of these as Bernoulli trials, where UjU_j is defined to be "successful" if Uj<pU_j < p (so the probability of success is pp for each trial). Let XX be the number of successes. Then XjX \geq j is the same event as U(j)pU_{(j)} \leq p, so P(Xj)=P(U(j)p)P(X \geq j) = P(U_{(j)} \leq p).