# Box-Muller Transformation

Suppose we are told that $U \sim Unif\left({0, 1}\right)$ and $V \sim Expo\left({\frac{1}{2}}\right)$ independently.

Q
A
Q

Find the density function and thus the joint distribution of

\begin{aligned} X &= \sqrt{V} \sin\left({2 \pi U}\right) \\ Y &= \sqrt{V} \cos\left({2 \pi U}\right)\end{aligned}
A

To do this, we can obtain the Jacobian and perform a change of variables. Note that it is easier to find the Jacobian of the transformation going for $\left({U, V}\right) \rightarrow \left({X, Y}\right)$ since $X$ and $Y$ are already expressed in terms of $U$ and $V$. The Jacobian is given as follows.

$J = \frac{\partial(x, y)}{\partial(u, v)} = \begin{pmatrix} \frac{\partial x}{\partial u} & \frac{\partial x}{\partial v} \\ \frac{\partial y}{\partial u} & \frac{\partial y}{\partial v} \\ \end{pmatrix} = \begin{pmatrix} \sqrt{v} \cos\left({2 \pi u}\right) \left({2 \pi}\right) & \frac{1}{2 \sqrt{v}} \sin\left({2 \pi u}\right) \\ \sqrt{v} \sin\left({2 \pi u}\right) \left({2 \pi}\right) & \frac{1}{2 \sqrt{v}} \cos\left({2 \pi u}\right) \\ \end{pmatrix}$

The Jacobian $|J|$ is found to be the determinant of $J$ above.

\begin{aligned} |J| &= \sqrt{v} \cos\left({2 \pi u}\right) \left({2 \pi}\right) \frac{1}{2 \sqrt{v}} \cos(2 \pi u) + \sqrt{v} \sin({2 \pi u}) (2 \pi) \frac{1}{2 \sqrt{v}} \sin ({2 \pi u}) \\ &= ({\cos^2({2 \pi u}) + \sin^2({2 \pi u})}) \pi \\ &= \pi\end{aligned}

Thus, our transformation can be specified as follows. Note that $X^2 + Y^2 = V$.

\begin{aligned} f_{U, V}(u, v) &= f_{X, Y}(x, y) |J| \\ f_{X, Y}(x, y) &= \frac{1}{|J|} f_{U, V}(u, v) \\ &= \frac{1}{|J|} f_{U}(u) f_{V}(v) \\ &= \frac{1}{\pi}\frac{1}{2}\exp\left({- \frac{1}{2}v}\right) \\ &= \frac{1}{2\pi}\exp\left({- \frac{x^2}{2} - \frac{y^2}{2}}\right) \\ &= \frac{1}{\sqrt{2\pi}}\exp\left({-\frac{x^2}{2}}\right)\frac{1}{\sqrt{2\pi}}\exp\left({-\frac{y^2}{2}}\right)\end{aligned}

Hence, because we see that the pdf factors nicely into a term involving $x$ and a term involving $y$, those terms are in fact the marginal densities of $X$ and $Y$ (be sure to make sure that the normalizing constants go to the right places). We recognize the marginal densities as standard normal densities. And thus, it follows that $X$ and $Y$ are i.i.d. $N(0, 1)$.

# Beta, Gamma, and Binomial

Let $B\sim Beta(\alpha, \beta)$. Find the distribution of $1 - B$ in the following two ways.

Q
A
Q

Find the distribution of $1 - B$ using a change of variables

A

Let $W = 1 - B$. Then we have $B = 1 - W$, and so $\left| \frac{db}{dw} \right| = |-1| = 1$. Hence, the PDF of $W$ is

$f_W(w) = f_B(b) \left| \frac{db}{dw} \right| = \frac{\Gamma(\alpha + \beta)}{\Gamma(\alpha)\Gamma(\beta)}b^{\alpha - 1}(1 - b)^{\beta - 1} = \frac{\Gamma(\alpha + \beta)}{\Gamma(\alpha)\Gamma(\beta)}(1 - w)^{\alpha - 1}w^{\beta - 1}$

for $0 < w < 1$. We recognize this PDF as the $Beta(\beta, \alpha)$ distribution, and so $W \sim Beta(\beta, \alpha)$.

Q
A
Q

Find the distribution of $1 - B$ using a story proof related to the Gamma distribution.

A

Using the bank-post office story, we can represent $B = \frac{X}{X + Y}$ with $X \sim Gamma(\alpha, 1)$ and $Y \sim Gamma(\beta, 1)$ independent. Then $1 - B = \frac{Y}{X + Y} \sim Beta(\beta, \alpha)$ by the same story.

Q
A
Q

How does $B$ and $1-B$ relate to the Binomial distribution?

A

If we use $Beta(\alpha, \beta)$ as the prior distribution for the probability $p$ of success in a Binomial problem, interpreting $\alpha$ as the number of prior successes and $\beta$ as the number of prior failures, then $1 - p$ is the probability of failure and, interchanging the roles of "success" and "failure," it makes sense to have $1-p$ be distributed as the following $1 - p \sim Beta(\beta, \alpha)$.

# Order Statistics

Q
A
Q

Let $U_1, \ldots, U_n$ be i.i.d. $Unif(0, 1)$. Find the unconditional distribution of $U_{(n - 1)}$, and the conditional distribution of $U_{(n - 1)}$ given $U_{(n)} = c$.

A

Unconditionally, $U_{(n - 1)} \sim Beta(n - 1, 2)$, using what we know about Uniform order statistics. For the conditional distribution,

for $0 < x < c$. Hence, the PDF is $f_{U_{(n - 1)} | U_{(n)}}(x | c) = (n - 1)\left({\frac{x}{c}}^{n - 2} \cdot \frac{1}{c}\right)$. This is the distribution of $cX$ where $X \sim Beta(n - 1, 1)$, which can be easily verified with a change of variables. Hence, the conditional distribution of $U_{(n - 1)}$ is that of a scaled Beta!

Q
A
Q

Let $X\sim Bin(n, p)$ and $B \sim Beta(j, n - j +1)$, where n is a positive integer and $j$ is a positive integer with $j \leq n$. Show using a story about order statistics that $P(X \geq j) = P(B \leq p)$ This shows that the CDF of the continuous r.v. B is closely related to the CDF of the discrete r.v. X, and is another connection between the Beta and Binomial.

A

Let $U_1, \ldots, U_n$ be i.i.d. $Unif(0, 1)$. Think of these as Bernoulli trials, where $U_j$ is defined to be "successful" if $U_j < p$ (so the probability of success is $p$ for each trial). Let $X$ be the number of successes. Then $X \geq j$ is the same event as $U_{(j)} \leq p$, so $P(X \geq j) = P(U_{(j)} \leq p)$.