STAT110: Fall 2020
  • Welcome!
  • FAQ
  • Resources
  • Section 1
    • Questions
  • Section 2
    • Questions
  • Section 3
    • Questions
  • Section 4
    • Questions
  • Section 5
    • Questions
  • Section 6
    • Questions
  • Section 7
    • Questions
  • Section 8
    • Questions
  • Section 9
    • Questions
  • Section 10
    • Questions
Powered by GitBook
On this page
  • Box-Muller Transformation
  • Beta, Gamma, and Binomial
  • Order Statistics

Was this helpful?

  1. Section 8

Questions

Box-Muller Transformation

Suppose we are told that U∼Unif(0,1)U \sim Unif\left({0, 1}\right)U∼Unif(0,1) and V∼Expo(12)V \sim Expo\left({\frac{1}{2}}\right)V∼Expo(21​) independently.

Find the density function and thus the joint distribution of

X=Vsin⁡(2πU)Y=Vcos⁡(2πU)\begin{aligned} X &= \sqrt{V} \sin\left({2 \pi U}\right) \\ Y &= \sqrt{V} \cos\left({2 \pi U}\right)\end{aligned}XY​=V​sin(2πU)=V​cos(2πU)​

To do this, we can obtain the Jacobian and perform a change of variables. Note that it is easier to find the Jacobian of the transformation going for (U,V)→(X,Y)\left({U, V}\right) \rightarrow \left({X, Y}\right)(U,V)→(X,Y) since XXX and YYY are already expressed in terms of UUU and VVV. The Jacobian is given as follows.

J=∂(x,y)∂(u,v)=(∂x∂u∂x∂v∂y∂u∂y∂v)=(vcos⁡(2πu)(2π)12vsin⁡(2πu)vsin⁡(2πu)(2π)12vcos⁡(2πu))J = \frac{\partial(x, y)}{\partial(u, v)} = \begin{pmatrix} \frac{\partial x}{\partial u} & \frac{\partial x}{\partial v} \\ \frac{\partial y}{\partial u} & \frac{\partial y}{\partial v} \\ \end{pmatrix} = \begin{pmatrix} \sqrt{v} \cos\left({2 \pi u}\right) \left({2 \pi}\right) & \frac{1}{2 \sqrt{v}} \sin\left({2 \pi u}\right) \\ \sqrt{v} \sin\left({2 \pi u}\right) \left({2 \pi}\right) & \frac{1}{2 \sqrt{v}} \cos\left({2 \pi u}\right) \\ \end{pmatrix}J=∂(u,v)∂(x,y)​=(∂u∂x​∂u∂y​​∂v∂x​∂v∂y​​)=(v​cos(2πu)(2π)v​sin(2πu)(2π)​2v​1​sin(2πu)2v​1​cos(2πu)​)

The Jacobian ∣J∣|J|∣J∣ is found to be the determinant of JJJ above.

∣J∣=vcos⁡(2πu)(2π)12vcos⁡(2πu)+vsin⁡(2πu)(2π)12vsin⁡(2πu)=(cos⁡2(2πu)+sin⁡2(2πu))π=π\begin{aligned} |J| &= \sqrt{v} \cos\left({2 \pi u}\right) \left({2 \pi}\right) \frac{1}{2 \sqrt{v}} \cos(2 \pi u) + \sqrt{v} \sin({2 \pi u}) (2 \pi) \frac{1}{2 \sqrt{v}} \sin ({2 \pi u}) \\ &= ({\cos^2({2 \pi u}) + \sin^2({2 \pi u})}) \pi \\ &= \pi\end{aligned}∣J∣​=v​cos(2πu)(2π)2v​1​cos(2πu)+v​sin(2πu)(2π)2v​1​sin(2πu)=(cos2(2πu)+sin2(2πu))π=π​

Thus, our transformation can be specified as follows. Note that X2+Y2=VX^2 + Y^2 = VX2+Y2=V.

fU,V(u,v)=fX,Y(x,y)∣J∣fX,Y(x,y)=1∣J∣fU,V(u,v)=1∣J∣fU(u)fV(v)=1π12exp⁡(−12v)=12πexp⁡(−x22−y22)=12πexp⁡(−x22)12πexp⁡(−y22)\begin{aligned} f_{U, V}(u, v) &= f_{X, Y}(x, y) |J| \\ f_{X, Y}(x, y) &= \frac{1}{|J|} f_{U, V}(u, v) \\ &= \frac{1}{|J|} f_{U}(u) f_{V}(v) \\ &= \frac{1}{\pi}\frac{1}{2}\exp\left({- \frac{1}{2}v}\right) \\ &= \frac{1}{2\pi}\exp\left({- \frac{x^2}{2} - \frac{y^2}{2}}\right) \\ &= \frac{1}{\sqrt{2\pi}}\exp\left({-\frac{x^2}{2}}\right)\frac{1}{\sqrt{2\pi}}\exp\left({-\frac{y^2}{2}}\right)\end{aligned}fU,V​(u,v)fX,Y​(x,y)​=fX,Y​(x,y)∣J∣=∣J∣1​fU,V​(u,v)=∣J∣1​fU​(u)fV​(v)=π1​21​exp(−21​v)=2π1​exp(−2x2​−2y2​)=2π​1​exp(−2x2​)2π​1​exp(−2y2​)​

Hence, because we see that the pdf factors nicely into a term involving xxx and a term involving yyy, those terms are in fact the marginal densities of XXX and YYY (be sure to make sure that the normalizing constants go to the right places). We recognize the marginal densities as standard normal densities. And thus, it follows that XXX and YYY are i.i.d. N(0,1)N(0, 1)N(0,1).

Beta, Gamma, and Binomial

Let B∼Beta(α,β)B\sim Beta(\alpha, \beta)B∼Beta(α,β). Find the distribution of 1−B1 - B1−B in the following two ways.

Find the distribution of 1−B1 - B1−B using a change of variables

Let W=1−BW = 1 - BW=1−B. Then we have B=1−WB = 1 - WB=1−W, and so ∣dbdw∣=∣−1∣=1\left| \frac{db}{dw} \right| = |-1| = 1​dwdb​​=∣−1∣=1. Hence, the PDF of WWW is

fW(w)=fB(b)∣dbdw∣=Γ(α+β)Γ(α)Γ(β)bα−1(1−b)β−1=Γ(α+β)Γ(α)Γ(β)(1−w)α−1wβ−1f_W(w) = f_B(b) \left| \frac{db}{dw} \right| = \frac{\Gamma(\alpha + \beta)}{\Gamma(\alpha)\Gamma(\beta)}b^{\alpha - 1}(1 - b)^{\beta - 1} = \frac{\Gamma(\alpha + \beta)}{\Gamma(\alpha)\Gamma(\beta)}(1 - w)^{\alpha - 1}w^{\beta - 1}fW​(w)=fB​(b)​dwdb​​=Γ(α)Γ(β)Γ(α+β)​bα−1(1−b)β−1=Γ(α)Γ(β)Γ(α+β)​(1−w)α−1wβ−1

for 0<w<10 < w < 10<w<1. We recognize this PDF as the Beta(β,α)Beta(\beta, \alpha)Beta(β,α) distribution, and so W∼Beta(β,α)W \sim Beta(\beta, \alpha)W∼Beta(β,α).

Find the distribution of 1−B1 - B1−B using a story proof related to the Gamma distribution.

Using the bank-post office story, we can represent B=XX+YB = \frac{X}{X + Y}B=X+YX​ with X∼Gamma(α,1)X \sim Gamma(\alpha, 1)X∼Gamma(α,1) and Y∼Gamma(β,1)Y \sim Gamma(\beta, 1)Y∼Gamma(β,1) independent. Then 1−B=YX+Y∼Beta(β,α)1 - B = \frac{Y}{X + Y} \sim Beta(\beta, \alpha)1−B=X+YY​∼Beta(β,α) by the same story.

How does BBB and 1−B1-B1−B relate to the Binomial distribution?

If we use Beta(α,β)Beta(\alpha, \beta)Beta(α,β) as the prior distribution for the probability ppp of success in a Binomial problem, interpreting α\alphaα as the number of prior successes and β\betaβ as the number of prior failures, then 1−p1 - p1−p is the probability of failure and, interchanging the roles of "success" and "failure," it makes sense to have 1−p1-p1−p be distributed as the following 1−p∼Beta(β,α)1 - p \sim Beta(\beta, \alpha)1−p∼Beta(β,α).

Order Statistics

Let U1,…,UnU_1, \ldots, U_nU1​,…,Un​ be i.i.d. Unif(0,1)Unif(0, 1)Unif(0,1). Find the unconditional distribution of U(n−1)U_{(n - 1)}U(n−1)​, and the conditional distribution of U(n−1)U_{(n - 1)}U(n−1)​ given U(n)=cU_{(n)} = cU(n)​=c.

Unconditionally, U(n−1)∼Beta(n−1,2)U_{(n - 1)} \sim Beta(n - 1, 2)U(n−1)​∼Beta(n−1,2), using what we know about Uniform order statistics. For the conditional distribution,

P(U(n−1)≤x∣U(n)=c)=P(remaining n−1 Uniforms ≤x∣U(1)<c,…,U(n−1)<c,U(n)=c)=(xcn−1)\begin{aligned} P\left({U_{(n - 1)} \leq x | U_{(n)} = c}\right) &= P({\textrm{remaining $n - 1$ Uniforms $\leq x$} |\\ U_{(1)} < c, \ldots, U_{(n - 1)} < c, U_{(n)} = c)} \\ &= \left({\frac{x}{c}}^{n - 1}\right)\end{aligned}P(U(n−1)​≤x∣U(n)​=c)​=P(remaining n−1 Uniforms ≤x∣U(1)​<c,…,U(n−1)​<c,U(n)​=c)=(cx​n−1)​

for 0<x<c0 < x < c0<x<c. Hence, the PDF is fU(n−1)∣U(n)(x∣c)=(n−1)(xcn−2⋅1c)f_{U_{(n - 1)} | U_{(n)}}(x | c) = (n - 1)\left({\frac{x}{c}}^{n - 2} \cdot \frac{1}{c}\right)fU(n−1)​∣U(n)​​(x∣c)=(n−1)(cx​n−2⋅c1​). This is the distribution of cXcXcX where X∼Beta(n−1,1)X \sim Beta(n - 1, 1)X∼Beta(n−1,1), which can be easily verified with a change of variables. Hence, the conditional distribution of U(n−1)U_{(n - 1)}U(n−1)​ is that of a scaled Beta!

Let X∼Bin(n,p)X\sim Bin(n, p)X∼Bin(n,p) and B∼Beta(j,n−j+1)B \sim Beta(j, n - j +1)B∼Beta(j,n−j+1), where n is a positive integer and jjj is a positive integer with j≤nj \leq nj≤n. Show using a story about order statistics that P(X≥j)=P(B≤p)P(X \geq j) = P(B \leq p)P(X≥j)=P(B≤p) This shows that the CDF of the continuous r.v. B is closely related to the CDF of the discrete r.v. X, and is another connection between the Beta and Binomial.

Let U1,…,UnU_1, \ldots, U_nU1​,…,Un​ be i.i.d. Unif(0,1)Unif(0, 1)Unif(0,1). Think of these as Bernoulli trials, where UjU_jUj​ is defined to be "successful" if Uj<pU_j < pUj​<p (so the probability of success is ppp for each trial). Let XXX be the number of successes. Then X≥jX \geq jX≥j is the same event as U(j)≤pU_{(j)} \leq pU(j)​≤p, so P(X≥j)=P(U(j)≤p)P(X \geq j) = P(U_{(j)} \leq p)P(X≥j)=P(U(j)​≤p).

PreviousSection 8NextSection 9

Last updated 4 years ago

Was this helpful?