Suppose we are told that U∼Unif(0,1) and V∼Expo(21) independently.
Find the density function and thus the joint distribution of
XY=Vsin(2πU)=Vcos(2πU)
To do this, we can obtain the Jacobian and perform a change of variables. Note that it is easier to find the Jacobian of the transformation going for (U,V)→(X,Y) since X and Y are already expressed in terms of U and V. The Jacobian is given as follows.
Hence, because we see that the pdf factors nicely into a term involving x and a term involving y, those terms are in fact the marginal densities of X and Y (be sure to make sure that the normalizing constants go to the right places). We recognize the marginal densities as standard normal densities. And thus, it follows that X and Y are i.i.d. N(0,1).
Beta, Gamma, and Binomial
Let B∼Beta(α,β). Find the distribution of 1−B in the following two ways.
Find the distribution of 1−B using a change of variables
Let W=1−B. Then we have B=1−W, and so dwdb=∣−1∣=1. Hence, the PDF of W is
for 0<w<1. We recognize this PDF as the Beta(β,α) distribution, and so W∼Beta(β,α).
Find the distribution of 1−B using a story proof related to the Gamma distribution.
Using the bank-post office story, we can represent B=X+YX with X∼Gamma(α,1) and Y∼Gamma(β,1) independent. Then 1−B=X+YY∼Beta(β,α) by the same story.
How does B and 1−B relate to the Binomial distribution?
If we use Beta(α,β) as the prior distribution for the probability p of success in a Binomial problem, interpreting α as the number of prior successes and β as the number of prior failures, then 1−p is the probability of failure and, interchanging the roles of "success" and "failure," it makes sense to have 1−p be distributed as the following 1−p∼Beta(β,α).
Order Statistics
Let U1,…,Un be i.i.d. Unif(0,1). Find the unconditional distribution of U(n−1), and the conditional distribution of U(n−1) given U(n)=c.
Unconditionally, U(n−1)∼Beta(n−1,2), using what we know about Uniform order statistics. For the conditional distribution,
for 0<x<c. Hence, the PDF is fU(n−1)∣U(n)(x∣c)=(n−1)(cxn−2⋅c1). This is the distribution of cX where X∼Beta(n−1,1), which can be easily verified with a change of variables. Hence, the conditional distribution of U(n−1) is that of a scaled Beta!
Let X∼Bin(n,p) and B∼Beta(j,n−j+1), where n is a positive integer and j is a positive integer with j≤n. Show using a story about order statistics that P(X≥j)=P(B≤p) This shows that the CDF of the continuous r.v. B is closely related to the CDF of the discrete r.v. X, and is another connection between the Beta and Binomial.
Let U1,…,Un be i.i.d. Unif(0,1). Think of these as Bernoulli trials, where Uj is defined to be "successful" if Uj<p (so the probability of success is p for each trial). Let X be the number of successes. Then X≥j is the same event as U(j)≤p, so P(X≥j)=P(U(j)≤p).