42 Multivariate Transformations
$$
$$
Given i.i.d. data \(X_1, \dots, X_n\), we know how to determine the distributions of statistics such as
- the sample mean \(\bar X\), and more generally, functions of means \(g(\bar X)\), and
- order statistics \(X_{(k)}\), such as the sample median.
A general statistic is of the form \(g(X_1, \dots, X_n)\). In this chapter, we will learn a general strategy for deriving the distribution of a general statistic when the random variables are continuous and \(g\) is differentiable.
42.1 Univariate Transformations
As a warm-up, consider the case of a single continuous random variable \(X\) with PDF \(f_X(x)\). What is the PDF of \(Y = g(X)\)?
In Chapter 20, we presented a general strategy for deriving the PDF of \(Y\). First, find the CDF of \(Y\), then differentiate to obtain the PDF. However, when \(g(x)\) is a differentiable function that is strictly increasing (or decreasing), there is a simple formula.
One natural application of Theorem 42.1 is to location-scale transformations.
42.2 Multivariate Transformations
Now, we consider the case where we have \(n\) random variables \(X_1, \dots, X_n\), and \(g\) is a differentiable one-to-one function from \(A \subset \mathbb{R}^n\) to \(B \subset \mathbb{R}^n\). That is, \[ (Y_1, \dots, Y_n) = g(X_1, \dots, X_n). \] Then, the natural generalization of Theorem 42.1 is the following.
Letβs apply Theorem 42.2 to some examples.
42.2.1 Simulating normal random variables
Theorem 42.2 provides a basis for simulating normal random variables. We begin by discussing how it can be used to simulate a single normal random variable, which is harder than it sounds.
With the Box-Muller transform, we get two independent normal random variables for the price of one. In Example 25.4, we discussed how to transform independent random variables into correlated ones. We can now derive the joint PDF of these correlated normal random variables using Theorem 42.2.
42.3 Exercises
Exercise 42.1 Let \(X\) and \(Y\) denote the coordinates of a point chosen uniformly in the unit circle. That is, the joint density is \[ f_{X,Y}(x,y) = \frac{1}{\pi}, \qquad x^2 + y^2 \leq 1. \] Find the joint density of the polar coordinates \(R = \sqrt{X^2+Y^2}\) and \(\Theta = \tan^{-1}(Y/X)\).
Exercise 42.2 Let \((X,Y)\) denote a random point in the plane, and assume the rectangular coordinates \(X\) and \(Y\) are independent standard normal random variables. Find the joint distribution of \(R\) and \(\Theta\), the polar coordinates.
Exercise 42.3 Let \(X_1, \dots, X_n\) be i.i.d. \(\text{Exponential}(\lambda)\). Define \[ Y_k = X_1 + \cdots + X_k \] for \(1 \leq k \leq n\).
- Find the joint density of \(Y_1, \dots, Y_n\).
- Use part (a) to find the density of \(Y_n\). Does your result make sense?
- Find the conditional density of \(Y_1, \dots, Y_{n-1}\) given \(Y_n = t\).
Exercise 42.4 Let \(Z_1\) and \(Z_2\) be i.i.d. \(\text{Normal}(0,1)\). Let \(X = Z_1 + Z_2\) and \(Y = Z_1 - Z_2\) β we know both are \(\text{Normal}(0,2)\). Are they independent?
Exercise 42.5 Let \(X\) and \(Y\) be i.i.d. \(\text{Exponential}(\lambda)\). Let \(T = X + Y\) and \(\displaystyle W = \frac{X}{Y}\). Are \(T\) and \(W\) independent? Find the marginal densities of \(T\) and \(W\).