42 Multivariate Transformations
$$
$$
\(\def\mean{\textcolor{red}{2.6}}\)
Given i.i.d. data \(X_1, \dots, X_n\), we know how to determine the distributions of:
- the sample mean \(\bar X\), and more generally, functions of means \(g(\bar X)\), and
- order statistics \(X_{(k)}\).
How do we determine the distribution of a general statistic of the form \(g(X_1, \dots, X_n)\)? That is the topic of this chapter. Throughout this chapter, we will assume that the random variables are continuous.
42.1 Univariate Transformations
As a warm-up, we consider the case where we have a single continuous random variable \(X\) with PDF \(f_X(x)\). What is the PDF of \(Y = g(X)\)?
In Chapter 20, we discussed a general strategy for deriving the PDF of \(Y\). First, we find the CDF of \(Y\), then we differentiate to obtain the PDF. However, when \(g(x)\) is a differentiable function that is strictly increasing (or decreasing), there is a simple formula.
One natural application of Theorem 42.1 is to location-scale transformations.
42.2 Multivariate Transformations
Now, we consider the case where we have \(n\) random variables \(X_1, \dots, X_n\), and \(g\) is a differentiable one-to-one function from \(A \subset \mathbb{R}^n\) to \(B \subset \mathbb{R}^n\). That is, \[ (Y_1, \dots, Y_n) = g(X_1, \dots, X_n). \] Then, the natural generalization of Theorem 42.1 is the following.
Letβs apply Theorem 42.2 to some examples.
42.2.1 Simulating normal random variables
Theorem 42.2 provides a basis for simulating normal random variables. We begin by discussing how it helps us simulate a single normal random variable, which is no simple task.
With the Box-Muller transform, we get two independent normal random variables for the price of one. In Example 25.4, we discussed how to transform independent random variables into correlated ones. We can derive the joint PDF of these correlated normal random variables using Theorem 42.2.