Now we want to explore multiple variables and their interaction. Here we focus on bivariate jointly continuous RVs.
Jointly Continuous
are jointly continuous if a function with joint density function, s.t.
For measurable.
So
A joint density function satisfies
;
.
Example: Uniform
Let , we can choose a point uniformly in :
Example: Two independent standard normal distribution
In the one dimensional case, for continuous RV, we have the approximation
Similarly in two dimentional case, take a small neighborhood containing , then we have
How to recover the marginal density or given ?
Fact
Proof
We can only prove . Note that differentiate on both sides.
will not lead to : let , and . Let . By this result we know . Then . However, we always have , and has positive probability, so .
1.1 Independence
are independent if
By independence, the value of will not give us anything about .
Example
. Calculate the marginal distribution: . So .
For , is similar. Then are not independent.
2 Bivariate Transformation
Transformation of random variables: . Polar coordinates: .
Fact (Polar Coordinates)
Proof
On one hand,
On the other, by describing the event using we have
Putting together we conclude.
Linear transformation
is a linear transformation if
Some properties:
is invertible if and only if is invertible;
is a parallelogram, then is also one;
.
Let be a linear transformation with inverse . Given the joint p.d.f of , what's the joint p.d.f of ?
On one hand,
Similarly
We conclude that for invertible transformation ,
Rotations
In positive direction, . . Then
Sum & difference
This is the rotation of .
Orthogonal transformation
is an orthogonal transformation if it preserves the inner product: . I.e.: , is an orthogonal matrix, .
They preserve angles, lengths, Areas, .
Fact
For , all orthogonal transformations are rotation, reflection, and composition of the two.
3 Invertible Affine Transformation
Suppose has inverse .
Define linear translation where is invertible matrix, and is vector.
Since ,
Since we have
Example
If , then .
Let be a small region containing . Then
Algebraically, and so we also have
4 General Invertible Transformations
. Assume differentiable, but not necessarily affine. Also assume , , . So
We want to know .
If is affine, then is a parallelogram. For general , if , then can be approximated by a parallelogram, since can be approximated by an affine transformation on .
Let where are differentiable functions. Then for any point near , the Taylor expansion in second order gives
In matrix notation, This is an affine transformation. Denote the yellow matrix as , which is the Jacobian matrix of at .
Hence, so
Example
Let , and . Let . Then .
So , and , .
Plug in (4.1),
So so .