Ch. 4 - Joint Distribution of Random Variables
Outline:
- Joint Distribution
- Conditional Distribution
- Conditional Expectation
- Law of Iterated Expectation
- Conditional Variance
- Law of Iterated Variance
Joint Distribution
Joint Distribution
Consider two rvs X and Y with probability functions fx and fy.
We can compute P(3 ≤ X ≤ 7) or P(1 ≤ Y ≤ 3).
- In general, however, we need to know how they behave jointly:
This is an intersection of two events. In order to compute such
probabilities, we need to have a joint probability distribution of the
bivariate random variable
- You need to have knowledge of the joint pair of random variables
Note the cases:
- x,y: discrete both
- x,y: continuous both
- x,y: one discrete, one continuous
Joint Distribution - Discrete Case
The joint or, bivariate probability mass function (PMF) of a pair of discrete random variables
- where Sdenotes the sample space of (X ,Y ) values.
- Observe that the joint pmf is noting but a probability
The joint PMF fX ,Y must satisfy the following properties:
- ...
Here, any probabilities of the form
- ...
Joint Distribution - Continuous Case
The joint probability density function (PDF) of a pair of continuous
random variables (X ,Y ) is a non–negative function fX ,Y (x,y ), (x,y ) ∈S(the sample space of (X ,Y ) values) such that
-
the total volume of the region under the surface fX ,Y and ∞ above the x-y plane is 1:
...
-
the probability that (X ,Y ) takes their values in a region A of the x-y plane will be the volume of the region A under the surface fX ,Y and above the region A: