# Independent Random Variables and Conditional Distribution

## Independent Random Variables

Two random variables are X and Y are independent if, for every A and B,

P(X ∈ A, Y ∈ B) = P(X ∈ A)P(Y ∈ B)

Two random variable X and Y which have joint pdf f

_{X,Y}are independent if and only if f_{X,Y}(x, y) = f_{X}(x) f_{Y}(y) for all values of x and y.

Note that f_{X}(x) and f_{Y}(y) above are marginal distributions.

Let us take the following example of bivariate distribution.

f_{X}(0) = 1/3, f_{X}(1) = 2/3, f_{Y}(0) = 1/3 and f_{Y}(1) = 2/3. Here X and Y are independent because f_{X}(0) f_{Y}(0) = f(0, 0), f_{X}(1) f_{Y}(0) = f(1, 0), f_{X}(0) f_{Y}(1) = f(0, 1) and f_{X}(1) f_{Y}(1) = f(1, 1).

## Conditional Distribution

Let X and Y be two **discrete** random variables. The conditional distribution of X given that we observe Y = y is expressed as

P(X = x| Y = y) = P(X = x, Y = y)/P(Y = y).

The conditional probability mass function isif f_{Y}(y) > 0.

For continuous random variables, the conditional probability density function isassuming that f_{Y}(y) > 0. Then,