Independence of random variables

[Pages:15]Independence of random variables

? Definition

Random variables X and Y are independent if their joint distribution function factors into the product of their marginal distribution functions

FX ,Y (x, y) = FX (x)FY (y)

? Theorem

Suppose X and Y are jointly continuous random variables. X and Y are independent if and only if given any two densities for X and Y their product is the joint density for the pair (X,Y) i.e.

Proof:

f X ,Y (x, y) = f X (x) fY (y)

? If X and Y are independent random variables and Z =g(X), W = h(Y) then Z,

W are also independent.

week 9

1

Example

? Suppose X and Y are discrete random variables whose values are the nonnegative integers and their joint probability function is

( ) pX ,Y

x, y

= 1 x ye-(+ )

x! y!

x, y = 0,1,2...

Are X and Y independent? What are their marginal distributions?

? Factorization is enough for independence, but we need to be careful of constant terms for factors to be marginal probability functions.

week 9

2

Example and Important Comment

? The joint density for X, Y is given by

( ) f X ,Y

(x,

y)

=

4

x+ 0

y2

x, y > 0 , x + y 1 otherwise

? Are X, Y independent?

? Independence requires that the set of points where the joint density is positive must be the Cartesian product of the set of points where the marginal densities are positive i.e. the set of points where fX,Y(x,y) >0 must be (possibly infinite) rectangles.

week 9

3

Conditional densities

? If X, Y jointly distributed continuous random variables, the conditional

density function of Y | X is defined to be

fY|X (y | x) =

f X ,Y (x, y) fX (x)

if fX(x) > 0 and 0 otherwise.

? If X, Y are independent then fY |X (y | x ) = fY (y ).

? Also,

f X ,Y (x, y) = ( fY|X y | x) f X (x)

Integrating both sides over x we get

fY (y) =

-

fY|X

(y

|

x) f X

(x)dx

? This is a useful application of the law of total probability for the continuous case.

week 9

4

Example

? Consider the joint density

f

X

,Y

(x,

y

)

=

2

e - y 0

0x y otherwise

? Find the conditional density of X given Y and the conditional density of Y given X.

week 9

5

Properties of Expectations Involving Joint Distributions

? For random variables X, Y and constants a, b R

E(aX + bY) = aE(X) + bE(Y) Proof:

? For independent random variables X, Y E(XY) = E(X)E(Y)

whenever these expectations exist. Proof:

week 9

6

Covariance

? Recall: Var(X+Y) = Var(X) + Var(Y) +2 E[(X-E(X))(Y-E(Y))]

? Definition For random variables X, Y with E(X), E(Y) < , the covariance of X and Y is

Cov(X ,Y ) = E[(X - E(X ))(Y - E(Y ))]

? Covariance measures whether or not X-E(X) and Y-E(Y) have the same sign.

? Claim: Proof:

Cov(X ,Y ) = E(XY ) - E(X )E(Y )

? Note: If X, Y independent then E(XY) =E(X)E(Y), and Cov(X,Y) = 0.

week 9

7

Example

? Suppose X, Y are discrete random variables with probability function given by y x -1 0 1 pX(x) -1 1/8 1/8 1/8 0 1/8 0 1/8 1 1/8 1/8 1/8 pY(y)

? Find Cov(X,Y). Are X,Y independent?

week 9

8

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download