Joint Distribution - Example

Lecture 17: Joint Distributions

Statistics 104

Colin Rundel

March 26, 2012

Section 5.1 Joint Distributions of Discrete RVs

Joint Distribution - Example, cont.

Let B be the number of Black socks and W the number of White socks drawn, then the joint distribution of B and W is given by:

W

012

0

1 8 6 15 66 66 66 66

B

1

12 66

24 66

0

36 66

2

15 66

0

0

15 66

28 32 6 66 66 66 66 66

1/66 8/66 6/66 12/66

P(B = b, W = w ) = 24/66

0/66 15/66 0/66 0/66

If b=0,w=0 If b=0,w=1 If b=0,w=2 If b=1,w=0 If b=1,w=1 If b=1,w=2 If b=2,w=0 If b=2,w=1 If b=2,w=2

64

2

P(B = b, W = w ) =

b

w

2-b-w 12

, for 0 b, w 2 and b + w 2

2

Statistics 104 (Colin Rundel)

Lecture 17

March 26, 2012 2 / 32

Section 5.1 Joint Distributions of Discrete RVs

Joint Distribution - Example

Draw two socks at random, without replacement, from a drawer full of twelve colored socks:

6 black, 4 white, 2 purple Let B be the number of Black socks, W the number of White socks drawn, then the distributions of B and W are given by:

0

1

2

P(B=k)

6 12

5 11

=

15 66

2

6 12

6 11

=

36 66

6 12

5 11

=

15 66

P(W=k)

8 12

7 11

=

28 66

2

4 12

8 11

=

32 66

4 12

3 11

=

6 66

Note - B HyperGeo(12, 6, 2) =

Statistics 104 (Colin Rundel)

66 k 2-k

12 2

48

and W HyperGeo(12, 4, 2) =

k 2-k 12

2

Lecture 17

March 26, 2012

1 / 32

Section 5.1 Joint Distributions of Discrete RVs

Marginal Distributions

Note that the column and row sums are the distributions of B and W respectively.

P(B = b) = P(B = b, W = 0) + P(B = b, W = 1) + P(B = b, W = 2) P(W = w ) = P(B = 0, W = w ) + P(B = 1, W = w ) + P(B = 2, W = w )

These are the marginal distributions of B and W . In general,

P(X = x) = P(X = x, Y = y ) = P(X = x|Y = y )P(Y = y )

y

y

Statistics 104 (Colin Rundel)

Lecture 17

March 26, 2012 3 / 32

Section 5.1 Joint Distributions of Discrete RVs

Conditional Distribution

Conditional distributions are defined as we have seen previously with

P(X = x, Y = y ) joint pmf

P(X = x|Y = y ) =

=

P(Y = y )

marginal pmf

Therefore the pmf for white socks given no black socks were drawn is

1

P (W

=

w |B

=

0)

=

P(W = w , B = P(B = 0)

0)

=

66

8

66

6

66

15 66

=

1 15

15 66

=

8 15

15 66

=

6 15

if W = 0 if W = 1 if W = 2

Joint CDF

Section 5.1 Joint Distributions of Continuous RVs

F (x, y ) = P[X x, Y y ] = P[(X , Y ) lies south-west of the point (x, y )]

Y (x,y)

q

Statistics 104 (Colin Rundel)

Lecture 17

March 26, 2012 4 / 32

Joint CDF, cont.

Section 5.1 Joint Distributions of Continuous RVs

The joint Cumulative distribution function follows the same rules as the univariate CDF,

Univariate definition:

lim F (x) = 0

x -

x

F (x) = P(X x) =

f (z)dz

-

lim F (x) = 1

x

x y F (x) F (y )

Bivariate definition:

y

x

F (x, y ) = P(X x, Y y ) =

f (x, y ) dx dy

- -

lim F (x, y ) = 0

x,y -

lim F (x, y ) = 1

x,y

x x ,y y F (x, y ) F (x , y )

Statistics 104 (Colin Rundel)

Lecture 17

March 26, 2012 6 / 32

Statistics 104 (Colin Rundel)

X

Lecture 17

March 26, 2012 5 / 32

Section 5.1 Joint Distributions of Continuous RVs

Marginal Distributions

We can define marginal distributions based on the CDF by setting one of the values to infinity:

x

F (x, ) = P(X x, Y ) =

f (x, y ) dy dy

- -

= P(X x) = FX (x)

y

F (, y ) = P(X , Y y ) =

f (x, y ) dx dy

- -

= P(Y y ) = FY (y )

Statistics 104 (Colin Rundel)

Lecture 17

March 26, 2012 7 / 32

Joint pdf

Section 5.1 Joint Distributions of Continuous RVs

Similar to the CDF the probability density function follows the same general rules except in two dimensions,

Univariate definition:

f (x) 0 for all x

f (x)

=

d dx

F

(x

)

-

f

(x )dx

=

1

Bivariate definition:

f (x, y ) 0 for all (x, y )

f (x, y ) =

F (x, y )

x y

f (x, y ) dx dy = 1

- -

Statistics 104 (Colin Rundel)

Lecture 17

March 26, 2012 8 / 32

Section 5.1 Joint Distributions of Continuous RVs

Probability and Expectation

Univariate definition: Bivariate definition:

P(X A) = f (x) dx

A

E [g (X )] =

g (x) ? f (x) dx

-

P(X A, Y B) =

f (x, y ) dx dy

AB

E [g (X , Y )] =

g (x, y ) ? f (x, y ) dx dy

- -

Marginal pdfs

Section 5.1 Joint Distributions of Continuous RVs

Marginal probability density functions are defined in terms of "integrating out" one of the random variables.

fX (x) = fY (x) =

f (x, y ) dy

-

f (x, y ) dx

-

Previously we defined independence in terms of E (XY ) = E (X )E (Y ) X and Y are independent. This is equivalent in the joint case of f (x, y ) = fX (x)fY (y ) X and Y are independent.

Statistics 104 (Colin Rundel)

Lecture 17

March 26, 2012 9 / 32

Section 5.1 Joint Distributions of Continuous RVs

Example 1 - Joint Uniforms

Let X , Y Unif(0, 1), it is straight forward to see graphically that

0 xy

F (x, y ) = x

y 1

if x (-, 0) or y (-, 0) if x (0, 1), y (0, 1) if x (0, 1), y (1, ) if x (1, ), y (0, 1) if x (1, ), y (1, )

Statistics 104 (Colin Rundel)

Lecture 17

March 26, 2012 10 / 32

Statistics 104 (Colin Rundel)

Lecture 17

March 26, 2012 11 / 32

Example 1, cont.

Section 5.1 Joint Distributions of Continuous RVs

Based on the CDF we can calculate the pdf using the 2nd partial derivative with regard to x and y.

f (x, y ) =

F (x, y )

x y

0 1

=0

0 0

if x (-, 0) or y (-, 0) if x (0, 1), y (0, 1) if x (0, 1), y (1, ) if x (1, ), y (0, 1) if x (1, ), y (1, )

1 if x (0, 1), y (0, 1) =

0 otherwise

Example 1, cont.

Section 5.1 Joint Distributions of Continuous RVs

Based on the pdf we can calculate the marginal densities:

1 if x (0, 1), y (0, 1) f (x, y ) =

0 otherwise

fX (x) = =

f (x, y ) dy

-

1 0

1

dy

if x (0, 1)

0 dy otherwise

1 if x (0, 1) =

0 otherwise

1 if y (0, 1) fY (y ) = 0 otherwise

Statistics 104 (Colin Rundel)

Lecture 17

March 26, 2012 12 / 32

Example 1, cont.

Section 5.1 Joint Distributions of Continuous RVs

Expectation is also straight forward:

E (X ) =

xf (x, y ) dx dy

- -

11

1

=

x dx dy =

00

0

x2 1 dy

20

11

11

=

dy = y = 1/2

02

20

E (Y ) =

yf (x, y ) dx dy

- -

11

1

=

y dx dy =

xy |10 dy

00

0

1

y2 1

= y dy =

= 1/2

0

20

WShtaictishticss1h0o4 u(CldolinnRountdebl) e surprising... Lecture 17

March 26, 2012 13 / 32

Example 1, cont.

Section 5.1 Joint Distributions of Continuous RVs

E (XY ) =

xyf (x, y ) dx dy

- -

11

1

=

xy dx dy =

00

0

x2y 1 dy

2 x=0

1y

y2 1

=

dy =

= 1/4

02

40

Note that E (XY ) = E (X )E (Y ), what does this tell us about X and Y ?

Statistics 104 (Colin Rundel)

Lecture 17

March 26, 2012 14 / 32

Statistics 104 (Colin Rundel)

Lecture 17

March 26, 2012 15 / 32

Section 5.1 Joint Distributions of Continuous RVs

Example 1, another way

If we did not feel comfortable coming up with the graphical arguments for F (x, y ) we can also use the fact that the pdf is constant on (0, 1) ? (0, 1) to derive the same distribution / density.

f (x, y ) = c

1=

f (x, y ) dx dy

- -

11

=

c dx dy

00

1

1

=

cx|10 dy = c dy

0

0

= cy |10 = c

Statistics 104 (Colin Rundel)

Lecture 17

March 26, 2012 16 / 32

Example 2, cont.

Section 5.1 Joint Distributions of Continuous RVs

Since the joint density is constant then

2 f (x, y ) = c = , for 0 x + y 3

9 based on the area of the triangle, but we need to be careful to define on what range. We can define the range in two ways since X and Y depend on each other, so we can define the range of X in terms of Y or Y in terms of X .

f (x, y ) =

2 9

if y (0, 3), x (0, 3 - y )

0 otherwise

=

2 9

if x (0, 3), y (0, 3 - x)

0 otherwise

Statistics 104 (Colin Rundel)

Lecture 17

March 26, 2012 18 / 32

Example 2

Section 5.1 Joint Distributions of Continuous RVs

Let X and Y be drawn uniformly from the triangle below

4

3

Y

2

1

0

0

1

2

3

4

Statistics 104 (Colin Rundel)

X Lecture 17

March 26, 2012 17 / 32

Find the joint pdf, cdf, and marginals. Section 5.1 Joint Distributions of Continuous RVs

Example 2, cont.

Depending on which range definition you choose it makes life easier when evaluating the marginal densities.

fX (x) =

f (x, y ) dy

-

3-x 2

=

dy

09

2 = (3 - x) for x (0, 3)

9

fY (y ) =

f (x, y ) dy

-

3-y 2

=

dx

09

2 = (3 - y ) for y (0, 3)

9

Statistics 104 (Colin Rundel)

Lecture 17

March 26, 2012 19 / 32

Example 2, cont.

Section 5.1 Joint Distributions of Continuous RVs

Finding the CDF with calculus is hard in this case, still a pain with graphical approaches but easier...

y

x

F (x, y ) =

f (x, y ) dx dy

- -

0

2 9

xy

2

9

xy

-

(y -(3-x))(x-(3-y )) 2

=2

9

3x

-

x2 2

2

9

3y

-

y2 2

1

if x (-, 0) or y (-, 0) if x (0, 3), y (0, 3), x + y (0, 3) if x (0, 3), y (0, 3), x + y (3, 6) if x (0, 3), y (3, ) if x (3, ), y (0, 3) if x (3, ), y (3, )

Example 3

Section 5.1 Joint Distributions of Continuous RVs

Let f (x, y ) = cx2y for x2 y 1.

Find: a) c b) P[X Y ] c) fX (x) and fY (y )

Statistics 104 (Colin Rundel)

Lecture 17

March 26, 2012 20 / 32

Example 3 - Range

Section 5.1 Joint Distributions of Continuous RVs

1

Y

-1 Statistics 104 (Colin Rundel)

0 X

Lecture 17

1 March 26, 2012 22 / 32

Statistics 104 (Colin Rundel)

Lecture 17

March 26, 2012 21 / 32

Example 3.a

Section 5.1 Joint Distributions of Continuous RVs

Let 0

f (x, y

y) 1,

-=cyx

2y

xforx2 y

y

1,

we

can

rewrite

the

bounds

as

1=

f (x, y ) dx dy

- -

1 y

=

cx2y dx dy

0 -1

1

x 3 y

=

cy

dy

0

3 x=-y

1

=

cy 5/2/3 + cy 5/2/3 dy

0

4cy 7/2 1 =

21

y =0

4 =c

21

c = 21/4

Statistics 104 (Colin Rundel)

Lecture 17

March 26, 2012 23 / 32

Example 3 - pdf

Section 5.1 Joint Distributions of Continuous RVs

0.20

0.15

f(x,y) 0.10

0.05

0.00

1.0

0.8

1.0

0.6

0.5

0.4

y

0.2

0.0 -1.0

-0.5

0.0 x

Statistics 104 (Colin Rundel)

Lecture 17

0.15 0.10 0.05 0.00

March 26, 2012 24 / 32

Example 3.b, cont.

Section 5.1 Joint Distributions of Continuous RVs

1

P(X Y ) =

|x| 21 x2y dy dx

-1 x2 4

= 2 1 x 21 x2y dy dx 0 x2 4

42 1 =

40

x2y2 x dx

2 x2

42 1 x 4 x 6

=

- dx

40 2 2

42 x 5 x 7 1

=

-

4 10 14 0

42 1 1

=

-

4 10 14

42 2

=

= 0.3

4 70

Statistics 104 (Colin Rundel)

Lecture 17

March 26, 2012 26 / 32

Example 3.b

Section 5.1 Joint Distributions of Continuous RVs

We need to integrate over the region where x2 y 1 and x y which is indicated in red below

1

1

Y

Y

-1

0

1

-1

0

1

X

X

Statistics 104 (Colin Rundel)

Lecture 17

March 26, 2012 25 / 32

Example 3.c

Section 5.1 Joint Distributions of Continuous RVs

fX (x) =

1 21 x2y dy x2 4

21 =

4

x2y2 1 2 x2

21 =

x2 - x6

, for x (-1, 1)

8

fY (y ) =

y 21 x2y dx -y 4

21 =

4

x 3y y 3 -y

21 y 5/2

=

2

4

3

= 7 y 5/2, for y (0, 1) 2

Statistics 104 (Colin Rundel)

Lecture 17

March 26, 2012 27 / 32

Example 3.c, cont.

Section 5.1 Joint Distributions of Continuous RVs

It is always a good idea to check that the marginals are proper densities.

1

fX (x)dx =

-1

1 21 -1 8

x2 - x6 dx

21 x 3 x 7 1

=

-

8 3 7 -1

21 1 1

=

- =1

43 7

1

fY (y )dy =

0

1 7 y 5/2dy 02

= 7 2 y 7/2 1 = 1

27

0

Example 4

Section 5.1 Joint Distributions of Continuous RVs

Let Y be the rate of calls at a help desk, and X the number of calls between 2 pm and 4 pm one day; Let's say that:

f (x, y ) = (2y )x e-3y x!

for y > 0, x = 0, 1, 2, . . ..

Find: a) P(X = 0) b) P(Y > 2) c) P[X = x] for all x

Statistics 104 (Colin Rundel)

Lecture 17

March 26, 2012 28 / 32

Example 4.a

Section 5.1 Joint Distributions of Continuous RVs

f (x, y ) = (2y )x e-3y , for y > 0, x = 0, 1, 2, . . . x!

P(X = 0) = f (0, y )dy

0

=

e-3y dy

0

= - 1 e-3y

3

0

= 1/3

Statistics 104 (Colin Rundel)

Lecture 17

March 26, 2012 30 / 32

Statistics 104 (Colin Rundel)

Lecture 17

March 26, 2012 29 / 32

Example 4.b

Section 5.1 Joint Distributions of Continuous RVs

f (x, y ) = (2y )x e-3y , for y > 0, x = 0, 1, 2, . . . x!

P(Y > 2) =

f (x, y )dy

2 x=0

= (2y )x e-3y dy 2 x=0 x !

=

e -3y

(2y )x

dy

2

x=0 x !

=

e -3y

(2y ) (2y )2

1+

+

+ ? ? ? dy

2

x =0

1

2

=

e-3y e2y dy =

e-y dy

2

2

=

-e -y

2

= e-2

= 0.13534

Statistics 104 (Colin Rundel)

Lecture 17

March 26, 2012 31 / 32

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download