[Chapter 5. Multivariate Probability Distributions] - UMass

[Chapter 5. Multivariate

Probability Distributions]

5.1 Introduction

5.2 Bivariate and Multivariate probability distributions

5.3 Marginal and Conditional probability distributions

5.4 Independent random variables

5.5 The expected value of a function of random variables

5.6 Special theorems

5.7 The Covariance of two random variables

5.8 The Moments of linear combinations of

random variables

5.9 The Multinomial probability distribution

5.10 The Bivariate normal distribution

5.11 Conditional expectations

1

5.1 Introduction

Suppose that Y1, Y2, . . . , Yn denote the outcomes

of n successive trials of an experiment. A

specific set of outcomes, or sample measurements, may be expressed in terms of the intersection of n events

(Y1 = y1), (Y2 = y2), . . . , (Yn = yn)

which we will denote as

(Y1 = y1, Y2 = y2, . . . , Yn = yn)

or more compactly, as

(y1, y2, . . . , yn).

Calculation of the probability of this intersection is essential in making inferences about the

population from which the sample was drawn

and is a major reason for studying multivariate

probability distributions.

2

5.2 Bivariate and Multivariate probability distributions

Many random variables can be defined over the

same sample space.

(Example) Tossing a pair of dice.

The sample space contains 36 sample points.

Let Y1 be the number of dots appearing on

die 1, and Y2 be the sum of the number of

dots on the dice. We would like to obtain the

probability of (Y1 = y1, Y2 = y2) for all the

possible values of y1 and y2. That is the joint

distribution of Y1 and Y2.

(Def 5.2) For any r.v. Y1 and Y2 the joint (bivariate) distribution function F (y1, y2) is given

by

F (y1, y2) = P (Y1 ¡Ü y1, Y2 ¡Ü y2)

for ?¡Þ < y1 < ¡Þ and ?¡Þ < y2 < ¡Þ.

3

(Theorem 5.2) If Y1 and Y2 are r.v. with joint distribution function F (y1 , y2 ), then

1. F (?¡Þ, ?¡Þ) = F (?¡Þ, y2 ) = F (y1 , ?¡Þ) = 0.

2. F (¡Þ, ¡Þ) = 1.

3. If a?1 ¡Ý a1 and b?2 ¡Ý b2 , then

F (a?1 , b?2 ) ? F (a?1 , b2 ) ? F (a1 , b?2 ) + F (a1 , b2 )

= P (a1 < Y1 ¡Ü a?1 , b2 < Y2 ¡Ü b?2 ) ¡Ý 0.

(1) Discrete variables:

(Def 5.1) Let Y1 and Y2 be discrete r.v. The

joint probability distribution for Y1 and Y2 is

given by

p(y1, y2) = p(Y1 = y1, Y2 = y2)

for ?¡Þ < y1 < ¡Þ and ?¡Þ < y2 < ¡Þ. The

function p(y1, y2) will be referred to as the joint

probability function.

4

Note that if Y1 and Y2 are discrete r.v. with

joint probability function p(y1, y2), its CDF is

F (y1, y2) = P (Y1 ¡Ü y1, Y2 ¡Ü y2)

X X

=

p(t1, t2)

t1 ¡Üy1 t2 ¡Üy2

(Theorem 5.1) If Y1 and Y2 are discrete r.v.

with joint probability function p(y1, y2), then

1. p(y1, y2) ¡Ý 0 for all y1, y2.

2.

P

y1 ,y2 p(y1 , y2 ) = 1, where the sum is over

all values (y1, y2) that are assigned nonzero

probabilities.

P

3. P [(y1, y2) ¡Ê A] = (y1,y2)¡ÊA p(y1, y2) for A ?

S. So,

P (a1 ¡Ü Y1 ¡Ü a2 , b1 ¡Ü Y2 ¡Ü b2 ) =

a2 X

b2

X

p(t1 , t2 )

t1 =a1 t2 =b1

5

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download