Chapter 2: Simple Linear Regression - Purdue University

Chapter 2: Simple Linear Regression

1 The model

The simple linear regression model for n observations can be written as

yi = ¦Â0 + ¦Â1 xi + ei , i = 1, 2, ¡¤ ¡¤ ¡¤ , n. (1)

The designation simple indicates that there is only

one predictor variable

x, and linear means that

the model is linear in

¦Â0 and ¦Â1 . The intercept

¦Â0 and the slope ¦Â1 are unknown constants, and

they are both called regression coefficients; ei ¡¯s

are random errors. For model (1), we have the

following assumptions:

1.

E(ei ) = 0 for i = 1, 2, ¡¤ ¡¤ ¡¤ , n, or, equiva-

lently E(yi )

2. var(ei )

= ¦Â0 + ¦Â1 xi.

= ¦Ò 2 for i = 1, 2, ¡¤ ¡¤ ¡¤ , n, or, equiva-

lently, var(yi ))

3.

2

=¦Ò .

cov(ei , ej ) = 0 for all i 6= j , or, equivalently,

cov(yi , yj ) = 0.

2 Ordinary Least Square Estimation

The method of least squares is to estimate

¦Â0

and ¦Â1 so that the sum of the squares of the difference between the observations yi and the straight

line is a minimum, i.e., minimize

S(¦Â0 , ¦Â1 ) =

n

X

i=1

(yi ? ¦Â0 ? ¦Â1 xi )2 .

4

3

2

1

1

¦Â0 = Intercept

0

E(Y|X=x)

¦Â1 = Slope

0

1

2

3

Predictor = X

Figure 1: Equation of a straight line E(Y |X

= x) = ¦Â0 + ¦Â1 x.

4

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download