Chapter 2: Simple Linear Regression - Purdue University

Chapter 2: Simple Linear Regression

1 The model

The simple linear regression model for n obser-

vations can be written as

yi = 0 + 1xi + ei, i = 1, 2, ? ? ? , n. (1)

The designation simple indicates that there is only

one predictor variable x, and linear means that the model is linear in 0 and 1. The intercept 0 and the slope 1 are unknown constants, and

they are both called regression coefficients; ei's

are random errors. For model (1), we have the following assumptions:

1. E(ei) = 0 for i = 1, 2, ? ? ? , n, or, equivalently E(yi) = 0 + 1xi.

2. var(ei) = 2 for i = 1, 2, ? ? ? , n, or, equivalently, var(yi)) = 2.

3. cov(ei, ej) = 0 for all i = j, or, equivalently, cov(yi, yj) = 0.

2 Ordinary Least Square Estimation

The method of least squares is to estimate 0 and 1 so that the sum of the squares of the difference between the observations yi and the straight

line is a minimum, i.e., minimize

n

S(0, 1) = (yi - 0 - 1xi)2.

i=1

4

3

2

E(Y|X=x)

1 = Slope 1

1

0

0 = Intercept

0

1

2

3

4

Predictor = X

Figure 1: Equation of a straight line E(Y |X = x) = 0 + 1x.

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download