Linear Regression - University of Pennsylvania

[Pages:61]Linear Regression

These slides were assembled by Eric Eaton, with grateful acknowledgement of the many others who made their course materials freely available online. Feel free to reuse or adapt these slides for your own academic purposes, provided that you include proper aHribuIon. Please send comments and correcIons to Eric.

Robot Image Credit: Viktoriya Sukhanova ?

Regression

Given:

? Data

X

n

o

= (1), . . . , (n)

x

x

where

(i) 2 Rd x

? Corresponding labels

n

o

y = y(1), . . . , y(n)

where

y(i)

2R

9

September Arc+c Sea Ice Extent (1,000,000 sq km)

8

7

6

5

4

3 Linear Regression QuadraIc Regression

2

1

0 1975

1980

1985

1990

1995 Year

2000

2005

2010

2015

Data from G. WiH. Journal of StaIsIcs EducaIon, Volume 21, Number 1 (2013)

2

Prostate Cancer Dataset

? 97 samples, parIIoned into 67 train / 30 test ? Eight predictors (features):

? 6 conInuous (4 log transforms), 1 binary, 1 ordinal

? ConInuous outcome variable:

? lpsa: log(prostate specific anIgen level)

Based on slide by Jeff Howbert

Linear Regression

? Hypothesis:

Xd

y = 0 + 1x1 + 2x2 + . . . + dxd = j xj

j=0 Assume x0 = 1

? Fit model by minimizing sum of squared errors

x

Figures are courtesy of Greg Shakhnarovich

5

Least Squares Linear Regression

?

Cost FuncIon J( ) =

1 2n

Xn

h

(i)

x

i=1

? Fit by solving min J()

2 y(i)

6

IntuiIon Behind Cost FuncIon

J( )

=

1 2n

Xn h

(i)

x

i=1

2 y(i)

For insight on J(), let's assume x 2 R so = [0, 1]

Based on example

by Andrew Ng

7

IntuiIon Behind Cost FuncIon

J( )

=

1 2n

Xn h

(i)

x

i=1

2 y(i)

For insight on J(), let's assume x 2 R so = [0, 1]

(for fixed , this is a funcIon of x)

3

2

y

1

0

0

1x 2

3

Based on example by Andrew Ng

(funcIon of the parameter )

3 2 1 0 -0.5 0 0.5 1 1.5 2 2.5

8

IntuiIon Behind Cost FuncIon

J( )

=

1 2n

Xn h

(i)

x

i=1

2 y(i)

For insight on J(), let's assume x 2 R so = [0, 1]

(for fixed , this is a funcIon of x)

3

(funcIon of the parameter )

3

2

2

y

1

1

0 0

Based on example by Andrew Ng

0

1x 2

3

-0.5 0 0.5 1 1.5 2 2.5

J([0, 0.5]) =

1

(0.5

1)2 + (1

2)2 + (1.5

3)2 0.58

23

9

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download