Lecture 13 Estimation and hypothesis testing for logistic ...

Lecture 13 Estimation and hypothesis testing for

logistic regression

BIOST 515 February 19, 2004

BIOST 515, Lecture 13

Outline

? Review of maximum likelihood estimation ? Maximum likelihood estimation for logistic regression ? Testing in logistic regression

BIOST 515, Lecture 13

1

Maximum likelihood estimation

Let's begin with an illustration from a simple bernoulli case.

In this case, we observe independent binary responses, and we wish to draw inferences about the probability of an event in the population. Sound familiar?

? Suppose in a population from which we are sampling, each individual has the same probability, p, that an event occurs.

? For each individual in our sample of size n, Yi = 1 indicates that an event occurs for the ith subject, otherwise, Yi = 0.

? The observed data is Y1, . . . , Yn.

BIOST 515, Lecture 13

2

The joint probability of the data (the likelihood) is given by

n

L=

pYi(1 - p)1-Yi

i=1

=

Pn

p i=1

Yi(1

-

p)n-Pni=1

Yi.

For estimation, we will work with the log-likelihood

n

n

l = log(L) = Yi log(p) + (n - Yi)log(1 - p).

i=1

i=1

The maximum likelihood estimate (MLE) of p is that value that maximizes l (equivalent to maximizing L).

BIOST 515, Lecture 13

3

The first derivative of l with respect to p is

l n

n

U (p) = = p

Yi/p - (n -

Yi)/(1 - p)

i=1

i=1

and is referred to as the score funcion. To calculate the MLE of p, we set the score function, U (p) equal to 0 and solve for p. In this case, we get an MLE of p that is

n

p^ = Yi/n.

i=1

BIOST 515, Lecture 13

4

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download