Notes 15
Notes 15
1 of 37
Linear regression
yi = β0 + β1 xi + i
where
i ∼ (0, σ2 )
In particular, yi has been continuous throughout the course
2 of 37
Binary responses
3 of 37
Binary responses
4 of 37
Linear regression for binary outcome
1.0
0.5
y
0.0
−10 −5 0 5
x
5 of 37
What we need for binary outcomes
6 of 37
Link functions
7 of 37
Logistic regression
Model is now
E(yi |xi ) = pi
pi
g(pi ) = log = β0 + β1 xi
1 − pi
exp(β0 + β1 xi )
pi = g−1 (β0 + β1 xi ) =
1 + exp(β0 + β1 xi )
8 of 37
Parameter interpretation
9 of 37
Parameter interpretation
10 of 37
Parameter estimation
11 of 37
ML for logistic regression
12 of 37
ML for logistic regression
13 of 37
♠ Likelihood Functions
• Parameters: β = (β0 , β1 , . . . , βk )′ .
12-22
• Binomial Logistic Regression Models
Let yi be a binary response taking values 0, 1, . . . , ni . Then the
likelihood function is given by
n
! n
!
Y ni y n −y
Y ni exp(y i x′
i β)
L(β) = pi i (1 − pi ) i i = ni
i=1
yi
i=1
yi [1 + exp(x ′
i β)]
12-23
and the log-likelihood function is
n n
X o
l(β) = log L(β) = yi x′i β − exp(x′i β) − log(yi !) .
i=1
12-24