Page 431 - Applied Statistics with R
P. 431
17.3. WORKING WITH LOGISTIC REGRESSION 431
but now we are performing a -test, as the test statistic is approximated by a
standard normal distribution, provided we have a large enough sample. (The
-test for ordinary linear regression, assuming the assumptions were correct, had
an exact distribution for any sample size.)
We’ll skip some of the exact details of the calculations, as R will obtain the
standard error for us. The use of this test will be extremely similar to the -test
for ordinary linear regression. Essentially the only thing that changes is the
distribution of the test statistic.
17.3.3 Likelihood-Ratio Test
Consider the following full model,
(x )
log ( i ) = + + + ⋯ + ( −1) ( −1) +
1 1
0
2 2
1 − (x )
i
This model has − 1 predictors, for a total of -parameters. We will denote
̂
the MLE of these -parameters as Full
Now consider a null (or reduced) model,
(x )
log ( i ) = + + + ⋯ + +
1 − (x ) 0 1 1 2 2 ( −1) ( −1)
i
where < . This model has −1 predictors, for a total of -parameters. We
̂
will denote the MLE of these -parameters as Null
The difference between these two models can be codified by the null hypothesis
of a test.
∶ = +1 = ⋯ = −1 = 0.
0
This implies that the reduced model is nested inside the full model.
We then define a test statistic, ,
( ̂ ) ( ̂ )
̂
̂
= −2 log ( Null ) = 2 log ( Full ) = 2 (ℓ( Full ) − ℓ( Null ))
̂
̂
( Full ) ( Null )
where denotes a likelihood and ℓ denotes a log-likelihood. For a large enough
sample, this test statistic has an approximate Chi-square distribution
approx
∼ 2

