Skip to content

two methods for regression, two different results

2 messages · John Sorkin, Jari Oksanen

#
On Tue, 2005-04-05 at 22:54 -0400, John Sorkin wrote:
John,

Bill Venables already told you that they don't do that, because they are
not orthogonal. Here is a simpler way of getting the same result as he
suggested for the coefficients of z (but only for z):
Call:
lm(formula = y ~ x + z)

Coefficients:
            Estimate Std. Error t value Pr(>|t|)
(Intercept)  2.96436    0.06070  48.836  < 2e-16 ***
x            0.96272    0.11576   8.317 5.67e-13 ***
z            1.08922    0.06711  16.229  < 2e-16 ***
---
Residual standard error: 0.2978 on 97 degrees of freedom
Call:
lm(formula = residuals(lm(y ~ x)) ~ x + z)

Coefficients:
            Estimate Std. Error t value Pr(>|t|)
(Intercept) -0.15731    0.06070  -2.592   0.0110 *
x           -0.84459    0.11576  -7.296 8.13e-11 ***
z            1.08922    0.06711  16.229  < 2e-16 ***
---
Residual standard error: 0.2978 on 97 degrees of freedom

You can omit x from the outer lm only if x and z are orthogonal,
although you already removed the effect of x... In orthogonal case the
coefficient for x would be 0.

Residuals are equal in these two models:
[1] -2.797242e-17  5.551115e-17

But, of course, fitted values are not equal, since you fit the mod2 to
the residuals after removing the effect of x...

cheers, jari oksanen