Skip to content

MANOVA polynomial contrasts

10 messages · Greg Snow, Manzoni, GianMauro, John Fox +1 more

1 day later
#
Dear Gian,

How contrasts are created by default is controlled by the contrasts option:
unordered           ordered 
"contr.treatment"      "contr.poly" 

So, unless you've changed this option, contr.poly() will be used to generate orthogonal polynomial contrasts for an ordered factor, and you therefore need do nothing special to get this result. For example:
[1] c c a a c c b c a c
Levels: a < b < c
.L      .Q
[1,] -0.7071  0.4082
[2,]  0.0000 -0.8165
[3,]  0.7071  0.4082

For more information, see section 11 on statistical models in the manual "An Introduction to R," which is part of the standard R distribution, and in particular sections 11.1 and 11.1.1.

I hope that this clarifies the issue.

Best,
 John

------------------------------------------------
John Fox
Sen. William McMaster Prof. of Social Statistics
Department of Sociology
McMaster University
Hamilton, Ontario, Canada
http://socserv.mcmaster.ca/jfox/

On Wed, 25 Jul 2012 11:58:30 +0200
"Manzoni, GianMauro" <gm.manzoni at auxologico.it> wrote:
4 days later
#
Dear Gian Mauro,

On Mon, 30 Jul 2012 14:44:44 +0200
"Manzoni, GianMauro" <gm.manzoni at auxologico.it> wrote:
Here's a contrived example, which uses the Anova() and linearHypothesis() functions in the car package:

----- snip ------
Call:
lm(formula = Y ~ f)

Coefficients:
             y1        y2        y3      
(Intercept)   0.06514  -0.01683  -0.13787
f.L          -0.37837   0.18309   0.29736
f.Q          -0.02102  -0.39894   0.08455
f.C           0.05898   0.09358  -0.17634
Type II MANOVA Tests: Pillai test statistic
  Df test stat approx F num Df den Df Pr(>F)
f  3   0.11395   1.2634      9    288 0.2566
Sum of squares and products for the hypothesis:
          y1        y2        y3
y1  3.607260 -1.745560 -2.834953
y2 -1.745560  0.844680  1.371839
y3 -2.834953  1.371839  2.227995

Sum of squares and products for error:
          y1        y2        y3
y1 86.343376 -8.054928 -3.711756
y2 -8.054928 95.473020  2.429151
y3 -3.711756  2.429151 89.593163

Multivariate Tests: 
                 Df test stat approx F num Df den Df   Pr(>F)  
Pillai            1 0.0648520 2.172951      3     94 0.096362 .
Wilks             1 0.9351480 2.172951      3     94 0.096362 .
Hotelling-Lawley  1 0.0693495 2.172951      3     94 0.096362 .
Roy               1 0.0693495 2.172951      3     94 0.096362 .
---
Signif. codes:  0 ?***? 0.001 ?**? 0.01 ?*? 0.05 ?.? 0.1 ? ? 1 

----- snip ------

You could do similar tests for the quadratic and cubic contrasts.

I hope this helps,
 John

------------------------------------------------
John Fox
Sen. William McMaster Prof. of Social Statistics
Department of Sociology
McMaster University
Hamilton, Ontario, Canada
http://socserv.mcmaster.ca/jfox/
#
Dear GMM,
second
Yes, but wouldn't it have been faster simply to try it? Also see
?linearHypothesis.
There are discussions of contrasts and of linear hypotheses about
coefficients, though not in the context of *multivariate* linear models;
that's the subject of an on-line appendix, at <
http://socserv.socsci.mcmaster.ca/jfox/Books/Companion/appendix/Appendix-Mul
tivariate-Linear-Models.pdf>.

Best,
 John
to
example:
the
specify
#
Dear Mauro,

I believe that I've answered a version of this question three times this
month alone, so I'll be brief.

 Are you aware that if you use type-III tests, even if you are careful to
employ contrasts, such as orthogonal polynomial contrasts, that are
orthogonal for different terms in the row-basis of the design, you will
nevertheless be testing for differences when the covariates are both 0? If
that's not sensible, then why not use type-II tests?

(As an aside, I've been experiencing a problem with my ISP that's causing my
return email address to be given incorrectly; please don't reply to an
address other than jfox at mcmaster.ca -- and, of course,
r-help at r-project.org.)

John
2
contrasts.
the