Skip to content
Prev 189564 / 398498 Next

matrix power

On 10-Aug-09 22:36:03, cindy Guo wrote:
Cindy,
If that -5.338634e-17 is typical of the "lot of negative eigenvalues",
then what you are seeing is the result of R's attempt to calculate
zero eigenvalues, but defeated by the inevitable rounding errors.
In other words, your covariance matrix is singular, and the variables
involved are not linearly independent.

The only thing that is guaranteed about a covariance matrix is that
it is positive semi-definite (not positive definite); in other words
all eigenvalues are positive or zero (mathematically).

For example, if Y=X, var(X) = var(Y) = 1, then
  cov(X,Y) =  1  1
              1  1
which is singular (eigenvalues = 2, 0).

The result of attempting to compute them is subject to rounding errors,
which (for zero eigenvalues) can be slightly negative.

So the covariance matrix in your case would not have an inverse,
still less a negative square root!

The basic problem is that you have luinear dependence between the
variables. To make progress, you would need to find a maximal linearly
independent set (or possibly find the principal components with
nozero weights).

Ted.

--------------------------------------------------------------------
E-Mail: (Ted Harding) <Ted.Harding at manchester.ac.uk>
Fax-to-email: +44 (0)870 094 0861
Date: 10-Aug-09                                       Time: 23:58:00
------------------------------ XFMail ------------------------------