dear R experts---sorry, second question of the day. I want to match
some
moments. I am writing my own code---I have exactly as many moment
conditions as parameters, and I am leary of having to learn the magic
of
GMM weighting matrices (if I was to introduce more). the process
sounds
easy conceptually. (Seen it in seminars many times, so how hard could
it
possibly be?...me thinks) first time I am trying this. some of my
moments
are standard deviations. Easy, me thinks. Just maximize the
exp(my.sigma.parameter) instead of the my.sigma.parameter. This way,
nlm()
can throw negative values into my objective function, and I will be
good.
this is about the time to start laughing, of course.
so, nlm() computes a gradient that is huge at my initial starting
value. it
then decides that it wants to take a step into exp(20.59), at which
point
everything in my function goes heywire and it wants to return NA. now
nlm()
barfs...and I am seriously consider grid-searching. This does not
strike me
as particular intelligent.
are there any intelligent optimizers that understand domains and/or
will "backstep" gracefully when they encounter an NA? are there
better ways
to deal with matching second moments?
advice appreciated.
regards,
/iaw
PS: you probably don't want to know this, but I have a dynamic panel
data
set; and my goal is to test whether a constant auto-coefficient
across
units can describe the data. that is, I want to find out whether
x(i,t)= a
+ b(i) + c*x(i,t-1) is better replaced by x(i,t)=a + b(i) +
c(i)*x(i,t-1).
right now, I am running N OLS TS regression of x on lagged x, and am
picking off the mean(c), sd(c), and mean(sigma_i) and sd(sigma_i). if
there
is a procedure in R that already does a test for heterogeneous
autocorrelation coefficients in a more intelligent fashion, please
please
point me to it. however, even if this exists, I think I need to
figure out
how to find a more graceful optimizer anyway.
[[alternative HTML version deleted]]