Skip to content
Prev 273088 / 398506 Next

Poor performance of "Optim"

With respect, your statement that R's optim does not give you a reliable
estimator is bogus. As pointed out before, this would depend on when optim
believes it's good enough and stops optimizing. In particular if you stretch
out x, then it is plausible that the likelihood function will become flat
enough "earlier," so that the numerical optimization will stop earlier
(i.e., optim will "think" that the slope of the likelihood function is flat
enough to be considered zero and stop earlier than it will for more
condensed data). After all, maximum likelihood is a numerical method and
thus an approximation. I would venture to say that what you describe lies in
the nature of this method. You could also follow the good advice given
earlier, by increasing the number of iterations or decreasing the tolerance. 

However, check the example below: for all purposes it's really close enough
and has nothing to do with optim being "unreliable."

n<-1000
x<-rnorm(n)
y<-0.5*x+rnorm(n)
z<-ifelse(y>0,1,0)

X<-cbind(1,x)
b<-matrix(c(0,0),nrow=2)

#Probit
reg<-glm(z~x,family=binomial("probit"))

#Optim reproducing probit (with minor deviations due to difference in
method)
LL<-function(b){-sum(z*log(pnorm(X%*%b))+(1-z)*log(1-pnorm(X%*%b)))}
optim(c(0,0),LL)

#Multiply x by 2 and repeat optim
X[,2]=2*X[,2]
optim(c(0,0),LL)

HTH,
Daniel
yehengxin wrote:
--
View this message in context: http://r.789695.n4.nabble.com/Poor-performance-of-Optim-tp3862229p3864133.html
Sent from the R help mailing list archive at Nabble.com.