Skip to content
Prev 334550 / 398502 Next

convergence=0 in optim and nlminb is real?

As indicated, if optimizers check Hessians on every occasion, R would
enrich all the computer manufacturers. In this case it is not too large
a problem, so worth doing.

However, for this problem, the Hessian is being evaluated by doing
numerical approximations to second partial derivatives, so the Hessian
may be almost a fiction of the analytic Hessian. I've seen plenty of
Hessian approximations that are not positive definite, when the answers
were OK.

That Inf is allowed does not mean that it is recommended. R is very
tolerant of many things that are not generally good ideas. That can be
helpful for some computations, but still cause trouble. It seems that it
is not the problem here.

I did not look at all the results for this problem from optimx, but it
appeared that several results were lower than the optim(BFGS) one. Is
any of the optimx results acceptable? Note that optimx DOES offer to
check the KKT conditions, and defaults to doing so unless the problem is
large. That was included precisely because the optimizers generally
avoid this very expensive computation. But given the range of results
from the optimx answers using "all methods", I'd still want to do a lot
of testing of the results.

This may be a useful case to point out that nonlinear optimization is
not a calculation that should be taken for granted. It is much less
reliable than most users think. I rarely find ANY problem for which all
the optimx methods return the same answer. You really do need to look at
the answers and make sure that they are meaningful.

JN
On 13-12-17 11:32 AM, Adelchi Azzalini wrote: