optim
nwew wrote:
Dear R-helpers, The function optim implements algorithms that I would like to use. I have function implemented in R, which given the parameters of which minimization is to take place returns a scalar as well as the gradient. Unfortunately optim requires two function _fn_ and _gr_ where fn returns the function value and gr the gradient. Splitting my function in two functions would be easy, however I am wondering if evaluating both is not doubling the the very high computational costs. Most of the computational intensive operations are identical if computing the function value and gradient. Question: is there a way to tweek optim that only one function evaluation is necessary? Are ther other implementations of the algorithm, which do assume that the function to be minimized returns the function value and the gradient as well?
I don't know the answer to your question, but here's a different
approach. Write a function that effectively splits your single function
into two:
splitfn <- function(f) {
lastx <- NA
lastfn <- NA
lastgr <- NA
doeval <- function(x) {
if (identical(all.equal(x, lastx), TRUE)) return(lastfn)
lastx <<- x
both <- f(x)
lastfn <<- both$fnval
lastgr <<- both$grval
return(lastfn)
}
fn <- function(x) doeval(x)
gr <- function(x) {
doeval()
lastgr
}
list(fn=fn, gr=gr)
}
I haven't tested this, but the idea is that it sets up a local
environment where the last x value and last function and gradient values
are stored. If the next call asks for the same x, then the cached
values are returned. I don't know if it will actually improve
efficiency: that depends on whether optim evaluates the gradient and
function values at the same points or at different points.
You would use this as follows, assuming your function is called f:
f2 <- splitfn(f)
optim(par, f2$fn, f2$gr, ...)
Duncan Murdoch