nls problem: singular gradient
On 07/12/2012 01:39 AM, Duncan Murdoch wrote:
On 12-07-11 2:34 PM, Jonas Stein wrote:
Take a look at the predicted values at your starting fit: there's a discontinuity at 0.4, which sure makes it look as though overflow is occurring. I'd recommend expanding tanh() in terms of exponentials and rewrite the prediction in a way that won't overflow. Duncan Murdoch
Hi Duncan,
Thank you for your suggestion. I wrote a function "mytanh" and
nls terminates a bit later with another error message:
Error in nls(data = dd, y ~ 1/2 * (1 - mytanh((x - ttt)/1e-04) *
exp(-x/tau2)), :
number of iterations exceeded maximum of 50
How can i fix that?
Kind regards,
Jonas
============================ R CODE STARTS HERE =======
mytanh<- function(x){
return(x - x^3/3 + 2*x^5 /15 - 17 * x^7/315)
}
That looks like it would overflow as soon as abs(x-ttt) got large, just like the original. You might be able to fix it by following the advice I gave last time, or maybe you need to rescale the parameters. In most cases optimizers work best when the uncertainty in the parameters is all on the same scale, typically around 1.
I am not shure what you mean with "rescale paramaeters", but i changed ttt and tau2 to 1 but nls still fails. Do you mean i can only use functions with tau2 and ttt close to 1? Is there a better fit function then nls for R? Even "origin" can find the parameters without any problems. nlsfit <- nls(data=dd, y ~ 1/2 * ( 1- mytanh((x - ttt)/0.0001) * exp(-x / tau2) ), start=list(ttt=1, tau2=1) , trace=TRUE, control = list(maxiter = 100))
Jonas Stein <news at jonasstein.de>