Skip to content
Prev 304603 / 398503 Next

fitting lognormal censored data

On 31-08-2012, at 03:54, Salma Wafi wrote:

            
Your code is a mess and unreadable.

Assuming that the function ml is what you want to minimize, I'm quite sure that the derivatives you are calculating are quite incorrect.
Did you check them?
Assuming that:

- function F gives the gradient of ml
- function J gives the Hessian of ml

we can check these with package numDeriv.

To make the following reproducible I inserted

set.seed(123)

at the start of your code and at the end of your code I inserted

library(numDeriv)
ml.grad <- function(par) grad(ml,par)    
ml.hessian <- function(par) hessian(ml,par)

pstart <- c(0.5,.5)
ml(pstart)
F(pstart)
ml.grad(pstart)
J(pstart)
ml.hessian(pstart)

Running the code in R gives
[1] 570.405
[,1]
[1,] 1.770876e+15
[2,] 7.247892e+15
[1]  -459.6117 -1423.2860
[,1]          [,2]
[1,] -2.634033e+01 -2.567008e+31
[2,] -2.567008e+31 -2.101266e+32
[,1]     [,2]
[1,]  326.3977 1826.317
[2,] 1826.3172 9187.053

Functions F and J are most likely not returning correct values (I have no reason to question the results of numDeriv).
So it is not surprising that your attempt at Newton fails miserably.

You can also run this 

(p1.opt <- optim(pstart,ml))
(p2.opt <- optim(pstart,ml, gr=ml.grad, method="BFGS"))
(p3.opt <- optim(pstart,ml, gr=ml.grad, method="CG"))
(p4.opt <- optim(pstart,ml, gr=ml.grad, method="L-BFGS-B"))
ml.grad(p1.opt$par)
ml.grad(p2.opt$par)
ml.grad(p3.opt$par)
ml.grad(p4.opt$par)

which will show you that optim is perfectly capable of minimizing function ml.
I cannot judge whether the result is compatible with the result of survreg. That's up to you.

Finally, it is often not a good idea to use an equation solver to find a solution to gradient of function=0 as a way of locating a minimum.
If you insist on doing that, have a look at packages nleqslv and BB.


regards,

Berend