Skip to content
Prev 30395 / 398513 Next

Solving A System of Equations

The best method I know is as follows:

1.  Write a function to be minimized, e.g., sum of squares of errors in 
the equations.  If your functions are continuously differentiable, then 
sum of squares is better than sum of absolute values, because the sum of 
squares it is continuous at the minimum while the sum of absolute values 
is not.

2.  If there is any question that there might be multiple local minima, 
then I would implement one of Prof. Blackwell's suggestions, namely 
testing it over an appropriate grid of points.  With only one unknown, I 
make a plot.  With two, I make a contour plot.  With three or more, I 
might try some kind of grid or Monte Carlo.

3.  With some appropriate starting value(s), I then pass the function to 
"optim".  If I want confidence intervals with nonlinear least squares, I 
may pass the output of optim to "nls".  If my objective function is a 
log(likelihood), "optim" will output the Hessian, which is the negative 
of the observed information, whose invers in the approximate covariance 
matrix.

Comments?
Spencer Graves
Thomas W Blackwell wrote: