Skip to content
Prev 10078 / 398506 Next

Fitting automatically empirical data

Hi,
I'm using R to find esplicit functions fitting set of data.
The data contains about 30 points, which have different weights (number
of cases represented from the point).
I plot the points, choose "by eye" a function made with exp or arctg or
polinomial and use nlm to minimaze the root mean error with correct the
weights.

For Example:
Err <- function(p) 
       sum((weight*(y - (p[1]+p[2]*atan(p[3]+p[4]*x))))^2) 
out <- nlm(Err, p=c(1,1,1,1), hessian=T)
fp <-  function(z)
out$estimate[1]+out$estimate[2]*atan(out$estimate[3]+out$estimate[4]*z)

The problem is that I have about 40 different set of data (variables)
and for each I obtain about 5 different sample, depending by a
parameter, so I have about 200 different function to fit: I can't do
this work for each function :(

1)There is a family of functions wich is more general and strong to fit
most of my data?
(most of my functions are very similar to arctg, or exp or parabolic:
they are not bad :)

2)How can I use nls to do the same thing (I hope nls is stronger than my
implementation of nlm: or not?)

Thank you very much!