Skip to content

Linear Model with curve fitting parameter?

2 messages · Steven McKinney, stephen sefick

#
Okay, using this notation, this appears to be the original
model you queried about.  So for this model, as I showed
before,

Let Z = log(Q) - log(A)

E(Z) = b0     + b2*log(R) + b3*log(S)
     = log(K) + b2*log(R) + b3*log(S)

Fitting the model  lm(Z ~ log(R) + log(S))
will yield parameter estimates b_hat_0, b_hat_2, b_hat_3
where
b_hat_0 (the fitted model intercept) is an estimate of b0 (which is log(K)),
b_hat_2 is an estimate of b2,
b_hat_3 is an estimate of b3.

So in answer to your previous question, b0 is an
estimate of log(K), not ( log(Qintercept)+log(K) )
so an estimate for K is exp(b_hat_0)
The dingman model notation is ambiguous.  Is the last
term  S^(b3*log(S))  or  (S^b3)*log(S) ?

Previous email showed

   > dingman
   > log(Q)=log(b0)+log(K)+a*log(A)+r*log(R)+s*(log(S))^2

which implies (if I ignore the log(b0) term)
  Q = K*(A^a)*(R^r)*(exp(log(S)*log(S))^s)
    = K*(A^a)*(R^r)*(S^(log(S)*s))

This is linearizable as

log(Q) = log(K) + a*log(A) + r*log(R) + s*(log(S))^2
       = b0     + b1*log(A) + b2*log(R) + b3*(log(S)^2)

Fitting lm(log(Q) ~ log(A) + log(R) + I(log(S)^2) ... )
will yield estimates b_hat_0, b_hat_1, b_hat_2 and b_hat_3
where b_hat_0 is an estimate of b0 = log(K) so an estimate of K is exp(b_hat_0),
b_hat_1 is an estimate of b1 = a,
b_hat_2 is an estimate of b2 = r,
b_hat_3 is an estimate of b3 = s
Fitting lm(log(Q) ~ log(A) + log(R) + log(S) ... )
will yield estimates b_hat_0, b_hat_1, b_hat_2 and b_hat_3
where b_hat_0 is an estimate of b0 = log(K) so an estimate of K is exp(b_hat_0),
b_hat_1 is an estimate of b1 = a,
b_hat_2 is an estimate of b2 = r,
b_hat_3 is an estimate of b3 = s


Best

Steve McKinney
#
Thank you very much for all of your help.
On Mon, Apr 4, 2011 at 6:10 PM, Steven McKinney <smckinney at bccrc.ca> wrote: