Skip to content
Back to formatted view

Raw Message

Message-ID: <801C3C37-7C6F-46A6-B8E7-B09C619595FF@comcast.net>
Date: 2009-01-15T16:53:30Z
From: David Winsemius
Subject: interpolation to abscissa
In-Reply-To: <298a6f60901150831k28609906q3ec2060963e95384@mail.gmail.com>

On Jan 15, 2009, at 11:31 AM, e-letter wrote:

>> Perhaps a coding error on my part (or on your part). Perhaps  
>> different
>> methods (none of which you describe)?
>
>>
>> I suspect that my method only used the first two points (I just
>> checked by plotting and -2.7 is closer to the paper and pen result I
>> get than is -3.28. Perhaps you made an extrapolation from a linear  
>> fit
>> of a dataset that is not co-linear?
>>
>>> lm(c(0,5) ~ c(16,45))
>>
>> Call:
>> lm(formula = c(0, 5) ~ c(16, 45))
>>
>> Coefficients:
>> (Intercept)    c(16, 45)
>>    -2.7586       0.1724
>>
>> It not that "R is different",  it is merely that I used it  
>> differently
>> than you used your other tools.
>>
>> Here's another method ( using all points and again reversing the  
>> roles
>> of x and y) :
>>> lm(c(0,5,10,15,20) ~ c(16,45,77,101,125))
>>
>> Call:
>> lm(formula = c(0, 5, 10, 15, 20) ~ c(16, 45, 77, 101, 125))
>>
>> Coefficients:
>>            (Intercept)  c(16, 45, 77, 101, 125)
>>                -3.2332                   0.1818
> My understanding from gnuplot manual is that a marquart-levenberg
> algorithm is used, which I applied to the data to perform a least
> squares best fit linear curve. Gnuplot returns values for the
> intercept and gradient which I then apply to solve the linear equation
> y=mx+c. Similarly with scilab, where the regress(ion?) function was
> applied. Qtiplot performed non-weighted linear regression to output
> values similar to those from gnuplot.
>
> Why reverse the roles of x and y in your method?

I accidentally switched x and y ad then realized I could get an  
intercept value without the labor of solving by hand.

> Although your revised value is closer to those from other programs,  
> how do I understand and
> explain the discrepancy?

The regression line for x ~ y is *not* the same as the regression line  
for y ~ x.

If you want to check that the numbers from R agree with your other  
solutions, then take the regression equation from
 > lmmod

Call:
lm(formula = c(16, 45, 77, 101, 125) ~ c(0, 5, 10, 15, 20))

Coefficients:
        (Intercept)  c(0, 5, 10, 15, 20)
              18.00                 5.48

... and then solve for x = 0 .... as you apparently did with the other  
systems.

-- 
David Winsemius