Skip to content
Prev 391073 / 398506 Next

Cook's distance for least absolute deviation (lad) regressions

Dear Kelly and Jim,
On 2022-03-20 9:40 p.m., Jim Lemon wrote:
cookd() in the car package has been defunct for some time.

To address the original question: One can compute Cook's distances for 
*any* regression model by brute-force, omitting each case i in turn and 
computing the Wald F or chisquare test statistic for the "hypothesis" 
that the deleted estimate of the regression coefficients b_{-i} is equal 
to the estimate b for all of the data. In a linear model, D can be 
computed much more efficiently based on the hatvalues, etc., without 
having to refit the model n times, but that's not generally the case, 
unless the model can be linearized (as for a GLM fit by IWLS).

I'm insufficiently familiar with the computational details of LAD 
regression (or quantile regression more generally) to know whether a 
more efficient computation is possible there, but unless the data set is 
very large, in which case it's highly unlikely that influence of 
individual cases is an issue, the brute-force approach should be 
feasible and very easy to program.

I hope this helps,
  John
Message-ID: <ae75f4b5-2d8f-8e24-b263-ebfd147b358f@mcmaster.ca>
In-Reply-To: <4944_1647826886_22L1fQCt025768_CA+8X3fXzZj2HUrQdvsmE0CDiVBejXGCF-pxEOrGVdkfDPhfU3g@mail.gmail.com>