Skip to content
Prev 45960 / 63424 Next

Assigning NULL to large variables is much faster than rm() - any reason why I should still use rm()?

On Sat, May 25, 2013 at 4:38 PM, Simon Urbanek
<simon.urbanek at r-project.org> wrote:
Thanks for this one.  This is useful - I did try to follow where
.Internal(remove, ...), but got lost in the internal structures.

Of course, I'd love to see such a function in 'base' itself.  Having
such a well defined and narrow function for removing a variable in the
current environment may also be useful for 'codetools'/'R CMD check'
such that code inspection can detect undefined variables in the case
they used to be defined but later have been removed.  Technically rm()
allows for that too, but I can see how such a task quickly gets
complicated when arguments 'list', 'envir' and 'inherits' are
involved.
I didn't mention it, but another reason I use rm() a lot is actually
so R can catch my programming mistakes (I'm maintaining 100,000+ lines
of code), i.e. the opposite to being error prone.  For instance, by
doing rm(tmp) as soon as possible, R will give me the run-time error
"Error: object 'tmp' not found" in case I use it by mistake later on.
As said above, potential the codetools/'R CMD check' will be able to
detect this already at check time [above].  With tmp <- NULL I'll
loose a bit of this protection, although another run-time error is
likely to occur a bit later.

Using local()/local functions are obviously alternatives for the above.

Thanks both (and sorry about the game - though it was an entertaining one)

/Henrik