Skip to content
Prev 355516 / 398500 Next

Problem in SVM model generation

On Oct 15, 2015, at 4:51 AM, Bhawana Sahu wrote:

            
The size of _contiguous_ memory is what's important and that will depend on the efficiency of your memory management (with Windows being accused of inferiority in the past) as well as how many other programs you have loaded and the degree of memory fragmentation. I'm not sure that you can resize memory when you already deep in a session. I thought it needed to be done early in the startup process but I'm not currently using Windows so am only reporting what I read in Rhelp

?'Memory-limits'
?Startup 

Delete any (possibly invisible) .RData file. Restart with a clean session of both your OS and R. You can get the size of objects currently in the R workspace with this function that I think I copied from one of Dirk Eddelbeuttel's or Bill Dunlap's posts:


getsizes <- 
function (num=10)  # change the num to different values to see more objects
{
    z <- sapply(ls(envir = globalenv()), function(x) object.size(get(x)))
    (tmp <- as.matrix(rev(sort(z))[1:num]))
}


getsizes()
Packages are not usually the culprit. It's usually program bloat by the user. I am usually guilty of having too many images and webpages open at the same time.
David Winsemius
Alameda, CA, USA