ERROR : cannot allocate vector of size (in MB & GB)
Sure, get more RAM. 2GB is a tiny amount if you need to load files of 1GB into R, and as you've discovered won't work. You can try a few simpler things, like making sure there's nothing loaded into R except what you absolutely need. It looks like there's no reason to read the entire file into R at once for what you want to do, so you could also load a chunk, process that, then move onto the next one. Sarah
On Tue, Jul 24, 2012 at 9:45 AM, Rantony <antony.akkara at ge.com> wrote:
Hi,
Here in R, I need to load a huge file(.csv) , its size is 200MB. [may come
more than 1GB sometimes].
When i tried to load into a variable it taking too much of time and after
that when i do cbind by groups,
getting an error like this
" Error: cannot allocate vector of size 82.4 Mb "
My requirement is, spilt data from Huge-size-file(.csv) to no. of small csv
files.
Here i will give no of lines to be 'split by' as input.
Below i give my code
-------------------------------
SplitLargeCSVToMany <- function(DataMatrix,Destination,NoOfLineToGroup)
{
test <- data.frame(read.csv(DataMatrix))
# create groups No.of rows
group <- rep(1:NROW(test), each=NoOfLineToGroup)
new.test <- cbind(test, group=group)
new.test2 <- new.test
new.test2[,ncol(new.test2)] <- NULL
# now get indices to write out
indices <- split(seq(nrow(test)), new.test[, 'group'])
# now write out the files
for (i in names(indices))
{
write.csv(new.test2[indices[[i]],], file=paste(Destination,"data.", i,
".csv", sep=""),row.names=FALSE)
}
}
-----------------------------------------------------
My system Configuration is,
Intel Core2 Duo
speed : 3GHz
2 GB RAM
OS: Windows-XP [ServicePack-3]
---------------------------------------------------
Any hope to solve this issue ?
Thanks in advance,
Antony.
--
Sarah Goslee http://www.functionaldiversity.org