Value Lookup from File without Slurping
Only the portion your extract is ever in R -- the file itself is read into a database without ever going through R so your memory requirements correspond to what you extract, not the size of the file.
On Fri, Jan 16, 2009 at 10:49 AM, Gundala Viswanath <gundalav at gmail.com> wrote:
Hi Gabor, Do you mean storing data in "sqldf', doesn't take memory? For example, I have 3GB data file. with standard R object using read.table() the object size will explode twice ~6GB. My current 4GB RAM cannot handle that. Do you mean with "sqldf", this is not the issue? Why is that? Sorry for my naive question. - Gundala Viswanath Jakarta - Indonesia On Fri, Jan 16, 2009 at 9:09 PM, Gabor Grothendieck <ggrothendieck at gmail.com> wrote:
On Fri, Jan 16, 2009 at 5:52 AM, r at quantide.com <r at quantide.com> wrote:
I agree on the database solution. Database are the rigth tool to solve this kind of problem. Only consider the start up cost of setting up the database. This could be a very time consuming task if someone is not familiar with database technology.
Using sqldf as mentioned previously on this thread allows one to use
the SQLite database with no setup at all. sqldf automatically creates
the database, generates the record layout, loads the file (not going through
R but outside of R so R does not slow it down) and extracts the
portion you want into R issuing the appropriate calls to RSQLite/DBI and
destroying the database afterwards all automatically. When you
install sqldf it automatically installs RSQLite and the SQLite database
itself so the entire installation is just one line: install.packages("sqldf")
______________________________________________ R-help at r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.