"Fastest" way to merge 300+ .5MB dataframes?
My sincere apologies. Having read http://www.r-project.org/posting-guide.html , I had wanted to post this one to R-help. Grant Rettke | ACM, ASA, FSF, IEEE, SIAM gcr at wisdomandwonder.com | http://www.wisdomandwonder.com/ ?Wisdom begins in wonder.? --Socrates ((? (x) (x x)) (? (x) (x x))) ?Life has become immeasurably better since I have been forced to stop taking it seriously.? --Thompson
On Sun, Aug 10, 2014 at 1:28 PM, Joshua Ulrich <josh.m.ulrich at gmail.com> wrote:
The same comment Jeroen Ooms made about your last email also applies to this one: it is better suited to R-help. -- Joshua Ulrich | about.me/joshuaulrich FOSS Trading | www.fosstrading.com On Sun, Aug 10, 2014 at 1:18 PM, Grant Rettke <gcr at wisdomandwonder.com> wrote:
Good afternoon, Today I was working on a practice problem. It was simple, and perhaps even realistic. It looked like this: ? Get a list of all the data files in a directory ? Load each file into a dataframe ? Merge them into a single data frame Because all of the columns were the same, the simplest solution in my mind was to `Reduce' the vector of dataframes with a call to `merge'. That worked fine, I got what was expected. That is key actually. It is literally a one-liner, and there will never be index or scoping errors with it. Now with that in mind, what is the idiomatic way? Do people usually do something else because it is /faster/ (by some definition)? Kind regards, Grant Rettke | ACM, ASA, FSF, IEEE, SIAM gcr at wisdomandwonder.com | http://www.wisdomandwonder.com/ ?Wisdom begins in wonder.? --Socrates ((? (x) (x x)) (? (x) (x x))) ?Life has become immeasurably better since I have been forced to stop taking it seriously.? --Thompson
______________________________________________ R-devel at r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel