Large data... yes, though how this can be done may vary. I have used
machines with 128G of RAM before with no special big data packages.
Making an executable... theoretically, yes, though there are some
significant technical (and possibly legal) challenges that will most likely
make you question whether it was worth it if you try, particularly if your
intent is to obscure your code from the recipient. I (as a random user and
programmer on the Internet) would strongly discourage such efforts... it
will almost certainly be more practical to deliver code in script/package
form.
On May 6, 2020 2:20:47 PM PDT, Paul Bernal <paulbernal07 at gmail.com> wrote:
Dear R friends,
Hope you are doing well. I have two questions, the first one is, can I
work
with very large datasets in R? That is, say I need to test several
machine
learning algorithms, like (random forest, multiple linear regression,
etc.)
on datasets having between 50 to 100 columns and 20 million
observations,
is there any way that R can handle data that large?
The second question is, is there a way I can develop an R model and
turn it
into an executable program that can work on any OS?
Any help and/or guidance will be greatly appreciated,
Best regards,
Paul
[[alternative HTML version deleted]]