Skip to content
Prev 50083 / 63424 Next

Reduce memory peak when serializing to raw vectors

Thanks Simon, Michael. 
Looking at the design more carefully I think we can get away with serializing directly to sockets or to a file in /dev/shm if we want to keep things in memory.

-Jorge

From: Simon Urbanek [mailto:simon.urbanek at r-project.org] 
Sent: Tuesday, March 17, 2015 3:13 PM
To: Michael Lawrence
Cc: Martinez de Salinas, Jorge; r-devel at r-project.org
Subject: Re: [Rd] Reduce memory peak when serializing to raw vectors

In principle, yes (that's what Rserve serialization does), but AFAIR we don't have the infrastructure in place for that. But then you may as well serialize to a connection instead. To be honest I don't see why you would serialize anything big to a vector - you can't really do anything useful with that ... (what you couldn't do with the streaming version).

Sent from my iPhone
On Mar 17, 2015, at 17:48, Michael Lawrence <lawrence.michael at gene.com> wrote:
Presumably one could stream over the data twice, the first to get the size, without storing the data. Slower but more memory efficient, unless I'm missing something.
Michael
On Tue, Mar 17, 2015 at 2:03 PM, Simon Urbanek <simon.urbanek at r-project.org> wrote:
Jorge,

what you propose is not possible because the size of the output is unknown, that's why a dynamically growing PStream buffer is used - it cannot be pre-allocated.

Cheers,
Simon
______________________________________________
R-devel at r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel