Skip to content
Prev 106473 / 398513 Next

Memory problem on a linux cluster using a large data set [Broadcast]

Section 8 of the Installation and Administration guide says that on
64-bit architectures the 'size of a block of memory allocated is
limited to 2^32-1 (8 GB) bytes'.

The wording 'a block of memory' here is important, because this sets a
limit on a single allocation rather than the memory consumed by an R
session. The size of the allocation of the original poster was
something like 300,000 SNPs x 1000 individuals x 8 bytes (depending on
representation, I guess) = about 2.3 GB so there is still some room
for even larger data.

Obviously it's important to think carefully about how the statistical
analysis of such a large volume of data will proceed, and be
interpreted.

Martin

Thomas Lumley <tlumley at u.washington.edu> writes:

  
    
Message-ID: <6phirg5rxe2.fsf@gopher4.fhcrc.org>
In-Reply-To: <Pine.LNX.4.64.0612210758480.26400@homer22.u.washington.edu> (Thomas Lumley's message of "Thu, 21 Dec 2006 08:07:26 -0800 (PST)")