Skip to content
Back to formatted view

Raw Message

Message-ID: <2fa1c8e2-c692-1237-ea3c-572397038b26@statistik.tu-dortmund.de>
Date: 2016-05-05T09:11:35Z
From: Uwe Ligges
Subject: R process killed when allocating too large matrix (Mac OS X)
In-Reply-To: <CAM3-KjbDejim0PSj9s5ognw4wfA=_BvD2UmfVDdmixqww2MkoA@mail.gmail.com>

On 05.05.2016 04:25, Marius Hofert wrote:
> Hi Simon,
>
> ... all interesting (but quite a bit above my head). I only read
> 'Linux' and want to throw in that this problem does not appear on
> Linux (it seems). I talked about this with Martin Maechler and he
> reported that the same example (on one of his machines; with NA_real_
> instead of '0's in the matrix) gave:
>
>   Error: cannot allocate vector of size 70.8 Gb
>     Timing stopped at: 144.79 41.619 202.019
>
> ... but no killer around...

Well, with n=1. ;-)

Actually this also happens under Linux and I had my R processes killed 
more than once (and much worse also other processes so that we had to 
reboot a server, essentially). That's why we use job scheduling on 
servers for R nowadays ...

Best,
Uwe

>
> Cheers,
> Marius
>
> ______________________________________________
> R-devel at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel
>