On Thu, 21 Oct 1999, Manuel wrote:
I hope that someone has had a similar trouble and will be able to help us : We , have installed the R package in a Digital Workstation with 500Mb of RAM memory, running under Unix operating system. The package works fine but when we try to start the program with more than 120Mb, (vsize - --120M) the workstation refuses to allocate this memory. The message that we get is: Fatal error: Could not allocate memory for vector heap. Someone told us that the solution was an appropiate ulimit call, but when we do ulimit -a we get only a number 1048576. We figure out that this number can be the data segment size. When we do ulimit -d unlimited ulimit -s unlimited ulimit -m unlimited ulimit -v unlimited we get the following mesage: Requested ulimit exceed hard limit. We think that this mean that we have no limit to the amount of memory that can be allocated. We have installed the same version of the program under linux (Redhat 6.0) and we were also unable to allocate more than 120 M. I would be very grateful if someone could let us some new advise in order to solve the problem. Note: I don`t know if the R package is able to allocate more than 120M. We need about 250 M of memory because currently we are dealing with problems in hight dimension.
I have no problem allocating --vsize 250M using R0.65.1 on either Debian
GNU/Linux or Solaris 2.7. In fact, I can allocate --vsize 1000M under
Solaris, which is substantially larger than physical memory
wompom% ~/Rarchive/R --vsize 1000M
then
R> gc()
free total
Ncells 128269 250000
Vcells 131024950 131072000
Thomas Lumley
Assistant Professor, Biostatistics
University of Washington, Seattle
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._