On Jun 30, 2008, at 3:07 PM, Antonio P. Ramos wrote:
Thanks for the comments. What I'm doing is very simple: I'm running an one dimensional item response model, similar to the ones use in psychology and educational testing data via Markov Chain Monte Carlo Methods. model_m12<- ideal(rollcall_m2, maxiter = 500 000 000, thin = 1000, burnin = 5000, store.item = TRUE, normalize=T, priors=list(xp=1e-12,xpv=1e-12,bp=1e-12,bpv=1e-12), verbose=T) # my data matrix is provided by the rollcall object, but it has just 155*17 dimensions: rollcall_m2 # the number of interactions is maxiter/thin= 500,000 # store.item=true is the main source of the problem: it's store the discrimination # parameter, which consumer a large amount of memory. Unfortunately, I need this information.
Looking at the code ideal seems to do most of its work in C code so I'm assuming it's not wasting it. Hence the memory is not strictly needed at simulation time, so you should get away with 64-bit version of R (see http://r.research.att.com/ ) which will swap out the storage as you go (the final collection of the result matrix will be probably very slow since it will have to activate all that swapped memory, but given that it's a selective copy operation it should still be doable). It's kind of beyond me why you need 500m iterations, but that's another issue ;). Cheers, Simon
So, if R can access up to 3.5 Gb how can I fix the problem. I'm sure lot's of mac users will be also interested in increasing its memory allocation capabilities in R. Cheers, Antonio. On Mon, Jun 30, 2008 at 11:38 AM, Kasper Daniel Hansen <khansen at stat.berkeley.edu> wrote:
Thanks for the clarification. How did you get that output? Kasper On Jun 30, 2008, at 10:23 AM, Simon Urbanek wrote:
On Jun 30, 2008, at 1:04 PM, Kasper Daniel Hansen wrote:
Like Sean is aying, you most likely are using _way_ more memory than 1.2 GB. However, if you a re running 32bit R (which is the case if you use the CRAN binary) R can only access 2GB,
That's not true, 32-bit process can use up to about 3.5GB of RAM: Virtual Memory Map of process 2849 (R) Output report format: 2.2 -- 32-bit process [...] ReadOnly portion of Libraries: Total=72.9M resident=36.6M(50%) swapped_out_or_unallocated=36.3M(50%) Writable regions: Total=3.4G written=3.4G(100%) resident=3.4G(99%) swapped_out=3352K(0%) unallocated=19.3M(1%) so it should make no real difference for Antonio (unless he doesn't mind waiting while the machine swaps). Nonetheless using 64-bit R is fine as well, especially on Leopard - albeit that doesn't fix incorrect use of memory by users :). Cheers, S
so you can squeeze a little more out of your machine by switching to a 64bit version of R. You can check what version you have by typing R> .Machine and look for sizeof.pointer - if it is 4 you are using 32bit, if it is 8 you are using 64 bit. If you want the 64 bit version of R you can download a binary from Simon's page: r.research.att.com , but you need to also get the preview build of GCC 4.2 which is available from Apple's developer site (although hard to find these days). Kasper On Jun 30, 2008, at 3:23 AM, Sean Davis wrote:
On Sun, Jun 29, 2008 at 6:35 AM, Antonio P. Ramos <ramos.grad.student at gmail.com> wrote:
Hi everybody, I have a memory allocation problem while using R in my macbook pro, which runs the latest leopard. I'm trying to run a monte carlo simulation with 500,000 interactions, but the machine failed: Starting MCMC Iterations... Error: cannot allocate vector of size 1.2 Gb R(176,0xa0640fa0) malloc: *** mmap(size=1239990272) failed (error code=12) *** error: can't allocate region *** set a breakpoint in malloc_error_break to debug R(176,0xa0640fa0) malloc: *** mmap(size=1239990272) failed (error code=12) *** error: can't allocate region *** set a breakpoint in malloc_error_break to debug Since my machine has 4 Gb of memory, and since I'm not running nothing in addition to the simulation, I found it strange. This is my machine: Model Identifier: MacBookPro3,1 Processor Name: Intel Core 2 Duo Processor Speed: 2.4 GHz Memory: 4 GB Unfortunately, I could figure it out how to solve it. Any help?
The error message above means that R failed to allocate a vector of size 1.2Gb. That doesn't mean that R was using only 1.2 Gb, but that it was trying to allocate a new block of memory of that size in addition to the memory that was already in use. The system on the Mac uses a fair amount of memory; R was probably using memory as well. In short, you probably need more memory or be more clever about how you are using the memory you have. Without more details about what you are doing, it is difficult to know how to change the latter. Sean
_______________________________________________ R-SIG-Mac mailing list R-SIG-Mac at stat.math.ethz.ch https://stat.ethz.ch/mailman/listinfo/r-sig-mac
_______________________________________________ R-SIG-Mac mailing list R-SIG-Mac at stat.math.ethz.ch https://stat.ethz.ch/mailman/listinfo/r-sig-mac