Skip to content

about memory

6 messages · ronggui, jon butchar, Joerg van den Hoff +1 more

#
here is my system memory:
ronggui at 0[ronggui]$ free
             total       used       free     shared    buffers     cached
Mem:        256728      79440     177288          0       2296      36136
-/+ buffers/cache:      41008     215720
Swap:       481908      60524     421384

and i want to cluster my data using hclust.my data has 3 variables and 10000 cases.but it fails and saying have not enough memory for the vector size.  I read the help doc and use $R --max-vsize=800M to start the R 2.1.0beta under debian linux.but it still can not get the solution.so is my pc'memory not enough to carry this analysis or my mistake on setting the memory?

thank you.
#
How much memory is free when R fails (e.g., what does "top" show while trying to run your clustering)?  If there's still a sizeable amount of free memory you may have to look into the system limits, maximum data segment size in particular.  Many Linux distros have it set to "unlimited" but default Debian may not.  If this turns out to be the problem, please do not, _do not_ raise it to "unlimited," but only to enough for R to work.

hth,

jon b



On Wed, 30 Mar 2005 18:36:37 +0800
ronggui <0034058 at fudan.edu.cn> wrote:

            
#
root at 2[ronggui]# ulimit -a
core file size        (blocks, -c) 0
data seg size         (kbytes, -d) unlimited
file size             (blocks, -f) unlimited
max locked memory     (kbytes, -l) unlimited
max memory size       (kbytes, -m) unlimited
open files                    (-n) 1024
pipe size          (512 bytes, -p) 8
stack size            (kbytes, -s) 8192
cpu time             (seconds, -t) unlimited
max user processes            (-u) unlimited
virtual memory        (kbytes, -v) unlimited

so it seems the data segment size is not limited.
and it is still free mem(1000k or so),and swap(100000k or so),and the error is(i translate it from chinese into english,maybe not exactly ,but i think the meanings are right):
error:can not allocate the vector size of 390585kb.
(´íÎó: ÎÞ·¨·ÖÅä´óСΪ390585 KbµÄÏòÁ¿)



On Wed, 30 Mar 2005 07:34:13 -0500
jon butchar <butchar.2 at osu.edu> wrote:

            
#
Yes, you may need more memory unless you can somehow free a good amount of RAM or find a more memory-efficient method for clustering.  If I'm reading it correctly, R wanted to allocate about 382 MB memory on top of what it had already taken but your computer had only about 98 MB swap plus about 1 MB RAM left to give.


On Wed, 30 Mar 2005 22:02:04 +0800
ronggui <0034058 at fudan.edu.cn> wrote:

            
#
Is there a better way than:


par(mar=c(6,6,6,6))
plot(1:10,yaxt="n",ylab="")
axis(4)
text(12,5.5,'y-label',xpd=T,srt=90)


to get the y-ticks _and_ the y-label to the rhs of the plot? I did not 
find anything in the  'par', 'plot', 'axis' and 'title' manpages to 
solve the problem. (the above is ugly, because one needs to hardcode the 
   text position or needs to calculate it 'manually' from par('usr'). it 
would be much nicer, if there were a flag to 'title' controlling were 
the labels occur).

thanks
joerg
#
Hello,

Please check if this is acceptable for you..

par(mar=c(6,6,6,6))
plot(1:10,yaxt="n",ylab="")
axis(4)
mtext('y-label',4,line=2)

Regards,
Carlos.

On Thu, 31 Mar 2005 12:31:49 +0200, joerg van den hoff
<j.van_den_hoff at fz-rossendorf.de> wrote: