Skip to content
Prev 66735 / 398525 Next

about memory

here is my system memory:
ronggui at 0[ronggui]$ free
             total       used       free     shared    buffers     cached
Mem:        256728      79440     177288          0       2296      36136
-/+ buffers/cache:      41008     215720
Swap:       481908      60524     421384

and i want to cluster my data using hclust.my data has 3 variables and 10000 cases.but it fails and saying have not enough memory for the vector size.  I read the help doc and use $R --max-vsize=800M to start the R 2.1.0beta under debian linux.but it still can not get the solution.so is my pc'memory not enough to carry this analysis or my mistake on setting the memory?

thank you.