about memory
Yes, you may need more memory unless you can somehow free a good amount of RAM or find a more memory-efficient method for clustering. If I'm reading it correctly, R wanted to allocate about 382 MB memory on top of what it had already taken but your computer had only about 98 MB swap plus about 1 MB RAM left to give. On Wed, 30 Mar 2005 22:02:04 +0800
ronggui <0034058 at fudan.edu.cn> wrote:
root at 2[ronggui]# ulimit -a core file size (blocks, -c) 0 data seg size (kbytes, -d) unlimited file size (blocks, -f) unlimited max locked memory (kbytes, -l) unlimited max memory size (kbytes, -m) unlimited open files (-n) 1024 pipe size (512 bytes, -p) 8 stack size (kbytes, -s) 8192 cpu time (seconds, -t) unlimited max user processes (-u) unlimited virtual memory (kbytes, -v) unlimited so it seems the data segment size is not limited. and it is still free mem(1000k or so),and swap(100000k or so),and the error is(i translate it from chinese into english,maybe not exactly ,but i think the meanings are right): error:can not allocate the vector size of 390585kb. (????: ??????????????390585 Kb??????)