Skip to content

heap size trouble

4 messages · karamian, Uli Flenker; Raum 704, Faheem Mitha

#
Hi ,

I 've got a trouble with using R.

When I want to load a file that contains 93 thousand raws and 22 colums
of data (essentially float)
R shows me this error message

"heap size trouble"

Does anyone could tell me what parameter shall I precise before
launching R in order to load my big file.

Thanks a lot


-------------- next part --------------
A non-text attachment was scrubbed...
Name: karamian.vcf
Type: text/x-vcard
Size: 331 bytes
Desc: Card for karamian
Url : https://stat.ethz.ch/pipermail/r-help/attachments/20000530/94a555d3/karamian.vcf
#
Karamian,
have a look at "help(memory)" ...

You can enlarge the memory reserved for R by specifyng the command line
arguments "--vsize" and "--nsize". When using ESS, type "C-u M-x R" and
ESS will prompt for arguments passed to R.


        Uli Flenker
        Institute of Biochemistry
        German Sports University Cologne
        Carl-Diem-Weg 6

        50933 Cologne / Germany

        Phone 0049/0221/4982-493
On Tue, 30 May 2000, karamian wrote:

            
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
#
You need to change R_VSIZE in .Renviron I belived, at least if you are
using Unix.                Faheem. 

Ie R_SIZE=somethingM

where something is a number currently larger than what you are using, and
M stands for megs. eg

R_SIZE=35M

You can use gc() to see your current memory situation. Ie. mine gives:
free   total (Mb) <- currently available
Ncells  839332 1024000 19.6
Vcells 3258006 4587520 35.0
                       
This is all for Unix, might differ for other platforms. In any case look
at the section in the FAQ about memory or do
I've found memory problems with large data sets in R a big pain. Sometimes
I just run out of memory and have to give up. I wish there was a magical
way arond this, but there doesn't seem to be.

                                             Yours, Faheem.
On Tue, 30 May 2000, karamian wrote:

            
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
#
Oooops,
its "help(Memory)" not "help(memory)"!
          ~                
	Sorry for this ...


        Uli Flenker
        Institute of Biochemistry
        German Sports University Cologne
        Carl-Diem-Weg 6

        50933 Cologne / Germany

        Phone 0049/0221/4982-493
On Tue, 30 May 2000, Uli Flenker; Raum 704 wrote:

            
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._