Skip to content

memory allocation error

8 messages · M. Edward (Ed) Borasky, Brian Ripley, Peter Dalgaard +3 more

#
Hi,

I have recently installed R-1.2.2 for windows (16MB RAM, P-166) and I
getting the following message after processing my data (6 variables and
1200 observations):
Then, the program close.

With the last version, 1.1.1 (I think) I didn't have this kind of problem.


 I've tried to increase memory through:

rgui.exe --vsize 16M --nsize 1000k

but nothing

Thanks in advance,

Antonio
Antonio Rodr?guez Verdugo
Huelva, Spain
rod.chav at hsoft.es
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
#
On Wed, 21 Mar 2001, Antonio [iso-8859-1] Rodríguez Verdugo wrote:

            
I was *just* going to post this same thing myself! My environment is a 400
MHz P III running Windows 2000 Professional, and I get the error at 127 MB on
my 128 MB machine :-). And I get the same thing on my 192 MB machine at home.
I am doing a "read.csv" on a 15 MB "CSV" file. What's even more interesting is
that this consumes so much memory that "q()" stops working!! I have to kill it
with the Task Manager. I haven't checked to see how big my pagefile is yet,
but it's obviously checking how much RAM is installed and hitting some kind of
wall there. I have some bigger Linux boxes available that I can try this
one on, so I may be able to get it to completion without waiting for the
latest R or trimming my data set. I can't do much about the rows, but it's
easy for me to drop unneeded columns; the raw data is coming from Microsoft
Access.
---------------------------------------------------------------------------
R is a collaborative project with many contributors.
Type `contributors()' for more information.

Type `demo()' for some demos, `help()' for on-line help, or
`help.start()' for a HTML browser interface to help.
Type `q()' to quit R.

[Previously saved workspace restored]
[1] "b"            "l"            "last.warning" "monsample"    "p3.500"
 [6] "p4.500"       "p4.667"       "reg"          "regP0"        "regP1"
[11] "regP2"        "regP3"
Error: vector memory exhausted (limit reached?)
In addition: Warning message:
Reached total allocation of 127Mb: see help(memory.size)
Lost warning messages
Error: vector memory exhausted (limit reached?)
In addition: Warning message:
Reached total allocation of 127Mb: see help(memory.size)
Lost warning messages
---------------------------------------------------------------------------
znmeb at aracnet.com (M. Edward Borasky) http://www.aracnet.com/~znmeb

"I'm not a saxophone, but I play one on TV!" -- Bill Clinton

-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
#
On Wed, 21 Mar 2001, Antonio [iso-8859-1] Rodríguez Verdugo wrote:

            
Try reading the rw-FAQ and/or the actual help file the message mentions!
It says nothing about those flags.

Hint: when you are asked to consult a help file, the information is there
Hint hint: the flag is called --max-mem-size
#
"M. Edward (Ed) Borasky" <znmeb at aracnet.com> writes:
(See Brian's post on the memory options)

R does have a rather bad habit of expanding data when they are loaded
into memory by storing into 8 byte doubles etc. Some even worse
inflation takes place during read.table (et al.) processing. In your
case, you seem to be able to read one file but not the second so you
might hope that both would actually fit when converted and try
something like

x<-read.csv(...)
save(x,file="filex")
rm(x)
y<-read.csv(...)
save(y,file="filey")
rm(y)
...

load("filex")
load("filey")

A more interesting option is to set up an ODBC connection to the
Access database and use the RODBC package, which should allow you to
select on columns, etc. You can even do that to your CSV file
(slightly tricky, I have only tried it once and it took a while to
figure out that you need to set up the ODBC connection to be to the
*directory* containing the text file(s)).

[Anyone for implementing a columns= option for scan() and read.xxx() ??]
#
On 22 Mar 2001, Peter Dalgaard BSA wrote:

            
Nah ... that would give me an excuse not to learn how to use RODBC :-) I ended
up going back to Access and dropping a bunch of columns from the query before
I exported it to a CSV file. Another excuse not to learn RODBC :-).
--
znmeb at aracnet.com (M. Edward Borasky) http://www.aracnet.com/~znmeb

Coming soon to a theatre near you -- the remake of "War With The Newts"!

-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
#
A way to read particular columns (if you have unix-like tools):
That would get column 2, 5, and 7 of "filename" into a three column matrix. 
(Not so sure it qualifies as an excuse not to learn how to use RODBC)
 
  Cheers, Pierre
#
rgui.exe --vsize 16M --nsize 1000k

This is just one method I've found, and in Mr. Maindonalds' manual is the
same information.
Yes, it mentions the size of my virtual memory, but then what?. If I change
the settings, what's going wrong now?
Could you explain me this a little bit please.

R is the only program that asks me for memory space. I think I'm not
managing a very big dataset (1200 observations, 6 variables). Even with the
base datasets I find the same constraints.

Thanks in advance

Antonio
Antonio Rodr?guez Verdugo
Huelva, Spain
rod.chav at hsoft.es
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
#
Antonio Rodr?guez Verdugo wrote:
I do not remember if you have mentioned which version of R you are
running.
Assuming R >= 1.2.0 you can extend the default (the minimum of physical
installed RAM and 256Mb) by setting the command line flag
--max-mem-size.

Your version of John Maindonalds paper seems to be written for R <
1.2.0.
Maybe you misunderstood Brian's hint:
Please read the FAQs and the help file, especially 
?Memory
The maximum amount of memory R is allowed to consume (defaults to the
minimum of physical installed RAM and 256Mb, as described in ?Memory).
But then also the only which protects you getting into trouble with
endless swapping activity.

Uwe Ligges
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._