Skip to content

Memory in R.

3 messages · Liviu Andronic, R RR, Gavin Simpson

#
Hello Amadou,
On Mon, Mar 31, 2008 at 4:53 PM, R RR <rstat.diallo at gmail.com> wrote:
You will find this recent thread [1] interesting. You'd also want to
check packages filehash, ff and sqldf.
Regards,
Liviu

[1] http://www.nabble.com/How-to-read-HUGE-data-sets--tt15729830.html
1 day later
#
Dear R users,
Many thanks for your answers.
I've made much progress since my last posting.

I have now the followuing problem. I've run the GAM model

mygam <- gam(Y ~ factor(year)
+ m1.q02 + m1.q05y + m1.q05y2 + m1.q06 + m4b.q05 + m4b.q052 + m5a.q01
+ depratio + depratio2 + residence10y + urbrur + factor(prefect)
+ m1.q02_ps
+ m1.q05y_ps
+ m1.q05y2_ps
+ m1.q06_ps
+ m4b.q05_ps
+ m4b.q052_ps
+ m5a.q01_ps
+ depratio_ps
+ depratio2_ps
+ residence10y_ps
+ urbrur_ps
                  +factor(hhid), data=cwp2)

and obtained the following error code:

Erreur : impossible d'allouer un vecteur de taille 236.3 Mo
(in english: cannot allocate a 236.3 Mo vector memory).

I have 7237 observations in my data.

Is there any way to increase the memory to fit the model?

Best regards.

Amadou DIALLO



2008/4/1, Liviu Andronic <landronimirc at gmail.com>:
#
On Wed, 2008-04-02 at 16:46 +0200, R RR wrote:
No idea if this will help, but seeing as there doesn't appear to be any
smooth terms in that model (s() or lo() depending on whether this is
mgcv:::gam or gam:::gam), and you are fitting a Gaussian model (no
family so default used), why not fit this using lm? You might find this
takes less memory than the way gam handles making the model matrix.

HTH

G