Skip to content
Prev 3482 / 29559 Next

Memory limit problems in R / import of maps

Dylan,

Thanks for your note.

A student of mine would like to run habitat suitability analysis using the adehabitat package
(http://dx.doi.org/10.1890%2F0012-9658%282002%29083%5B2027%3AENFAHT%5D2.0.CO%3B2). I encouraged him
to use R, for many reasons.

At the moment, he is thinking of doing the whole thing in Matlab (or using the original Biomapper
software), because we would not like to give up on the original resolution (250 m).  

As a GIS person, I definitively do not see ~20 millions pixels as a Huge data set.

cheers,

Tom Hengl



-----Original Message-----
From: Dylan Beaudette [mailto:dylan.beaudette at gmail.com] 
Sent: dinsdag 22 april 2008 17:22
To: Tomislav Hengl
Cc: r-sig-geo at stat.math.ethz.ch; Michalis Vardakis
Subject: Re: [R-sig-Geo] Memory limit problems in R / import of maps
On Tue, Apr 22, 2008 at 6:49 AM, Tomislav Hengl <hengl at science.uva.nl> wrote:
would
http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-be-a-limit-on-the-memory-it-us
message
loading
Hi,

What exactly were you hoping to do with such a massive data frame once
you overcame the initial memory problems associated with loading the
data? Any type of multivariate, classification, or inference testing
would certainly require just as much memory to perform any analysis on
the stack of grids.

Not knowing what the purpose of this operation is (although I would
guess something related to soil property or landscape modeling of some
sort), it is hard to suggest a better approach. For grid that size I
would use an algorithm that operates on strips or tiles. There are
several great starting points in the GRASS source code. Doing all of
the pre-processing, and possibly some aggregating to larger support
size, in GRASS would allow you to test any R-centric operations on a
coarser version of the original dataset.

Cheers,

Dylan