Skip to content
Prev 24338 / 29559 Next

best practice for reading large shapefiles?

Vinh

Even if it might be in this list OT, IMHO R is not the best tool for 
dealing with this amount of vector data. Actually I agree completely 
with Roger's remarks and corresponding to the "competent platform" you 
also may think about using software for big data...

As Roger already has clarified: The recommendation what might be best 
depends highly  on your questions and issues or on the type of analysis 
you need to run and cannot be answered straightforward.

I think Edzer can clarify up to which size sp object are still "usable", 
following my experience  i would guess something like 500K polygons 1M 
lines and up to 5M points but it is highly dependent on the number of 
attributes. So you are far beyond this.


If you want to deal with this amount of spatial vector data using R, it 
is highly reasonable to have a look at one of the mature GIS packages 
like GRASS or QGIS. You can use them via their APIs.
Nevertheless you easily can put it in postgres/postgis and perform all 
operations/analysis using the spatial capabilities and build in 
functions of postgis if you are an experienced PostGis user.

cheers
Chris


Am 26.04.2016 um 22:33 schrieb Vinh Nguyen: