Skip to content

working with large ascii files in R

5 messages · Rocio Ponce, Roman Luštrik, Mauricio Zambrano-Bigiarini +1 more

#
On 07/18/2013 09:26 AM, Roman Lu?trik wrote:
As Roman mentioned, I also suggest you, as first step, to update R (and 
all the packages) up to version 3 or higher, because one of the 
significant user changes introduced in R 3.0.0 was related to memory 
(from the NEWS file):

      "It is now possible for 64-bit builds to allocate amounts of
       memory limited only by the OS.  It may be wise to use OS
       facilities (e.g. ulimit in a bash shell, limit in csh), to set
       limits on overall memory consumption of an R process,
       particularly in a multi-user environment.  A number of packages
       need a limit of at least 4GB of virtual memory to load.

       64-bit Windows builds of R are by default limited in memory usage
       to the amount of RAM installed: this limit can be changed by
       command-line option --max-mem-size or setting environment
       variable R_MAX_MEM_SIZE."

Kind regards,

Mauricio Zambrano-Bigiarini, Ph.D
#
Rocio,

You should not use functions read.asc and raster.from.asc because they
read all the values into memory. Also please indicated which packages
you use. The below should fix the problem:

setwd('G:/future_A2a/2020/bccr_bcm2_0_sres_a2_2020s/')
library(raster)
mask<-raster("C:/xalapa/worldclim_pres/bio01.asc")
files <- list.files(pattern='\\.asc$')
s <- stack(files)
x <- crop(s, mask)

And it would indeed good to use a current version of R and its packages.
Robert
On Thu, Jul 18, 2013 at 12:05 AM, Rocio Ponce <r.ponce.reyes at gmail.com> wrote: