Skip to content

Graphic CPU usage in R

4 messages · Maurizio Marchi, Rich Shepard, Roger Bivand +1 more

#
HI all,
I'm a Linux user and (Ubuntu 18.04) and I was trying to find a way to 
speed up my R code. Reading on the web I found than some packages such 
as parallel or gpuR have been developed to allow faster calculation in 
R. Anyway I was wondering whether any other ways were available. My 
question concern how to speed up old R codes involving GIS procedures 
and mainly using rgdal, raster, biomod2, dismo, sp or other Spatial 
packages.
According to this post (here 
<https://starbeamrainbowlabs.com/blog/article.php?article=posts%2F254-run-program-on-amd-dedicated-graphics-card.html>) 
it seems that as linuk user we can launch a specific program using GPU 
so I was wondering if this could be used with R. In other words I would 
like to solve the issue from the beginning, opening an R session from 
terminal running on the GPU instead of on CPU(s). Is it possible? Does 
anyone has experience on it?
Here 
<https://www.researchgate.net/post/Parallel_computing_and_graphic_CPU_GCPU_usage_in_R_is_it_possible_with_ALL_R_packages> 
the question I opened on ResearchGate.
Thank you in advance and happy new year to everybody
#
On Tue, 31 Dec 2019, maurizio marchi wrote:

            
Maurizio,

How many cores are in the CPU of your machine? AMD processors have two
threads per core (e.g., the Ryzen7 in my desktop has 8 cores and 16
threads). Programs need to be compiled to use multiple threads and you need
libraries such as mesa or opengl to take advantage of that.

Also, how much memory is installed on that system? More is always better.
Something else for you to consider is that there are two types of video
cards: those designed for gamers and those designed for technical work. An
explanation of the differences (focused on nVidia's products) is here:
<https://www.quora.com/What-is-the-different-between-gaming-GPU-vs-professional-graphics-programming-GPU>.

There are multiple facturs involved so it's not a simple solution. Of
course, if you have a long spatial model running you can start it using
screen and it will continue running even after you log out as long as the
computer is running.

Hope this helps,

Rich
#
On Tue, 31 Dec 2019, Rich Shepard wrote:

            
GPU's are where the action was about ten years ago, but are not now. Many 
of the spatial packages that can benefit from multiple processors already 
facilitate their use, but often inter-process communication is the 
bottleneck, not per processor computation. A report from ten years ago is: 
https://papers.ssrn.com/sol3/Delivery.cfm/SSRN_ID1690584_code1391513.pdf?abstractid=1690584&mirid=1

Now, look to the stars and gdalcubes and many others, where the data are 
held in the cloud, and processing may be assigned to cloud nodes, with 
only target resolution output needing to be downloaded. The cloud nodes 
may actually be GPUs, but for the user this is transparent.

There are plenty of R packages accessing GPUs, described on the HPC task 
view: https://cran.r-project.org/view=HighPerformanceComputing.

Hope this clarifies,

Roger

  
    
#
Hello!

Are you using the PGI compiler, please?  If not, you may want to check that
out.  It works with Linux.
Thanks
Erin

On Tue, Dec 31, 2019 at 5:54 AM maurizio marchi <maurizio.marchi at ibbr.cnr.it>
wrote: