Skip to content
Prev 5033 / 15274 Next

Exploratory analyses: Experience using gputools-package for Nvidia graphics-accellerators?

Gero Schwenk wrote:
This is my personal experience and thoughts only, and not as 
well-informed as I might like, ymmv.

I know firms in finance that are making extensive use of different GPU 
architectures.  They are *all* doing a lot of low level C programming to 
do it, using the API directly in many cases, or reference 
implementations of linear and matrix algebra packages tuned for the GPU 
they've chosen.  I appreciate the approach if you have the resources to 
engage in it.

My personal feeling is that the "general purpose" in "general purpose 
GPU" will not be met until the linear algebra libraries that are hidden 
from most users transparently support execution on GPU's.  See for 
example the MAGMA project, run by the folks that brought us the widely 
deployed ATLAS.

After experimenting with some of the tools that are available now, I 
made the decision here at my work to not do anything serious with GPU's 
right now.  I expect to revisit that decision again in a few months, as 
the machines at my desk already have reasonably powerful GPU hardware in 
them.  However, right now, the potential hasn't gotten to the level 
where it makes it worth the work for what I do.

I think that over time, commonly available math libraries and 
parallelization frameworks will embrace GPU's, and *then* I'll have more 
reason to spend time working with them.

Cheers,

    - Brian