[Bf-funboard] GPU accelerated rendering

Timothy Baldridge tbaldridge at alertacademy.com
Fri Oct 22 16:23:10 CEST 2004


Okay, this is getting rediculus. I have been interested in GPU rendering 
for blender for some time. And I a have researched it quite alot. I must 
say that I am rather suprised at how little work it really takes. 
Basicly there are two ways of doing GPU rendering:

OpenGL + Cg (or equivalent)  -  In this type of rendering, the renderer 
uses OpenGL to draw the scene and Cg (Nvidia's free GPU programming 
language)  for shading. This is the fastest,  but is the hardest to get 
raytracing and GI working in.

Cg - Uses only Cg for rendering. Basicly all the triangle data is handed 
to the GPU via a texture, and then each and each output pixel is 
calculated in the GPU. For more information look up Purcell's documents 
under raytracing on www.gpgpu.org.


It can be done. It takes a lot of work, and can be very frustrating (Cg 
is a pain to debug). But I would be willing to work with someone on this 
project.

Note: the minimum requirements for somthing like this would be GeForce 
FX ($100) or equivalent.

tbc++


More information about the Bf-funboard mailing list