[Bf-committers] Parallel Blender

Knapp magick.crow at gmail.com
Mon May 10 16:51:14 CEST 2010


On Mon, May 10, 2010 at 2:44 PM, André Susano Pinto
<andresusanopinto at gmail.com> wrote:
> Hi Knapp,
>
> On 8 May 2010 08:02, Knapp <magick.crow at gmail.com> wrote:
>
>> I was reading these other posts about where Blender might go and just
>> wanted to add my thoughts. Naturally this is just IMOHO. And I know
>> blender is taking steps in this direction.
>>
>> Parallel is where it is at.
>>
>> I am sure most of you know this but I just wanted to highlight the
>> total importance of it and make sure it is in front of everyones face.
>> This is not something to put off, as I see it.
>>
>> Computers are very quickly going parallel. Just a few years ago my
>> computer had one CPU and the GPU was not much to look at. My current
>> computer has 2 CPUs and a GPU that is old and out of date but still
>> very very fast. My next computer will likely have 6 CPUs on a chip and
>> if all goes well 2 of these chips for a cost of about 400 USD and 2
>> graphics cards with high end GPUs that work together, this will happen
>> this year! Can Blender make the most of this?
>>
>> Just like everything computer, the number of CPUs will likely double
>> about every 12 to 18 months. This means that many of our users will be
>> running computers with over 4 CPUs by the time 2.5 is released. Just 2
>> years latter, I would not be surprised if most users had 12 CPUs. Two
>> years after that many top end users might be running systems with 48
>> CPUs and god only knows what the GPUs and physics systems will look
>> like. Blender should be able to eat that stuff up with ease to stay in
>> the lead.
>>
>
> I actually doubt the number of processors will increase like that.
> It will probably increase, but I wonder about your numbers :p
> Since there's other problems like caching limits and memory bandwidth.
> Also the fact most applications are not able to take profit out of an
> increased number of cpu's
> may also lead to a decrease effort on hardware manufacturers.

I am totally open to being wrong however even basic computer will have
at least 4 CPUs in the very near future. That trend is here and clear.
I am no expert on parallel programming but I would assume that making
it run on 4 is about the same as making it run on 8 or 32.

As for your reasons that it will not happen, I have seen things like
that stated since I started with computers is the mid 80s. What always
happens is some person comes along and ether finds a way to make some
super new idea work or they just push right through the problem with
extra hardware. If a super computer can do it now, a desktop will do
it better in 10 years time.

For a long time we watched as CPUs just got faster. Then they started
to get special parts like MPU, GPU, etc and now even a physics CPU.
Now we have reached the ending of that push and are expanding into the
multi CPU and GPU areas. I would guess that we might start seeing
computers with multiple physics CPUs in a few years too.

Have you read this:
http://en.wikipedia.org/wiki/Avatar_(2009_film)
It blows my mind!
Wiki:
"To render Avatar, Weta used a 10,000 sq ft (930 m2) server farm
making use of 4,000 Hewlett-Packard servers with 35,000 processor
cores running Ubuntu Linux and the Grid Engine cluster manager. The
render farm occupies the 193rd to 197th spots in the TOP500 list of
the world's most powerful supercomputers. Creating the Na'vi
characters and the virtual world of Pandora required over a petabyte
of digital storage, and each minute of the final footage for Avatar
occupies 17.28 gigabytes of storage."

I would love it if blender could have done that work. Ya, I know, big
dreams. :-)

>>
>> Another thing to keep in mind is that in 4 years time most users will
>> likely be expecting everything to be 3d (as in Avatar). That seems to
>> be the big push in the industry right now both in movies, HDTV and in
>> gaming.
>>
>> Making blender take advantage of this in every way possible is a must
>> and something that, IMOHO, everyone should be being worked on very
>> hard right now. As I understand it, Python is currently the biggest
>> problem.
>>
>
> Though Yes! Blender is an excel position to use the increased processing
> power.
> And we should strive to use it :)
> But don't overdue it and parallelize everything! Some stuff work just fine
> and there might not be a point in spending time on it.

Naturally, we should do wise programming.

> We should also start plugin GPU in the really intensive parts?

I think so for the places it will help. GPUs are become super fast and
powerful and often users have more than one with very big data pipes.
I think that trend will only increase as computers are pushed harder
to do HD and true 3d display driving as well as game play. Not to use
this is a huge waste of available computing power.

> And btw whats the problem with python?

Again I am over my head and perhaps this is changing or changed but I
Googled up this for the answer:
"The most simple and common way to write parallel applications for SMP
computers is to use threads. Although, it appears that if the
application is computation-bound using 'thread' or 'threading' python
modules will not allow to run python byte-code in parallel. The reason
is that python interpreter uses GIL (Global Interpreter Lock) for
internal bookkeeping. This lock allows to execute only one python
byte-code instruction at a time even on an SMP computer."

I would like to know where Blender is at with regards to this too.

> And is there any compiled list of what needs to be improved and what current
> problems block it from being parallelized?

Good question!

> Best regards,
> André

-- 
Douglas E Knapp

Massagen Arbeit in Gelsenkirchen Buer
http://bespin.org/~douglas/tcm/ztab1.htm
Please link to me and trade links with me!

Open Source Sci-Fi mmoRPG Game project.
http://sf-journey-creations.wikispot.org/Front_Page
http://code.google.com/p/perspectiveproject/


More information about the Bf-committers mailing list