[Bf-committers] Challenges for distributed rendering

Sergey Sharybin sergey.vfx at gmail.com
Mon Aug 4 15:17:09 CEST 2014


Linked files you can handle using "Pack Blender Libraries" operator, it
packs all the .blend libraries into the current .blend. That .blend you can
send to the render farm.

But there are more concerns about rendering on the farm in general:
- You can't pack movies
- You can't pack image sequences
- You can't pack Movie Clip datablock
- You can't pack bakes
- something i'm missing?

Those are to be addressed by the renderfarm addon and pass those files as
separate files.


On Sun, Aug 3, 2014 at 4:39 AM, Jonas Eschenburg <indyjo at gmx.de> wrote:

> I am developing a  distributed rendering add-on for Blender, based on
> BitWrk (http://bitwrk.net). So far, I've come a long way by studying
> what others have been doing and getting inspiration from add-ons such as
> "netrender" and "render_renderfarmfi".
>
> One challenge in distributed rendering is replicating the complete set
> of files needed for rendering onto the worker computers.
>
> There are a couple of different cases:
>
> - Textures. Blender's Python API makes iterating over all images quite
> easy, even those which were linked from some library. There is
> "File->External Data->Pack all into blend" and the associated Python
> function.
>
> - Libraries, i.e. linked blend files. Just like with images, it is easy
> to iterate over them and get to the file name. Also, there is
> "Object->Make Local->All" which copies all linked objects into the main
> blend file.
>
> - Caches, other resources. Haven't really looked into those yet. Lets
> exclude them for now.
>
> So, in order to have a nice, all-encompassing blend file that can be
> sent over the wire, the user would need to perform two distinct actions
> and then save the file. Might get annoying. Worse yet, it requires the
> user to make modifications to the blend file that have to be rolled back
> later on (you generally don't want to include all linked resources
> permanently).
>
> The renderfarm.fi add-on simply packs all textures, makes all objects
> local and saves the scene, without telling the user. Not very nice, if
> you ask me.
>
> The netrender add-on transmits all linked resources separately, but has
> to go through big troubles in order to restore referential integrity
> between files on the worker side. File names have to be changed for safe
> transmission, e.g. you can't use absolute paths for obvious security
> reasons.
>
> My own approach was inspired by netrender's, in that I also transmit all
> connected files separately (but in one big transmission) and assign a
> unique ID to each file. Then, on the worker side, I store all files
> using their unique IDs, not under the original filenames. Blender is
> then started with a python script that walks through all linked
> resources and replaces the original file names (which can't be resolved
> on the worker) by the new unique IDs.
>
> This has turned out to be very troublesome because
> - I can't prevent Blender from trying to load all linked resourced under
> their original file name initially
> - and to make things even more complicated, Blender removes all
> references to libraries it can't load on startup.
>
> Ideally, there would be a function
> "save_main_file_under_another_name_with_all_external_resources_included()".
>
> Second best would be a way to manipulate all data on save time, perhaps
> by employing a visitor pattern.
>
> Third best would be a method to defer loading of external resources
> until a python startup script had the chance to replace file names
> properly.
>
> Ok, anyone has a suggestion how to handle linked files nicely?
>
> Thanks,
> Jonas
>
>
>
> _______________________________________________
> Bf-committers mailing list
> Bf-committers at blender.org
> http://lists.blender.org/mailman/listinfo/bf-committers
>



-- 
With best regards, Sergey Sharybin


More information about the Bf-committers mailing list