[Bf-blender-npr] BEER: a possible action plan and request for comments/feedback.
ton at blender.org
Thu Oct 30 19:38:49 CET 2014
I've spent an hour with the BEER presentation at the conference, and it only left me more confused.
When I look at the goals and the project's presentation style (we BEER versus rest), with that very narrow focus on doing what already has been decided apparently - it only allows me one advice: just add this to Blender as an independent rendering engine, like Lux or Yafray or Mitsuba or Vray. Use an add-on for UI and editing. Start your own independent project, that's really much more fun and even quite well doable. Forget about realtime viewport drawing for now, but do it after we have our own viewport project done.
A project to gradually improve Blender's render system with better NPR features is welcome too, but it's made clear to me that this is not of "BEER"s interest.
Ton Roosendaal - ton at blender.org - www.blender.org
Chairman Blender Foundation - Producer Blender Institute
Entrepotdok 57A - 1018AD Amsterdam - The Netherlands
On 29 Oct, 2014, at 7:25, Roberto Maurizzi wrote:
> Dear Ton (and all people that might be able to help),
> I am Roberto, one of the people interested in adding better/simpler 'expressive' rendering capabilities into Blender and helping with BEER project.
> BEER's plan after I joined the merry group was to "copy-paste" the Cycles code, using its internal architecture but changing the rendering part, mainly because we don't want to risk breaking BI (or Cycles) and we'd like to use C++ to be more productive, but if this isn't advisable or acceptable there are other possibilities.
> To enable anyone to experiment with their preferred render styles and algorithms, we will need to add user-programmable shader algorithms to BI render code.
> My first guess on how to do this would be to try to "extract" the current fixed-pipeline BI render code and transform it into a pluggable module, so that based on some condition (materials? layers?) the render path can call the original code or any one of a series of new render code modules that may then implement the core rendering functions based on any algorithm or technology their authors might choose.
> Since we think that GLSL is a good path (far more standard, stable and simple than OSL and useful as a game design pre-visualization tool) we'd like to go that route: this will require us to extract any required scene, model, lighting, material, etc. information from the Database_FromScene data and prepare it for submission to the GLSL code, that will do the work currently done in new_render_result() (this based on this document: http://wiki.blender.org/index.php/Dev:Source/Render/Pipeline). All the existing tile shader functions will have to be implemented in GLSL.
> Needless to say my main worries are:
> 1) will we be able to have a clean separation between data extraction and data rendering without requiring a major rewrite of BI's render pipeline or breaking the current multithread/multi-tile architecture?
> 2) what would be the best way to enable users to choose one renderer or the other? Scene? (Blender) Layer? Material? Object or Object Group? Or final compositing is enough?
> 3) Will it be possible to use this renderer for the Viewport too? (My guess is "yes", but I still have to check the existing code, let alone what's happening in the new Viewport FX code)
> 4) how to document all of this? AKA "what happened to Sphinx/Doxygen docs"?
> At this point, we'd like very much to know your ideas about this and the best way(s) to implement it.
> What are the preferred improvement targets you (Ton) and the BF have for BI? What do you prefer to avoid?
> Any feedback (including "this is totally wrong, there's a much better way to do it!") will be very much appreciated!
> Roberto Maurizzi
> (with some help from the others)
> Bf-blender-npr mailing list
> Bf-blender-npr at blender.org
More information about the Bf-blender-npr