[Bf-funboard] Freestyle into Blender: Howto?
psyborgue at mac.com
Thu Mar 8 23:29:28 CET 2007
On Mar 8, 2007, at 3:51 PM, Matt Ebb wrote:
> On 09/03/2007, at 12:08 AM, Michael Crawford wrote:
>> On Mar 8, 2007, at 2:20 AM, Stéphane Grabli wrote:
>>> In short: How should a contour-line renderer be integrated
>>> to Blender?
>> Well. I think the compositing node was a really bad idea
> However, even if freestyle doesn't work as a post process, it would
> still be great to have various render pass outputs to the
> compositor via the render result node, so you can tweak it later.
Just to be clear, in case i wasn't earlier (I was in a hurry):
Multiple renderers already work with linked scenes. I often use this
to my advantage: for example, i have a background scene at half/
quarter res in another scene, then set one of the "render result"
nodes in another scene to use that scene, scale it up, set up a
blur.... etc. The result is, when I hit render, all the scenes,
layers, etc, dependent on the final composite get rendered at once.
This allows me to set up render with multiple renderers and composite
by hitting one button. I can use fancy effects (Yafray HDRI etc) in
the background and get away with it since i'm rendering at such a low
res... Since it's blurred, one sample per pixel should even be
If all Freestyle does is plop out fancy outlines, that's great. If
it's integrated as another renderer it can be used through another
scene.... for example:
I can have
scene 1: Blender
scene 1: Freestyle
I can either choose one scene to be a master or create a third, such as:
scene 1: Composite, in which i can call the two other scenes.
In the "scene 1: Freestyle" scene I could set objects to use separate
materials (blender conveniently lets you do this) while still
maintaining any other changes that are made (such as tweaks to an
animation etc.) This would allow me to adjust freestyle shading
attributes on a per-object basis, and control exactly how it is
> This is already the case with Blender's current 'edge' output.
> Throwing another idea out there: What if freestyle was integrated
> as another material type? It could be done in a similar way to how
> Halo works. That way it would be much easier to integrate both line
> rendered and non-line rendered imagery in the one shot.
A few questions: How would lines and the regular material work at
the same time. I mean, i can't assign two materials to the same
object can I? If it was changed into another material type like
"halo" wouldn't that involve integration into the main renderer?
Hey. If Ton goes for it, great!, but i can see reasons he would
Personally, I'd like to see a plethora of renderers supported by
blender, so I could use the best renderer per element/pass. Perhaps
a Universal Rendering Interface / API of some sort could be proposed,
designed, and implemented so third parties can easily integrate into
blender without python (a language that is constantly changing)
scripts. If this was done, perhaps the renderers could even send
previews to blender on request (very handy for the preview window).
Right now, I use Blender's internal renderer almost exclusively,
chiefly becuase it is "wysiwyg". I would love to be able to use
renderers like Pixie (aah the glory of REYES), Indigo, or even PrMan
within blender... Being a cvs addict, however, scripts that worked
one day, often break the next, (python grrrr...).
This could also have added benefits. AFAIK If renderers were
completely seperate external processes (merely communicating with
blender for example), that would allow them to either work around the
64 bit limitation, or, in the case of Mac OS / Linux, use 64 bit
natively. As I understand it, the reluctance to switch to 64 bit is
only because of the problems with blender's sDNA design (breaking
file compatability). So... who cares if the renderer, a seperate,
completely output process, is 64 bit. That's the only place it is
really needed anyway. And... OSX Tiger supports 64 bit command line
apps... Perhaps the renderer can act like a frameserver, recieving
data, and forwarding rendered frames back to blender...
At least that's the way I understand it. I may be wrong, I quit
developing a few years back (burn out) and went into art.
> Matt Ebb . matt at mke3.net . http://mke3.net
> Bf-funboard mailing list
> Bf-funboard at projects.blender.org
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Bf-funboard