[Bf-viewport] Rendering to a cubemap

Clément FOUCAULT foucault.clem at gmail.com
Tue Mar 22 14:32:55 CET 2016


If someone cares I manage to get cubemap capturing to work inside blender's
viewport.

http://blenderartists.org/forum/showthread.php?343278-GLSL-PBR-Shader-for-viewport&p=3025302&viewfull=1#post3025302

If someone more experienced than me want take a look to my code and tell me
if i'm going wrong in some areas I would be very graful.

I don't have any diff file for now but I have a github :
https://github.com/Hypersomniac/blender-shader

2016-01-29 2:06 GMT+01:00 Brecht Van Lommel <brechtvanlommel at pandora.be>:

> Hi Clément,
>
> Thanks for you insights. It's a great overview of the trade-offs of
> the various methods.
>
> I agree about deferred shading, it adds limitations for artists in the
> shaders and lighting they can create, and it really would be a very
> different rendering system. It may give better performance for games,
> but for use cases like realtime rendered short movies (e.g. Glass
> Half) or Cycles previews it's not so great. There's more important
> things to solve before complexifying the system with deferred shading.
>
> Brecht.
>
>
> On Fri, Jan 29, 2016 at 12:55 AM, Clément FOUCAULT
> <foucault.clem at gmail.com> wrote:
> > About the ubershader :
> > I don't think it's easier. The current viewport rendering is done in
> forward
> > Shading and is already built to allow a wide variety of shaders. That's
> why
> > my build exists. Because it was "easy".
> > Making the viewport works in deferred shading should imply another render
> > "engine" (i'm talking about the drop down menu on the top header) as the
> > limitations of such method is not compatible with previewing cycles
> > materials. It's not easier. It would be just faster to render with
> complex
> > materials (but these materials are subject to limitations).
> >
> > Clément
> >
> >
> > 2016-01-28 23:49 GMT+01:00 Hadrien Brissaud <hadriscus at gmail.com>:
> >>
> >> Just my two cents : from a user pov having a monolithic ubershader for
> >> viewport seems fine, probably covers most uses and you make it sound
> like
> >> it's way easier to make. Now my use case is more like having a nice
> preview
> >> of actual cycles shaders (ie not necessarily super-accurate) rather
> than a
> >> full fledged cryengine, maybe not everyone is in the same boat.
> >>
> >> By the way thanks a lot Clément for contributing.
> >>
> >> Hadrien
> >>
> >> On 28 January 2016 at 23:37, Clément FOUCAULT <foucault.clem at gmail.com>
> >> wrote:
> >>>
> >>> Hi Ton,
> >>>
> >>> I don't have a design doc but I can share some thoughts about what I
> read
> >>> on this mailing list.
> >>>
> >>> Please note I am not feeling very familiar with all concepts below and
> I
> >>> can really be misleaded. I am a 3D artist before being a coder.
> >>> Also please note that I'm primarily focusing on the viewport and not
> the
> >>> game engine.
> >>> Some of my notes have already been expressed by other participants.
> >>>
> >>> About deferred shading :
> >>> I think it's a bad idea overall.
> >>> Why? Because of the multitude and flexibility of blender's node setups
> we
> >>> can have all sorts of bsdf inside one shader and thus having a lot of
> >>> parameters.
> >>> In fact the number of parameters is not fixed and we cannot pass this
> to
> >>> the shading step unless using a gigantic G-buffer.
> >>> So deferred shading would be nice ONLY if we have a static "ubershader"
> >>> like unreal with consistent inputs. But that would mean subtracting
> >>> features.
> >>> Even Epic is reluctant to add anisotropic shaders because of the cost
> of
> >>> another parameter inside the G-buffer.
> >>> I know we could support both and switch between the one that fits
> better
> >>> to the task. But in the short run I think we should try to focus only
> on one
> >>> of these.
> >>>
> >>> About light probes :
> >>> Needs to be HDR (kinda obvious). Needs to be dynamic (rendered from
> >>> opengl) if not it wont be worth it to user to spend time to hit a
> render
> >>> button and wait at least 30secs for a preview.
> >>> Pre-convolving (baking) the bsdf to the cubemap can be slow even for
> the
> >>> gpu (500ms for a 64px cubemap with diffuse brdf with average gpu). So I
> >>> would prefer to have this as an option.
> >>> Also we can't convolve for each Bsdf that would mean a lot of cubemaps.
> >>> As I said previously, I'm in favor a global cubemap which would be the
> >>> result of the world node tree. This one is applied by default to all
> >>> materials / objects.
> >>> Then similarly to the blender internal envmap texture, we could
> specify a
> >>> viewpoint for the object cubemap.
> >>> To get accurate reflection Epic Unreal Engine correct the ray casting
> >>> (who sample the cubemap) to fit a volume sphere or box. This would be
> useful
> >>> for indoor scenes.
> >>> As of now, in my branch I use Importance sampling (
> >>> http://http.developer.nvidia.com/GPUGems3/gpugems3_ch20.html )
> because it
> >>> does not need to precompute anything.
> >>> The number of samples could be exposed to the user to get the most
> >>> accurate reflections for render, or to lower the samples for
> performance
> >>> reason.
> >>> For diffusion i'm using spherical harmonics.
> >>>
> https://seblagarde.wordpress.com/2012/01/08/pi-or-not-to-pi-in-game-lighting-equation/
> >>> Computing them depends on the size of the map as it read each texel and
> >>> the type of mapping. But it's fairly quick and require only 1sec for a
> >>> 2000x1000px equirectangular envmap.
> >>> Any kind of (lengthy) pre-computation should not freeze blender's UI.
> >>> Pretty much like the renderers do.
> >>>
> >>> About lamps :
> >>> I'm already using epic's approach of area lights.
> >>> Using tube lights for area (rectangular) lights as I was not able to
> find
> >>> a correct algorithm for them.
> >>> But energy conservation is not perfect for this solution and does not
> >>> quite match cycles at grazing angles.
> >>> We could use this paper https://www.youtube.com/watch?v=O3TG1VXx1tg
> but I
> >>> highly doubt the computational cost is worth it.
> >>> Another technique is by Michael Drobot inside GPU Pro 5. But I did not
> >>> have much success with it.
> >>>
> >>> About aligning on UE4 :
> >>> Their implementation of PBR is already doable inside cycles.
> >>> They just use baking to do less calculation at runtime.
> >>> My branch is already using a lot of their maners.
> >>> Supporting cycles shaders implies supporting UE4 shaders.
> >>> If you want to talk about rendering pipeline that's another topic.
> >>>
> >>> About viewport framebuffer:
> >>> We need a float framebuffer for a lots of effects. Like bloom,
> >>> tone/colormapping, Quality dof.
> >>> You could also render quick passes with float informations (like Z
> depth
> >>> or world normal for instance) instead of doing a slow cycle pass.
> >>> I suggested a R10_G11_B11 format but that means having a separate alpha
> >>> buffer.
> >>> To have a "correct shading", with only the indirect light being
> >>> multiplied by ambiant occlusion, we need to separate the direct light
> from
> >>> the indirect lighting.
> >>> That would mean another float buffer unfortunately.
> >>> Maybe there is a way to encode everything inside a big 64bit buffer
> with
> >>> alpha but it can get a bit complicated.
> >>> I've seen people doing these kind of things but I'm not completely sure
> >>> if it's worth it.
> >>>
> >>> RGBA16    R16   G16   B16   A16
> >>> Direct    8bits 8bits 8bits 4bits -> R9G10B9
> >>> Indirect  8bits 8bits 8bits 4bits -> R9G10B9
> >>> Alpha     -     -     -     8bits
> >>>
> >>> I don't know if these extra bits could be enough.
> >>> Also not to forget the depth buffer.
> >>>
> >>> I'd really like to also see ScreenSpaceLocalReflection inside the
> >>> viewport but this means to have a mean to tag relevant area of the
> >>> frame-buffer to blend with a reflection buffer.
> >>> And it should overwrite only the indirect specular (envmap), that means
> >>> basically splitting up the color buffer again and it can get pretty
> far.
> >>> So i'm not positive on this. But if someone has an idea about this that
> >>> would be cool.
> >>>
> >>> I can populate some design docs (at least try I've) but as I mention
> >>> earlier i'm not familiar with all the opengl drawing methods and how
> blender
> >>> handles it.
> >>>
> >>> I'll be around IRC when I will have more time. I'm starting a job
> Monday
> >>> so it won't be easy for me.
> >>>
> >>> I hope we can decide where to go and help each others.
> >>>
> >>> Thank you for your interest.
> >>>
> >>> _______________________________________________
> >>> Bf-viewport mailing list
> >>> Bf-viewport at blender.org
> >>> http://lists.blender.org/mailman/listinfo/bf-viewport
> >>>
> >>
> >>
> >> _______________________________________________
> >> Bf-viewport mailing list
> >> Bf-viewport at blender.org
> >> http://lists.blender.org/mailman/listinfo/bf-viewport
> >>
> >
> >
> > _______________________________________________
> > Bf-viewport mailing list
> > Bf-viewport at blender.org
> > http://lists.blender.org/mailman/listinfo/bf-viewport
> >
> _______________________________________________
> Bf-viewport mailing list
> Bf-viewport at blender.org
> http://lists.blender.org/mailman/listinfo/bf-viewport
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.blender.org/pipermail/bf-viewport/attachments/20160322/d33fae1e/attachment.htm 


More information about the Bf-viewport mailing list