[Bf-committers] Camera, displacement and normals

Gregor Mückl bf-committers@blender.org
Tue, 20 Jan 2004 23:30:15 +0100


Hi!

Unfortunately I was not able to solve this problem by using xparts and yparts 
(I've always used up the limit of 64 parts while trying different variants). 
In fact I do not see any change in the images. The seams I see between the 
images are still the same with normal mapping enabled.

The geometry and the color texture fits seamlessly in any case. Only normal 
textures spoil the fun. I haven't tried displacement yet, since you pointed 
out that this is a bit shakier than normal textures.

Just so that you get me right: the images that blender creates are just fine. 
I only cannot assemble them the way I want because of how the normal mapping 
is taken into account in the renderer (and I don't see any reason why it is 
done the way it is in blender right now; actually I think that this is wrong 
because it looks like the camera direction is taken instead of the ray 
direction).

For now I would be happy if someone could give me a good explanation of the 
reasons for this implementation.

Regards,
Gregor

Am Tuesday 20 January 2004 21:03 schrieb Ton Roosendaal:
> Hi,
>
> This is what panorama render was meant for. You can set any lens, Xsize
> (pixels per part), and Xparts for that. But whether the problem for
> displacement mapping will solve you'll have to check.
> If it doesn't we just have a bug. I can't think of a reason why it
> should not work.
>
> Another report here (glass + displacement) shows that there's still an
> issue to work at.
>
> -Ton-
>
> On Tuesday, Jan 20, 2004, at 15:41 Europe/Amsterdam, Gregor Mückl wrote:
> > Am Tuesday 20 January 2004 15:38 schrieb Gregor Mückl:
> >> Hi!
> >>
> >> I'm currently in the process of building a game engine which allows
> >> the
> >> player to look at panoramas, with viewing angles of 180 degrees in the
> >> vertical direction and 360 degrees in the horizontal direction.
> >>
> >> I am able to render these panoramas within blender with a small
> >> trick: I
> >> use a camera fov of 90 degrees and render 6 images - two along each
> >> coordinate axis - and project them onto a cube which is rendered in
> >> realtime. This method works quite good.
> >>
> >> Obviously, the rendered images must fit together seamlessly to create
> >> this
> >> illusion. However, I discovered, that when mapping texture output
> >> onto the
> >> displacement or normal of a material, the images won't fit as
> >> expected.
> >> This is quite disappointing because of this these features aren't
> >> available
> >> to the artists working on this game.
> >>
> >> Why is this? Does blender's renderer use the camera front vector
> >> instead of
> >> the ray direction for this? With a field of view of 90 degrees the
> >> angle
> >> between these gets to 45 degrees at the edge centers and even greater
> >> than
> >> this at the corners of the image. This would explain the effects I'm
> >> seeing.
> >>
> >> Is there a reason for doing it the way it is done right now? Would it
> >> be a
> >> big deal to alter this behaviour?
> >>
> >> Regards,
> >> Gregor
> >
> > I'm sorry to reply to my own email, but I've just had another thought
> > on this:
> > What about blender's panorama rendering option? Would the same problem
> > arise
> > there if xparts and yparts are not chosen large enough?
> >
> > Regards,
> > Gregor
> > _______________________________________________
> > Bf-committers mailing list
> > Bf-committers@blender.org
> > http://www.blender.org/mailman/listinfo/bf-committers
>
> ------------------------------------------------------------------------
> --
> Ton Roosendaal  Blender Foundation ton@blender.org
> http://www.blender.org
>
> _______________________________________________
> Bf-committers mailing list
> Bf-committers@blender.org
> http://www.blender.org/mailman/listinfo/bf-committers