[Bf-committers] Camera, displacement and normals

Gregor Mückl bf-committers@blender.org
Wed, 21 Jan 2004 09:31:16 +0100


Am Wednesday 21 January 2004 03:15 schrieb Robert Wenzlaff:
> On Tuesday 20 January 2004 09:38, Gregor Mückl wrote:
> > Hi!
> >
> > I'm currently in the process of building a game engine which allows the
> > player to look at panoramas, with viewing angles of 180 degrees in the
> > vertical direction and 360 degrees in the horizontal direction.
> >
> > I am able to render these panoramas within blender with a small trick: I
> > use a camera fov of 90 degrees and render 6 images - two along each
> > coordinate axis - and project them onto a cube which is rendered in
> > realtime. This method works quite good.
> >
> > Obviously, the rendered images must fit together seamlessly to create
> > this illusion. However, I discovered, that when mapping texture output
> > onto the displacement or normal of a material, the images won't fit as
> > expected. This is quite disappointing because of this these features
> > aren't available to the artists working on this game.
> >
> > Why is this? Does blender's renderer use the camera front vector instead
> > of the ray direction for this? With a field of view of 90 degrees the
> > angle between these gets to 45 degrees at the edge centers and even
> > greater than this at the corners of the image. This would explain the
> > effects I'm seeing.
> >
> > Is there a reason for doing it the way it is done right now? Would it be
> > a big deal to alter this behaviour?
>
> It may have been related to the normal flipping, or I may not quite
> understand the steps to recreate this.  But after the last commit, I ran
> these two images with the camera at the origin, and rotated 90 degrees
> between #1 and #2.
>
> http://www.soylent-green.com/Test1.jpg
> http://www.soylent-green.com/Test2.jpg
>
> Pasting them together, I see no seam
>
> http://www.soylent-green.com/Test1-2.jpg
>
> Robert Wenzlaff
>

Thanks for your effort. This is indeed quite strange then. I do see seams when 
I render in blender. I've created an example of what I try to do in blender 
(using yafray as renderer turns out to work, btw).

The scene is

http://quark.futureware.at/panotest.blend

The created images are:

http://quark.futureware.at/test0001.jpg
http://quark.futureware.at/test0002.jpg

If I put the images together I clearly see a seam:

http://quark.futureware.at/test-complete.jpg

The two camera positions which I used for this test are stored as keys in 
frame 1 and 2.

Regards,
Gregor