[Bf-committers] Camera, displacement and normals

Gregor Mückl bf-committers@blender.org
Tue, 20 Jan 2004 15:38:16 +0100


I'm currently in the process of building a game engine which allows the player 
to look at panoramas, with viewing angles of 180 degrees in the vertical 
direction and 360 degrees in the horizontal direction.

I am able to render these panoramas within blender with a small trick: I use  
a camera fov of 90 degrees and render 6 images - two along each coordinate 
axis - and project them onto a cube which is rendered in realtime. This 
method works quite good.

Obviously, the rendered images must fit together seamlessly to create this 
illusion. However, I discovered, that when mapping texture output onto the 
displacement or normal of a material, the images won't fit as expected. This 
is quite disappointing because of this these features aren't available to the 
artists working on this game.

Why is this? Does blender's renderer use the camera front vector instead of 
the ray direction for this? With a field of view of 90 degrees the angle 
between these gets to 45 degrees at the edge centers and even greater than 
this at the corners of the image. This would explain the effects I'm seeing.

Is there a reason for doing it the way it is done right now? Would it be a big 
deal to alter this behaviour?