[Bf-committers] Camera, displacement and normals
Tue, 20 Jan 2004 15:41:13 +0100
Am Tuesday 20 January 2004 15:38 schrieb Gregor Mückl:
> I'm currently in the process of building a game engine which allows the
> player to look at panoramas, with viewing angles of 180 degrees in the
> vertical direction and 360 degrees in the horizontal direction.
> I am able to render these panoramas within blender with a small trick: I
> use a camera fov of 90 degrees and render 6 images - two along each
> coordinate axis - and project them onto a cube which is rendered in
> realtime. This method works quite good.
> Obviously, the rendered images must fit together seamlessly to create this
> illusion. However, I discovered, that when mapping texture output onto the
> displacement or normal of a material, the images won't fit as expected.
> This is quite disappointing because of this these features aren't availab=
> to the artists working on this game.
> Why is this? Does blender's renderer use the camera front vector instead =
> the ray direction for this? With a field of view of 90 degrees the angle
> between these gets to 45 degrees at the edge centers and even greater than
> this at the corners of the image. This would explain the effects I'm
> Is there a reason for doing it the way it is done right now? Would it be a
> big deal to alter this behaviour?
I'm sorry to reply to my own email, but I've just had another thought on this:
What about blender's panorama rendering option? Would the same problem arise
there if xparts and yparts are not chosen large enough?