[tuhopuu-devel] IRIX Tuhopuu gameengine crash

Kester Maddock tuhopuu-devel@blender.org
Sat, 14 Aug 2004 15:49:23 +1200


Hi Carsten,

You're almost on the right track: the B{blahblah.xx}:: means "put the 
following text into a text space, and name it blahblah.xx" (B{...} turns ... 
into bold when run through epydoc, and :: sets fixed spacing [code mode])

Alternatively, pass the shader in as a string, (like you are doing) but remove 
the B{fragment.fs}:: and B{vertex.vs}:: lines, which aren't valid GLSL.
Grab the GLSL reference: 
http://oss.sgi.com/projects/ogl-sample/registry/ARB/GLSLangSpec.Full.1.10.59.pdf
for the GLSL syntax (C like), builtin variables and builtin functions.

Quick GLSL tutorial:
You need two shaders - a vertex shader and a fragment shader. (Fragment = 
pixel in GL lingo.)

The vertex shader is run for every vertex in the mesh:
// ----------- INPUTS ---------------------------
// Uniforms are the same for every vertex in the mesh:
// A 4x4 matrix called MVI.
uniform mat4 MVI; // Inverse ModelView Matrix
// Uniforms are set from Python see setUniform and genUniform.

// Attributes are different for each vertex.  Position, colour and normal
// are examples of vertex attributes.  These attributes are built in to
// OpenGL, so we don't need to do anything special for them.
// The only extra attribute Blender exports at the moment is the vertex
// tangent.
attribute vec4 vertex_tangent;
// Attributes are set with the Python bindAttribute method.

// ---------------- OUTPUTS -----------------
// Varyings are output variables.  The hardware will interpolate them
// across the triangle for the fragment shader.
// GeforceFX, and Radeon 9x00 only support 8 varyings (including builtin)
// Geforce 6800 support 10. (I believe)
// This is the major resource limitation for shaders.
varying vec3 lightvec;
varying vec3 viewvec;

// The main function is where the shading action happens.
void main ()
{
	// Transform vertex position from local to clip space.
	// (Clip space is the camera view)
	// This is pretty much required by all vertex shaders.
	gl_Position = gl_ModelViewProjectionMatrix*gl_Vertex;

	// ---- CUSTOM VERTEX Calculation --------
	// This part does stuff that the fixed OpenGL transform doesn't - 
	// why we are using a vertex shader in the first place.

	// Create Tangent Space Matrix
	// Tangent space is the easiest way of doing normal mapping.
	// Vectors can be 'swizzled:'
	// The .xyzw, .rgba, and .stpq can be used to switch components of a vector:
	// eg:
	// vec4 foo(1.0, 2.0, 3.0, 4.0);
	// foo.xyz is a vec3 (1.0, 2.0, 3.0)
	// foo.zyxw is a vec4 (3.0, 2.0, 1.0, 4.0)
	// .rgba are for colours, .stpq are for texture coords
	mat3 tangent (vertex_tangent.xyz,
			cross(vertex_tangent.xyz, gl_Normal.xyz)*vertex_tangent.w,
			gl_Normal);

	// Transform light 0's position to mesh local space:
	vec3 lightpos = (MVI*gl_LightSource[0].position).xyz;
	// Create the light vector in tangent space.
	lightvec = lightpos*tangent - gl_Vertex.xyz;

	// Create the view vector in tangent space.
	viewvec = (MVI[3].xyz*tangent;

	// ----- OUTPUT ----------
	// This section outputs stuff from the vertex shader.
	// Variables starting with gl_ are builtin, and gl_ is reserved
	// as a variable prefix.
	
	// UV coordinates
	gl_TexCoord[0] = gl_MultiTexCoord0;
	// Vertex colour
	gl_FrontColor = gl_Color;
}

-----------------
Onto the fragment shader:

// sampler2Ds are texture handles.
// They are uniforms, and are set from Python.
uniform sampler2D colourmap;
uniform sampler2D normap;

// The input from our vertex shader.  The names
// must match.
varying vec3 lightvec;
varying vec3 viewvec;

// The main program.  This is executed for each fragment.
void main()
{
	// Tangent space light vector
	vec3 lv = normalize(lightvec);

	// Tangent space view vector
	vec3 vv = normalize(viewvec);

	// Lookup normal
	vec3 normal = texture2D(normap, gl_TexCoord[0]).rgb;
	// Scale & bias normal
	// Textures range from 0.0 - 1.0 per component.
	// The normal should range from -1.0 to 1.0:
	normal = 2.0*normal - 1.0;
	// Normalise normal
	normal = normalize(normal);

	// Compute diffuse lighting (standard Lambertian diffuse)
	float diffuse = dot(normal, lv);

	// Reflection vector:
	vec3 refl = normalize(2.0*diffuse*normal - lv);
	// Compute specular lighting (Phong specular)
	float spec = pow(max(dot(refl, vv), 0.0), gl_FrontMaterial.shininess);

	// Lookup colour map
	vec4 colour = texture2D(colourmap, gl_TexCoord[0]) * 
gl_LightSource[0].diffuse * diffuse;

	// -------- OUTPUT ------------
	// light colour map
	gl_FragColor = spec * gl_FrontMaterial.specular *
 gl_LightSource[0].specular + colour;
	// Pass alpha
	gl_FragColor.a = colour.a;
}

The main purpose of the fragment shader is to write gl_FragColor.  There are 
some utility library shaders at:
http://projects.blender.org/viewcvs/viewcvs.cgi/tuhopuu2/source/gameengine/Ketsji/Shaders/?cvsroot=tuhopuu
Currently you have to copy them into your blend - I haven't decided how they 
should integrate into blender.

Good luck.

Kester
On Friday 13 August 2004 23:11, Carsten Wartmann wrote:
> Kester Maddock schrieb:
> > Hi Carsten,
>
> [...]
>
> > Then you need to use Python to load a shader, bind the texture uniforms,
> > bind your own uniforms.
> >
> > The Python reference has an example shader pair and the Python to load
> > them.
> > http://projects.blender.org/viewcvs/viewcvs.cgi/tuhopuu2/source/gameengin
> >e/PyDoc/KX_PolygonMaterial.py?rev=1.4&cvsroot=tuhopuu&content-type=text/vn
> >d.viewcvs-markup
>
> I tried this way, its all very new to me, so maybe completely nonsense:
>
> ---------------------------------------------------------------------------
[snip]
>-------------
>
> Am I on the right track?
>
> Carsten.
>
> _______________________________________________
> tuhopuu-devel mailing list
> tuhopuu-devel@blender.org
> http://www.blender.org/mailman/listinfo/tuhopuu-devel
>
>
>
> !DSPAM:411d825844961637514496!