Archive for March, 2008


Vertex Displacement Mapping + Normalmap

Normal mapping and vertex-displacement mapping from the same texture.
This ones uses the Kineme texture tools to create the displacement texture. It’s a classic use of the technique.

Not sure I’ll ever use this one ‘in anger’ but it’s an interesting test. I’m glad the lighting works so well, too.

The base colour comes from the normal map, incidentally.

Vertex Displace Mountains 03
Vertex Displace Mountains 04
Vertex Displace Mountains 05

The final screenshot shows the mesh from the back. Not quite sure why it looks so different from that side…

Next I have to work out if I can use the same technique with radial displacement. Think that might be a bit more tricky.


Bump Mapping

Thought I’d have a go using the Normal Map code from the last post to do some bump-mapping.
Turns out it’s quite easy to use the output of the Normal Map CIFilter as normals in a Fragment Shader. All that needs to be done is to transform it into the correct range by multiplying the RGB value of each pixel by 2.0, then subtracting 1.0.

Some examples using a simple Phong lighting model, and normals extracted from first a static image, then the live video from the iSight camera, run through the Normal Map CIFilter:

Bump Mapping 01
Bump Mapping 02

Here’s the GLSL code:

Vertex Shader:

varying vec3 normal, lightDir, eyeVec;

void main()
	// Normal from mesh
	normal = gl_NormalMatrix * gl_Normal;

	// Current Vertex coordinates transformed into World space
	vec3 Vertex = vec3(gl_ModelViewMatrix * gl_Vertex);

	// Light Direction
	lightDir = vec3(gl_LightSource[0] - Vertex);
	// Eye Vector
	eyeVec = -Vertex;

	gl_Position = ftransform();

	gl_FrontColor = gl_Color;
	gl_TexCoord[0] = gl_TextureMatrix[0] * gl_MultiTexCoord0;

Fragment Shader:

varying vec3 normal, lightDir, eyeVec;
uniform sampler2D texture;
uniform sampler2D normalMap;

void main (void)
	// Base color
	vec4 final_color =
	(gl_FrontLightModelProduct.sceneColor * gl_FrontMaterial.ambient) +
	(gl_LightSource[0].ambient * gl_FrontMaterial.ambient);

	vec3 N = texture2D(normalMap, gl_TexCoord[0].st).rgb * 2.0 - 1.0;
	//vec3 N = normalize(normal);	// Normal
	vec3 L = normalize(lightDir);	// Light Direction (normalized)
	float lambertTerm = dot(N,L);	// Base lighting (no ambient or specular contribution)

	if(lambertTerm > 0.0)
		// Add diffuse contribution
		final_color += gl_LightSource[0].diffuse * gl_FrontMaterial.diffuse * lambertTerm;
		// Compute specular contribution
		vec3 E = normalize(eyeVec);
		vec3 R = reflect(-L, N);
		float specular = pow(max(dot(R, E), 0.0), gl_FrontMaterial.shininess);
		// Add specular contribution
		final_color += gl_LightSource[0].specular * gl_FrontMaterial.specular * specular;

	// Output
	gl_FragColor = final_color;// * texture2D(texture, gl_TexCoord[0].xy);

You’ll need to stick the GLSL shader inside a Lighting Patch in order to be able to set the properties for the light.
You’ll get an error in the Frag shader about the varying normal not being used.


Normal Map

Inspired by Angelo Pesce’s comment here, I decided this morning I’d have a go at generating a Normal Map from a live video input, using a CIFilter patch. The code is surprisingly simple.

////	FUNCTIONS      ////

Function to calculate the luminosity of a pixel

float luminosity(vec4 color)
	vec4 lumcoeff = vec4(0.299,0.587,0.114,0.0);
	return dot(color, lumcoeff);

Creates Normal Map for use in Displacement Map / Height Map GLSL shaders

vec4 tb_normalMap(sampler inPix)
	// Sampler dimensions
	vec2 dims = samplerSize(inPix);
	// Working pixel coordinates
	vec2 xy = samplerCoord(inPix);
	// Normalised coordinates of working pixel (used to calculate Normal Z component)
	vec2 xyNorm = xy / dims;
	// Normal grid size (in pixels)
	float offset = 1.0;

	// Working pixel luminosity
	vec4 pix = sample(inPix, samplerTransform(inPix, xy));
	float lumCurrent = luminosity(pix);

	// Derivitive X
	float lumdx = luminosity(sample(inPix, samplerTransform(inPix, vec2(xy.x+offset, xy.y))));
	float dx = lumCurrent - lumdx;
	// Derivitive Y
	float lumdy = luminosity(sample(inPix, samplerTransform(inPix, vec2(xy.x, xy.y+offset))));
	float dy = lumCurrent - lumdy;
	// Normal Z. I'm not sure if this is strictly-speaking correct, but it seems to work!
	float dz = 1.0;

	// Output Normal Map
	return vec4(dx,dy,dz,1.0);

////    MAIN LOOP      ////

kernel vec4 normalMap(sampler Image)
	// Output Normal Map
	return tb_normalMap(Image);

The idea is that this map can be used to aid lighting of meshes created with Vertex Displacement Mapping / Heightmap techniques.

May or may not be useful…

Here’s what the effect looks like (I actually think it’s quite a nice effect in it’s own right).

Normal Map 01

Corrected mistake in function name in main loop.


More Radial VDM Screenshots (again)

Radial VDM 17
Radial VDM 19
Radial VDM 15
Radial VDM 21

In the third and fourth screenshots, I’ve swapped the Blending Mode of the mesh. I like the ethereal look of these. The downside is, it’s much more difficult to discern the 3-dimensional form of the mesh. I may try adding some kind of fog-like effect, so that the mesh colour or alpha changes with the z-coordinate of each fragment. This may helpĀ  things look more 3D (and is something I’ve been meaning to try, anyway).


More Radial VDM Screenshots

Radial VDM 05Radial VDM 06
Radial VDM 07Radial VDM 08
Radial VDM 12Radial VDM 11

I’ve changed the blending mode of the GLSL Mesh in the last two (to ‘Over’ and ‘Add’ respectively).


Radial Vertex Displacement Mapping

Thanks to some great work (and a fortuitous accident) from the good people at, it’s now possible to use moving images as Displacement Maps in a GLSL Vertex Shader in QC. So I thought I’d give it a go.

Radial VDM 01Radial VDM 02Radial VDM 04Radial VDM 03

The input is the iSight camera, with luminosity modulating the Radius parameter of a function to bend a GLSL grid into a sphere, so what we have is essentially a radial distortion of a sphere, based on the luminosity of the input texture.

This is very cool, and something I’ve been wanting to do for ages. It opens the doors to all sorts of mesh-related fun, as you can potentially use custom CIFilter or GLSL Fragment shader code to create different displacement images to feed into the patch.

The only problem is, it runs quite slowly on my machine. It’s undoubtedly much faster to do this kind of thing with Vertex Buffer objects, but that means custom OpenGL code in a QC plugin, which is way beyond my capabilities currently. Sadly.


Spherical Harmonics Rendering Issues

It seems some people have been having problems getting the Spherical Harmonics qcClip to work properly. Everything works fine on my 1st-generation MacBook Pro, but it seems some other systems/graphics cards have issues. In fact, I just tried it on my soon-to-be-replaced G5, and it didn’t render correctly on this system either.

Apologies to anyone else effected. I’m not sure what the issue is as yet, but I’ll try and track it down.

Any clues anyone might have as to why it might not be working greatly appreciated.

March 2008


Blog Stats

  • 492,858 hits