Archive for the 'Algorithms' Category


Adaptive Tiling

It’s a lonnnnnnng time since I did anything in QC, so I thought I’d try and keep my hand in a bit, by doing something really simple.

As usual, it didn’t turn out to do quite as simple as I thought it would, in the end, but it looks very simple 😉

It’s basically just a 2D tiling effect, but works iteratively, so brighter areas of the image are subdivided into small tiles.

Uses a JavaScript loop inside a Core Image Kernel. I don’t know what the latest state-of-play is with JavaScript in CIFilter patches. There used to be memory leaks. Dunno if this is still the case.


Reaction-Diffusion Shader Interesting Errors

I’ve been getting some interesting results with a Reaction-Diffusion shader. It’s definitely not setup in the way it should be yet, but has produced some interesting images nonetheless. The stills are a bit misleading, as in motion, the effect is very hard to watch at the moment, as it flickers horribly.

This is based on a GLSL port of Sanch’s HLSL shader for vvvv, posted on the vvvv website, so all credit goes to him.
It’s hug fun to play with, though the results I’m getting look very different from the screenshots on his (still amazing) website.


Interference Patterns

2D 2 and 3-point light-interference patterns, converted from WebGL code at:


Truchet Tile

Lots of people seem to have been downloading my Truchet Tile QTZ lately. Just wondering what you’ve been doing with it. If any of you guys have examples of using it on YouTube/Vimeo etc. I’d love to post them here. Same goes for anything else you’ve done using tb code/QC files.

This is my quick demo of the effect in action with webcam input.


2D Perlin Vertex Noise GLSL Sourcecode

Vertex Shader:

	2D Perlin-Noise in the vertex shader, based originally on
	vBomb.fx HLSL vertex noise shader, from the NVIDIA Shader Library.
	Original Perlin function substituted for Stefan Gustavson's
	texture-lookup-based Perlin implementation.
	Quartz Composer setup
	toneburst 2009

//  2D Perlin Noise   //

	2D Perlin-Noise from example by Stefan Gustavson, found at

uniform sampler2D permTexture;			// Permutation texture
const float permTexUnit = 1.0/256.0;		// Perm texture texel-size
const float permTexUnitHalf = 0.5/256.0;	// Half perm texture texel-size

float fade(in float t) {
	return t*t*t*(t*(t*6.0-15.0)+10.0);

float pnoise2D(in vec2 p)
	// Integer part, scaled and offset for texture lookup
	vec2 pi = permTexUnit*floor(p) + permTexUnitHalf;
	// Fractional part for interpolation
	vec2 pf = fract(p);
	// Noise contribution from lower left corner
	vec2 grad00 = texture2D(permTexture, pi).rg * 4.0 - 1.0;
	float n00 = dot(grad00, pf);
	// Noise contribution from lower right corner
	vec2 grad10 = texture2D(permTexture, pi + vec2(permTexUnit, 0.0)).rg * 4.0 - 1.0;
	float n10 = dot(grad10, pf - vec2(1.0, 0.0));
	// Noise contribution from upper left corner
	vec2 grad01 = texture2D(permTexture, pi + vec2(0.0, permTexUnit)).rg * 4.0 - 1.0;
	float n01 = dot(grad01, pf - vec2(0.0, 1.0));
	// Noise contribution from upper right corner
	vec2 grad11 = texture2D(permTexture, pi + vec2(permTexUnit, permTexUnit)).rg * 4.0 - 1.0;
	float n11 = dot(grad11, pf - vec2(1.0, 1.0));
	// Blend contributions along x
	vec2 n_x = mix(vec2(n00, n01), vec2(n10, n11), fade(pf.x));
	// Blend contributions along y
	float n_xy = mix(n_x.x, n_x.y, fade(pf.y));
	// We're done, return the final noise value.
	return n_xy;

// Sphere Function //

const float PI = 3.14159265;
const float TWOPI = 6.28318531;
uniform float BaseRadius;

vec4 sphere(in float u, in float v) {
	u *= PI;
	v *= TWOPI;
	vec4 pSphere;
	pSphere.x = BaseRadius * cos(v) * sin(u);
	pSphere.y = BaseRadius * sin(v) * sin(u);
	pSphere.z = BaseRadius * cos(u);
	pSphere.w = 1.0;
	return pSphere;

// Apply 2D Perlin Noise //

uniform vec3 NoiseScale;	// Noise scale, 0.01 > 8
uniform float Sharpness;	// Displacement 'sharpness', 0.1 > 5
uniform float Displacement;	// Displcement amount, 0 > 2
uniform float Speed;		// Displacement rate, 0.01 > 1
uniform float Timer;		// Feed incrementing value, infinite

vec4 perlinSphere(in float u, in float v) {
	vec4 sPoint = sphere(u, v);
	// The rest of this function is mainly from vBomb shader from NVIDIA Shader Library
	vec4 noisePos = vec4(,1.0) * (sPoint + (Speed * Timer));
	float noise = (pnoise2D(noisePos.xy) + 1.0) * 0.5;;
	float ni = pow(abs(noise),Sharpness) - 0.25;
	vec4 nn = vec4(normalize(,0.0);
	return (sPoint - (nn * (ni-0.5) * Displacement));

// Calculate Position, Normal //

const float grid = 0.01;	// Grid offset for normal-estimation
varying vec3 norm;			// Normal

vec4 posNorm(in float u, in float v) {
	// Vertex position
	vec4 vPosition = perlinSphere(u, v);
	// Estimate normal by 'neighbour' technique
	// with thanks to tonfilm
	vec3 tangent = (perlinSphere(u + grid, v) - vPosition).xyz;
	vec3 bitangent = (perlinSphere(u, v + grid) - vPosition).xyz;
	norm = gl_NormalMatrix * normalize(cross(tangent, bitangent));
	// Return vertex position
	return vPosition;

// Phong Directional VS //

// -- Lighting varyings (to Fragment Shader)
varying vec3 lightDir0, halfVector0;
varying vec4 diffuse0, ambient;

void phongDir_VS() {
	// Extract values from gl light parameters
	// and set varyings for Fragment Shader
	lightDir0 = normalize(vec3(gl_LightSource[0].position));
	halfVector0 = normalize(gl_LightSource[0];
	diffuse0 = gl_FrontMaterial.diffuse * gl_LightSource[0].diffuse;
	ambient =  gl_FrontMaterial.ambient * gl_LightSource[0].ambient;
	ambient += gl_LightModel.ambient * gl_FrontMaterial.ambient;

// Main Loop //

uniform vec2 PreScale, PreTranslate;	// Mesh pre-transform

void main()
	vec2 uv = gl_Vertex.xy;
	// Offset XY mesh coords to 0 > 1 range
	uv += 0.5;
	// Pre-scale and transform mesh
	uv *= PreScale;
	uv += PreTranslate;
	// Calculate new vertex position and normal
	vec4 spherePos = posNorm(uv[0], uv[1]);
	// Calculate lighting varyings to be passed to fragment shader
	// Transform new vertex position by modelview and projection matrices
	gl_Position = gl_ModelViewProjectionMatrix * spherePos;
	// Forward current texture coordinates after applying texture matrix
	gl_TexCoord[0] = gl_TextureMatrix[0] * gl_MultiTexCoord0;

And the Fragment Shader, which just implements a generic Phong Directional lighting model:

	Generic Fragment Shader
	with Phong Directional lighting

// Phong Directional FS //

// -- Lighting varyings (from Vertex Shader)
varying vec3 norm, lightDir0, halfVector0;
varying vec4 diffuse0, ambient;

vec4 phongDir_FS()
	vec3 halfV;
	float NdotL, NdotHV;
	// The ambient term will always be present
	vec4 color = ambient;
	// compute the dot product between normal and ldir
	NdotL = max(dot(norm, lightDir0),0.0);
	if (NdotL > 0.0) {
		color += diffuse0 * NdotL;
		halfV = normalize(halfVector0);
		NdotHV = max(dot(norm, halfV), 0.0);
		color +=	gl_FrontMaterial.specular * 
				gl_LightSource[0].specular * 
				pow(NdotHV, gl_FrontMaterial.shininess);
	return color;

// Main Loop //

void main()
	// Call lighting function and return result
	gl_FragColor = phongDir_FS();


You’ll need to put a GLSL Grid patch inside the GLSL Shader macro. I’d set the Vertical and Horizontal Resolution of the Grid to a higher value than the default 50. Try to balance resolution (higher the better) against frame-rate. Setting it to 256×256 will give an ultra-smooth mesh, but will potentially slow down the rendering, depending on your system. You could put the GLSL Grid inside a Trackball if you wanted, so you can spin the whole thing around.

You’ll also need to put the whole thing inside a Lighting patch.

The complete setup would be:
Lighting : GLSL Shader : (optional) Trackball : GLSL Grid

I’ve tried to organise the code into blocks to make it easier to understand. I’m sure it could be made more efficient and/or elegant by combining some of the functions, but my aim was to make it easier to copy-paste discrete functions into other code, in nice self-contained chunks. It’s probably terrible coding practice, but I’ve placed all the variables for each function with the function definition itself, rather than declaring them all at the top. I found this helped keep the code ‘modular’, and it seems to work, so I guess the compiler can work its way through it OK.

I’ve suggested ranges for the various parameters in the shader code.

You’ll also need this picture, as the permutation texture, connected to the ‘permTex’ input on the shader.

Permutation Texture

And here’s a couple of examples of the code in action, courtesy of Marcos Prack.

Very smooth, and looks like it’s running faster than it does on my MacBook Pro.
Cheers Marcos!


Texture-Lookup-Deform By 2-Point Bezier-Curve

Same Bezier Curve formula applied in CIFilter to deforming texture-lookup X and Y coordinates.


2-Point Bezier Curve

Inspired by a query from 2bitpunk, I decided yesterday to have another go at Bezier curves.

I’ve looked at this in the past, and have always been really confused by the maths, or found sourcecode, but been unable to separate-out the relevant maths from the rest of the code. Yesterday though, I stumbled on this page, which had the necessary equation without a load of confusing explanation, that, with my mathematical myopia, I wouldn’t stand a cat’s chance in hell of understanding (as the phrase goes).

Soo… after an hour or so of swearing at QC’s buggy JavaScript module, I managed to put together a simple 2-point Bezier demo. Ironically, in the end, it was the dragging-control-points bit that took the most hair-tearing to implement. My eventual solution to that problem wasn’t ideal, but was functional, at least.

Here’s the JS code to generate a Bezier Curve with a settable number of steps, taking as input 4 2D points (2 anchor-points, 2 control-points/handles) and outputting a structure of points in 3D space descibing the resulting curve. There’s also a chunk of code to output a second structure containing the coordinates of the anchor and control-points, so you can see where they are:

	Bezier curve between 2 points, with 2 control-points.

function bezier(A, B, D, C, t)
	// Note I've reversed the order of C and D so that
	// anchor and control-points are in more intuitive order
	var a = t;
	var b = 1 - t;
	var out=	A * Math.pow(b, 3) +
			3 * B * Math.pow(b, 2) * a +
			3 * C * b * Math.pow(a, 2) +
			D * Math.pow(a, 3);
	return out;

function (__structure Points_Struct, __structure Controlpoints_Struct)
		(__number P0_X, __number P0_Y,
		 __number P1_X, __number P1_Y,
		 __number P2_X, __number P2_Y,
		 __number P3_X, __number P3_Y,
		 __index Steps)
	if(!_testMode) {
			P0 is first anchorpoint, P1 first controlpoint
			P2 is second anchorpoint, P3 second controlpoint
			(see note in bezier function) 
		// Init output arrays
		var points = new Array();
		var controlpoints = new Array();
		// Time-increment amount
		var stepSize = 1 / Steps;
		// Curve with steps segments
		for(i = 0; i <= Steps; i++) {
			var thisPoint = new Object();
			t = stepSize * i;
			thisPoint.X = bezier(P0_X, P1_X, P2_X, P3_X, t);
			thisPoint.Y = bezier(P0_Y, P1_Y, P2_Y, P3_Y, t);
			thisPoint.Z = 0;
			points[i] = thisPoint;
		// Control-points
		var thisControlpoint = new Object();
		thisControlpoint.X = P0_X;
		thisControlpoint.Y = P0_Y;
		thisControlpoint.Z = 0;
		controlpoints[0] = thisControlpoint;
		var thisControlpoint = new Object();
		thisControlpoint.X = P1_X;
		thisControlpoint.Y = P1_Y;
		thisControlpoint.Z = 0;
		controlpoints[1] = thisControlpoint;
		var thisControlpoint = new Object();
		thisControlpoint.X = P2_X;
		thisControlpoint.Y = P2_Y;
		thisControlpoint.Z = 0;
		controlpoints[2] = thisControlpoint;
		var thisControlpoint = new Object();
		thisControlpoint.X = P3_X;
		thisControlpoint.Y = P3_Y;
		thisControlpoint.Z = 0;
		controlpoints[3] = thisControlpoint;

		// Output
		var result = new Object();
		result.Points_Struct = points;
		result.Controlpoints_Struct = controlpoints;
		return result;

It would be easy to extend the curve into 3D space by simply passing the Z-coords to the Bezier function too, rather than just setting Z to a constant.

Example video clip:

I think my mistake in the past has been to try and implement Bezier curve-drawing in a fragment shader/CIFilter, whereas it’s much easier to simple generate a load of points, and then draw them using 3D geometry.

Check out this lovely piece for work by Jeogho Park, using the above bezier JavaScript code.


February 2020
« Aug    


Blog Stats

  • 491,796 hits