Inspired by this page on Inigo Quilez’ website, I decided to try the same thing in Quartz Composer.
The basic idea is, you find some interesting formula to deform a 2D plane in some way, then write that formula into a 2D lookup-table. Then, you use that LUT to generate texture coordinates to render a second image. As Inigo says, you can do the whole thing in a single pixel/fragment shader now, but I thought I’d try it this way, anyway.
In the example clip below, I’m passing in an image of a blurry circle, made using a simple smoothstep function, and adjusting the parameters while I swap the LUT, and apply animation to the deformation.
The whole thing is done in two CoreImage filters, one for the base image, and one to generate the LUT and apply the deformation.
Apparently, this is a 90’s demoscene staple. I wasn’t involved with realtime graphics then, so it’s new and exciting to me..!
Here are some examples of the same effect applied to the iSight input:
With a less blurry base image, you can see some quite nasty aliasing. The effect could do with some kind of adaptive antialiasing, but I’m not quite sure how to go about that, to be honest. Mind you, it’s an old-school effect, so maybe it’s OK if it’s a bit rough around the edges (literally, in fact)…
QTZ is in the Box.net widget.
With thanks to Inigo Quilez.