Some screenshots from something I’ve been tinkering with for a while, which I finally got working last night.
The input is the iSight cam on my laptop. The heights of the blocks are scaled according to brightness samples taken with the new Image Pixel patch, at gridpoints on the input image.
Aliasing is much less obvious when the effect is ‘live’. Having said that, I may still investigate ‘supersampling’ (ie rendering to an image at a higher resolution, then scaling it back down for display), to smooth out the edges a little.