Last night before I went to bed I read about the so called sobel operator that is often used to find edges in images in programs like Photoshop. The filter basically gives every pixel a number that is large for large differences
in adjacent pixels. This was a very easy algorithm to implement.
Running the filter against image data would be a bad idea though. To find the edges I had extracted the shape of the surface (normals and depth values) from the partially rendered image. When I run the sobel filter against this data I get large values wherever there is a jump in the shape. Exactly what I want!
But this gives equally sized edges everywhere and to be honest it looks quite bad. Fortunately the values of the sobel filter often go much above 1 which is the maximum I can use. I figured if I blur these values I get wider edges where the shape had larger jumps.
Today I read some papers about High Dynamic Range imaging. Be sure to see some images Google gives when asked for "HDR", they're very cool. The idea is that normally every pixel rendered has to have color values between 0 and 1 and this leaves the image very flat. The sun can be dozens of times brighter than a room so they simply don't fit in the same range.
To fix this the scene is first rendered without this cramping restriction. Only after everything has been drawn completely to a texture, with color values more like between 0 and 100, the image is downscaled back to between 0 and 1 by some curve. This gives a very satisfying result, though finding a curve good enough takes time.
Oh and by the way, the fps I get on this laptop keeps going down. A few days ago I was still 60. Then I added soft shadows, rendering normals and depths to textures and all kinds of post processing effects like HDR rendering.
Everyone of those rendering phases basically takes 2 to 5 fps down. Now I'm at 20. I hope I won't need any more ;)