October 15, 2012 | By
Simon Yeung

SSAO (Screen space ambient occlusion) is a common post processing effect that approximates how much light is occluded in a given surface by the surrounding objects.

In this year's SIGGRAPH, there were a few slides in "The Technology behind the Unreal Engine 4 Elemental Demo" about how Epic implements SSAO. Their technique can either use only the depth buffer or with the addition of per-pixel normal.

I tried to implement both versions with a slight modification:

To approximate this in screen space, we design our sampling pattern as paired samples:

paired sample pattern

So for each pair of samples, we can approximate how much the shading point is occluded in 2D instead of integrating over the hemisphere:

The AO term for each given pair of samples will be min( (θleft + θright)/π, 1). Then by averaging the AO terms of all the sample pairs (in my case, there are 6 pairs), we achieve the following result:

If one of the paired samples is too far away from the shading point, say the red point in the following figure, it will be replace by the pink point, which is on the same plane as the other valid paired sample:

So we can interpolate between the red point and the pink point for dealing with the large depth difference. Now the dark halo is gone:

The above treatment only happens if one of the paired samples is far away from shading point. What if both of the samples have large depth differences?

In this case, it will result in the dark halo around the sword in the above screenshot. Remember we are averaging all the paired samples to compute the final AO value. So to deal with this artifact, we just assign a weight to each paired sample and then re-normalize the final result.

Say, for each paired sample, if both of the samples are within a small depth difference, that sample pair will have a weight of 1. If only 1 sample is far away, that pair will have a weight of 0.5. And finally if both of the samples are far away, the weight will be 0. This can eliminate most (but not all) of the artifacts:

And the resulting AO looks much darker with this approximation:

Note that the maximum error between the two functions is around 18.946 degrees.

This may affect the AO for the area of a curved surface with low tessellation. You might either need to increase the bias angle threshold or switch to a more accurate function. So my second attempt is to approximate it with a quadratic function: π(1- sign(x) * x * x)/2.

And this approximation shows a very similar result to the one using the arc-cos function.

And the maximum error of this function is around 9.473 degrees.

And here is the final result:

Finally, more time is needed to spend on generating the sampling pattern in the future, where the pattern I currently used is nearly uniform distributed (with some jittering).

[1] The Technology behind the Unreal Engine 4 Elemental Demo

[2] Rendering techniques in Toy Story 3

[3] Image-Space Horizon-Based Ambient Occlusion

[4] Wolfram Alpha

[5] The models are exported from UDK and extracted from