Gamasutra: The Art & Business of Making Gamesspacer
A More Accurate Volumetric Particle Rendering Method Using the Pixel Shader
View All     RSS
August 22, 2014
arrowPress Releases
August 22, 2014
PR Newswire
View All

If you enjoy reading this site, you might also want to check out these UBM Tech sites:

A More Accurate Volumetric Particle Rendering Method Using the Pixel Shader

June 11, 2008 Article Start Page 1 of 4 Next

Many games, even on current "next-gen" hardware, render particles using camera facing quads. In many cases these particles are used to represent volumes of many smaller microscopic particles. These volumes typically are simulated simply by determining how much contribution they present to the view using a simple blend function. This blend function defines how much the simulated volume of particles obscures the scene behind them.

Although this method has been employed in games for many years, this article defines a method using shader technology to more physically represent these volumetric particles. This method will give a more accurate visual representation of the simulated volumes as well as potentially decreasing the necessary number of particles, which in turn will help to improve render performance.

It should first be stated that the method defined in this article is limited to particles that represent volumes of sub-particles. It is also noted that the analysis that is to follow assumes a uniform density of the particles. There are methods that would allow the user to define more complex density functions, but that will not be covered here.

Many games have tried to represent environmental effects such as dust, mist, gases, energy volumes etc. using the same method for years. Essentially they define how much a particle obscures the rendered environment behind it using the alpha channel in the texture.

The alpha channel can be used to define transparency or opacity but in either case the algorithm is the same, the resulting pixel colour (Cr) is interpolated by some function (A) between the already rendered environment colour (Ci) and the particle colour (Cp). In most cases I have seen, the alpha channel defines the A value and the function is:

Cr = A*Cp + (1-A)*Ci;

This is the basic interpolation function.

So what is wrong with this function? It uses two adds and two multiplies and in most cases is handled in hardware. It is simple and easy to use.

For the most part, semi- transparent camera-facing polygons rendered using this function represent the macroscopic volumes that simulate the microscopic particles as the user intended. Assuming the art is correct, they are sorted correctly from back to front and the quads they are rendered to don't collide with any other quads in the scene, the visual effect is sufficient to simulate microscopic particle volumes.

Now the art correctness is handled by the art department, and many engines handle sorting of the particles as they are rendered, so there is no problem there. How about the problem with them colliding with other geometry? A consequence of rendering all the particles as camera facing polygons is that the particles don't collide with each other almost all of the time. We can't make the same claim about the particles' collision with other geometry in the scene.

In some cases, the particles can be rendered without considering the depth buffer, but that assumes that nothing will obscure the particles themselves from the view. In other cases, there might be the CPU time to handle each particle's collision with the world, and simply make sure they don't ever encounter this case. For most games, the artifacts caused by particles and the environment have been considered an acceptable artifact.

Fig 1. Example from the Chrome Dragon Engine using the basic modulate blend mode render method for smoke particles

Unfortunately, due to the visual aliasing problems that arise from this method of particle rendering, art teams have developed some problematic procedures:

  1. Typically, on a first pass of a volumetric particle effect, the artist will create the particles the size and density that best represents the visual quality that is appropriate. When the effect is added to the game, it typically becomes immediately apparent that the actual particles are geometrically too big.

    When they render, the artist is required to tweak the visual look away from the intended visual quality, and retrofit it to avoid any possible collision with environmental geometry.

  2. In cases when it is necessary to have particles close to geometry, the artist is required to not only artificially shrink the particles to avoid noticeable aliasing artifacts, but typically increase the number of particles to account for the now-empty space. As soon as they increase the number of particles they must change the alpha values in the function to account for the increased likelihood of overlapping particles.

    This decreases the intended resolution of the alpha channel (potentially causing visual effects itself), makes the effect itself more grainy and exponentially dense, and worst of all, requires the artist (and possibly a programmer) to waste time tweaking to account for the inadequacies of the basic interpolation function.

Article Start Page 1 of 4 Next

Related Jobs

Retro Studios - Nintendo
Retro Studios - Nintendo — Austin, Texas, United States

Gameplay Engineer
Trion Worlds
Trion Worlds — Redwood City, California, United States

Senior Gameplay Engineer
GREE International
GREE International — San Francisco, California, United States

Senior Software Engineer, Unity
GREE International
GREE International — San Francisco, California, United States

Engineering Manage - Game Server


Ronald Mexico
profile image
Very interesting. Nice work.

profile image
Not sure what method UBI used, but GRAW 2 took care of this issue. Pretty amazingly realistic smoke in that game. No plane seams on the ground at all. They even made mention of their new particle tech in a pre-release, PR video.

profile image
How is this different from the DX10 soft particle sample?

Josiah Manson
profile image
What happens when particle volumes intersect? Do the particles act independently and double up the density, or do they somehow preserve uniform density. I think that uniform density may be achieved by having the particles write to the depth buffer.

Aaron Casillas
profile image
Great article!

I'm also interesting in learning more about "Baked" particle animations like the ones found in WOW. Is there a package out there that allows the particles system to be baked in?

Karsten Schwenk
profile image
Nice article, but I think variants of this method have been used for some time now.

My implementation uses the extinction factor to calculate the alpha value, which is a bit more physically based, and optionally also does a short ray-marching (for shadowing and non-homogeneous media). I called them 'thick particles'.

Unfortunately, from my experience, the default soft particles are usually preferable. At least in scenarios where the improved physical correctness isn't needed - and that includes almost all applications in games. Simply because they essentially do the same thing but are cheaper to render and easier to tweak for artists.

Space Games
profile image
I remember in older games like Call of Duty the smoke would just be ridiculous, going through walls and just not really conforming to what it should be doing. Recent games seem to be a lot better in that regard, including the newer Activision stuff. Interesting article either way. :)