Gamasutra is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.


Gamasutra: The Art & Business of Making Gamesspacer
Let There Be Light!: A Unified Lighting Technique for a New Generation of Games
arrowPress Releases
December 1, 2020
Games Press
View All     RSS







If you enjoy reading this site, you might also want to check out these UBM Tech sites:


 

Let There Be Light!: A Unified Lighting Technique for a New Generation of Games


July 29, 2005 Article Start Page 1 of 3 Next
 

Game developers have always strived to provide a greater sense of realism to the player in order to further immerse them in the worlds they create. With the coming of vertex and pixel shaders, a new power became available that allowed developers to advance this goal by applying complex lighting and advanced visual effects to the scenes they created. In the last generation of gaming (i.e. the Xbox/PS2/GC era), dynamic lighting took a giant leap forward thanks to programmable vertex and pixel pipelines. Most games now support lighting calculated on a per-vertex basis, with some later entries into the market offering the more esthetic per-pixel lighting solution. Now it's time to make another leap: from the vertex to the pixel shader.

Modern GPUs (Graphics Processing Units), and most notably those that will power the next generation of game consoles, offer an incredible amount of processing power. It is now possible to have all lighting in a scene computed at the per pixel level using a variety of different techniques. While it is true that per-pixel lighting has been available through past shader models (SM), the introduction of SM3.0 has permitted developers to remove the bounds on the number of lights that could traditionally be calculated in a pass1.

The goal of this article is to present the reader with a unified per-pixel lighting solution capable of handling an arbitrary number of dynamic lights, and techniques for optimizing such a solution should it be adopted. We will start by taking a look at the current state of lighting techniques and examining their limitations. The unified model will then be explained in detail and its limitations discussed, along with optimizations that can be made to ensure that application runs at a solid frame rate.

Throughout the article, various bits of shader code will be presented to help explain implementation details. To this end, a test scenario has been established to illustrate the various limitations of existing lighting techniques, and how the unified lighting solution can implement the same scenario with superior results. The test scenario will consist of a static non-skinned object being affected by a number of lights (i.e. an environment). All light sources are assumed to be point lights. Adding code to support other kinds of lights as well as the specular lighting model is left to the reader. The lighting equations are readily available on the Internet for those interested.

For readers unfamiliar with lighting equations, a point light's contribution is computed as follows:

Given: N, the normal of the point on the surface to be lit
         Ppoint, the position of the point to be lit
         Plight, the position of the light
         A0, A1 and A2, the attenuation factors for the light2

(1) dist = |Ppoint- Plight |, the distance from the light

(2) Lpoint = Ppoint- Plight /|Ppoint- Plight |, the normalized direction from the light to the surface point

(3) att = 1.0/( A0 + (A1*dist) + (A2*dist2) ), the attenuation of the light

and finally,

(4) Cpoint = att * (N · -Lpoint), the color contribution to the overall lighting

For this article, we will simply be using inverse linear drop off as our attenuation factor. Inverse linear attenuation is 1/dist; it is equivalent to setting A0 = 0.0, A1 = 1.0, and A2 = 0.0.

Overview of current lighting techniques

Vertex Lighting

The most common form of lighting in games today is vertex lighting. This is the fastest form of dynamic lighting presented in this article. Vertex lighting works much like it sounds; the color contribution of each light is calculated for each vertex on the surface and interpolated across the surface during rasterization.

For a quick lighting fix, this solution is robust and inexpensive. However, this method does have its drawbacks, which will be explored later in the article. In the meantime, see listing 1.1 for an abbreviated example of vertex lighting in HLSL.


Listing 1.1: Simple point lighting using HLSL

Per-pixel normal map based lighting

Normal map lighting takes a different approach to lighting by encoding tangent-space normals3 for the surface in a texture to compute the lighting equation at each pixel, rather than at each vertex. Object space normal maps are also possible and are generally used to light dynamic objects. This form of bump mapping has quickly become the standard for games that want to push the graphical limits. Most new games rely on this as their primary lighting technique because it allows artists to achieve incredible levels of detail while still keeping the polygon count low. There is also a variation on normal map lighting called parallax mapping which encodes an additional height map value into the normal texture in order to simulate the parallax effect. You can find more information and implementation details about it at both ATI's and Nvidia's developer sites.

Performing normal map lighting is a three-step approach. The normal map must first be created, applied to the model and exported with tangent space information4. Next, when processing the vertices of the surface to be normal mapped on the vertex shader, a tangent matrix must be created to transform all positional lighting information into tangent space (all lighting equations must be performed in the same coordinate space). The tangent space matrix is a 3x3 matrix made up of the vertex's tangent, binormal and normal vectors. The binormal vector is obtained by computing the cross product of the tangent and the normal. Finally, the color contribution of each light is calculated in the pixel shader using the normal information fetched from the normal map and the tangent space lighting vectors computed from data transformed on the vertex shader. See listing 1.2 for a simple HLSL implementation of normal map lighting.



Listing 1.2 : Vertex/Pixel shader for simple normal map lighting with directional lights



Article Start Page 1 of 3 Next

Related Jobs

Petroglyph Games
Petroglyph Games — Las Vegas, Nevada, United States
[11.30.20]

Graphics Engineer
Heart Machine
Heart Machine — Culver City, California, United States
[11.30.20]

Game Network Engineer
Disbelief
Disbelief — Cambridge, Massachusetts, United States
[11.30.20]

Junior Programmer, Cambridge, MA
innogames
innogames — Hamburg, Germany
[11.30.20]

Frontend Developer (Haxe) - Video Game: Forge of Empires





Loading Comments

loader image