Gamasutra: The Art & Business of Making Gamesspacer
arrowPress Releases
October 31, 2014
PR Newswire
View All
View All     Submit Event





If you enjoy reading this site, you might also want to check out these UBM Tech sites:


 
Three Normal Mapping Techniques Explained For the Mathematically Uninclined
by Robert Basler on 11/22/13 04:53:00 am   Featured Blogs

The following blog post, unless otherwise noted, was written by a member of Gamasutra’s community.
The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company.

 

[Originally from onemanmmo.com.]

I've spent quite a bit of time the last couple weeks piecing bits of different articles together to figure out how to implement normal mapping in the Lair Engine. The problem I encountered is that there is a huge variation in the methods and terminology used in articles on normal mapping. This makes it a very confusing topic for non-math-lovers like myself. So here I'm going to explain the three common techniques for normal mapping for the mathematically uninclined.

I've talked about the setup of my renderer before, so if you want to compare it to what you're doing, read this.

Originally I started by implementing Normal Mapping Without Precomputed Tangents which sounded like the easiest way to go, but was probably not actually the best place to start. I wasn't able to get that working. Even with some help from the author of the paper, no matter what I did it appeared that the light would rotate when I rotated the model.

After a couple days of debugging frustration I moved on to other things for a couple of weeks, reading articles on normal mapping in my spare time. When I thought I had an understanding of it, I integrated this code to calculate tangent vectors for an arbitrary mesh as well as shader code to do normal mapping in View Space (which by general consensus is the best coordinate space to do normal mapping.) To my disappointment, this implementation had the exact same problem as the original one: the lighting rotated with the model. At this point I was a little baffled so I put a question to stackoverflow.com which was kind of frustrating, but I did get two helpful things from there: 1) Nobody pointed out any issues with my math and 2) the suggestion to implement normal mapping in World Space so you can render the different elements to the screen during debugging and have a hope of figuring out what's going on.

Coordinate Spaces

When you do 3D rendering, there are a lot of different coordinate spaces you have to deal with. I'm only going to explain the first one here, so if you aren't sure what the others are, go look them up. With normal mapping, the chain of transformations looks like:

Tangent Space <-> Model Space <-> World Space <-> View Space <-> Clip Space

(You might hear View Space called Eye Space or Camera Space. Some people call Clip Space Homogeneous Space. So confusing.)

Tangent Space is the one we are interested in today. It is the coordinate space that the normals in a normal map are in.

The TBN Matrix

Any article you read about normal mapping will talk about the TBN matrix. It's so named because of the elements that make it up, the Tangent, Bitangent and Normal vectors. (Some people call the Bitangent the Binormal - don't do that.) What the TBN matrix does is allow you to convert normals from the normal map (in Tangent Space) to Model Space. That's all it does.

To build a TBN matrix from the normal of a face and the tangent as calculated with this code, you need GLSL code something like this:

vec3 n = normal;
vec3 t = tangent.xyz;
vec3 b = cross( normal, tangent.xyz ) * tangent.w;
mat3 tbn = mat3( t, b, n );

The normal you've seen a million times before, it is a vector perpendicular to the face in Model Space. tangent points along the positive U texture coordinate axis for the face. To calculate the bitangent, we take the cross product of the normal and tangent vectors then multiply it by a constant in tangent.w which is the handedness of the tangent space. The bitangent points along the V texture coordinate axis of the face. I made a texture mapped cube to debug the normal mapping, and on it, the TBN vectors looked like:

 

 

This TBN matrix isn't particularly helpful for us now, because we are trying to do lighting and all this does is convert normals from tangent space to model space. To make it more useful, lets build it like this:

vec3 n = normalize( ( modelMatrix * vec4( normal, 0.0 ) ).xyz );
vec3 t = normalize( ( modelMatrix * vec4( tangent.xyz, 0.0 ) ).xyz );
vec3 b = normalize( ( modelMatrix * vec4( ( cross( normal, tangent.xyz ) * tangent.w ), 0.0 ) ).xyz );
tbn = mat3( t, b, n );

By multiplying each vector by the model matrix we get a TBN which converts from Tangent Space to World Space. You'll notice that when we make a vec4 out of the normal, tangent and bitangent we use 0.0 for the w value, not 1.0 which you usually see. The reason for this is to eliminate any translation which might be present in the model matrix, since it doesn't make sense to translate direction vectors.

This brings us naturally to:

World Space Normal Mapping

The idea with world space normal mapping is to convert a normal taken from the normal map and the direction vector to the light source into World Space so that we can take a dot product of the normal map normal and the light direction to get the magnitude of the diffusely reflected light or lambert value. You can read about why it has this name on WikiPedia.

float lambert = max( 0.0, dot( normal, normalize( lightDirection ) ) );

So, in the full World Space Normal Mapping example code below I generate a TBN matrix which converts from Tangent Space to World Space, plus a lightDirection vector in World Space in the fragment shader. In the vertex shader I use the TBN matrix to convert the normal from the normal map from Tangent Space to World Space and dot it with the normalized lightDirection also in World Space.

You may have noticed this

normalize( texture2D( normalMap, texCoord0.st ).xyz * 2.0 - 1.0 )

in the fragment shader. Since the data in the normal map has to be stored in the range [0.0 - 1.0], we need to rescale it back to its original [-1.0 - 1.0] range. If we are confident about the accuracy of our data, we could get rid of the normalize here.

In all of the example code here I only generate the lambert element of the full ADS (Ambient/Diffuse/Specular) lighting pipeline. I then render that value as a greyscale so that you can see what the contribution of the normal map would be to the final image. I'm hoping by the time I'm done you'll understand the lambert term well enough to plug it into the full ADS lighting setup yourself. viewMatrix, modelMatrix, normalMatrix, modelViewMatrix etc. are all equivalent to the deprecated OpenGL GL_NormalMatrix, GL_ModelViewProjectionMatrix etc.

World Space Normal Mapping Vertex Shader

#version 120
uniform mat4 viewMatrix;
uniform mat4 modelMatrix;
uniform mat3 normalMatrix;
uniform mat4 modelViewMatrix;
uniform mat4 modelViewProjectionMatrix;
uniform vec3 lightPosition; /* World coordinates */

attribute vec3 vertex3;
attribute vec3 normal;
attribute vec4 tangent; /* Calculated as per www.terathon.com/code/tangent.html */
attribute vec2 texCoords0;

varying vec3 lightDirection;
varying vec2 texCoord0;
varying mat3 tbn;

void main()
{
vec4 vertex = vec4( vertex3.xyz, 1.0 );
vec3 vertexWorld = ( modelMatrix * vertex ).xyz;

vec3 n = normalize( ( modelMatrix * vec4( normal, 0.0 ) ).xyz );
vec3 t = normalize( ( modelMatrix * vec4( tangent.xyz, 0.0 ) ).xyz );
vec3 b = normalize( ( modelMatrix * vec4( ( cross( normal, tangent.xyz ) * tangent.w ), 0.0 ) ).xyz );
tbn = mat3( t, b, n );
lightDirection = lightPosition - vertexWorld;

texCoord0 = texCoords0;
gl_Position = modelViewProjectionMatrix * vertex;
}

World Space Normal Mapping Fragment Shader

uniform sampler2D normalMap;

varying vec3 lightDirection;
varying vec2 texCoord0;
varying mat3 tbn;

void main (void)
{
vec3 pixelNormal = tbn * normalize( texture2D( normalMap, texCoord0.st ).xyz * 2.0 - 1.0 );
vec3 normalizedLightDirection = normalize( lightDirection );
float lambert = max( 0.0, dot( pixelNormal, normalizedLightDirection ) );
gl_FragColor = vec4( lambert, lambert, lambert, 1.0 );
}

So the great thing about working entirely in world space is that you can check the tangent, bitangent and normal values by passing them to the fragment shader and rendering them to the screen and then looking at what color they are. If a face ends up red, the vector you are rendering to it points along the positive X axis. (You get green for the Y axis and blue for the Z axis.) You can also do the same thing for the normal map values. Note that these vectors can also have negative values, so unless you rescale them back into the [0.0 - 1.0] range, you'll see some black polygons.

I built a cube model then spent most of an afternoon checking every single input into the shader to try to figure out why the light was turning with the camera. In the end I figured out that that wasn't really what was happening at all.

Tangent Space Normal Maps

If you search normal map on Google Images you'll see a lot of powder-blue images. These are tangent space normal maps. The reason they're blueish is that the up vector for a normal map is on the positive Z axis which is stored in the blue channel of a bitmap. So to view a tangent space bitmap, flat is powder blue, normals pointing upwards are cyan and ones pointing downward are magenta.

There isn't any sort of standardization for normal maps, so some will only have two channels of data (it is up to you to reproduce the third yourself but you can use the other channel for some other rendering info, like a specular map), some have the normals oriented in the opposite direction vertically, so look very closely at the normal maps you use to make sure they are in the format you expect. There are also bump maps which are often green, those are completely different, don't try to use those with this.

View Space Normal Mapping

Once I had a handle on world-space normal mapping and had figured out my problem I was ready to give View Space normal mapping a try. The idea here is the same as World Space normal mapping except that this time we convert the vectors to View Space instead. The reason for doing this is that you can shift more work to the vertex shader and simplify some calculations as well.

So let's calculate our TBN again:

vec3 n = normalize( normalMatrix * normal );
vec3 t = normalize( normalMatrix * tangent.xyz );
vec3 b = normalize( normalMatrix * ( cross( normal, tangent.xyz ) * tangent.w ) );
mat3 tbn = transpose( mat3( t, b, n ) );

This one is a little different. First we multiply by the normalMatrix to convert the normal, tangent and bitangent to View Space. Since the normalMatrix is already 3x3 we don't need to use the , 0.0 trick we did in World Space. Next we make a mat3 out of t, b, and n but this time we do a transpose on it. The transpose reverses the action of the TBN matrix so instead of converting from Tangent Space to View Space it now converts from View Space to Tangent Space. There's a math reason why this works in this case. This trick does not work on all matrices.

What we do with that backwards TBN matrix is convert the direction to the light source vector from World Space to View Space and then use the TBN matrix to convert it back to Tangent Space.

vec3 temp = ( viewMatrix * vec4( lightPosition, 1.0 ) ).xyz - vertexView;
lightDirection = tbn * temp;

Tricky! Now our lightDirection vector is in Tangent Space, the same space as our normal map vectors.

Now you'll notice that the TBN construction code above is commented out in the shader below. That's because there's a mathy way to make this calculation a little simpler:

vec3 temp = ( viewMatrix * vec4( lightPosition, 1.0 ) ).xyz - vertexView;
lightDirection.x = dot( temp, t );
lightDirection.y = dot( temp, b );
lightDirection.z = dot( temp, n );

So with our tricky lightDirection vector in Tangent Space the fragment shader is super-simple and fast.

vec3 pixelNormal = normalize( texture2D( normalMap, texCoord0.st ).xyz * 2.0 - 1.0 );
vec3 normalizedLightDirection = normalize( lightDirection );
float lambert = max( 0.0, dot( pixelNormal, normalizedLightDirection ) );

View Space Normal Mapping Vertex Shader

#version 120
uniform mat4 viewMatrix;
uniform mat3 normalMatrix;
uniform mat4 modelViewMatrix;
uniform mat4 modelViewProjectionMatrix;
uniform vec3 lightPosition; /* World coordinates */

attribute vec3 vertex3;
attribute vec3 normal;
attribute vec4 tangent;
attribute vec2 texCoords0;

varying vec3 lightDirection;
varying vec2 texCoord0;

void main()
{
vec4 vertex = vec4( vertex3.xyz, 1.0 );
vec3 vertexView = ( modelViewMatrix * vertex ).xyz;

vec3 n = normalize( normalMatrix * normal );
vec3 t = normalize( normalMatrix * tangent.xyz );
vec3 b = normalize( normalMatrix * ( cross( normal, tangent.xyz ) * tangent.w ) );
/*
mat3 tbn = transpose( mat3( t, b, n ) );
vec3 temp = ( viewMatrix * vec4( lightPosition, 1.0 ) ).xyz - vertexView;
lightDirection = tbn * temp;
*/
vec3 temp = ( viewMatrix * vec4( lightPosition, 1.0 ) ).xyz - vertexView;
lightDirection.x = dot( temp, t );
lightDirection.y = dot( temp, b );
lightDirection.z = dot( temp, n );

gl_Position = modelViewProjectionMatrix * vertex;
}

View Space Normal Mapping Fragment Shader

uniform sampler2D normalMap;
varying vec3 lightDirection;
varying vec2 texCoord0;

void main (void)
{
vec3 pixelNormal = normalize( texture2D( normalMap, texCoord0.st ).xyz * 2.0 - 1.0 );
vec3 normalizedLightDirection = normalize( lightDirection );
float lambert = max( 0.0, dot( pixelNormal, normalizedLightDirection ) );
gl_FragColor = vec4( lambert, lambert, lambert, 1.0 );
}

Normal Mapping Without Precomputed Tangents

Once I had View Space normal mapping working, it was no effort at all to get the precomputed-tangent-free normal mapping working. This does normal mapping in World Space like the first example, but it computes the Tangent and Bitangent in the fragment shader. I can't notice any significant visual difference between the result of the precomputed tangents, but your mileage may vary. You might want to look at the original article and its comments if you are thinking about using this.

There is a lot more GPU calculation in the precomputed-tangent-free implementation, but you save transferring a 12-byte vertex attribute to the GPU, so which one you choose really depends on your platform and other rendering load. Apparently on some mobile platforms the precomputed-tangent-free implementation is significantly slower. I'm going to continue calculating the tangent offline and passing it as a vertex attribute to the vertex shader because some of my other shaders already put quite a heavy load on the GPU. I'm keeping this implementation though for the case where I have a model with a small normal-mapped element and the rest is not normal mapped, as I don't currently support enabling precomputed tangent generation on a per-material basis.

Normal Mapping Without Precomputed Tangents Vertex Shader

uniform mat4 modelMatrix;
uniform mat4 modelViewProjectionMatrix;
uniform vec3 lightPosition;
uniform vec3 cameraPosition;

attribute vec3 vertex3;
attribute vec3 normal;
attribute vec2 texCoords0;

varying vec3 varyingNormal;
varying vec3 lightDirection;
varying vec3 viewDirection;
varying vec2 texCoord0;

void main()
{
vec4 vertex = vec4( vertex3.xyz, 1.0 );
vec3 vertexWorld = ( modelMatrix * vertex ).xyz;

varyingNormal = ( modelMatrix * vec4( normal, 0.0 ) ).xyz;
lightDirection = lightPosition - vertexWorld;
viewDirection = cameraPosition - vertexWorld;

texCoord0 = texCoords0;
gl_Position = modelViewProjectionMatrix * vertex;
}

Normal Mapping Without Precomputed Tangents Fragment Shader

uniform sampler2D normalMap;

varying vec3 varyingNormal;
varying vec3 lightDirection;
varying vec3 viewDirection;
varying vec2 texCoord0;

// "Followup: Normal Mapping Without Precomputed Tangents" from http://www.thetenthplanet.de/archives/1180
mat3 cotangent_frame( vec3 N, vec3 p, vec2 uv )
{
/* get edge vectors of the pixel triangle */
vec3 dp1 = dFdx( p );
vec3 dp2 = dFdy( p );
vec2 duv1 = dFdx( uv );
vec2 duv2 = dFdy( uv );

/* solve the linear system */
vec3 dp2perp = cross( dp2, N );
vec3 dp1perp = cross( N, dp1 );
vec3 T = dp2perp * duv1.x + dp1perp * duv2.x;
vec3 B = dp2perp * duv1.y + dp1perp * duv2.y;

/* construct a scale-invariant frame */
float invmax = inversesqrt( max( dot(T,T), dot(B,B) ) );
return mat3( T * invmax, B * invmax, N );
}

vec3 perturb_normal( vec3 N, vec3 V, vec2 texcoord )
{
/* assume N, the interpolated vertex normal and V, the view vector (vertex to eye) */
vec3 map = texture2D( normalMap, texcoord ).xyz;
// WITH_NORMALMAP_UNSIGNED
map = map * 255./127. - 128./127.;
// WITH_NORMALMAP_2CHANNEL
// map.z = sqrt( 1. - dot( map.xy, map.xy ) );
// WITH_NORMALMAP_GREEN_UP
// map.y = -map.y;
mat3 TBN = cotangent_frame( N, -V, texcoord );
return normalize( TBN * map );
}

void main (void)
{
vec3 faceNormal = perturb_normal( normalize( varyingNormal ), normalize( viewDirection ), texCoord0.st );
float lambert = max( 0.0, dot( faceNormal, normalize( lightDirection ) ) );
gl_FragColor = vec4( lambert, lambert, lambert, 1.0 );
}

Mystery Solved

I found the solution to my rotating light problem when I replaced the code to get the surface normal from the map with

vec3( 0.0, 0.0, 1.0 )

All of a sudden I started getting flat shaded lighting which looked as expected. The light was no longer rotating.

Gamma and Normal Maps

The normal maps I was using were PNG files. The first I made in GIMP for debugging and it was just a completely flat surface which should give me flat shading if it is working correctly. The second was of a carving that I downloaded from the internet as a sanity check. It turns out both images had the same problem! I've written before about how the Lair Engine is Gamma Correct. Well, both images had a gamma value of 2.2 stored in their PNG files but the data in the files was actually gamma 1.0. When OpenGL transferred the normal maps to the video card it automatically converted them from SRGB space to linear space, thus mangling all the normals contained within. This isn't the first time I've run into this issue with PNG files, so it was time to make a tool. I wrote a little utility to load a PNG, change the gamma value without modifying the data, then write out a new PNG.

Here's the test-cube with the normal map I downloaded from the web mapped onto it. The light is above and to the right of the camera.

 


Related Jobs

Amazon
Amazon — Seattle, Washington, United States
[10.30.14]

Sr. Software Development Engineer - Game Publishing
Intel
Intel — Folsom, California, United States
[10.30.14]

Senior Graphics Software Engineer
Pocket Gems
Pocket Gems — San Francisco, California, United States
[10.30.14]

Software Engineer - Mobile, Backend & Tools
Grover Gaming
Grover Gaming — Greenville, North Carolina, United States
[10.30.14]

3D Generalist / Artist






Comments


Kelly Kleider
profile image
Don't take this as a jerky challenge, but your article seems to have a lot of math to be for the "uninclined".

Maybe uninitiated?

I don't know if you have seen this:
http://www.bencloward.com/tutorials_normal_maps1.shtml
It isn't really a functional programming approach, but it is a great description from a techart/art perspective.
In any case, nice write-up even if it was mathy. ;)

Robert Basler
profile image
Thanks, I actually thought about that when I was writing this. Granted there is math here, but I was going for it to be of the "this does this" and "just do this" variety and hoping that combined with the source code it might help someone else avoid the math entirely.

That article has some good info on making normal maps.

Terry Matthes
profile image
Great breakdown. I booked marked this to share with a few friends of mine :)

nicholas ralabate
profile image
i don't understand the "mystery solved" part at all... can you explain what was wrong with the lengyel's method and how hardcoding your surface normal fixed it?

Sandor Domokos
profile image
"mystery solved" part... This very interested in me too, please explain if it is possible. Thank You.

Joao Almeida
profile image
Hello. It's my first time in this blog. :)

Your view space normal mapping technique may have a bug. Yes, it's faster than your world space normal mapping technique (not much more faster, in my case, 88 fps for the first, and 83 fps for the second). But the world technique gives perfect results, while the view technique has some light issues.
It's normal for this to happen? :(

Please, answer me.


none
 
Comment: