Gamasutra: The Art & Business of Making Gamesspacer
Refractive Texture Mapping, Part One
View All     RSS
December 18, 2014
arrowPress Releases
December 18, 2014
PR Newswire
View All






If you enjoy reading this site, you might also want to check out these UBM Tech sites:


 
Refractive Texture Mapping, Part One

November 10, 2000 Article Start Page 1 of 4 Next
 

This article presents, in two parts, a detailed implementation of refractive texture mapping for a simple water wave simulation using directional sine waves applied to a flat polygonal mesh. It is assumed that the reader is familiar with the 3D transformations pipeline, simple vector algebra, surface modeling using polygons, as well as UV coordinates for texture mapping.

This article is better suited for console development, but for sake of clarity, the sample program was implemented in a Pentium 300 with a 3D graphics card, and using DirectX 7. Even though DirectX 7 has its own implementation of one of the mapping techniques described in this article (sphere mapping), DirectX 7 was only used to transform and render the final polygons in the sample program. All the UV coordinate calculations, perturbations, normal calculations, and so on, were done internally by the program. It's also important to note that the hardware did all the transformations, and all the calculations described here are in world space. However, these calculations can be easily done in object space if necessary as described further.

Motivation

Last year I started to play around with a particular texture mapping technique called sphere mapping. As I went further in my research, I found out that sphere and refractive texture mapping were similar. At the end, I started to be interested in writing a real-time water wave application using refractive texture mapping. Meanwhile, I also found out that the literature about the subject was not too clear. Most of the papers, articles, and books either described the techniques too briefly, missing a number of details, or dove into things such as fluid dynamics.

In this series of articles, I'll present some experiments I did with sphere and refractive texture mapping, as well as few tricks I learned to speed up the code, without compromising the quality of the final rendering too much. The first part will focus on using sphere mapping to simulate curved-surface reflections, and in part two I will describe how to use refractive texture mapping to simulate water refraction, complete with optimizations and a sample program.

Sphere Mapping

Sphere mapping is a technique used to simulate reflections in curved surfaces. There is a really good discussion about sphere mapping in Real-Time Rendering by Möller and Haines, and also in the DirectX 7 SDK documentation. Here I've presented a simplified version of the technique that can be done basically in two steps. First, the reflected rays from the camera's position to the object are computed. Second, the reflected rays' values are interpolated as texture coordinates. Note that a special texture is normally used for the final rendering.

Figure 1: Reflected ray.

By looking at the figure above (Figure 1), the reflected ray is computed by the formula:

In the particular case of a polygonal mesh, the surface normals (N) are the vertex normals. A simple way of computing the vertex normal is by averaging the surface normals of all polygons sharing this vertex. The incident rays (V) are the vectors between the mesh vertices and the camera position.

Once the reflected rays are computed, their results need to be transformed in texture coordinates. As in 3D space, the reflected ray has three values (X, Y, Z) and the texture mapping is a 2D entity, the values need to be transformed in some way to generate the final UV coordinates. There are several ways of transforming the reflected ray values in UV coordinates. An easy an simple way is just to ignore one of the values (in my particular implementation, the Z value) and interpolate the remaining two values as UV coordinates. This approach is reasonably fast and generates good results. After the UV coordinates are stored in the mesh data structure, the polygons are sent to the hardware to be transformed and rendered.

Even though the process is simple, there are few details to be considered for the implementation, specially if the target machine is a console. Let's look at pseudocode below to see these details.

Listing 1. Sphere mapping

//main loop
vertex_current = (VERTEX_TEXTURE_LIGHT *)water->mesh_info.vertex_list;
for (t0=0; t0mesh_info.vertex_list_count; t0++,

vertex_current++)
{
camera_ray.x= camera.camera_pos.x - vertex_current->x;
camera_ray.y= camera.camera_pos.y - vertex_current->y;
camera_ray.z= camera.camera_pos.z - vertex_current->z;

//avoid round off errors
math2_normalize_vector (&camera_ray);

//let's be more clear (the compiler will optimize this)
vertex_normal.x= vertex_current->nx;
vertex_normal.y= vertex_current->ny;
vertex_normal.z= vertex_current->nz;

//reflected ray
dot= math2_dot_product (&camera_ray, &vertex_normal);

reflected_ray.x= 2.0f*dot*vertex_normal.x - camera_ray.x;
reflected_ray.y= 2.0f*dot*vertex_normal.y - camera_ray.y;
reflected_ray.z= 2.0f*dot*vertex_normal.z - camera_ray.z;

math2_normalize_vector (&reflected_ray);

//interpolate and assign as uv's
vertex_current->u= (reflected_ray.x+1.0f)/2.0f;
vertex_current->v= (reflected_ray.y+1.0f)/2.0f;

}//end main loop

Figure 2: Reflected rays.

First, all the coordinates are assumed to be in world space. This is because the mesh vertices are in world space (in the sample program), and the vectors from camera to the mesh vertices (incident rays) are also in world space. In case you need the object (mesh) to be in object space, you must transform it into world space before applying the calculations. Otherwise you must transform the incident rays back to object space. Whichever is more convenient for you depends on the hardware and the application.

Second, the mesh normals need to be computed and normalized before the loop above starts. The sample code uses the simple average surface normals approach. Third, all the other vectors need to be normalized to avoid roundoff errors, and for the final interpolation. Also, looking at how the incident ray is being computed in the code above, you can see that it's inverted. This is for debugging purposes. The implementation can display the rays (see Figure 2), and if they were in the right direction, they would pierce the object. The inverted incident ray does not cause any problems, it just hits the map inverted. Finally, as the components of each normalized reflected ray vertex are going from -1.0 to 1.0, by simple algebra they can be linearly interpolated as:


If the reflected ray is normalized and the texture coordinates come from this vector, the vector will hit the map in a shape of a sphere. For this reason, a special texture is normally used. The texture needs to represent a full view of the environment distorted as if seen through a fish-eye lens. Figure 3 is an example of this type of texture.

Figure 3. Sphere mapping texture (courtesy of Paul Haeberli, SGI).

In water simulation, sphere mapping can also be used simulate water reflections. Also, in the code above there are couple of vector normalizations, which might not be too efficient on some platforms. However, most console hardware can perform vector normalization reasonably fast. Also, the spherical texture for water simulation, is really not a requirement. If the application does not need to generate a reflection of the environment, graphic artists can experiment with different types of textures until good results are achieved.


Article Start Page 1 of 4 Next

Related Jobs

Gameloft
Gameloft — New York, New York, United States
[12.18.14]

Technical Director
Rumble Entertainment, Inc.
Rumble Entertainment, Inc. — San Mateo, California, United States
[12.18.14]

Senior Platform Engineer
Deep Silver Volition
Deep Silver Volition — Champaign, Illinois, United States
[12.18.14]

Senior or Principal Programmer
Irrational Games
Irrational Games — Quincy, Massachusetts, United States
[12.18.14]

Sr. Programmer (AI & Gameplay)





Loading Comments

loader image