Animation and Rendering
In the last section, we described how the light causes caustics by the
water surface focusing and defocusing light rays. However as the rays
pass the water matter, they scatter from small particles floating in the
water (plankton, dirt), making them visible and causing streaks of light
known as Godrays. Rendering this phenomenon correctly would require volumetric
rendering. However if we don’t insist on absolute correctness, preferring
the visual look of the result, we can use a quite simple algorithm to
create relatively convincing pictures. We already have the caustics texture,
which represents shape and positions of the individual ray streaks (even
though only as a slice at given depth). If we define this slice to represent
the light intensity for the whole volume, we can render it using techniques
for volumetric rendering.
position of our camera, we create several (in our experiments 32) slices
of the volume as seen in Figure 3-10. We then render them into the completed
scene with additive alpha-blending (and writes to zbuffer disabled).
this method shows visible artefacts – revealing the low sampling, we use
a non-uniform distribution of the samples. We use high density in front
of camera – these samples are responsible for the smooth look of the result
and for bright spots where they should be. The lower density samples further
away from the camera ensure that the rays extend into distance.
increasing the number of rendering passes considerably slows down the
whole process, we can use the multitexturing capabilities of graphics
hardware to increase the number of samples as suggested in . So even
if we render just one slice, we apply to it four textures at once as if
they represented subsequent samples of the volume. In this way we obtain
128 samples on the GeForce3, which gives us smooth enough pictures in
most cases (as seen in Figure 3-11).
that we can “skew” the volume, resulting from repeating our caustics texture,
in any way to simulate rays going from a given direction (according to
position of the sun). An additional improvement (which we didn’t implement)
would be to use shadow buffer to take shadows cast by objects in water
Spray and Bubbles
Whenever the water surface is violent enough or the water meeting obstacles,
we should see foam resulting from the breaking waves. Probably the best
way to render this would be to use particle system, but this would be
quite costly for our purposes. Instead, we take advantage from the fact
that the foam always lies on the water surface and render is as another
texture layer on top of it. In our implementation, each vertex of the
grid has assigned an “amount of foam” to it. Then, when rendering the
surface, we use this amount as transparency for the foam texture stretched
over the whole surface (the texture itself is rendered with additive blending).
Now the only thing left to solve is spawning of the foam itself. Here
we use modification of the algorithm suggested in . For a given vertex
and its two neighbours (in x and z direction) we compute the difference
between their slopes. If you remember the way we animated choppy waves,
the displacement used does in fact represent how close these points get.
Now if the computed difference is less than a chosen (negative) limit,
we increase the foam amount for the given vertex by a small number. Otherwise,
we decrease its current foam amount (causing the foam already existing
to fade away). In this way we get foam spawning near tops of big choppy
(and possibly meeting) waves. See Figure 3-12 for a typical result of
the foam generation.
is important to note that even though the alpha factor of the foam texture
is limited to the range [0,1], this is not true for the foam amount (that
can be more then one, but should be still limited). Also, when we detect
a foam-producing point, we shouldn’t set its foam amount immediately to
maximum – the vertex is likely to spawn foam the next few frames as well,
and increasing the foam amount slowly gives a better visual result. Limitations
of this technique are quite obvious – the rendered foam looks quite similar
at different places (since it’s just a repeated texture, not an uniquely
generated pattern), and it doesn’t move on the water surface according
to it’s slope (though one might get the impression that this is happening
when using the choppy waves algorithm).
When water collides against obstacles we generate spray of water using
a particle system with simple Newtonian dynamics, see . Each particle
is given an initial velocity taken directly from the water-surface’s velocity,
at the spawning position, with added turbulence. It’s then updated according
to gravity, wind and other global forces thereafter. Rendering of the
particles are done with a mixture of alpha-transparency and additive-alpha
sprites. See Figure 3-13 for a screen shot of this effect. The particle
system is also used for drawing bobbles from objects dropped into the
water. For this effect we simply move the bobbles on a sinus path around
the buoyancy vector up to the surface were they are killed.
We implemented the algorithms described in this paper on a PC platform
with windows. Both the FFT-based and physical-based animations were realized
for grids with 64x64 elements. Two FFTs were required for the animation,
one complete complex complex for the surface slope (that is later used
either for the choppy waves or for surface normals) and one complex real
for surface height. Our first implementation used routines from , but
later we replaced them by faster routines from the Intel . Math Kernel
Library. Rendering is implemented in DirectX 8.1 using nVidia’s GeForce3
hardware for rendering. While the basic computations (heights, normals,
foam etc.) is done only once for a single water tile (that can be repeated
all over the place), many other computations depends on the viewer position
(we use local viewer everywhere) and thus had to be done separately for
each tile. This offers perfect opportunity for the use of vertex shaders,
offloading the burden of those computations from CPU. Some of these effects
(especially per-pixel bump-mapping) requires the use of pixel shaders
as well, but in general most of the algorithms described here should be
possible on DirectX7 class hardware.
and Future Extensions
We have presented a new scheme for deep-water animation and rendering.
It’s main contributions on the animation side is the blending of proven
methods for realistic object/ocean interaction. On the rendering side
we have presented a new method for foam rendering and shown clever use
of the new 3D graphic cards features to reach new levels of (realtime)
realism. There are many extensions, to the current implementations, that
we want to try out in the future. First of all we are not to impressed
by our Phong shaded water shimmering. We believe this is mainly because
of too low contrast in the final image. Contrast enhancement can probably
be realised by using Hi-Dynamic Range Images (HDRI), as described in .
We also want to try prefiltering of the environment-map  to approach
the BRDF shading of water. When it comes to animation, there’s so much
cool stuff out there to follow up! Foremost we are trying to get the ocean
sinus model from  to work with our system…breaking waves next?
We would like to thank Richard Lee for implementing the choppy waves modification
and “forcing” us to add light shimmering. Super-hero-star to Tore Blystad
and Christian Morgan Enger for their excellent demo artwork, and a final
thank to Mads Staff Jensen for the slide illustrations!
 Alan Watt and Mark Watt. Advanced animation
and rendering techniques. ISBN 0-201-54412-1
 Jerry Tessendorf. Simulating Ocean Water.
SIGGRAPH 2001 Course notes.
 Miguel Gomez. Interactive Simulation of Water
Surfaces. Game Programming Gems. ISBN 1-58450-
 Anis Ahmad. Improving Environment-Mapped
Reflection Using Glossy Prefiltering and the Fresnel
term. Game Programming Gems. ISBN 1-58450-
 Alex Vlachos and Jason L.Mitchell. Refraction
Mapping for Liquids in Containers. Game
Programming Gems. ISBN 1-58450-049-2.
 Press, Teukolsky, Vetterling, Flannery. Numerical
Recipes in C, The Art of Scientific Computing.
Second edition. Cambridge University Press. ISBN
 Jim X. Chen, Niels da Vitoria Lobo, Charles E.
Hughes and J.Michael Moshell. Real-Time Fluid
Simulation in a Dynamic Virtual Environment.
IEEE Computer Graphics and Application. May-June
 Nick Foster and Dimitri Metaxas. Realistic
Animation of Liquids. Graphical Models and Image
Processing, 58(5), 1996, pp.471-483.
 Nick Foster and Dimitri Metaxas. Controlling Fluid
Animation. Proceeding of the Computer Graphics
 Nick Foster and Dimitri Metaxas. Modeling the
Motion of a Hot, Turbulent Gas. Computer
Graphics Proceeding, Annual Conference Series,
1997, pp. 181-188.
 Jos Stam. Stable Fluids. SIGGRAPH99
 C.Rezk-Salama, K.Engel, M.Bauer, G.Greiner,
T.Ertl. Interactive Volume Rendering on Standard
PC Graphics Hardware Using Multi-Textures And
 David Baraff, Andrew Witkin. Physically Based
Modeling SIGGRAPH 98 course notes.
 Mark J.Kilgard. Improving Shadows and
Reflections via the Stencil Buffer, nVidia white
 Foley, van Dam, Feiner and Huges. Computer
Graphics. Principles and Practice. ISBN 0-201-
 Tomoyuki Nishita, Eihac hiro Nakamae. Method of
Displaying Optical Effects within Water using
 Michael Kass and Gavin Miller. Rapid, Stable Fluid
Dynamics for Computer Graphics. Computer
Graphics, Volume 24, Number 4, August 1990.
 Joe Stam. A Simple Fluid Solver based on the
FFT. Journal of Graphics Tools.
 Hugh D. Young. University Physics. Eighth
edition. ISBN 0-201-52690-5.
 Lasse Staff Jensen. Game Physics. Part I:
Unconstraint Rigid Body Motion.
 Wolfgang Heidrich. Environment Maps And Their
 Jonathan Cohen, Chris Tchou, Tim Hawkins and
Paul Debevec. Real-time High Dynamic Range
Texture Mapping.. Eurographics Rendering
 Alain Fournier and William T. Reeves. A simple
model of Ocean waves. SIGGRAPH 1986