What is Adaptive Music?
This section didn’t previously exist. When I originally wrote this
piece, I assumed that readers would already know what adaptive music
was. I also assumed that they would have some familiarity with the
kinds of game implementations that are out there. So, in the
introduction, I jumped right into exploring the value of a formal
definition, without first introducing the basic idea of adaptive music
itself. And, in the body of the article, I focused exclusively on non-gaming examples – figuring the reader would be less familiar with (and therefore much more interested in) these.
As it turned out, the original approach didn’t work at all for
non-expert readers. I got some preliminary feedback like: “Wow,
interesting piece. But are there any, you know, video game examples of adaptive music?”
So! If you don’t already know what adaptive music is, or how it’s used
in games, this brand-spanking-new section is for you. On the other
hand, if you are an expert reader, you may want to skip ahead.
“Adaptive music” is a game industry term that seems to have entered
general usage in the last ten years or so. (I don’t know who first
coined the term.) It replaces the term “interactive music”, which was
more immediately obvious, but had some accuracy issues.
If I started talking about “interactive music”, you’d probably
immediately know what I was talking about… so actually, let’s start
there. (In a minute, I’ll explain why we don’t call it that any more.
For now, I’ll use “interactive”, since it’s a bit more user-friendly.)
Making Multimedia Music Meaningful
Interactive music is the game industry’s answer to the Hollywood film score.
At the movies, music has become a powerful tool for clarifying and
emphasizing emotional contexts. It complements and reinforces onscreen
action and beats. It can add or resolve tension, or create
anticipation. It can subtly (or not so subtly) associate intangible
aural qualities with different characters, while evoking the spirit of
times long past and places long vanished (or yet to come… or never to
It would be really cool if game music could
complement onscreen action with the same kind of subtlety, depth, and
expression. The complication is that, in games, the timing, pacing,
contexts, and outcomes of the onscreen action are constantly in flux,
depending on the actions of the player.
player winning? How many orcs are left? Is it important that the player
just ran out of painkillers? Did the player find the AK47 yet, or are
the dragons going to eat the bowling ball before her plate-mail is
repaired? How long is this battle going to last? Are the tides turning?
Was that hit significant? Or… did the battle start at all, or did the
player sneak past with the Cloaking Cloak of Cloaking +2?
Most importantly: how can a game composer score a scene intelligently and compellingly, when she doesn’t know what is going to happen, when?
Film Scores Versus Game Scores
When a film composer starts working, she sits down with a finished,
fixed-length, edited scene. She can custom tailor a musical
accompaniment to fit the film perfectly, with subtlety, elegance, and
grace. (One hopes.) Linear music is composed to match the linear scene
Interactive music in games attempts to
score non-linear, indeterminate scenes with non-linear music. It uses
game code and data to track changing game contexts on the fly, and to
cue appropriate score responses. It also has to track the current music
context, in order to avoid ugly or musically inappropriate transitions
or pacing. It’s an interdisciplinary challenge, since game logic
synchronization is squarely in the programmer’s domain, while music
logic is best left to the composers.
Why Don’t We Just Call It “Interactive Music”?
In most cases, “interactive” music falsely implies direct user interaction with the music. (In some games, this is in fact the case – as in the musical game-play of “PaRappa the Rapper” or “Guitar Hero”.) However, in general, the user should really be concerned with interacting with the game. The music system is supporting the dramatic action by adapting intuitively and discretely in order to remain contextually appropriate. Hence “adaptive” music.
Examples of Adaptive Music in Games
The following survey of examples is by no means comprehensive – it
barely scratches the surface. Nor should it be considered
representative – the selection criteria was rather sketchy. (By and
large these are adaptive music systems that have been presented at GDC
lectures, with some arbitrary references to titles I happen to own and
enjoy.) But some attempt was made to present a broad sample of
important trends, technologies, and techniques.
Totally Games / LucasArts’ “X-Wing” series
The “X-Wing” (PC DOS) series, which debuted in 1993, featured MIDI
versions of John Williams and John-Williams-esque orchestral music.
Lucas Arts’ patented iMUSE music engine handled sophisticated run-time
interactions between dramatic onscreen action and a database of music
loops, cues, and transitions. (Evolving versions of iMUSE were also
used on a number of later Lucas Arts projects.)
Monolith’s “Blood II: The Chosen” and “Shogo: Mobile Armor Division”
These Windows titles (released in 1997 and 1998) featured Microsoft
Interactive Music Architecture (IMA) technology – a precursor to the
(now deprecated) DirectMusic SDK. [3DSoundSurge01]
provided runtime musical alignment tools, and design-time tools for
managing (and auditioning) adaptive music building blocks. Advanced
functionality included runtime MIDI variation generation based on
composer-designed templates (“Styles” and “Chordmaps”), and
standardized methods for switching between context-sensitive content
versions via “Groove Levels”. DirectMusic was fully supported in
DirectX 7 through early versions of DirectX 9.
Sony’s “The Mark of Kri”
In a 2003 GDC presentation, Chuck Doud described custom adaptive music
technology and techniques that were developed specifically for this PS2
title. Doud stressed that constant, diligent communication and
coordination between programmers, designers, and composers were
essential for the interdisciplinary project. The team found that an
emphasis on percussive elements and irregular meters allowed for some
startlingly fast transitions to remain musically consistent and
cohesive. [Doud03] The examples he showed demonstrated some incredibly tight synchronization to on-screen state changes.
Stormfront Studios’ “The Lord of the Rings: The Two Towers”
This multi-platform 2002 release was able to mine its adaptive building
blocks from hours of recorded orchestral cues for the film of the same
name. It is a very interesting case, in which big budget music by a big
Hollywood composer (Howard Shore) receives a very ambitious interactive
video game treatment. [Boyd06]
Ubisoft’s “Rainbow Six 3”
This 2003 Xbox title is a good example of a contrasting approach to
adaptive game music, which could more properly be described as adaptive
music editing than adaptive music composition. In “Rainbow Six 3”,
use of in-game music is much more sparing than in the other examples.
The title avoids the wall-to-wall approach in-game music, and thereby
sidesteps the challenge of changing musical forms on the fly. Cues are
reserved for a small number of special dramatic contexts, which are, in
general, designed not to overlap. A good percentage of game-play is not
scored, focusing instead on immersive sim sound design elements.