David Kanaga, the musician behind Dyad and Proteus, explores the connections lying at the heart of video games and music, in this reprint from Game Developer's May issue.
An "isomorphism" is a 1:1 structural relationship between different forms. As a simple example, we could say that the equations "2x - 2 = 0" and "3y - 3 = 0" are isomorphic, because they both define a variable with equivalent solutions, x = y. Douglas Hofstadter famously hypothesized a relationship between perception of isomorphisms and our experience of meaning (see Godel, Escher, Bach). There has been loads of recent interest, of course, in what it might mean to make games more meaningful.
As events and processes in time, musical structures and game structures can be described isomorphically. That is to say: games are musics, musics are games.
I have a theory that the vague concept of "musical meaning" is relevant to finding a new kind of played meaning in games. Designing games as music spaces ("music designing" any game), we can tap into existing structural relationships (isomorphisms, harmonics) in the system/unit operations--and by "hugging"/"skinning" these structures with composed sound materials variably manipulated in ways true to the space and the time-structures of play, we can make their presence sensuously felt. It is in this presence that gameplay becomes musical.
Naturally, we ask: "Is it really true that games are music?"
How do we begin to explore the possibility that they are, or might be?
The following is a practice/method/game/process I've been experimenting with to test the idea that games are music--in two parts, looped:
1. Reading Games as Scores
When designing (composing) music for games--any game--try to "read" the game mechanics and structures and feels as a score to be realized sonically, a compositional map. The musical movement exists ready-made in the game's structure, its rhythms, all of its motion. Be true to these time-structures. Video games are dense with mechanical variation. This is musical variation.
For every process or change of state in a game, there should be a corresponding process or change of state in its soundtrack. So, don't think in terms of background music and sound effects, but rather events, states, processes, textures, rhythms, and forms. There is no sound design or composition, only improvisation and music design--improvisation plays, and music design builds.
Now, if you embrace this method, you will have to compose spaces that tend toward fluidity and disorder (because they're being played), built of modules and processes as opposed to full "pieces." Depending on the game, many modules could be required, perhaps as many as there are sprites or 3D models in the visual design, and producing all of these might be a lot of work. To make this tolerable, play all of it! Improvise the music, don't get caught up in how things sound. It doesn't matter, everything is music. Music doesn't need sound, only movement.
Algorithmic techniques are useful, but don't lose track of the touch of musical play (we'll touch on this in the next section). It might be helpful to practice compositional techniques on video footage. For example, take the "Mickey Mousing" technique (the process of scoring music events to coincide with motion-image events); while it's perceived humorously in movies, the fact is that games are music, and as played music, they are effectively musical instruments. In other words, we can think of Mickey Mousing in games as the process of designing a musical instrument such that it can produce sound.
Think of your music-design process as adding vibrational effects (real perceptual experience) to informational architectures (music spaces as virtual compositions/scores). Graphics (lights, images) do this too; they are information sensualizers--but graphics are only seen in front of us while music surrounds us.
2. Playing Music
When you're playing music outside of video games, try to find spaces that you love independently of video games. Continue to explore new territories (play spaces): Improvise, "de-quantize" (an aesthetic-ethical tactic of turning constants into variables described in Adam Harper's excellent book of music space theory, Infinite Music), find music in all things. We don't know what kinds of play spaces are possible, so finding new spaces in play is essential research.
We can too easily grow accustomed to all sorts of value-qualifiers as to what music "should be." If we are unable to push past these boundaries, to feel the infinite potential and presence of music in all situations (project of John Cage's 4'33"
), we will be hesitant to read a game structure as musical (it might be arrhythmic, for instance). Find new music, new movements--we cannot consider motion to be separate from music.
Music design is quantitative manipulation of qualitative affect--experience becoming numerical value, becoming variable, becoming experience. Numbers, harmonics, vibrations. Our conception of systemic design should not be opposed to sensuousness or irrationality. On the contrary, music design will need to be totally irrational in its pursuit of real subjective experience, while at the same time highly disciplined in its engagement with quantity.
Imagine game music design as a similar process to following music notation or data visualization's leads in making raw quantitative data intuitively knowable, tangible, playable. It is in transforming material presence that we experience the nonrational, qualitative time flows of differences, repetitions, harmonies, textures--these qualities that describe the lived space of musical playing.
David Kanaga is an improviser and music designer. He has produced award-winning dynamic scores for the indie titles Dyad
. He is currently researching smooth and shifting dimensionality in improvised music spaces with Ilinx Group and working on music designs for morphing landscapes in Panoramical.