Although all of these visual elements do elicit general emotional responses on their own, a director can assign further meaning to each element.
In the 2004 movie Hero, directed by Yimou Zhang, the color green signified vengeance, red stood for passion, and white denoted truth. These artistic decisions were governed by the director's interpretation of the screenplay.
As seen by the accompanying graphs, each visual component can be mapped to the story. The first graph, depicting story intensity, shows the standard rise and fall of a linear narrative.
A visual component can stay constant, follow the story, or perhaps it spikes based on some story events. The screenplay serves as a blueprint for the visuals.
Block's visual theory applies to any time-based visual medium, such as video games. Obviously, not all of these visual components can be controlled during a video game, but some, like color, tone, shape, and line quality, clearly can be used to make a game stronger.
For the game Company of Heroes: Opposing Fronts, Dinehart created the following Game Intensity Graph to generate a color script.
Peak intensity for the game and story occurred during "red" missions while "green" signified a return to normalcy. This loading screen, a prelude to the final conflict, shows affinity of color.
Even in situations where there aren't traditional narratives, these visual principles still apply. Advertisements and music videos may not have stories, but they have structure. A music video director can use the structure of verse and refrain or possibly tie the rhythm of visuals to musical intensity.
In this example, Jenova Chen, creative director of thatgamecompany, plots out a game's Emotional Intensity Graph, which could later form the foundation for visual design, story design, and sound design.
In his book, Block cautions that the worst visual design is no plan at all. Without a plan, viewers would still have a reaction, but that gut reaction may work against your purpose. "That's just how our brains work," says Drew Davidson, Director of the Entertainment Technology Center at Carnegie Mellon University. "We're pattern-making people. We look for things even when they're not there."
Narrative design drives home the meaning of a game. In a meaningful game, art isn't created because it's cool but to convey a specific impression to the player. Filmmakers have gone beyond the cool factor to try to make films that provoke thought. They've already learned that these visual components evoke certain feelings. Now, it's time for video game developers to use visual structure to dramatic advantage.
Narrative design and cinematographic techniques can be embedded into gameplay as well. As Brian Hawkins, CEO of Soma, Inc., describes more fully in his book, Real-Time Cinematography for Games, programmers can create algorithms and agents so that gameplay events have a more dramatic effect.
As lead game core and interface programmer on the game, Star Trek: Armada, Hawkins sought to make the real-time strategy game more exciting by highlighting cinematic events like explosions or confrontations. He took it upon himself to learn cinematographic techniques so that he could program them into the game.
Beyond cinematographic techniques, Hawkins advocates the judicious use of scripted sequences to push forward the narrative. These would not be minutes-long cut scenes, but infrequent pre-scripted moves created for key story moments.
Although many developers despise scripted sequences for taking control away from the player, Hawkins believes that these sequences can be done in a way that would not elicit complaints from the players. "The trick is to take control away from the players without them realizing it," says Hawkins.
In his book, Hawkins describes the creation of a jump sequence for dramatic purposes. Normally, in this hypothetical game, the player jumps from ledge to ledge, but if during a chase, the player slips and almost misses a ledge, this is a matter of chance. The player may recall later, "I barely pulled myself up by the fingers!"
Instead of leaving it to chance, the scripted sequence recreates this moment at a time when it would make the most dramatic sense for the story. The player is still in control of the jump and if the player begins the jump within the confines of a predetermined jump zone, then the player always lands with fingers on the ledge. Outside the jump zone, the player would fall in the chasm as usual. Of course, this jump sequence could only be used once or twice because overuse would reduce the dramatic effect.
Hawkins calls this the video game equivalent of cinematographic "cheating" since players probably wouldn't know the exact distance needed to land the jump. In film, cinematographic "cheating" occurs when props are moved closer to actors in close shots to indicate spatial relationships or when footage from two locations are edited together to make it seem like one location.
In some cases, set designers have actually painted shadows onto curbs. Video game developers often employ "cheating" in other areas, such as in the simplification of physics or accelerated time.