This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.
Heads-up displays, or HUDs for short, have been an integral part of video game design since the industry's infancy. Currently, however, three factors are working to shift the console game player away from a reliance on HUDs. This has created a challenge for developers: how do you convey necessary information to the player without utilizing a traditional HUD?
First, it's important to answer the question, “What is a HUD?” A HUD is simply a collection of persistent onscreen elements whose purpose is to indicate player status. HUD elements can be used to show, among many other things, how much health the player has, in which direction the player is heading, or where the player ranks in a race. This makes the HUD an invaluable method of conveying information to the player during a game. It is an accepted shorthand, a direct pipeline from the developer to the end-user. So what would make console developers suddenly rethink the necessity of such a seemingly essential and time-honored technique as the HUD? Here are three compelling reasons.
It is only recently that console developers have begun to address the hi-def revolution taking place in living rooms around the globe. According to the Consumer Electronics Association, over 12 million high-definition televisions (HDTVs) were sold in the United States between 1998 and 2004, and the market continues to grow rapidly; research firm Strategy Analytics has predicted HDTVs in almost 30 million American homes by 2008. With the advent of a new generation of consoles, developers are finally taking advantage of the ultra-sharp screen resolutions and theater quality sound offered by these increasingly common home entertainment systems. However, millions of high-definition televisions have an Achilles heel that can hinder developers as well: burn-in. Burn-in can occur on different types of phosphor-based HDTVs, including plasma and traditional rear-projection units; it is caused by persistent onscreen elements that, over time, create a ghost image on the screen even after they are no longer shown. Hmm… persistent onscreen elements? Like a HUD? The short answer is yes—traditional HUDs can pose a risk to many who play console games for extended periods of time on their HDTVs. Although some newer types of HDTVs are not prone to burn-in, these are generally more expensive and often have shortcomings of their own. In addition, while the dangers posed by burn-in may be less frequent or severe than many consumers believe, this doesn't stop the consumer from worrying; many an HDTV owner has had a marathon gaming experience effectively ruined by the nagging concern that the health bar at the top of the screen might never go away after the game is done.
|Peter Jackson's King Kong by Ubisoft offers a HUD-less, cinematic level of player immersion.|
For many years, game developers have spoken of the goal of achieving a cinema-quality experience in a video game. One of the key ingredients for such an experience is the successful immersion of the player into the game world. Just as a filmmaker doesn't want a viewer to stop and think, “This is only a movie,” a game developer should strive to avoid moments that cause a gamer to think, “This is just a game.”
How, then, does a developer avoid such moments? Increasingly sophisticated home theater systems have helped create a sense of immersion for those that have them. More detailed graphics and more refined storytelling techniques can also draw a player into a rich and complex game world. However, nothing screams “this is just a game” louder than an old-fashioned HUD. It is not a part of the game world; it is an artificial overlay that is efficient, but often distracts the player from the environment in which he or she is immersed.
The rise of the casual gamer.
As video games attempt to reach new audiences beyond the core gamer market, developers are realizing the need to simplify interface design. While hardcore gamers might not be intimidated by numerous status bars and gauges onscreen, a casual gamer is much more likely to feel overwhelmed. Gamers looking for a “pick up and play” experience are not inclined to spend time figuring out what all those bars and gauges are for. The simpler and more intuitive the interface, the more accessible the game can be to non-traditional gamers.
Together, these three factors are prompting a sea change in how and when HUDs are used in video games, especially those developed for console systems. (This shift is also noticeable in certain types of PC games, and many of the games used as examples herein are also available for the PC platform. The discussion here focuses on console design due to the added problems of console-specific issues such as burn-in.) Peter Jackson's King Kong by Ubisoft (Xbox 360) is a perfect example of things to come: the game features essentially no HUD elements, and offers a level of immersion comparable to the film that shares its name.
In Doom 3, some weapons show an ammunition count directly on the weapon model.
How to Go HUD-less
How, then, does one convey player status information without a HUD? There are many techniques that can be utilized to reduce dependency on HUD elements, or to reduce the intrusiveness or potential screen damage from necessary HUD elements. A survey of recent games illustrates a number of these techniques, though in most cases the solutions are applied redundantly, partially or only within certain contexts; a well-planned and comprehensive interface design that addresses these concerns during preliminary design stages can lead to a more consistent and successful end result.
Decide what you need… and what you don't.
Many elements found on a typical HUD are there not out of necessity, but out of convention; they represent a sort of “info overkill” that, for the vast majority of players, has no impact on gameplay at all. For every piece of information you offer the player, ask, “Is this information essential to the game experience?” In doing so, you might find that you don't need to bombard the player with quite as much data as you once thought you did.
Call of Duty 2 (Xbox 360) provides a good example of eliminating one type of unnecessary information. Although the game does feature some elaborate HUD elements, it's also notable for what it doesn't feature: a visible health meter. It seems illogical for a first-person shooter to not include a health meter of some sort; and yet, the game plays beautifully, relying on a very simple and intuitive visual cue that warns the player when health is dangerously low: the screen periphery turns red and pulses. It doesn't take long for a player to realize that this means, “Take cover and give yourself a few seconds to heal, or you're going to die.” This not only removes unnecessary onscreen information, but also creates a much deeper sense of immersion in the game world.
At the same time, however, Call of Duty 2 features an indicator that lets the player know if he or she is standing, crouching or crawling. While this sort of indicator might have been valuable back when camera height was the only differentiating factor between the different stances (as in, say, the original Half-Life ), more intuitive visual cues offered in a game like Call of Duty 2 have rendered this sort of indicator essentially redundant. Even if it were not redundant, though, it is still unnecessarily distracting; with a HUD element such as this, it is well worth the time and asset investment to…