The release of Halo on the original Xbox brought many, many things to the games industry. It was, apart from Goldeneye 007 and Perfect Dark on the Nintendo 64, one of the first truly successful console-developed first-person shooters, and took full advantage of Microsoft's hardware, both the controller and the processing power itself, to deliver an experience unlike any other. While its own innovations outside of controls were few, Halo popularized open-ended single-player levels, vehicle sequences, the two-weapon switching mechanic, the "throw grenade" button, and more. By understanding the limitations of the console hardware and controller, Bungie were able to build a game whose mechanics and controls compensated for many of the flaws inherent in less precise gamepad input and turned them into strengths.
One of Halo's most-debated design choices was to include regenerating shields. Up until then, the idea of any health or armor regeneration in a first-person game was nearly unheard of. Bungie's decision, no doubt, was made in order both to appeal to wider audiences, and to make up for the slower movement, aiming and turning speed inherent in analogue stick control. Many gamers at the time, however, claimed that the game was made too easy for its regeneration - usually the old guard of PC gamers who were still riding high on games like Quake 3, Unreal Tournament, the original Call of Duty, Half-Life, and others.
Though not the first game to feature health regeneration, Halo birthed a trend that would define action games for over a decade longer.
With Halo 2, Bungie chose to take things one step farther, and implemented not just regenerating shields, but regenerating health as well. With this decision, the dynamic of the game changed in a fundamental way. Halo 2 was no longer based so much on long-term attrition and perseverance, but on mastery of the combat mechanics within specific encounters and challenges. The pace of the game changed, and much of the exploration inherent in the original title was stripped away in favor of a more focused and linear experience; the developers no longer had to think about the player's health levels one room to the next, thus hunting around for health and shields became less important.
By Activision's Call of Duty 2, health regeneration became entrenched in first-person shooters, where it has remained as standard to this day. Call of Duty 2 was even more tightly focused and straightforward than the first game, and while a fine shooter, at the time there was, like with Halo, some outcry amongst hardcore gamers, who felt that the decision to include regenerating health had been made to appeal to more casual console audiences. Whatever the reasons, though, health regeneration was here to stay, and has since appeared in everything from platformers, to action-adventures, to "old-school" RPGs.
I was one of those gamers who was upset at the rise of regenerating health years ago. While I have certainly played and enjoyed many games featuring the mechanic, it's never something I've been happy with, but the real answer for that has always eluded me. After all, I've played Call of Duty 4, Gears of War, and more, all titles which are based entirely around their regenerating health mechanics, and I enjoyed them plenty at the time. It was only after going back to earlier games again, that I found myself realizing what modern games were missing. In this piece, I'd like to get to the heart of the matter, and discuss why the added convenience of regenerating health doesn't always make for a better game.
Why Regenerating Health?
I already touched on this in the introduction, but it's worth discussing in more detail: why do developers include regenerating health in games? What are its advantages? Since it has become so popular, surely there must be some kind of consensus as to what makes it superior to traditional health systems.
The reasons for including regenerating health in a game are actually manifold, and extend beyond just the obvious ones. The implications are far-reaching and have a profound effect not just on the dynamic of combat, but on nearly every facet of the gameplay experience.
Look alive, soldier! Regenerating health makes it easier for developers to build scenarios and ensure game balance - but at what cost?
It is worth pointing out, that out of all of these justifications, none of them really work to the advantage of the player, in terms of actually providing them with more interesting, complicated, or engaging experiences. Virtually every single point on this list is a way of saying "regenerating health makes games shorter, easier, and simpler" - both for players and, more importantly, developers. While the intent isn't necessarily malicious, and I'd argue the rise of health regeneration can be pinned more on trend-hopping than anything else, the fact is: not having to take this layer of resource management into account beyond the simple 30-second gameplay loop that makes up every single combat encounter in the game takes a substantial load off of a developer's shoulders.
But is it Right for Gamers?
I admit that this part of the article is going to get into some things which depend a lot more on personal experience and opinion. I realize that this is just my own perspective on gaming, and the particular reason why I choose to play certain types of games over others. In the same format, I'd now like to offer refutations of every one of the justifications for regenerating health that I listed above.
The most realistic games of all tend to stay far, far away from regenerating health. Managing finite resources is, in fact, what make a simulation game like ArmA as compelling as it is.
It's worth qualifying all of this by saying that I understand why many games are built the way they are. I realize players want and expect a certain kind of gameplay these days, and it is more than legitimate for developers to chase that very large and profitable market.
At the same time, I fear for the ever-lowering standards of gameplay, and have to wonder if in the long run, giving the crowd exactly what it wants will eventually harm the industry by raising an audience that no longer craves novelty or challenge. I know many younger players, and it can be alarming how many of them are lost when they play games without health regeneration, without quest compasses, without completely linear levels. The gravy train may still be running along, but how long is it until those gamers, who have limited themselves and been limited by developers, lose interest in gaming entirely when the same old stuff no longer appeals to them?
In Defense of Regenerating Health
All that said, I'd like to take some time to argue that regenerating health can work well, in certain contexts. As is often the case, the problem with regenerating health is not that it exists, but rather the way in which it is used in the majority of games it appears in.
First off, I think that Halo's regenerating shield was a master stroke by Bungie. With the game's semi-open-ended levels, the desire to explore is strong for many players. Providing regenerating shields, but not health, still gives the incentive for players to look for power-ups, but also invites a degree of caution that simple regenerating health does not. It also compensates for the slower and more limited input that gamepads tend to allow versus a keyboard and mouse, and ensures players won't feel they aren't whittled down by cheap shots.
With this in mind, I think that limited regenerating health is preferable to a standard non-regenerating health, perhaps not in every single game, but the majority. Games are made to be finished, and enjoyed, and being stuck with 1 health point left right before fighting a room full of powerful monsters can be incredibly frustrating - this always was and still is the downfall of some of those classic shooters.
As such, providing a 15 or 20 percent regeneration effect on the current amount of health is preferable. Players no longer feel whittled down by the occasional stray bullet or feel the need to save scum their way through combat situations, while players who just barely scrape by will always have enough health to see them to the next health pack. Several games already do this, including Just Cause 2, whose open world nature encourages exploration and experimentation, but also makes finding health supplies more difficult - leaving a combat encounter to find a supply point is a "softer" penalty than simple death, but still serves the same effect.
The insane stunts and madcap action of Just Cause 2 arguably wouldn't be possible without at least a little health regeneration - but its limited nature ensures that players still have reason to play well.
One alternative that poses some interesting implications is the idea of "overcharging" health. This was seen last year in Deus Ex: Human Revolution, where the player's health always recharged to 100%, but could be boosted up to 200% using consumable items. While I don't think this mechanic was used to full effect, as the game was balanced around the player having 100% health, it functions similarly to Just Cause 2's mechanic - except that the default health level gives the player a bit more leeway.
That said, I am absolutely opposed to health regeneration in certain games. Resource management is a critical component of many role-playing games, and in my opinion, the shifting of that resource management over to cooldowns in many of them (especially those inspired by MMOs) leads to mechanical simplicity and a lack of any long-term risk and reward, which traditionally has been a major hallmark of the RPG genre (especially games inspired by the Dungeons & Dragons model).
Similarly, horror games do not benefit much from health regeneration, because the intense feeling of tension that comes from just clinging on by a thread, and the relief in finding health supplies just in the nick of time is one of the key things that keeps the experience engaging. It's been said before that the suspense and fear of death is the most compelling aspect of horror, and my experience with horror games certainly agrees with that statement.
As I've tried to stress, I don't think regenerating health is enough to "ruin" any game, and I don't think that using it in the manner that is currently popular is a bad thing in every single instance, especially when your goal in designing a game is to create something for as wide an audience as possible. At the same time, health management is one of the most fundamental components of videogame design, and casting away the long-term component of it also saps a lot of interesting gameplay potential, not to mention also tends to sap the brand identity behind gameplay.
More and more developers have begun to deviate from the usual health regeneration over the last few years, so there's hope that by next generation the trend will have ended, but more than anything I'd simply like developers (and publishers) to keep in mind the benefits of more complex mechanics, and not to simply brush something off because it isn't the easiest, quickest way.