Clearly video games have been trying to give players meaningful moral choices for quite some time, but the solutions remain imperfect. It's no simple design matter. The difficulties that arise, when games try, appear varied, as the previous examples demonstrate. But the key problem is singular: context.
The BioWare model used in Mass Effect and other titles allows for a dominant moral strategy because of the gameplay implications of the player's moral choices. That is, BioWare puts the choice in the wrong context. Each choice becomes not an ethical decision, but a gameplay decision. They share their context with character build and combat tactics (amongst other things).
This extra layer of strategy may make these games stronger as gameplay experiences, but the moral aspect is really just window dressing. The difference between these choices and, say, which weapon to equip to a character is largely semantic.
The advantage to this approach is that gameplay is the context: the game does not need to spend additional time and assets to set the stage for the choice. The player will be familiar with the way the game works already, and so understands the implications of the decision and its consequences.
Insofar as this makes the decision meaningful to the player in some way, it succeeds. Perhaps, since games are already abstractions of the actions they approximate, this moral abstraction is acceptable, too.
In fairness to BioWare, the approach it takes with the Dragon Age series is, while still mechanically defined at its core, stronger. Rather than saddling the main character with an alignment measure at all, the Dragon Age games instead track how much the party NPCs like the player character.
Actions a character agrees with will raise this Friendship rating, while actions he or she disagrees with will lower it, and there is ample opportunity to make up for past disagreements.
The benefit to maintaining high Friendship scores is access to some minor statistical buffs for the character in question -- helpful, but not dramatically so. The significant choices, then, feel somewhat more divorced from the game's mechanics than their Mass Effect counterparts. (Frankly, I also believe that the strength of Dragon Age's writing is simply much better than Mass Effect's, resulting in far more compelling situations regardless.)
The context of choices in Dragon Age is twofold: primarily, they are placed within the context of the world and the story, and secondarily within the context of the player character's relationship with the party NPCs.
More narrative-focused games like Heavy Rain approach the problem of context from the other direction. They do all they can to invest the player in the game's story and characters so that moral decisions can be significant in the context of the characters they affect.
This is truer to the situation at hand -- the player can base his decisions on the facts they possess, not a gameplay strategy -- but it takes a great deal of effort to reach that level of investment. The method is, therefore, high-risk: many players simply don't have the patience for the amount of narrative this usually requires, and those that do may still not actually become as invested in the game narrative as the designers intend. Making anyone care about fictional characters and situations is difficult at the best of times.
So this context needs to change. The gameplay context tends to overwhelm the moral concepts at play; meters like Mass Effect's, or even out-of-game achievements, "invariably guide players to push to one extreme," as Louis Castle observed. Even just tying the morals to the gameplay tends to boil game morality down to using violence or using more violence. But a fully dramatic context is largely impractical; the execution alone adds huge costs in time and money to the development of an average contemporary game release.
Building context can be hugely time consuming and difficult to achieve fluidly. That is the strength of a mechanical context: players will learn the game anyway. A narrative context --whether a choice is significant to a character, a plotline, or the entire game world - takes more effort, and requires that the player buys into it in the first place. RPGs like Dragon Age or The Witcher have the luxury of side quests to build this in small pieces; besides, players expect a great deal of narrative content from that genre and will generally forgive it.
Other genres have it harder, though, because they do not have this advantage. Players of first-person shooters, for instance, would likely never put up with Mass Effect's encyclopedic codex of background info. This does not mean that there is no way to give such genres a thoroughly narrative context.
The obvious (and easy) answer is to use cutscenes and in-game dialogue to tell the story, both of which have ample precedent. But there is also an immense amount of power in a game's setting: this is the environment a player will actually be exploring throughout, and with the current graphics capabilities of game consoles the sky is arguably the limit. Consider BioShock: the environment of Rapture alone does so much narrative work that a discussion of it would easily warrant its own essay, and then some. Immersive worlds like this encourage investment.
Of the titles I've discussed, I would argue that The Witcher gets nearest to a solution by hybridizing the systematic and narrative contexts in subtle fashion. There is no meter, no clear quantifier for the player's morality, but there are (eventually) gameplay rewards in the form of items and assistance for certain actions.
There is substantial narrative context for the game's decisions, but most of this is revealed piecemeal during quests, not through lengthy (and expensive) cutscenes. In essence, the game is player-driven, rather than system-driven, at least in this respect, but to allow a meaningful amount of flexibility within a game's design constraints, the game itself must sacrifice some fidelity.
Like The Witcher, Infamous uses largely static, but stylized and attractive, images in lieu of pre-rendered cutscenes.
Indeed, reducing the overall fidelity of a game in exchange for additional content could also allow for a return to some advantageous methods of past games. In the late 1990s, it was in vogue for RPGs to include literally dozens of characters -- far more than a player could ever need or want in a single play through the game.
Games like Baldur's Gate forced the player to consider her actions in the context of the party he had built as much as anything, and characters were free to leave (and die) when their differences became too heated. Party sizes shrunk again dramatically over the years as games were increasingly expected to deliver characters with fully 3D models, complex animations, and full voice acting. So much effort is put into a single character in a modern RPG that players are left with little party flexibility in the long run.
We need to be able to step back from the sort of gee-whiz technical achievements that characterize AAA titles these days if they won't add much to the game experience. Strong art direction can do as much, or more than, high polygon counts. If we want to see video games move beyond increasingly complex struggles for high scores, we need to be willing to consider qualitative, not quantitative, approaches to game concepts, at least on the surface. We need not just game designers, but experience designers, writers, and world builders.
Morality is incredibly complicated. The fact that learned individuals have debated it for as long as we can remember is testament enough to that. Engaging such a dilemma is not going to be as simple as assigning a number to it. Really, truly tackling this complex topic will require an equally complex solution with lots of moving parts. But I think we have the tools; we just need to apply them.