Measure for Measure
One likely reason we don't often incorporate learning goals is that implementing them into a
game is more difficult and requires more thought than traditional performance goals. It requires
breaking from molds and doing something new. It's much easier to pop up a "level completed"
message, a story cinematic, or an "achievement unlocked" notification after the player hits a
predefined milestone in the game than it is to integrate learning goals that reflect the improvements
players make.
Only recently have games been tracking player data in a way that could support
learning goals, which could also be a contributing factor. But most likely, we have simply been
stuck following conventional wisdom about how
we reward players and provide feedback.
There are ways to start implementing
learning goals in your games. One of the easiest
(and most likely to have a significant effect on
player motivation) is to tell the player how he
has improved. While this is not a goal per se, it
provides the player the information he requires
to track his progress and set his own goals, and
also provides the foundations upon which you
can build actual learning goals.
First, break the game down into component
skills. What skills does the player need to be
successful? Does he need to do double-jumps?
Does he need to master aiming? Does he need
to figure out how to counter an attack?
Learning
goals should focus on behaviors or skills that, when
combined, give the player tools to complete more
complex activities. Of course, these also need to be skills or strategies that your game can track. For
example, if the player needs to understand how to play with stealth, it might be impossible to track [understands the stealth system], but you could track [was hit by enemy] or [used crouch].

Feedback can help teach players more effective strategies.
Then display progress on these component skills to the player. Rather than listing how many
times x or y event happened, communicate metrics that relate to improvement, much like the
example cited previously from Team Fortress. The obvious places to display progress information
to players are 1) at the end of a level, 2) when they pause or quit the game, and 3) when they die.
Better yet, display a progress chart that players can access whenever they want. One example
from Gears of War 2 are the messages that appear as a player nears a new achievement.
Another example from Gears of War 2 is the "war journal" which keeps track of the player's current
campaign status. There's no reason we couldn't put similar messages in other games to keep players
informed about their progress in mastering basic skills.
Of course, people have their own motivations and
mindsets that they bring to games. Some people have a
learning mindset and are likely to focus on getting better
at a game. Others prefer goal-based achievements and do
in fact feel motivated by them. In both cases, players are
likely to have some preexisting beliefs about their gameplaying
abilities. However, the type of goals presented
and the feedback they receive during both success and
failure can have a significant effect on how they respond
to those setbacks.
Through better feedback and goal-setting, we can
encourage a mindset of competence, reduce frustration,
and encourage players to play longer, try harder, and feel
more confident about future gameplay challenges.
Resources
Anderson, C. A. & Jennings, D. L. (1980). "When
experiences of failure promote expectations of
success: The impact of attribution failure to ineffective
strategies," Journal of Personality, 48, 393-407.
Ames, C. & Archer, J. (1981). "Competitive versus
individualistic goal structures: The salience of past
performance information for causal attributions
and affect," Journal of Educational Psychology, 73,
411-418.
Butler, R. (1987). "Task-involving and ego-involving
properties of evaluation: Effects of different feedback
conditions on motivational perceptions, interest, and
performance," Journal of Educational Psychology, 79,
474-482.
Clifford, M. M. (1986a). "The comparative effects of
strategy and effort attributions," British Journal of
Educational Psychology, 56, 75-83.
Clifford, M. M. (1986b). "The effects of ability,
strategy, and effort attributions for educational,
business, and athletic failure," British Journal of
Educational Psychology, 56, 169-179.
Elliott, E. S., & Dweck, C. S. (1988). "Goals: An
approach to motivation and achievement," Journal of
Personality and Social Psychology, 54, 5-12.
Kamins, M. L., & Dweck, C. S. (1999). "Person
versus process praise and criticism: Implications for
contingent self-worth and coping," Developmental
Psychology, 35, 835-847.
Seijts, G. H., & Latham, G. P. (2006). "Learning
goals or performance goals: Is it the journey or the
destination?" Ivey Business Journal, 70, 1-6.
|
Console based games seem to be growing into the habits of showing real time advice, likely due to it's (generally) more casual audiences. For example, Ninja Gaiden Sigma 2 will remind the player about blocking (and which button to use for it) if they're taking a lot of hits and not blocking them. Little things like this make a big difference, in my opinion.
Small point of contention: while your ideas and suggestions are sound, your interpretation of the achievement data isn't. In your second table on the first page (campaign completion percentage), you assume this is a sign of how many people "quit" playing the game. That's an unsupported jump in logic. All it says is that people quit playing the campaign. Seven of the nine games listed have very heavy multiplayer components, and the other two (Fable II and GTAIV) are open-world titles. This data might be more useful on a per-genre basis rather than a top-nine or top-13 basis. You suggest that there might be other explanations, but I'm not sure that it's reasonable to assume that "frustration" is the most significant one (at least in the case of the games listed in those tables).
Also some achievements are multiplayer based vs single player so that will also throw the gamerscore completion % off won't it?
These two games felt more like an obligation and catch-up than fun and compelling. It was a chore to play them because I felt like I was not playing on my own terms.
I don't have to tell you that there is alot more reasons they are putting down the controller. You know already that.
But until you start giving gamers what they really desire, the number of "Ditractors" is going to continually rise until they just say "F" it, and take up knitting or something.
Much of this work is still in research phase but I hope industry-academia collaborations will begin to transition these techniques into production in the near future.
Along with many approving comments, the criticisms I've heard elsewhere and in the comments here appear to center on two things:
1. The data showing how many people don't finish games do not show *why* people aren't finishing games. This fails to provide adequate support for the author's later analysis and recommendations, which rest on an assumption that at least some players quit a game because they're failing at performance-based goals and not getting any help or encouragement for improvement.
2. The *how* matters, too. Like most people, critics tend to be practical -- they want to see how something will really work before they will accept that it can work at all. The author indicated that describing ways to implement effective feedback would be a complete article in itself -- so perhaps this criticism may actually be taken as encouragement for writing just such an article.
I'd very much enjoy reading that article.
This also gives rise to another question- what else is the player also playing at the time? This is perhaps less easily measured- while it would be very easy to see that my progress through Fable 2 has been hampered by my also playing Saint's Row 2 (or that somebody who came late to the GHIII party has moved onto World Tour, Aerosmith, Van Halen, GH5 etc), it is perhaps less clear that I have also been making bits of progress through Little King's Story on Wii, which is something I've been playing in short bursts- which, if it was a 360 game, would also fall foul of my first point.
This is, of course, stat-pedantry, and not directly related to the issue of frustration, which I accept the statistics are simply a framing device. However, without taking these into account, we risk making some very wrong conclusions.
One question I do feel the need to ask, though: GTA IV has a lot of gamerscore dedicated to two large, expensive packages of downloadable content, one of which is not yet available to the general public. How have these been handled in working out the stats for the title in the first graph?
http://www.neurosciencemarketing.com/blog/articles/praise-your-child.htm
Hint: "praise the effort, not the abilities."
Still, statistics here needs to be more precise indeed, with more structure :
- hardcore gamers stats and numbers, casual gamers stats and numbers, etc...) using consecutives hours played. we need percentile categories, here because of the highly separate by nature of profiles (hardcode gamers "always" finish games... even multiple times). So average here might not be as interesting.
I never had patience to finish the UT3 campaing, but I may had around 100 or 200 hours of gameplay with it... I don't care about achievements and completing or goals, and in FEAR2 it was common to kill myself just to play a nice grunt fight again, since the game have a checkpoint system...