Gamasutra: The Art & Business of Making Gamesspacer
Pacing And Gameplay Analysis In Theory And Practice
View All     RSS
November 26, 2014
arrowPress Releases
November 26, 2014
PR Newswire
View All






If you enjoy reading this site, you might also want to check out these UBM Tech sites:


 
Pacing And Gameplay Analysis In Theory And Practice

August 3, 2011 Article Start Previous Page 4 of 4
 

Analysis

Comparing the data from the two games, we can draw quite a few interesting conclusions. The first and most obvious one is that Wolverine had much more combat with more than four enemies. We can also see that the average combat sequence in Wolverine is a lot longer than Batman. In Wolverine, the fight sequences could be up to 10 minutes long, while in Batman, the longest was five minutes.

Looking at the variation, we can see that in the chapter "Snow Landscape" in Wolverine, the game only switches between combat and roaming back and forth, while Batman always keeps a good variation. Boss fights in Batman are also a lot more evenly spread out through the whole game, giving more interesting variation and more frequent "wow moments" compared to Wolverine, which seems to squeeze almost all the boss fights toward the end of the game.

Even though Batman and Wolverine appeal to slightly different audiences, we can still use this method to compare them. Just by a quick glance, we can draw interesting conclusions and see patterns that suggest that Wolverine is more repetitive -- which we actually can confirm by reading reviews, as they often use the word "repetitive" when describing Wolverine.

In Practice

Instead of analyzing others' games, you could use this method the other way around -- by creating a chart first and then building your levels using the chart as a framework.

You can then let play testers play through your work while timing them, and then listen to feedback on how they experienced the levels -- and compare that to how they progressed through the game.

In a personal project I called "Project 25", where the goal was to create a single player experience as true to Half-Life 2 gameplay as possible, I used this method to first analyze Half-Life 2 and then create a chart to serve as a framework for my layout of the levels.

It greatly helped me when drawing the first floor plan, as I had a pacing chart to follow; it was also useful during play testing. I could see exactly how the testers progressed through the levels and then use that data to create the average player progression, and compare that with my target completion time.

By timing the players and watching how they behaved when playing, I also made some unexpected discoveries that helped me to further improve the gameplay.

One of them was during the early stages of Project 25, when it became obvious that the final battle was way too long. For some testers it surpassed 12 minutes and many showed signs of fatigue and frustration as they died after such a long stretch of intense arena combat, only to play it through all over again. Luckily I timed this, and solved the problem by having two attack waves to give the players a bit of a breather in between the intense fighting.

If you would like to read more about Project 25, you can read an article about it on Gamasutra sister site GameCareerGuide.

Final Words

If you decide to undertake this analysis, I strongly advise you to play through the game and cut away the parts where you got stuck or died to create the progression from the view of the "ideal player". This data will of course be somewhat skewed when you use yourself as test subject. However, the rise of "let's play" videos on YouTube solves this problem, as people from all over the world upload video captures of themselves playing through various games, and it is a great source for collecting more accurate data.

As I discuss this method amongst my colleagues, some tend to be very reluctant to embrace it -- while others react very positively to the ideas.

Those positive about the method have quickly found applications for it in their own work. One area where the method got picked up quickly was in the pitching process during dialogue with publishers. With the help of these pacing charts, my colleagues gained an efficient way to communicate and to determine what type of pacing the publishers were looking for.

On the other hand, those being reluctant wrongly see this as an attempt to measure "fun". I don't believe we will ever be able to measure how fun a game is by using only one method, but to be a good storyteller, you really need to know how stories work -- and the same goes for games.

I'm certain that studying games using many methods will greatly expand our knowledge and understanding of what makes games fun. I'm sure that we one day will be able to look back to this time, congratulating ourselves for how well we did -- and living in an era where we, as game, developers have since matured.


Article Start Previous Page 4 of 4

Related Jobs

Filament Games LLC
Filament Games LLC — Madison, Wisconsin, United States
[11.26.14]

Visual / Interaction Designer
Gameloft New Orleans
Gameloft New Orleans — New Orleans, Louisiana, United States
[11.26.14]

Game Economy Designer
Aechelon Technology, Inc.
Aechelon Technology, Inc. — San Francisco, California, United States
[11.26.14]

Geospatial Engineer
Blizzard Entertainment
Blizzard Entertainment — Irvine, California, United States
[11.26.14]

Senior User Experience Designer, Irvine





Loading Comments

loader image