Gamasutra: The Art & Business of Making Gamesspacer
View All     RSS
July 30, 2014
arrowPress Releases
July 30, 2014
PR Newswire
View All





If you enjoy reading this site, you might also want to check out these UBM Tech sites:


Researchers break strategy game competition down to a science
Researchers break strategy game competition down to a science
August 14, 2013 | By Mike Rose

August 14, 2013 | By Mike Rose
Comments
    12 comments
More:



Researchers at North Carolina State University claim to have devised a precise technique that can greatly increase player chances of winning in online team-based strategy games -- and this research could be used to improve gameplay experiences.

The team says that through various analytics tools, it has managed to evaluate logs of player actions in games like Dota, Warcraft III and Starcraft II, to develop a specific set of rules that govern team gameplay strategies.

By looking at the way in which player attributes and stats change over time, the research team was able to pinpoint the optimal timing for upgrades and improvements, in order to raise the likelihood of team success.

The research papers contain extremely specific scenarios as examples, such as the claim that if a team in a game of Dota has amassed 59.7 more damage points than the opposing team by the second quarter of a game, the first team has more than an 80 percent chance of winning.

And in another example, the researchers found that if a Starcraft II team is made up of Terrans and Zerg, its chances of winning the game are more than 70 percent from the get-go if the Zerg population has a high growth rate, and the Terran population has a low one.

Using this data, the research team has devised optimal gameplay approaches that greatly raise a team's chances of winning. Dr. David Roberts, co-author on the papers, says that this research could potentially help game developers to fine-tune their strategy titles.

"We're currently working to use these findings to develop visualization tools that let players know how they are doing in real-time, relative to the strategies we know are predictive of success," Roberts adds.

Of course, if the techniques discussed in these papers prove as watertight as they are described, this research could have notable implications for the various strategy game eSports currently available.

The two papers can be found below.

Cig2013 Roles by Michael Rose



IKE13 Rules by Michael Rose



Related Jobs

Raven Software / Activision
Raven Software / Activision — Madison, Wisconsin, United States
[07.30.14]

Network Engineer
2K
2K — Novato, California, United States
[07.29.14]

Level Architect
Cloud Imperium Games
Cloud Imperium Games — Santa Monica, California, United States
[07.29.14]

Art Outsourcing Manager
Respawn Entertainment
Respawn Entertainment — San Fernando Valley, California, United States
[07.29.14]

Senior Systems Designer










Comments


Joel Nystrom
profile image
Money Ball.

Michael Joseph
profile image
You could be right. For competitive e-sports, this sort of research could become commonplace. For all we know studies like this have already been done privately.

The concept is not new, but studies like this can encourage a renewed focus.

Ramin Shokrizade
profile image
This analysis of relatively simplistic games seems to me to itself be relatively simplistic. Just measuring some statistics to show the growth/performance patterns of a team tells you more about the lack of balance of a title than anything useful about strategy. Further, such information does not translate well to other titles because again you are just measuring design weaknesses that are unique to one title, and also in an "after the fact" fashion. A method of evaluating designs that was predictive, that could tell you about the performance of a game design before it was populated by consumers would be much more impressive and useful. The kind of analysis done here by academics has been performed fairly regularly inside industry for years, we just don't publish our information.

Put another way, if I have a 100 meter sprint that takes 10 seconds to win, and I figure out that if you are 0.1 seconds or more behind the leader after 7 seconds, I can tell you that you have 95+% chance of losing. Is this really helpful? Is this newsworthy? Is this an achievement? It makes a good homework assignment in an advanced statistics class.

Michael Joseph
profile image
"relatively simplistic games"

I think the point of this article frankly is that these are not relatively simplistic games. Try writing an AI to beat "master" level players of Starcraft II. Am I wrong in suggesting that writing such an AI is easy?. Am I wrong for implying that writing an AI for a "relatively simplistic game" is harder than analyzing the gameplay for said game? Maybe I am... wouldn't be the first time.

Just seems to me if you've "figured it out" then you can write a master AI bot for it.

Ramin Shokrizade
profile image
Chess is an extremely simplistic game. It still took a while to write effective AI for it. These games are at least 100 times more complex than chess, and perhaps 1000 times more complex. Still, their short duration, simplistic rules/options, limited victory conditions, and limited number of participants makes them quite accessible to almost all gamers.

Writing an AI for this would be a lot harder than it would be for Chess, but still not that hard for running a race like Zerg in SC where the key is to build up rapidly. A computer can think and move faster than a human. The level of non-mechanical "strategic speed" for a game like this is still not high.

Now if your point is that writing AI is harder than analyzing the metrics in a game, I would agree. It is much harder. If this article was about creating automated methods of designing AI for simple real time strategy games, it would be a hell of a lot more impressive. You almost would have to have full AI in the first place to make AI, and in this case you would be talking about a learning AI that could take the place of a human.

Michael Joseph
profile image
With all due respect, chess is not an extremely simplistic game. You're simply saying the rules are simple which says nothing about the play.

Many systems have simple rules. For all we know the universe has very simple rules.... yet results in complex behavior.

Analyzing the effects and relationships of rules within a system is not trivial. And this appears to be what these researchers have done unless I'm mistaken.

http://www.youtube.com/watch?v=jNDt1DCE25c

I probably misunderstand you. But again as I see it, if we understand a simple system, then we can write a master AI for it. Otherwise we don't understand it and the solution is non trivial.


Ramin Shokrizade
profile image
I learned chess on my fifth birthday and went undefeated for 10 years. To me it always seemed like a very simplistic game. The first play by mail games I started playing when I was 10 were much more complex, because they had many more options, strategies, and multiplayer dynamics. Bigger computers have allowed even more complex games, though we rarely take advantage of this. The advent of MMOGs really raised the complexity of games in my mind, especially when economies and non-linear play were allowed.

To me the great frontier in game complexity, and for game design/analysis, is in games with complex economies and social systems. A game like chess has a finite number of options and is fairly easily understood by a computer. Even if you made a chess game with 100 players playing it at once in turn based fashion, it might be 10,000 times more complex than chess, but it would still be finite and easily learned by a computer. It just would have to be a very fast computer.

Creating models that allow you to predict behavior before it happens in social environments again is much more interesting to me than looking at something very small scale and saying that 80% of all activity occurs on one side of a line in a graph, and 20% on the other, when measuring just one simple metric.

Adam Blake
profile image
It's easy to go undefeated in chess if you never play against anybody good! I thought I was good at it until I made it to the third round of a tournament (after making it out of my school and then my local region), where I was soundly defeated by a number of players who had been taking lessons and dedicating time to practice. It's a very deep game strategically, although at the higher levels requires much more memorization than I like in a strategy game. For this reason I think I prefer the design of Go -- you can remember certain recurring patterns, but cannot rely completely on memory in nearly as many situations. As a bonus, it has even simpler rules. That doesn't mean it's easy to write an AI that can defeat top players, though!

I think you're right that this study is a bit simplistic, but these researchers need to start somewhere. Also, there is a big difference between a game that's meant to be a balanced competition between two players or several small teams, and massively-multiplayer persistent worlds. I don't know that they're really at all comparable as they have completely different goals.

Ramin Shokrizade
profile image
Adam, I'm in agreement with you. I think the only thing thrilling about chess for me was that it let me beat adults that were 8 or 10 times my age. It didn't take long for this to get boring. I mean honestly how many times can you move the same pieces around a tiny 64 space board? It becomes a lot of memorization as you say, which is not in my mind so much about strategy, and certainly didn't meet my dopamine needs even at that age.

Don't get me wrong, chess is a pretty sophisticated game if you consider that it is 1500 years old. Even a fairly simple game like LoL or SC is so much more complex. That said, those games are far simpler than persistent virtual worlds, especially those with functioning virtual economies.

I guess the point here that I am trying to make is that almost every decent business intelligence unit attached to a AAA game company has been doing much more sophisticated analysis than what is described in this article for some time now. They just rarely publish their data, which makes research and academics in this space challenging. I know I'm not the only one trying to push the envelope on monetization models for F2P games, but again who else is publishing that work? The public space needs work like this, but we should at the same time put this stuff in context because what we are looking at here is not revolutionary. On an industry level, it is not even timely.

Adam Blake
profile image
Balance, complexity, and strategic depth are all very different things. Chess is well-balanced because both sides have exactly the same units, whereas even a single overpowered unit in StarCraft would be a game-breaker (this of course leads to any number of online arguments). A game can have arbitrary complexity -- hundreds of units, a huge tech tree, etc. -- and still be perfectly balanced, if both players start with the same resources. By contrast, SC is asymmetrical (each player can control completely different units). It's also a game of imperfect knowledge (due to fog of war), whereas in Chess both players can always see the whole board.

In Chess, the near-perfect balance (white always has initiative, of course) and moderate complexity is what leads to its strategic depth (checkers is just as balanced, but not complex enough to be interesting). Memorization helps a lot in the early game, but tactics like forking, pinning, and gambits are always useful, as are larger strategic principles like wanting to control the center of the board or looking to force trades when at a material advantage.

You seem very focused on virtual economies and that sort of complex simulation, but the point of a game like Chess or SC is to be a balanced tournament game. It's a very different animal. All I'm saying is that, while a certain level of complexity is necessary, it's not the be-all and end-all of "depth" and I don't think it's what these researchers were focused on.

However, I'm sure you're right, even when it comes to game balance and strategic depth, that e.g. Blizzard has done plenty of internal research they're not publishing. I'm sure researchers like these would love to get their hands on that sort of data.

Tony Dormanesh
profile image
Really cool article and stuff like this could lead to interesting discoveries, but I agree with Ramin on this one.

Saying that if the Zerg has high population growth and their Terran partner has high damage output they have a good chance of winning doesn't really reveal much to actual SC2 players. It does validate some high level strategies, but saying a Zerg with low population growth has a good chance of losing is common knowledge to SC2 players. The high level strategies are already known.

When you get down to the micro management and specific tactics on how units are used, all these numbers go out the window.

Sean Chau
profile image
Blizzard evolutionized RTS games with AAA designers, programmers, and rigorous balance testing. Starcraft and Warcraft are complex games in terms of development, but the game rules are basic and straightforward.

Having said that, I feel that all competitive gaming, not just RTS or video games, comes down to the basic science of human competition.

If you want to get good at competitive Starcraft or RTS's, it's not about picking the right strategy or race, but about dedication, hard work, etc. Even though it's a game, once it's a "competition", it's not a game anymore, it's about winning and all the human psychology it takes to win, and the game rules.


none
 
Comment: