Gamasutra: The Art & Business of Making Gamesspacer
Postmortem: Intelligence Engine Design Systems' City Conquest
View All     RSS
July 29, 2014
arrowPress Releases
July 29, 2014
PR Newswire
View All





If you enjoy reading this site, you might also want to check out these UBM Tech sites:


 
Postmortem: Intelligence Engine Design Systems' City Conquest

February 6, 2013 Article Start Page 1 of 4 Next
 

Intelligence Engine Design Systems and our first game, City Conquest, came about as a result of a surprising personal journey.

My career began at Brøderbund in 1994, working as the lead designer on a real-time strategy title. After it shipped, I conceived a design concept for a new game -- a huge, epic, insanely ambitious dream game that I am still working toward even to this day. I knew that it would be possible to make it a reality someday, but I also recognized that I wasn't yet qualified to design it. My dream game required a huge amount of artificial intelligence, and I needed a much deeper understanding of AI to have any chance to design it properly.

I spent well over a decade learning everything I could about game AI, helping to develop the AI in games like Metroid Prime 2 and 3, MechWarrior 4, and some others, writing articles on game AI, evangelizing the use of navigation meshes for pathfinding, speaking at AIIDE conferences, and generally pretending to be a programmer while picking up everything I needed to know to design that game. Along the way, I got a golden opportunity to learn from a few of the industry's most acclaimed designers.

At the end of that process, I finally did learn what I needed to know to design the dream game. Intelligence Engine Design Systems is still working toward that game and growing toward the funding level and company size that can make its development possible.

But there was an unexpected revelation along the way. I had not expected that AI would change everything I thought I knew about game design.

The more I learned about AI and game design, the more I began to understand that they are two sides of the same coin. Game design is the creation of possibility spaces in which players can act; game AI is about developing systems to explore those possibility spaces and act within them -- usually on the part of game characters or entities, but sometimes by separate tools. When used wisely, AI can give us an incredible number of tools and techniques for improving our designs and helping us explore the ramifications of our design decisions more quickly.

The more I looked at the kinds of problems we encounter in game design, the more I realized just how many of them are fundamentally either decision optimization problems that are eerily similar to the problems we solve in artificial intelligence, or combinatorial optimization problems that we can optimize with the kinds of decision modeling processes already used in many other engineering fields. As a result, there are some fairly remarkable unexplored possibilities for different kinds of tools we can create to optimize the design process.

Game design is a process of learning and exploration. All game designs evolve, and even the most focused and well-planned projects spend a considerable amount of design iteration to get the design just right. Well-run projects are able to manage their exploration of the design space in a way that balances design evolution and its impact on product quality against all of the costs of that design iteration and experimentation. Less well-run projects get lost in creative thrashing, unfocused design exploration, or perpetual preproduction.

When we design, we are struggling in the dark. We have endless options and ideas for design changes that can make our games more fun, engaging, immersive, or addictive. But we have very little ability to accurately predict all of the countless ramifications of these changes and additions to a game's design, or to truly understand how the game's character will change without actually implementing the changes we're considering.

We have precious few tools to explore that space beyond our own imaginations and our ability to actually go ahead and experiment, prototype, and test our ideas.

Professional aircraft designers have powerful engineering tools at their disposal to simulate "virtual wind tunnels" that can estimate the performance characteristics of their aircraft. Long before they ever need to actually build a prototype and place it in a physical wind tunnel, aircraft designers can quickly and cheaply model an aircraft using CAD-like tools and instantly see how its design influences its performance characteristics.

There is no equivalent for game design. We have no tools to show us all of the ramifications of our core design decisions. I believe we need to grow beyond the current, purely anthropocentric approach to design and accept the need for a process that involves some level cooperation with machine intelligence as so many other industries have done. Our industry doesn't need a "Photoshop of AI": it needs a virtual wind tunnel for game design.

So I decided to make one.

City Conquest began in August 2011, while I was earning an MSE degree from a University of Pennsylvania technology management program co-sponsored by the Wharton School. As the result of a market analysis in a Wharton marketing class, I identified a market opportunity for games that combined elements of tower defense games like Kingdom Rush with the depth of real-time strategy games like StarCraft.

I designed City Conquest as a hybrid TD/RTS to combine the simplicity, accessibility, addictiveness, and feel of tower defense with some of the depth and strategic elements of an RTS. It needed to feel like a head-to-head TD game, with each player given a Capitol building to defend and special "dropship pads" that produce new units every turn.

The goal was to limit the scope to a one-year development cycle while ensuring that it was a serious game development effort with genuine potential to grow into a franchise, rather than a quickie game for the sake of "learning." In today's saturated mobile market, there's no point wasting time on any development effort that isn't at least making an honest attempt to be ambitious, polished, and unique.

Our goal with City Conquest was to optimize the "cost-to-quality ratio": the best possible product at the lowest possible cost. Our years at Retro Studios / Nintendo imbued us with an obsessive focus on product quality and polish and a conviction that this diligence pays serious long-term dividends.

At the same time, we were determined to reduce our risks by minimizing our overhead -- we would avoid growing the team any more than absolutely necessary, and we would outsource all of our art and audio needs while handling the design and production and the majority of the engineering ourselves. In large part because of this risk-centric attitude, IEDS maintained a very small team throughout development of City Conquest.

We had a basic playable prototype running within a month. The concept worked better than we had any right to expect. Our decision to use a fundamental gameplay to drive the game's development, rather than any particular narrative arc or artistic "high concept," went a very long way toward ensuring that the game would be fun and playable.

Production has gone amazingly smoothly compared to even the most well-run triple-A projects we've worked on. There were a few missteps along the way, and some clear failures of due diligence, but nothing that negatively impacted the final quality of the product. The fun began to shine through in the first few months of development, and I increasingly found that I was so genuinely addicted to the game that playing it became a dangerous distraction from actually working on it.

We took it as an encouraging sign, and this more than anything pushed us to see City Conquest through all the way to the end.

I have worked with some teams that spent months to years trying to "find the fun" during development. This is confusing. If you haven't found it yet, why are you in development at all? How did the project get greenlit in the first place? Why are you even calling it development when you're really still in preproduction?

City Conquest was completed several months later than expected, but most of this delay was for the right reason: because we were determined to optimize for quality and had the time to make the game better. Although we were careful to limit the scope of the project by adding new features only when truly necessary, we spent a great deal of time and effort on polishing existing features and addressing our playtesters' feedback. We were consistently unwilling to pass up changes that would improve quality, schedule be damned.


Article Start Page 1 of 4 Next

Related Jobs

Cloud Imperium Games
Cloud Imperium Games — Santa Monica, California, United States
[07.29.14]

Art Outsourcing Manager
Respawn Entertainment
Respawn Entertainment — San Fernando Valley, California, United States
[07.29.14]

Senior Systems Designer
Petroglyph Games
Petroglyph Games — Las Vegas, Nevada, United States
[07.29.14]

Senior Unity Artist
Sony Online Entertainment, San Diego
Sony Online Entertainment, San Diego — San Diego, California, United States
[07.29.14]

Intermediate and Senior Database Tools Programmers






Comments


Paul Tozour
profile image
I just wanted to note that Gamasutra's tagline on this article, "how automated testing saves time, and how Kickstarter wastes it," is Gamasutra's characterization, not my own, and is not the way I would have characterized this article. I'm working to get this fixed, but in the meantime, Evolver is not exactly an automated testing system, and Kickstarter isn't a waste of time.

Paul Tozour
profile image
EDIT: The Gamasutra editors have tweaked the tagline to avoid confusion. Thanks, Gamasutra!

Don Hogan
profile image
Excellent write-up, Paul. It's always good to hear your take on game development, there's never a shortage of food for thought. Glad to hear the project went well!

Michael DeFazio
profile image
Paul,
Fantastic article--

Wish the Kickstarter video could have been attached it was also awesome (Seeing how you created algorithms to find optimal strategies and had them play against each other was fabulous.)

Love your philosophy about games (problems spaces) and AI... And the amount of times you mentioned "decisions" in the article put a smile on my face (I'm sorta a gameplay first kinda guy, and great games to me always find a way of presenting "interesting decisions").

You completely sold me on this game (one android copy Sold!)... and I will continue to watch for other revelations/advancements you and your company find about making compelling (and balanced) gameplay in the future.

Cheers

Paul Tozour
profile image
Thanks, Michael! For anyone who's interested, the Kickstarter video is here: http://kck.st/GzJ324

I'm also going to be giving a talk on Evolver and some other aspects of my approach at the GDC AI Summit in March (along with Damian Isla and Christian Baekkelund, who will be discussing the role of AI in the design of Moonshot Games' terrific new game Third Eye Crime).

Paul Tozour
profile image
.

GameViewPoint Developer
profile image
I think the AI approach to game testing is definitely interesting but would be a lot of work for a small indie team to implement, perhaps if there were 3rd party tools available it would be a useable solution.

Paul Tozour
profile image
To be clear, the point of Evolver was not as a game testing system. It was designed to help explore the design space and guide the game balancing. It was fundamentally a design tool, NOT a testing tool.

It did help find some bugs, of course, but that was just a nice side-effect. The real point was to help us optimize the balancing between all the different units and towers in the game.

The difficulty of implementing something like this really depends on the game. It's not a magic bullet and it's not an approach that will work for every game. And it's the type of system that has to be carefully designed for the particular game in question, so it's not the kind of thing where you can really create a tool that will work for any game.

In the case of City Conquest, it cost us about 2 weeks' worth of coding and other work, and easily saved us a good 3-4 weeks worth of design time -- while also giving us better results than we would likely have been able to get by hand, AND giving us a system that would give us instant visibility into the ramifications of any given design decision. So, it was clearly a net win just in terms of the time savings alone, even before you consider all of the other benefits.

In any event, as I mentioned in one of the previous comments, I'm going to be speaking about this in more detail at the GDC AI Summit in March. So be sure to stop by if you'll be at GDC.

Denis Timofeev
profile image
Hi! That's a great article, thanks. A lot of inspiration. I'm just wondering how many users you had on TestFlight, how did you get them and how helpful their was.

Paul Tozour
profile image
We had around 50 or so at first, but once we opened the testing up to *all* the Kickstarter backers, we ultimately got to the maximum number of users allowable based on Apple's developer restrictions for the maximum allowable number of devices per app (100). The actual number is below 100 since some of them had multiple devices (iPhone + iPad) so they took up multiple device ID slots.

We started with personal friends and industry contacts, then added backers at the appropriate backing level in Kickstarter, and then, a few months later, opened it up to absolutely all the backers who had iOS devices.

Their feedback was extremely helpful overall. We got a very wide range of feedback from a lot of people at a lot of different skill levels. We had a few industry veterans in there (see the game's Credits for the full list), who provided terrific and detailed feedback, along with a few non-gamers. We also did some diagnostics, such as adding buttons so the testers could send back valuable data on mission completion and achievements earned. So it was nice to see that about 90% of the ones who responded were able to finish the single-player campaign, and every mission was completed by at least one tester at every available difficulty level. Also, they were able to get me invaliable feedback on devices I didn't own / couldn't get my hands on at the time, such as the iPhone 5 and the "new" iPad aka iPad 3.

Louis Gascoigne
profile image
Great article Paul, worth the read just for the tech section.

Bram Stolk
profile image
Impressive stuff, and great article.
Amazing that you could get Computer Aided Game Balancing working in just 2 weeks.
Writing a system like Evolver sounds like more fun than tediously going through manual iterations of game balance.

Jeremy Tate
profile image
@Paul How did you approach actual code testing? It seems like it would be crucial under this model to have technical integrated testers on the team if you were going to maintain a running total of 10 bugs. Otherwise, you are kind of pushing those undiscovered bugs to later in the project.

Paul Tozour
profile image
I definitely agree in principle with having as many testers as possible doing testing from day one. This is a great idea whenever you can afford it, and it worked very nicely for Retro Studios when I worked with them on Metroid Prime 2 and 3 -- the internal elite ninja testing team was super effective.

I think we were able to get away with not having dedicated testers on the City Conquest team thanks to a combination of factors: it was the relatively limited scope of the project, the involvement of our external playtesting team (friends and Kickstarter backers on TestFlight), our defensive programming practices, and our habit of playing through the entire game on a regular basis. Also, the fact that the Evolver tool found a few of the most difficult/subtle bugs on its own just by virtue of the fact that we were running a million simulations overnight and could pretty quickly find anything in the game logic that caused a crash or a hang.

But again, I do agree with you that having dedicated testers throughout the process is the better way to go if at all possible. The earlier you can find bugs, the better.

Paul Tozour
profile image
And when I say "10 bugs," of course I mean 10 *known* bugs. Naturally there's no way to count bugs that you don't know about.

But, yeah -- the earlier you find them, the earlier you can fix them.

Jeremy Tate
profile image
In addition, since a game by its very nature is almost entirely focused on usability vs functionality, so if something has to go, it has to be the latter. It's not like its a piece of medical software where lives depend on the functionality. Which, by focusing on playtesting, was the choice you made.

I'm thinking that a person who would the upkeep of the Evolver scripts, managing the data, disseminating data out and making sure issues are tracked and are acted on would be logical next step though. You could easily scale the concept out and get even more detail.

Paul Tozour
profile image
Absolutely. Evolver is really only scratching the surface of what's possible.

There's also been some very interesting academic work related to this, as well as interesting procedural content generation work, being done by folks like

Alexander Jaffe http://homes.cs.washington.edu/~ajaffe/
Adam Smith http://users.soe.ucsc.edu/~amsmith/
and Gillian Smith http://sokath.com/main/publications/

... and several others whose names escape me at the moment.

James Yee
profile image
Good to see your game came out Paul.

As a writer in the Kickstarter community I recall seeing your project at the time and not being overly impressed by it. (Hence the reset you did) Do you think the Kickstarter would have gone better if it wasn't up to YOU to create it? Basically would it have been more cost effective for you to keep working on the game while letting someone else do the PR/Campaign management of Kickstarter?

Also for future Kickstarter creators how do you avoid the Apple Promo code problem you ran into? Just make sure your full game is a separate app if you plan on a free version?

Paul Tozour
profile image
Hi James -- I'd love to get your thoughts on the Kickstarter and what could have been done better there. Please feel free to send us a direct message via Twitter or e-mail us directly; I always appreciate honest, direct feedback.

Yes, the Kickstarter probably would have been more successful if I'd hired separate PR for it, but I'm not sure it would it have been cost-effective or that it ever would have actually brought in more funding than the cost of hiring the PR firm in the first place.

I didn't bring on a PR firm until the game was ready to launch on iOS; I did consider doing it for the Kickstarter campaign but it seemed like overkill.

As for avoiding the Apple Promo code problem: I don't have a good way to recommend that developers make their app available to backers for free outside of the limited promo codes Apple gives you.

Other developers I've spoken with have recommended briefly making your IAPs free for a very short time frame and telling your audience exactly when they need to download them, or putting hidden features / secret codes in your app (players need to click on a certain pattern on invisible hotspots). But the latter approach especially is risky since it risks incurring the wrath of Apple (and potentially getting banned from the App Store) or becoming open knowledge (and allowing people to pirate your game easily once they know the secret code).

Damian Connolly
profile image
Thanks for the excellent post-mortem, and congrats on the game!

Can you elaborate a bit on how you used discovery driven planning with your game? Did it change the game design, and if so, to what extent? Did you use it mainly in the prototyping stage, or throughout the entire process?

Also, your Evolver tech sounds pretty amazing. Is it specific to the game, or do you see yourself eventually spinning it off as middleware? It seems like an ideal candidate, especially for smaller studios.

Paul Tozour
profile image
Hi Damian -- Thanks for the kind words!

Although I'd love to be able to build middleware someday to help with this aspect of game design, I don't think it will be a case of extending Evolver to do that. The genetic algorithm component of Evolver isn't really anything unique or protectable, and the aspects relating to integration with City Conquest is too game-specific to be able to put it in middleware.

Regarding Discovery-Driven Planning: As luck would have it, I'm working on an article (or two) on DDP for Gamasutra right now. I hope to have them up by the end of the month or maybe early in March.

DDP is really a project planning methodology, so you want to use it throughout the project to make sure you have a good handle on the risks of the project, and ensure that you're prioritizing your efforts to learn as much as you can to reduce uncertainty as cheaply and as quickly as you can.

So on City Conquest, it drove every milestone. It didn't really change the game design directly but it did ensure that we worked on the riskiest aspects of the project first to ensure that we reduced our risks and ensured the smoothest possible path a completed, profitable game.

Paul Tozour
profile image
FYI, Gamasutra has now published my article on applying discovery-driven planning to games:

http://gamasutra.com/view/feature/191523/managing_risk_in_video_g
ame_.php

Josh D
profile image
Hi Paul,

Thanks so much for writing this postmortem, I was hoping you could elaborate a little bit on your monetization decision. You said that in retrospect, you didn't think making a free app with one large IAP was the best choice. I'm wondering why that is and what you think would be more effective? I'm currently involved with an app considering a similar pricing structure, so I'm very curious as to your thoughts and experiences with this. Any feedback would be greatly appreciated, thanks again!

Paul Tozour
profile image
Hi Josh,

What it comes down to is that to really maximize your revenues, you need to be able to do proper price discrimination. And by that, I mean that you need to be able to serve all the customers across the whole curve of different levels of willigness to pay -- the 5% of "whales" who will pay $20+ per month of your app without hesitation, the 10% of "dolphins" who might pay $5 per month, and the "minnows" who will pay maybe $1 a month on average ... in addition to all the non-paying users who just won't pay anything, which will likely be 3-10x your number of paying users.

So, I've come around to the realization that the full-unlock-through-a-single-IAP model is suboptimal because it can only do price discrimination at a single point: the decision of whether or not to pay for that one IAP. So you are getting no revenue at all from the many users who consider your price point too high, and you're getting a lot less revenue than you could from the minority of very wealthy "whales" who would be willing to pay much more to enhance their game experience.

Paul Tozour
profile image
I'd also recommend checking out GamesBrief -- http://www.gamesbrief.com/

They have some interesting papers on this.


none
 
Comment: