Gamasutra: The Art & Business of Making Gamesspacer
View All     RSS
October 25, 2014
arrowPress Releases
October 25, 2014
PR Newswire
View All
View All     Submit Event

If you enjoy reading this site, you might also want to check out these UBM Tech sites:

Next Generation Graphics vs. The Console Business
by Tom Battey on 06/11/12 09:33:00 am   Featured Blogs

The following blog post, unless otherwise noted, was written by a member of Gamasutra’s community.
The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company.


I remember when I decided that videogames didn't need to look any better. It was 2001, I was twelve years old, and I was hunched over a biege PC in the school library watching clips of Final Fantasy X on some precursor to YouTube. I watched Yuna summon Valefor and I thought to myself: "Wow. This is it. This is as good as videogames will ever need to look."

Eleven years later, I still largely stick by that statement. The PS2 era will always represent, for me, a pinnacle moment where graphical fidelity and the imagination of game creators perfectly overlapped.

This was the first time that designers could create in-game models that actually looked human. Their faces were no longer textured on; they had real (digital) lips that could be synced to dialogue tracks, and eyes that could move and express emotion. There was no longer any need to cut away to pre-rendered cutscenes for narrative impact - these videogame people actually looked like people.

Games like Final Fantasy X/XII and Metal Gear Solid 2/3 are standout examples that spring to mind. Their predecessors featured boxy approximations that required some imagination to appear human, but that extra graphical processing power could now render emotive characters, characters that could actually tell a story.

At the same time, PS2 tech wasn't so prohibitively expensive to develop for that publishers couldn't afford to take risks. For a time, we had studios developing games with a relatable visual fidelity and the creativity to experiment, and we got games like Psychonauts, Okami, Dark Cloud and Killer 7, games that would struggle to be green-lit in our modern climate.

The tech-jump to the current console generation has caused game budgets to balloon to a scale that has proven prohibitive to creativity. Much of this has to do with the cost of generating the complex, high-resolution assets that have become the required standard.

This increase in budget has caused publishers to become increasingly risk-averse. With game budgets sometimes stretching into the hundreds-of-millions, anything but a chart-topping success can, and increasingly has, led to studio closures. As a result, games have become 'safer', more derivative and less experimental. I can't provide the figures to prove it, but I know that we've seen far fewer Okamis and Killer 7s on HD consoles.

And now Epic and Square Enix have revealed their visions of what next-generation videogames are going to look like. And while a small part of me still gets a little giddy at the thought of all those shiny new games, a greater part of me worries about the consequences of another tech-leap forward at a time when companies are struggling to meet the budgetary requirements of the current generation.

I'm not one of those people who believes that the console business will be erased entirely by the rise of mobile and tablet gaming. However, I consider it entirely possible that the console business could end up stagnating itself into obsolescence by demanding the sort of budget that no one bar four or five huge AAA publishers can afford.

Kotaku's Stephen Totilo has written a good piece on the advancements put forward by Epic's Unreal Engine 4, an engine that is likely to be fairly ubiquitous on next-generation consoles, if the success of Unreal Engine 3 is anything to go by.

While some of the points raised are expected graphical fluff - particle effects are of negligible actual importance to game development - it does seem that Epic have identified the issue, and are attempting to combat the inevitable next-gen budget-hike with their new technology.

Having real-time global illumination as standard, for example, will mean less time spent on creating elaborate lighting systems. That saves money, and is only one of many alleged 'game-changing' features. Perhaps the most important addition is the ability to edit game code in 'real time', without having to compile to prototype. A seemingly simple feature like this could save hundreds of hours per game project, and reduce the required budget accordingly.

This is the kind of thinking developers need to take forward into the next console generation. Next-gen console business needs to be agile and accessible. We cannot afford a year-long post-launch dearth of creativity while everyone tries to get to grips with the technology. For all the time wasted on creating a particularly luscious dirt texture, developers will be losing custom to the ever-stronger mobile and social markets.

In short, we can't afford to attempt the traditional generational leap this time around. An increase in graphical fidelity alone won't sell a new generation of consoles. In fact, the budgetary inflation caused by such could endanger the whole AAA game business. We need a business model where smaller studios can compete with the massive publishers, otherwise complacency and stagnation will drive customers away to the point where no console games are financially viable.

If Epic's technology, and similar efforts from other studios, can reduce the cost of making games while increasing the scope and vision of these games at the same time, then that's great. That's the sort of progress we need. But it's possible that the constant strive for ever-fancier graphics has turned the console business into its own worst enemy. We'll begin to see whether or not this is true in the next two or three years. While I'm waiting, I might play Final Fantasy X again. Rather that than XIII-2, at any rate.

Related Jobs

Forio — San Francisco, California, United States

Project Manager / Producer (Games)
Yoh — Vancouver, British Columbia, Canada

Build & Test Engineer
Infinity Ward / Activision
Infinity Ward / Activision — Woodland Hills, California, United States

Senior Sound Designer - Infinity Ward
Treyarch / Activision
Treyarch / Activision — Santa Monica, California, United States

Multiplayer Level Designer - Treyarch


Hakim Boukellif
profile image
More advanced engines that make development easier and take care of more boilerplate are great and all, but I get the feeling that the biggest problem may lie with content creation. How much easier has it become to make a 3D model of a certain amount of detail over the years? How much effort is being made in the progression of procedural generation?

Tom Battey
profile image
This is true. It's almost impossible to stop asset-generation from being more expensive come next gen. More polys and bigger textures are a given, and these are going to be more expensive to make. We can only hope that technological advancements, both in game engines and in 3D development software, offset this extra cost enough that complex games do not become prohibitively expensive to make.

Eric Schwarz
profile image
This is really my issue too. As content complexity/detail/etc. grows, so too does the time to create it grow, often exponentially. I think that more emphasis on procedural content generation will be what continues to make creating game content like this viable. Things like having vegetation automatically appear based on material type laid down on terrain, or even generating terrain automatically for background scenes, or being able to lay down decals across an entire level rather than individual objects (how great would it be to say "this is a dingy factory, put oil slicks, dirt and scratches around accordingly?" and have it happen?), all of those will make a huge different in speeding things up.

Beyond that I'd also like to see more emphasis on stock models textures, more libraries of freely-available high-quality assets that are open to be tweaked as needed (including things like particle effects), and easier modeling tools that are less "placing vertices in 3D space" and more "Photoshop" or even a bit simpler - ZBrush of course is an excellent piece of software that's getting closer, but the interface and workflow still have a learning curve and it's very awkward to use without a tablet. For instance, there are games like Spore that have amazing tools for procedurally building content (used for characters, but also for buildings in the new Sim City), and Saints Row has an exceptionally detailed and easy character editor - why aren't tools like this the de-facto standard for artists and designers?

It's true that lots of stock elements might reduce the uniqueness of art style, but honestly, unless you're trying to make a stylised game in the first place, I don't think it matters much that your trees, or ground textures, or building models might also show up in different games - it's how you use those assets that matters, and as we get closer to photo-realism all those more artistic considerations will become less important because artists no longer have to make up for technical limitations with style (even "stylized realism").

Tom Battey
profile image
I completely agree; it's beyond time developers started sharing resources in the name of creating great content. Epic are ahead of the game a bit here, by selling the engine they use for their games as middleware software, and I understand they make a good deal of money from doing so as well. If more companies could work like this, I think the industry as a whole would benefit.

I understand the arguments for competition and distinction, but I think we're past the point where companies can afford to jealously hoard technology. If it makes for better games, share it. Hell, license it for profit, like Epic do. That way everyone benefits, and we can all stay in business.

Rick Kolesar
profile image
But with applications like Z-Brush and Mudbox, making ultra-high polygon models for games isn't anything new. In fact, if Epic wanted to they could make a better looking Gears of War right now with the models they have on their hard drives. Most studios start high, and then using 3ds max, Maya, or other poly crunching tools, bring the models down to a size their engine can handle.

If anything, it will be quicker to make assets since the artists will have less poly constraint concerns.

But why chase the realistic goal when a nice, artistic style can save you time and polys? I would love to see what Zelda Wind Waker 2 would look like on the WiiU. Forget high polygon, I want a world where I can see for miles, with no loading screens, and crisp textures. You don't need a team of 100 people to make that happen.

Thom Q
profile image
I think the distance between the current Industry hardware status, and the quality of the needed adwork isn't necessarily getting bigger. Like Rick points out, a whole lot of artists actually design with higher polys already, and then optimize that for usage in games. Almost every 3D artist who has not worked previously with game content actually have the most problem trying to design with a low poly count.

Of course it seems way easier to animate a sprite, then a biped. But dont forget, in the NES / SNES days, making those sprites was also a big challenge.

Tom Battey
profile image
I wish more developers would opt for a stylised art style. Not only do you negate the need for MASSIVe textures and the like, but the upshot is that your game won't immediately look dated as soon as new technology comes out.

Wind Waker is a great example. WW still looks great, and (resolution aside) could easily be a game released this year judged by visuals alone. Twilight Princess? Not so much.

Nathan Mates
profile image
"How much effort is being made in the progression of procedural generation?"

As far as I can tell, only a few products like SpeedTree have been relatively successful in providing a useful and robust middleware that uses procedural generation. Not surprisingly (to me), they're modeling naturalistic items, where I think procedural content will work the best. As long as there's yet another corridor FPS shooter, there's going to be a lot of man-made items needing artist time.

Load times are also a consideration when considering procedural generation. Yes, I've played (more like watched) tiny demos like kkrieger (96KB). The startup time was horrendous and would have completely bounced Sony/MS requirements. When there's ~45-60 seconds to load all content, procedural content would either have to be much, much faster or pre-processed offline and read off disk.

Also, videogame content creation pipelines have a history of looking to the film creation pipelines (e.g. Lucas's ILM, Pixar, etc), and optimizing/simplifying/smoke-and-mirrors to make it fit on the game's target hardware. Films have higher overall budgets and the enviable luxury of being able to spend hours to days of CPU time creating a single image. Techniques used by the film people tend to make it to videogames, albeit usually with a 5-15 year lag because games need to be able to do things in realtime. So, look to the film special efx industry -- wait for them to note they're using procedural generation regularly and often. When it's good enough for the pros, then there's a chance game developers will adopt it. Not the other way around.

In short: I like the idea of procedural generation. Like cheap reliable fusion power plants. Neither is really ready to take over existing technologies. Heck, we don't even have a reliable roadmap of the steps needed to get to truly useful general-purpose procedural generation. It's people playing around with ideas in basements still waiting for the eureka moment.

Ryan Marshall
profile image
Honestly, the game I'm looking forward to most from a big company is probably Final Fantasy Dimensions. Just because they want to make hundred-million dollar blockbusters with their main franchise doesn't mean they're going to stop making the games that I want to play.

While I can conceive of "modern" polygon-style graphics improving to life-like fidelity, I cannot conceive of noticeable improvements in "classic" sprite-style graphics. To me, the end has been here for a while.

Tom Battey
profile image
The game does look great. I agree that pixel-art is timeless, and have a particular soft spot for SE's FFVI-era sprite-work. It will also be interesting to see how they approach making a 'new' FF game in the 'old' template; it's been such a long time since they made a real 'old school' go at FF that the results should be interesting. Shame I don't own a smart phone or trendy tablet...

Thom Q
profile image
You do know why they call it dimensions, right? Because you can finally go left or right in some places, instead of an endless path forward ;)

Jeremy Reaban
profile image
And yet PC games exist (and have), that look far, far better than their console counterparts, as well as run at 1080p and 60fps. And PC is a far smaller market for the most part...

Tom Battey
profile image
That's true, but high-end PC gaming is a niche market. You can get some fantastic looking games running at silly-high resolutions, but to do so you'll need a machine that costs upwards of £500 (closer to £1000 if you want to hit the proper high-end), and that's a hell of a lot of money to spend on videogames.

This is fine in an enthusiast market, but it doesn't work for the console business. The console business is punching for the mainstream, and a mainstream audience does not have that kind of money to spend on gaming.

I'd be happy to leave fidelity-chasing to the PC market, and settle for a next-gen console that can run games with 'current gen' graphics at 1080p, 60fps, and that costs less than £300. That would absolutely be enough for me, and more than likely for the average game consumer as well.

Merc Hoffner
profile image
Correct me if I'm wrong, but I'd estimate that dev costs have more or less been quadrupling each generation every generation:
NES: ~$100,000
SNES: ~$400,000
PSX: ~$1,500,000
PS2: ~$6,000,000
PS3/360: ~$24,000,000
PS4/720: ~$100,000,000+?????

Epic is certainly promising big advances in productivity, but haven't there been significant advances every generation anyway and the budgets still increased? Witness the advent of an actual SDK (NES to SNES) -> the progression of machine code to standardized languages (SNES to PSX) -> the provision of actual art tools and middleware (PSX to PS2) -> the availability of high level shader languages and real time building and scripting (PS2 to PS3/360). Under the best case scenario Unreal 4 is merely keeping this progess up. ART is what costs. ART is king, and artists cannot (yet) be replaced), but the content they're required to fill keeps scaling. This is all very worrying. Will it scale forever?

Now consider the frightening progression of hardware. Take into view a very 'fanboyish' appreciation of power progression and bear in mind traditional Moore's law increases should give a 16 fold-ish power gain every 6 years (one gen):

NES->Snes - ~10X (ish)
Snes->PSX - ~30X (efficiency gains made in task specific acceleration)
PSX->PS2 - ~30X (some efficiency gains in Vector unit type processing for general purpose type tasks)
PS2-PS3/360 - ~50X (efficiency gains from shaders yielding high psychovisual quality per unit power, but offset by power costs of HD)
PS4/720? Diminishing psychovisual quality per unit power returns and potential leaps into uncanny valley.

Sony bet big and pushed the boat a bit beyond affordable Moore's scaling with PS2. This lead to early losses but early market dominance. In a pissing contest Sony and Microsoft then pushed the boat out again, breaking affordable Moore's scaling two generations running leading to catastrophic losses. Pushing it a 3rd time in a row will almost certainly break the business model on the hardware end. Unfortunately it's become the 'core' expectation. Nintendo has reset the counter by taking one gen step back. Either the other two make a modest 'improvement' and have a small chance of sustainability, or they make a massive improvement and have zero chance of sustainability.

Tiago Costa
profile image
@Merc Hoffner

I, unfortunately, want a bit that crash to happen, the industry needs a big kick in the mouth and a wake up call.
Unfortunately it will be at the expenses of the lives of the studios teams and I dont like that.

I think that the problem with this industry is that you have Two and only Two layers, one for low budget indie games and another for AAA hiper costly games.
There is a big gap between them and I only see a couple of games filling it (Torchligh Im looking at you DIRECTLY). This gap should be filled with games from dev teams in the size of the small dozens and not hundrends, with budget bellow the milllions that would give enough money for the company to continue to make its next game(s).

Tom Battey
profile image
@Merc Hoffner: This is all entirely true, and I for one hope that console manufacturers eschew another 'big leap' forward and instead make a more modest step-up, bolstered by robust hardware and real innovations in the fields of online connectivity.

Unfortunately, both Sony and Microsoft are under huge pressure, largely from companies like Epic who come from a tradition PC development power-race background, to put out machines that can compete with high-end, £1000+ gaming PCs. Both presentations by Epic and SquareEnix appear as something of a challenge to the hardware developers: THIS is what we're going to make our games look like, YOU have to provide the hardware to support this.

Epic famously pressured Microsoft into adding more RAM to the Xbox 360 in order to better support Unreal Engine 3. That worked well for this generation, but is it sustainable for the next? I'm not so sure.

@Tiago Costa: I'm in a similar position, in that I want the big studios to wake up to the unsustainability and stagnation of their current business model. If it takes a collapse in the high-end of the market to do so, then that's unfortunate for everyone in videogames. I want there to be room for huge high-budget blockbusters in gaming; I just don't want there to be ONLY room for such.

And I think the mid-tier developers you mention do exist; they're just not making games for the HD consoles. Mid-sized teams have been putting out mid-budget games on the Wii, the PSP and the DS, and more recently on the 3DS. It's just a shame that there doesn't seem to be room for such games on the 'big' consoles, and that is almost enirely down to the cost of developing in HD, and the requirement of selling a game at £40/$60.

k s
profile image
@Tom There are mid-tier games releasing on XBLA and psn as well and I hope to see more of that in the future as that is more sustainable then the AAA model.

Duong Nguyen
profile image
Devel cost for this generation i would say have been averaging 50 million, this probably is due to 2 fold, the cost of supporting multi-platforms ( both widely divergent architectures ) and also the cost of increasing multiplayer backend support (which was non-existant for previous generation games ), otherwise the cost growth would have been as you'd predicted.

The graphical leap from this gen to the next will not be as great as the previous games. People can get a glimpse of what the "next gen" consolse games will look like by looking at the current best of class PC games ( ie Crysis, Witcher series, Skyrim, etc..) They look good but they are not going to be photo realistic mind blowing stuff.

The issue of asset creation is always present, i suspect several new trends will occur, more procedural generation and crowd sourcing asset creation.. either by leveraging built in cross platform modding community or in game..

Justin Speer
profile image
"The PS2 era will always represent, for me, a pinnacle moment where graphical fidelity and the imagination of game creators perfectly overlapped."

This is much the same reason that I'm so fond of the Dreamcast, if I was in an alternate reality where that was the last console ever released, I still think I'd be having a lot of fun with gaming.

Tom Battey
profile image
I never actually owned a Dreamcast; I was still too young to afford to buy my own consoles, and my parents got frowny at the thought of me owning more than one gaming machine.

I did, however, borrow one, with a stack of games, from an older family member for a couple of months. I played Skies of Arcadia, Sonic Adventure, Shenmue, Crazy Taxi, Jet Set Radio...those were a good few months...

Kevin Patterson
profile image
"I'd be happy to leave fidelity-chasing to the PC market, and settle for a next-gen console that can run games with 'current gen' graphics at 1080p, 60fps, and that costs less than £300. That would absolutely be enough for me, and more than likely for the average game consumer as well. "

I can understand Tom where your coming from, but I personally would go back to PC gaming and leave consoles behind if what you wanted came true. Most of my friends who are major console gamers are eagerly awaiting the next gen, and many who started as PC gamers have started focusing on PC again as consoles are seeming old hat performance wise. Playing a game at 1080p with full detail on my PC looks way better than my 360 could ever do, but I prefer the console experience over the PC.

If a consumer doesn't care about graphics then there are plenty of consoles and games out there that would make them happy, but i Hope that MS and Sony continue the tradition of a more powerful new console each generation, as for many gamers like myself that is very exciting and what we want for next gen.

Tom Battey
profile image
I do understand. As gamers, we've become accustomed to the thrill of the hardware reveal, and the jaw-drop feeling at seeing what the next-generation of graphics can deliver.

However, I feel that as console and PC development have grown closer in this last generation - and the console market has cannibalised some of the PC market in the process - the resulting tech push has placed too heavy an expectation on console manufacturers to meet, or best, high-end PC graphics.

PC graphics look as good as they do because they are running on rigs that cost upwards of £1000, and it's unrealistic to expect console manufacturers to produce something that looks as good as that without pricing themselves out of the market. It won't be as easy to sell hardware at a loss this time around, given the current economic climate.

I for one would welcome a divergeance in the console and PC markets. PC gaming works as a niche high-end market because there are consumers willing to pay the price to experience the best graphics. I don't believe there are enough of them, however, to support a £500+ videogame console with comparable graphical power.

I can see PC gaming continue to push the envelope power-wise, while the console market operates in more of a mid-tier, somewhere between the PC experience and the (increasingly more powerful) mobile market. This can only be positive; it means better games for PC gamers, with development unrestricted by 'watered down' console ports, and a more accessible, affordable, and stable console experience.