Gamasutra: The Art & Business of Making Gamesspacer
arrowPress Releases
September 18, 2014
PR Newswire
View All





If you enjoy reading this site, you might also want to check out these UBM Tech sites:


Carmack: Xbox One and PlayStation 4 are 'essentially the same'
Carmack: Xbox One and PlayStation 4 are 'essentially the same'
August 1, 2013 | By Kris Ligman

August 1, 2013 | By Kris Ligman
Comments
    21 comments
More: Console/PC, Programming, Art



In his annual keynote for QuakeCon, John Carmack wrote off the triple-A industry's obsession with high-end graphics, while also deeming the next-generation consoles Xbox One and PlayStation 4 "essentially the same."

"I haven't done really rigorous benchmarking [but] they're very close and they're both very good," Carmack told QuakeCon attendees, also noting that Sony had "made huge strides" in the last generation, especially regarding tools for developers.

"It's almost amazing how close they are in capabilities, how common they are," Carmack continued. He described the public reaction to initial fears regarding Xbox One's proposed preowned games lockdown and always-on Kinect 2.0 as "a bit on the side of a witch hunt," and that posterity would look upon these technologies differently.

"The future is obvious right there, and it will be good for us in general," said Carmack, who also described technologies such as Google Glass, while being controversial today, would be a "net positive."

"A lot of these things are inevitable."

Overall, the Kinect drew perhaps the sharpest crack in the course of Carmack's keynote, in which he described the Microsoft peripheral as "sort of like a zero-button mouse with a lot of latency."

"Kinect still has some fundamental limitations with the latency and framerate... It's fundamentally a poor interaction."

Carmack also expressed disappointment that because modern computer systems are so vast, there is no way to truly master a platform.

"No one person has the entire capabilities of one of these modern platforms," he said, contrasting with older hardware the entire documentation for which could be contained in a manual. Instead, the "crystal jewel of perfection" of modern platforms was handled by specialists for every facet.

It might have a bit of something to do with why, even as the industry reaches for photorealism with its graphics, innovation is harder to come by.

"Priorities are out of whack," Carmack said. He posed a scenario where a developer could take five minutes of in-game play, and pass it through top-of-the-line offline rendering. Would play experience improve? "Not by much."

QuakeCon runs this weekend in Dallas, Texas. Two Twitch.tv livestreams have been set up to cover both matches and panels. You can view the entire schedule or learn more about the event here.


Related Jobs

Runic Games, Inc.
Runic Games, Inc. — Seattle, Washington, United States
[09.18.14]

Junior Web Developer / Systems Administrator
Wargaming.net
Wargaming.net — Chicago, Illinois, United States
[09.18.14]

Front End Web Developer
CCP
CCP — Reykjavík, Iceland
[09.18.14]

Distributed Systems Engineer
CCP
CCP — Reykjavik, Iceland
[09.18.14]

Low Level/Engine Programmer










Comments


Ron Dippold
profile image
Well, that should stop all the internet arguments!

But seriously, it's good to have a performance expert weigh in. It sounds like from this and talking with other devs that the differences this time around aren't performance at all, but in how much of a PITA the online integration APIs and suspend APIs are, soft requirement on XBone to support Kinect, hard requirement on PS4 to support Vita, that sort of thing. Ecosystem pain (and opportunity).

Kujel s
profile image
In the end for consumers it's the games and the features that matter most but for us it's always nice not to have to deal with things like the cell processor.

Kujel s
profile image
I came across this today, I think it's relivent to the discussion of graphics http://www.100fps.com/how_many_frames_can_humans_see.htm

Josh Stratton
profile image
> It might have a bit of something to do with why, even as the industry reaches for photorealism with its graphics, innovation is harder to come by.

I respect John Carmack as a developer, but I think it's a little hypocritical criticizing people for lack of gameplay. Doom 3 had some interesting graphics and theme, but I think its gameplay won't be remembered over any other shooter of its day. Forcing gamers to listen to PDA entries, poorly developed and boring puzzles, walls that drop down with bad guys behind them for no reason ala Doom 1/2, seeing a piece of armor and knowing that the second I touch it two zombie guards are scripted to run into the room and shoot at me kind of killed the mood for me.

Jakub Majewski
profile image
I don't think it's hypocritical as such - just because he failed to do gameplay well, doesn't mean he can't think that gameplay is really important. Ironic, yes, but not hypocritical.

Soeren Andersen
profile image
Perhaps his statement should be read as a veiled criticism of id.

Carmack is not their game play designer so you cant really fault him directly for DooM3, but its clear that they seem to be struggling. it must be frustrating for him that despite his technological strides the id projects fail due to game design.

Clearly the elephant in the room was DooM4 which wasn't mentioned at all. Its the first quakecon Ive seen where no news of id titles were disclosed.

I'm actually surprised that the headlines Ive seen so far have been about his comments on kinect and xbox one vs ps4 and none mentioning the absence of DooM4 or other new projects.

Jonathan Murphy
profile image
Honestly hardware doesn't concern me. It's coding for the consoles. If ports are easy I'll be happy. It has plenty for what I plan on using them for.

Michael Scala
profile image
I understand 1% of what Carmack says.

Too bad him and his boys at id seem to have forgotten how to make a badass gaming experience.
[opinion]

Christian Philippe Guay
profile image
''Carmack also expressed disappointment that because modern computer systems are so vast, there is no way to truly master a platform.''

I'm no engine programmer, so maybe someone can clarify. I remember an interview with Crytek saying that Windows wtih DirectX doesn't allow game developers to code to the metal at a low level (hardware) and that's why PCs are 10 times more powerfuls than consoles, but games don't look 10 times better. Does it mean that the world would benefit from a new OS that would allow game developers to code to the metal at a low level on PC?

And as graphics are getting stagnant, if most game developers start to use a very specific card as the new standard (like right now it's pretty much around nvidia GTX 580) for years to come, I think the problem could fix almost itself.

Freek Hoekstra
profile image
coding to the metal has dissapeared mostly due to the fact that it is just not practical,
it takes way too much time and resources to create everything yourself.

OT: I think he is right, we have hit diminishing returns, twice as powerfll means slightly better shaders and that's about it... and these devices are close enough for it not to matter a whole lot anymore.

the PS2 competed (and beat hands down) the Xbox, even though it was over 3x as powerful and supported pixel shaders etc. the PS3 and 360 came out around the same time and turned out to be pretty much equal, and these consoles will do the same...
they launch in the same window and will be close enough for the difference to be negligable.

Christian Philippe Guay
profile image
But wouldn't it make sense for engine developers to have the opportunity to code to the metal, so that their commercial engines can fully take advantage of the different platforms?

Because as graphics got recently stagnant due to high production cost and time, it would make a lot of sense now for game companies to make their own game engine that finally won't be outdated just in 3 years.

And considering that it takes a lot of time to code to the metal, that's precisely what could make Unreal Engine or CryEngine shine, if those company to code to the low level and heavily optimize their technologies for the greater good.

Ron Dippold
profile image
I think you're hoping for a bit too much from low level coding. What really 'stagnated' graphics was hitting the absolute hardware limits of the current gen consoles, which means that because of sales realities the PC side suffers as well even though it's capable of far more.

As far as I can tell, only one company managed to squeeze every single drop out of the PS3 - Naughty Dog. A combination of being engine geniuses, having full access to Sony, and having enough resources to dedicate to coaxing every last cycle out of those @#$$ing SPEs and miniscule data pipelines. And it still took them over five years!

So what did it get us? Uncharted 3. Which was gorgeous, but was it a better game than Uncharted 2? No. Nor does it look better than something you could get on a high to mid end PC with a lot less effort. So for Last of Us they went with something that still looks damn good but concentrates more on telling an interesting story.

Also note that as a consequence of their complete and total PS3 optimization, their games are completely unplayable on anything else. Which is great for Sony, but sucks for everyone else. That's not a minor consideration if you're not a first party developer.

I still do some coding at low level and bit banging in ASM for critical loops, but I do it so we can provide an API so our users don't have to. If a dev has to resort to bit banging on a device that they didn't make, it means that we (the people who made the device) screwed up. Applications (games) need to /consider/ the architecture, but they should not have to /constantly/ obsess about the low level details. It's far too much duplicated work and far too prone to error and brittleness for too little gain.

The best thing you could provide is a super fast general purpose CPU + a super fast general purpose GPU + lots of shared RAM + a great API that lets you say 'do what I want' to a large extent. And standardize it so we don't have to worry about multiple configurations. Which is kind of what we have this time - except we can't manage a single super-fast CPU, so we have multiple fairly fast cores - but they're generic and symmetric, and people are used to that now, so it's okay. PS4 and XBone are close enough you can treat them as kissing configurations (for power - the ecosystems are another matter).

Okay, that came out a lot longer than I intended, but the key points are:
- To the metal is far too much effort for far too little return (though it's fun) and results in fragile, non-portable code. You do not want to have to program to the metal. If you need to, the people who made the thing messed up.
- Far better is for the hardware to be very powerful, very generic, and standardized. For the first time, we're there!

John Ingato
profile image
Shut up Carmack....You're saying what they want you to say. Explain to me how they could be "essentially the same" when one has 8GB of DDR5 memory and the other has 5GB of DDR3? As a game developer I know how important memory is. I know a lot of people are going to say "5GB is way more than enough". Yeah, well it wasn't too long ago when people considered 1GB of memory to be overkill for a gaming rig.

Joel Bitar
profile image
Regarding the amount of memory the PS4 also has a lot of memory reserved for the system, obviously.

Gryff David
profile image
This new console generation is disappointing in general though. If you compare how the X-Box 360 and PS3 hardware compared to PC hardware at the time, they were incredibly high-end. It's approaching the point now that 8GB in a PC is fairly standard and 16GB-32GB is not unheard of. If they want this generation to last as long and as well as the previous generation did, they're both gonna have to step up their game.

Tom Baird
profile image
@John Ignato
The PS4 and Xbox 360 have similar amounts of memory, and reserve similar amounts for the OS.

Source: http://arstechnica.com/gaming/2013/07/report-os-overhead-takes-up
-3-5gb-of-ps4s-8gb-of-ram/

... and did you really just try to accuse him of speaking his opinion like it was a bad thing? Would you rather he simply said what he was told from higher ups?

Daniel Carreras
profile image
You sure that's correct? I heard the rumour on the PS4's Ram was false, with games already using 6GB.

Source:

http://www.thesixthaxis.com/2013/07/28/ps4-ram-rumours-absolutely
-false-according-to-developer/

Zach Grant
profile image
Sorry John, but I'll take the word of one of the greatest graphics programmers of our time over your opinion. Unless you are equally qualified and have done tests yourself?

Dave Hoskins
profile image
I think what Carmack means is that he laments the days he could sit in a room dreaming up algorithms that would make his games stand out from the rest, and get astonished reviewers raving about them.

You can always code your graphics to the metal by using shader techniques and all those ideas coming out of Siggraph every year. And you can't really compare the new consoles to a desktop PC, because the graphics processing is sharing the RAM with the CPU's, so a lot of the transfer times have been removed. This makes some interesting fast streaming techniques possible, plus many other things.
It's same with audio, as there is nothing to stop people using all manor of FX hand written in C++ or assembly.

Brett Williams
profile image
I would say at this point PC and Console are closer than they ever have been. So I don't feel it's fair to say we can't compare them.

I think we have a risk that as the consoles move toward multimedia devices they have the potential to be introducing more layers that get in the way. Similar to the system/driver latencies experienced with Windows. It's good to know that right now they are separating these out into separately run entities and giving dedicated cores to applications.

He talked a bit about unified memory, and how it's a benefit. As long as the system reserves a specific amount for applications and it is deterministic and doesn't randomly change. I think we all want unified memory on the PC, but at this point AMD/Nvidia have a strong stance on keeping it separate for some time. Intel would benefit the most from GPU manufacturers moving to unified.

Lincoln Thurber
profile image
Carmack is probably a genius, but at this point if I were at id software I'd probably rather be at Epic, Naughty Dog, or Rockstar. The engines he makes will probably always be good and pushed data in unique, if not more efficient ways, but probably the engines coming out of RockStar North, Epic, Naughty Dog, Square, Konami, etc see more use.

The analogy would be the Daytona Motor Speedway might be the smoothest road even made...but you'll never drive it and you'll never drive on it in a NASCAR so even if you did drive it you car would be wasted on it. Carmack could make the best engine ever, but his team can't make the best game ever nor do they make many games you could play or see it in actions...so what is the point.


none
 
Comment: