|
The PlayStation 4 is due out this fall, and its technical specifications have been largely under wraps -- till now. While the company gave a presentation at GDC, the system's lead architect, Mark Cerny, hasn't talked publicly in any great depth about the platform since its unveiling this February.
Cerny approached Gamasutra in the hope of delivering a "no holds barred PlayStation 4 hardware expose," he said, during the interview that resulted in this story. "That certainly is what we're here to do," said Cerny, before speaking to Gamasutra for well over an hour.
What follows is a total breakdown of the hardware from a developer's perspective: the chips on the board, and what they're capable of.
Questions on the UI and OS were off the table. What was up for discussion is what the system is capable of, and the thinking that lead Cerny and his team to make the decisions they made about the components they chose and how they function together.
To get to the heart of this deeply technical discussion, Gamasutra was assisted by someone with an intimate knowledge of how console hardware really works: Mark DeLoura, THQ's former VP of tech and now senior adviser for digital media at the White House Office of Science and Technology Policy.
The Beginnings
"For me, this all started in late 2007," said Cerny, remembering how he embarked on the road to becoming lead architect of the PlayStation 4. "Because we'd been doing postmortems on the PlayStation 3 -- a very broad group of people across the Sony Computer Entertainment team were evaluating how well that had gone."
That lead, naturally, to thoughts about what to do next. Musing on the architecture of Sony's next system, Cerny spent his Thanksgiving holiday reading up on the history of the X86 architecture -- realizing that not only had it evolved dramatically over the years, but that by the time the PlayStation 4 shipped, it would be powerful enough for Sony's needs.
It had evolved into something "that looked broadly usable by even the sort of extreme programmers we find in the games business," he said.
Realizing how passionate he was about the PlayStation 4 project, after Thanksgiving, Cerny went to Sony's then-execs Phil Harrison and Masa Chatani, "and asked if I could lead the next generation effort. And to my great surprise, they said yes."
"The Biggest Thing" About the PlayStation 4
Cerny approached the design of the PlayStation 4 with one important mandate above all else: "The biggest thing is we didn't want the hardware to be a puzzle that programmers would be needing to solve in order to make quality titles."
The PlayStation 3 was very powerful, but its unfamiliar CELL processor stymied developers. "There was huge performance there, but in order to unlock that performance, you really needed to study it and learn unique ways of using the hardware," said Cerny.
That situation led directly to the PS4's design philosophy: "The hope with PlayStation 4 was to have a powerful architecture, but also an architecture that would be a very familiar architecture in many ways."
In fact, this is something Cerny returned to again and again during the conversation. "We want to make sure that the hardware is easy to use. And so having the familiar CPU and the familiar GPU definitely makes it easier to use," he said.
Later, when asked about whether Sony considers the fact that many third party developers will also have to create versions of their games for the next Xbox, his response was, "when I say that our goal is not to create puzzles that the developers have to solve, that is how we do well in a multi-platform world."
But ease-of-use is far from Cerny's only goal. As a 31-year veteran of the industry, he well knows that the PC will march onward even as the PlayStation 4 stays frozen in time.
"Ultimately, we are trying to strike a balance between features which you can use day one, and features which will allow the system to evolve over the years, as gaming itself evolves," said Cerny. The "supercharged PC architecture," that the team has come up with -- to use Cerny's term -- is designed to offer significant gains the PC can't, while still offering a familiar technological environment for engineers.
To design the PlayStation 4, Cerny didn't just rely on research, or postmortems of the PlayStation 3. He also toured development teams and spoke to middleware partners to find out precisely what they wanted to see in a next generation console. The result? You'll read about it below.
What Does 'Supercharged' Mean, Anyway?
The PlayStation 4's architecture looks very familiar, at first blush -- and it is. But Cerny maintains that his team's work on it extends it far beyond its basic capabilities.
For example, this is his take on its GPU: "It's ATI Radeon. Getting into specific numbers probably doesn't help clarify the situation much, except we took their most current technology, and performed a large number of modifications to it."
To understand the PS4, you have to take what you know about Cerny's vision for it (easy to use, but powerful in the long term) and marry that to what the company has chosen for its architecture (familiar, but cleverly modified.) That's what he means by "supercharged."
"The 'supercharged' part, a lot of that comes from the use of the single unified pool of high-speed memory," said Cerny. The PS4 packs 8GB of GDDR5 RAM that's easily and fully addressable by both the CPU and GPU.
If you look at a PC, said Cerny, "if it had 8 gigabytes of memory on it, the CPU or GPU could only share about 1 percent of that memory on any given frame. That's simply a limit imposed by the speed of the PCIe. So, yes, there is substantial benefit to having a unified architecture on PS4, and it’s a very straightforward benefit that you get even on your first day of coding with the system. The growth in the system in later years will come more from having the enhanced PC GPU. And I guess that conversation gets into everything we did to enhance it."
The CPU and GPU are on a "very large single custom chip" created by AMD for Sony. "The eight Jaguar cores, the GPU and a large number of other units are all on the same die," said Cerny. The memory is not on the chip, however. Via a 256-bit bus, it communicates with the shared pool of ram at 176 GB per second.
"One thing we could have done is drop it down to 128-bit bus, which would drop the bandwidth to 88 gigabytes per second, and then have eDRAM on chip to bring the performance back up again," said Cerny. While that solution initially looked appealing to the team due to its ease of manufacturability, it was abandoned thanks to the complexity it would add for developers. "We did not want to create some kind of puzzle that the development community would have to solve in order to create their games. And so we stayed true to the philosophy of unified memory."
In fact, said Cerny, when he toured development studios asking what they wanted from the PlayStation 4, the "largest piece of feedback that we got is they wanted unified memory."
"I think you can appreciate how large our commitment to having a developer friendly architecture is in light of the fact that we could have made hardware with as much as a terabyte [Editor's note: 1000 gigabytes] of bandwidth to a small internal RAM, and still did not adopt that strategy," said Cerny. "I think that really shows our thinking the most clearly of anything."
|
However, there's an indie game revolution going on, and those games tend not to push the limits of the hardware so much as the licensing model. Sony seems to be playing their cards right on this as well, but it remains to be seen where the tiny developers end up, whether it be PC, Steambox, Ouya, PS4 or Xbox.
Installed user base will be a huge factor in deciding where the indies end up, and most of that will come down to price and the strength of their launch library. The things described in this article won't get a chance to play out unless the PS4 does well through the first couple of years.
PS move and EYE may look silly at times now, but it sure is more responsive and reliable than it's competitors by miles. IF you don't include one of the cardinal pieces, such as the EYE camera, they will have to be sold with games.
Less of these games are sold and and the publishers don't have a lot of incentive.
This is a big problem because, gaming will always stay the same way in terms of interactivity. I for one DO NOT want that.
But, I agree SONY does have to be reasonable and then some.
That is simply untrue. The Blu-ray drive mechanism was expensive at the time compared to a DVD-ROM unit but it was well established production line with low defect levels.
The bulk of the PS3's cost at launch was the silicon. Yields on the CELL were horrible, which is why the original multiple CELLs doing everything design was scrapped and a dedicated GPU was added late in the process. This wasn't the first time Sony had very expensive silicon at launch. The PS2 was intended to launch with the EE and GS produced .18 micron. The existing .25 micron chips were only for engineering samples and early developer kits. But Sony couldn't get a .18 micron production line going in time. (Intel was only starting to ship .18 micron parts at the time.) So the PS3 launch in Japan used BIG .25 micron chips that made those units very expensive for Sony. But not more expensive than it would have been to delay launching for several months and give Sega time to draw more consumers to the Dreamcast.
Blu-ray drives cost a lot less now but by far the biggest cost reduction for Sony on the PS3 is shrinking and integrating the chip set. This meant better yields, lower cost per unit aside from yield levels, and subsequently lesser power and cooling needs, which in turn allowed for more cost reduction in the later models.
I'm personally interested in how different their chip architecture will be from the new xbox, and whether the technical bells and whistles of the PS4 will attract developers to make exclusive titles for Sony.
XBOX may go with 32MB embedded eDRAM within the APU to compensate future shader bottlenecks, when it comes to graphic commands. The APU will use 8GB or more DDR3 SDRAM. PS4 also has 18 CUs where as XBOX NG is supposed to have only 12CUs. cost of the APUs may be pretty much the same.
Benefits in this approach would be two-fold.
1. Memory is abundant and cheap, box will cost considerably less
2. Windows kernels are already designed to use DDR3 addressing system.
Leaving Microsoft to make up for it by means of their strong software features and truly next gen Kinect.
As a consumer I couldn't be happier.
The hardware sounds great and powerful. It's up to the developers to exploit that potential. Unfortunately, the popularity of the console will come down to price. The PS4 will be extremely popular if it's competitively priced and not overpriced like the PS3 was at launch.
Also the Windows 8 numbers are just that, Windows 8. You don't need a new desktop or OS to do PC gaming. I think a lot of people are looking too much into this. There is probably more correlation in that people don't "need" a computer every 3 years as to why the numbers are really low with every other version of Windows...
People have been wrong for 20 years, and continue to be about the PC. With a 1 to 2 billion person installed user base, PC 's and pc gaming isnt going anywhere anytime soon.
The PC( Personal Computer ) is not going anywhere, and I'll be quite happy to say I've told you so down the line.
The PC may not exist in the same form we use today -- which is a given, my Mac as an example is way differen than the TI99 I had in the eighties, or anything I worked on in the nineties -- but even if it's a tablet I have plugged into an external monitor and keyboard/mouse( Whatever input, I use a Wacom. ) down the line, it's still a PC.
There's plenty of room for mobile devices and PCs to exist. Both platforms compliment one another at the momemnt and even if and when a convergence happens, it's only because other devices have become more like PCs.
If a day comes where the PC does go away, and you're right, it won't be because we've replaced them, it will be because the "P" in PC has been eliminated. That will be a sad day when people no longer own their device( computer ), let alone their information, as they're tied into and completely reliant on some CLOUD service with a monthly subscription; only a fool would welcome this.
Its easy to say I told you so when your argument is just a semantics game. The point is very simple and clear: very few people buy desktop computers anymore. Very few companies even make them anymore. Laptops, tablets, smartphones are the wave of the future. New hardware architectures will preclude Valve from simply declaring Steam the go to portal for all games. Yes there will always be hobbyists and people who love the latest GPU, but they won't dictate overall market trends.
Yes, when you account for less variables and live only in the now, things can be simpler and clearer.
And laptops are part of the wave of the future? Here's some semantics, this isn't the eighties, notebooks have been the now for a long time -- at least from my perspective; and even though there's crossovers, notebooks are more closely related to a desktop PC, than a tablet/smartphone; in my case, they are my primary desktop comp.
Your thinking isn't remotely new, it's only more relevant, since the technology is finally reaching a point that PCs can exist in more forms and fill in more niches, but they're still PCs. Unless whatever is trendy can truly live up to the task required, it will not replace; and mobile OSs are by no means a replacement as of yet; and locked down OSs will never be a replacement.
Anyways, good luck predicting the future with your tEh DOOM outlook on desktop PCs. Maybe generalization is they key?
Smart phones are the mediating variable towards the conventional PC's death. We've already seen the transition begin to take place with Android and iOS being featured on tablets, so to even preclude these platforms as "mobile" is deceptive.
That said, while these platforms being locked down is a topic worth thinking on, the reason it will work towards replacing PCs is that they're dramatically cheaper and more convenient than conventional PCs.
No consumer is going to buy a "Dell X3594" when they can get a say, "Script Writer" tablet, that is very cheap, and yet has software completely specialized towards the task of word processing for instance, and thus actually better at it than a trojan horse PC is capable of.
A prime example of this actually is the Amazon Kindle, and it also helps refute your concern about Android being a "locked down" platform. By branding its own hardware Amazon is able to proliferate all its digital business channels, and make more money. They're able to curate and shape Android to the liking of their brand. There's no way they could do that sort of thing with a Windows based device---because it depends on Microsoft's legacy of apps, and relationships with Intel, etc.
Now you can raise semantic hell, and tell me that in fact the Amazon Kindle is a "PC" until the cows come home. But can Valve sell Steam games on such a platform? No. Does such a platform threaten the sellability of high end graphics cards that desktop PCs rely on? Yes. The best hope for the future of "PC gaming" is that ARM architecture increases in power exponentially, and essentially resets the field.
*Incidentally the other future for PC gaming is that they create external hardware designed to interact with mobile and "smart" devices and enhance their graphics capabilities. And in this sense, the PC will just turn into what consoles already are. And the PS4 is already simulating this future by bundling in an ARM processor to perform OS functions---in other words the "Windows" like functions.
If anything is going to be threatened by mobile devices it's consoles. A few years from now people will think: "sure these consoles can do some nifty stuff, but why should I pay extra for that when my phone can already hook to the TV and let me play full HD 3D games?"
I think that for now PC is relatively safe thanks to being the content creation platform. Once that changes, PC users will need to worry. (The PC will possibly move to hardware more like the PS4 though.)
Windows 8 numbers are Windows 8 numbers. Windows Vista sucked in the eyes of many when it came out (so did its sales), and when Windows 7 rolled around, sales bounced back. Windows 8 sucks in the eyes of many people today (WHO wants to cripple the desktop with touchscreen controls that can't be used except at a very premium price? Who wants to use Metro on their desktop? Who wants another potential walled garden? Who wants to have to replace most of their expensive software libraries?), and in fact offers severe incompatibilities with some older software, especially development software. Windows 8 numbers reflect that the marketplace doesn't buy Windows 8 as a good platform and they're sticking with 7, and even XP in some cases.
Different strokes for different folks, but for me consoles provide an experience that my PC doesn't quite live up to at times.
From what Sony has announced, the Unreal engine will be bundled with the toolkit.
What's your source on that? Epic announced they were supporting it and showed the Elemental demo, but I haven't seen any announcements about Unreal being bundled with the SDK.
Got tired of my PS3 "movie" colection, I'd like to see some better games for a change.
1) What are kind/spec of chip is being used for the dedicated download (and other peripheral) CPUs? PowerPC? ARM? Is this how some bits of cross-platform/backward-compatibility will be addressed?
2) Lots of use of the word async makes me ask: Will we be getting a toolchain and library set that is C++11 native? lambdas, atomic, copy elision, etc are all things that high-performance highly-concurrent state machines and simulations will need to maximize the hardware.
3) will OpenMP be supported by the toolchain for the same reasons? A GCC 4.8-based toolchain would be awesome, but I would settle for a 4.7-based one with enhancements from Google's branches. We see ~20% better raw performance from Google's 4.7 branch versus the FSF 4.7.
4) Will mono be a first-class citizen with full C# 5.0 async/concurrency suport? Again, a necessity for allowing developers to use language-native constructs for distributing work to the CPU/GPU transparently without crazy macros that can collide and whatnot. Note that I mean mono in-general, not specifically PSM or Unity.
5) The GPU can only access the main memory at 20GB/sec? out of a total of 176GB/sec bandwidth? Why?
And 5) It seems that's the second bus on the GPU for smaller read/write system memory data transfers that bypass the GPU's L1/L2 to circumvent any synchronization issues. In all the documentation so far, the GPU's main bus memory transfer is listed at the 176GB/sec rate, but there's a degree of graphics/compute sharing to juggle.
I'm surprised they didn't use a scaled down PowerPC somewhere in there. Certainly SPU code can be dynamically recompiled on the fly for GPU/CPU, but PowerPC might be more difficult. Then again, if Microsoft was able to do dynamic recompilation+static patches for Xbox1->360 compatibility, I guess it's entirely possible.
Down the road, probably the simplest way to do an improved remake of a PS3 game that was multi-platform, would be to port from the PC version and see how much the visuals can be improved over the PS3 version without substantial investment. Actually, we might be seeing ports of a lot of PC games that never appeared on any Sony console. It could be Sony's own version of GoG.
With the above in mind, it doesn't really matter what architecture the assist chip uses. It will likely be completely reserved for the system and not accessible by app and game developers.
If the high end titles, making the most use of the hardware anyway, all use virtualized textures, that's at max a 16k buffer maybe, maybe two if you really want to get into complex materials. If a good, cheap crack free tesselation scheme can be found, you can get models down too almost a tenth the size they are now with displacement maps. Suddenly in 2015 you have a ton of ram and you don't have anything to fill it with, because you can't compute that much that fast. I can foresee the new PS and Xbox needing their own set of hacks, caching everything possible, trying to make use of those 8 gigs, kind of the opposite of this last generations struggle to get every mb to count.
At least, I can see that as a possibility.
(Of course, the Killzone demo we saw at the launch event did some distract+replace... not sure why, though.)
In another interview, Cerny gives the credit to Kaz Hirai for recognizing that Kutaragi's EE method of 'Okay, here's some hardware we designed in a windowless tower, now you software guys do whatever you do' wasn't going to be workable any more and going developer friendly for PS4 very early on.
No, PC gamers won't be impressed.
Mac's used to Motorola processor's. Yet now they use Intel procs. Are they less of a Macintosh because of the hardware change? Of course not. They still use OSX and all the other services that Apple provides.
That is what matters in the end. The OS, the programs/games, and the services that Sony and Microsoft provide for the gamers.
How many PC gamers do you know with hardware that can selectively dedicate 4-6 GB of GDDR5 solely to the GPU's needs? All those PC gamers with $1000 graphics cards in 4GB DDR5 flavors? Plus add-in PCIe DDR5 RAM modules costing $400? And with those luxury cards you still won't get close to the effective PS4 memory bandwidth with PCIe bottlenecks.
While your PC tower features gigantic power-hogging add-in cards in all of its PCIe slots.
The RAM is quite impressive on the PS4. At least you can capitalize on it in every sense for a game, unlike the bloated system overhead on PC -- not to mention the fact that few PC games are even optimized for 4GB video cards at this time.
That lovely unified memory and processor architecture should give the PS4 some solid future-proofing.
Maybe the raw spec numbers of PS4 get overshadowed (already has) by higher end PC gaming hardware, but does the PC provide as efficient/elegant of an architecture yet, for games? Not yet.
Even when AMD goes all in with a consolidated, unified board/chipset in the near future, I wonder if the PC parts industry will be ready to keep up with third-party motherboard variants, or third-party RAM (and would the board feature both DDR3 and DDR5 slots, or would DDR5 still be relegated to the realm of add-ins, and does PCIe bus speed dramatically improve?).
But sure, it's altogether possible that by 2015 or so this form of unified architecture could be widespread in mainstream computers, as we shrink down component sizes, get more efficient, consolidated, leaner and greener across our computing devices.
Do you accuse the systems in a new car of being 'computer wannabes' because they're specialized and would not lend themselves to most common PC uses?
All CPU:s today has GPU built in and AMD's Fusion platform is many way built for the same purpose as described in the article to use the GPU to perform alot of tasks instead of the CPU like physics calculations etc.
The PC have 1 big GPU for graphics. And a smaller GPU built in the CPU with shared memory.
Of course it is not exactly the same as PS4 and a bit more complexed. But the end result would probably be the same for the gamer.
They added a second BUS from GPU to sytem memory and a bunch of dedicated hardware units to real benefit.
Let's not forget the main disadvantage of such PCs. Windows OS kernels are the same for every PC. It doesn't at all take into account if the SYSTEM can do this (HSA computing).
But in the future AMD may take things further and get this tech to PC, in which case AMD and SONY still benefits enormously.
What AMD is doing with the upcoming systems is creating a true unified pool of memory in which both CPU and GPU not only have access tot he entirety of RAM but also use the same addressing scheme to reference a location. This is a big, big change over how things work now and makes using the GPU far simpler.
With all that said I really hope next gen is better than the current one has been. This gen was very pricey, unreliable (durability wise) and lacking in moderate priced software.
I have well over a hundred games on each machine. Nearly all were bought new and my average cost was between $10 and $20. Everything turns up cheap if you're patient and pay attention to the deals when they pop up.
I enjoy building PCs, especially if some whizzy new stuff is involved. But I also enjoy being able to just press a button and have the thing work every time when it comes to games. It's enjoyable to read the review of the latest GPU but when it comes time to play I'm perfectly happy to use old hardware if everything just works without a hassle.
Basic,no plus but gankai or gankai,no plus basic...basic would continue free online where plus and gankai get dedicated servers.
Basic would blend all online on non dedicated servers so kz and cod share servers.
No need for the cattle prod of xbl in addition to your isp when free,discounted and streaming games over dedicated servers is enough of a carrot to get people to subscribe...imho