Another thing the PlayStation 4 team did to increase the flexibility of the console is to put many of its basic functions on dedicated units on the board -- that way, you don't have to allocate resources to handling these things.
"The reason we use dedicated units is it means the overhead as far as games are concerned is very low," said Cerny. "It also establishes a baseline that we can use in our user experience."
"For example, by having the hardware dedicated unit for audio, that means we can support audio chat without the games needing to dedicate any significant resources to them. The same thing for compression and decompression of video." The audio unit also handles decompression of "a very large number" of MP3 streams for in-game audio, Cerny added.
At the New York City unveiling of the system, Cerny talked about PlayGo, the system by which the console will download digital titles even as they're being played.
"The concept is you download just a portion of the overall data and start your play session, and you continue your play session as the rest downloads in the background," he explained to Gamasutra.
However, PlayGo "is two separate linked systems," Cerny said. The other is to do with the Blu-ray drive -- to help with the fact that it is, essentially, a bit slow for next-gen games.
"So, what we do as the game accesses the Blu-ray disc, is we take any data that was accessed and we put it on the hard drive. And if then if there is idle time, we go ahead and copy the remaining data to the hard drive. And what that means is after an hour or two, the game is on the hard drive, and you have access, you have dramatically quicker loading... And you have the ability to do some truly high-speed streaming."
To further help the Blu-ray along, the system also has a unit to support zlib decompression -- so developers can confidently compress all of their game data and know the system will decode it on the fly. "As a minimum, our vision is that our games are zlib compressed on media," said Cerny.
There's also another custom chip to put the system in a low-power mode for background downloads. "To make it a more green hardware, which is very important for us, we have the ability to turn off the main power in the system and just have power to that secondary custom chip, system memory, and I/O -- hard drive, Ethernet. So that allows background downloads to happen in a very low power scenario. We also have the ability to shut off everything except power to the RAMs, which is how we leave your game session suspended."
Sounds Good, But... Bottlenecks?
One thing Cerny was not at all shy about discussing are the system's bottlenecks -- because, in his view, he and his engineers have done a great job of devising ways to work around them.
"With graphics, the first bottleneck you’re likely to run into is memory bandwidth. Given that 10 or more textures per object will be standard in this generation, it’s very easy to run into that bottleneck," he said. "Quite a few phases of rendering become memory bound, and beyond shifting to lower bit-per-texel textures, there’s not a whole lot you can do. Our strategy has been simply to make sure that we were using GDDR5 for the system memory and therefore have a lot of bandwidth."
That's one down. "If you're not bottlenecked by memory, it's very possible -- if you have dense meshes in your objects -- to be bottlenecked on vertices. And you can try to ask your artists to use larger triangles, but as a practical matter, it's difficult to achieve that. It's quite common to be displaying graphics where much of what you see on the screen is triangles that are just a single pixel in size. In which case, yes, vertex bottlenecks can be large."
"There are a broad variety of techniques we've come up with to reduce the vertex bottlenecks, in some cases they are enhancements to the hardware," said Cerny. "The most interesting of those is that you can use compute as a frontend for your graphics."
This technique, he said, is "a mix of hardware, firmware inside of the GPU, and compiler technology. What happens is you take your vertex shader, and you compile it twice, once as a compute shader, once as a vertex shader. The compute shader does a triangle sieve -- it just does the position computations from the original vertex shader and sees if the triangle is backfaced, or the like. And it's generating, on the fly, a reduced set of triangles for the vertex shader to use. This compute shader and the vertex shader are very, very tightly linked inside of the hardware."
It's also not a hard solution to implement, Cerny suggested. "From a graphics programmer perspective, using this technique means setting some compiler flags and using a different mode of the graphics API. So this is the kind of thing where you can try it in an afternoon and see if it happens to bump up your performance."
These processes are "so tightly linked," said Cerny, that all that's required is "just a ring buffer for indices... it's the Goldilocks size. It's small enough to fit the cache, it's large enough that it won't stall out based on discrepancies between the speed of processing of the compute shaders and the vertex shaders."
He has also promised Gamasutra that the company is working on a version of its performance analysis tool, Razor, optimized for the PlayStation 4, as well as example code to be distributed to developers. Cerny would also like to distribute real-world code: "If somebody has written something interesting and is willing to post the source for it, to make it available to the other PlayStation developers, then that has the highest value."
A Knack for Development
There's another way Cerny is working to understand what developers need from the hardware.
"When I pitched Sony originally on the idea that I would be lead system architect in late 2007, I had the idea that I'd be mostly doing hardware but still keep doing a bit of software at the time," he said. "And then I got busy with the hardware."
That detachment did not last. "I ended up having a conversation with Akira Sato, who was the chairman of Sony Computer Entertainment for many years. And his strong advice was, 'Don't give up the software, because your value is so much higher to the process, whatever it is -- whether it's hardware design, the development environment, or the tool chain -- as long as you're making a game.'"
That's the birth of Knack, Cerny's PlayStation 4 game, which he unveiled during the system reveal in New York City. And it's his link to understanding the practical problems of developing for the PlayStation 4 in an intimate way.
From a Thanksgiving weekend reading technical documents through a difficult and complex engineering process and finally to the development of a big new IP launch for Sony -- you can't say Mark Cerny isn't a dedicated, passionate, and busy man.
"I have not been this busy in 20 years. It's nice. But, definitely, I'm very busy right now," he said, to laughter from everyone in the room.
I think calling it a game changer is a bit of an exaggeration. The things he describes are very nice, but engine code is a relatively small piece of the entire game development picture. For developers already in the Sony ecosystem, the ease of development should be greatly improved, and hopefully this should result in more polished, more stable games.
However, there's an indie game revolution going on, and those games tend not to push the limits of the hardware so much as the licensing model. Sony seems to be playing their cards right on this as well, but it remains to be seen where the tiny developers end up, whether it be PC, Steambox, Ouya, PS4 or Xbox.
Installed user base will be a huge factor in deciding where the indies end up, and most of that will come down to price and the strength of their launch library. The things described in this article won't get a chance to play out unless the PS4 does well through the first couple of years.
He's talking about the PS4 architecture, you're talking about business models...in any case, five or so years from now indies are going to be stuck between a rock and a hard place thanks to oversaturating the market.
Well, it *is* a game changer. Much like Cell powered Little Big Planet, this very tight and forward-thinking architecture will enable Minecraft-type games but with breathtaking visuals, massive dynamism in physics, powerful control of the world yet with super-simple and intelligent controls.
To be honest im a bit concerned about this new camera being an unnecessary expense ontop of the price of the core unit. I have the same concern with the new xbox and its one of the main reasons i havent invested in a Wii U.
Dario: I don't think they've announced whether the camera will come bundled with every PS4, so there may be a SKU with just the console. We'll probably know more at E3.
Since the bulk of the price of the PS3 came from the brand new Blu-Ray drive, I don't really imagine a repeat of that fiasco. I guess it depends on how much 8GB of DDR5 costs.
They've talked about light bar tracking with the controller as a standard feature, so I think the controller will come with every console. Here's the secret about Kinect though: Its just two web cams duck taped together. It doesn't actually cost the $150 to make that they charge for it.
Whatever they do the standard SKU should include all the essencial peripherals. There is a very good reason for that.
PS move and EYE may look silly at times now, but it sure is more responsive and reliable than it's competitors by miles. IF you don't include one of the cardinal pieces, such as the EYE camera, they will have to be sold with games.
Less of these games are sold and and the publishers don't have a lot of incentive.
This is a big problem because, gaming will always stay the same way in terms of interactivity. I for one DO NOT want that.
But, I agree SONY does have to be reasonable and then some.
That is simply untrue. The Blu-ray drive mechanism was expensive at the time compared to a DVD-ROM unit but it was well established production line with low defect levels.
The bulk of the PS3's cost at launch was the silicon. Yields on the CELL were horrible, which is why the original multiple CELLs doing everything design was scrapped and a dedicated GPU was added late in the process. This wasn't the first time Sony had very expensive silicon at launch. The PS2 was intended to launch with the EE and GS produced .18 micron. The existing .25 micron chips were only for engineering samples and early developer kits. But Sony couldn't get a .18 micron production line going in time. (Intel was only starting to ship .18 micron parts at the time.) So the PS3 launch in Japan used BIG .25 micron chips that made those units very expensive for Sony. But not more expensive than it would have been to delay launching for several months and give Sega time to draw more consumers to the Dreamcast.
Blu-ray drives cost a lot less now but by far the biggest cost reduction for Sony on the PS3 is shrinking and integrating the chip set. This meant better yields, lower cost per unit aside from yield levels, and subsequently lesser power and cooling needs, which in turn allowed for more cost reduction in the later models.
Sony previously said they learned from their PS3 launch and announced that the PS4 price point will be around $400. AMD chips are cheap right now, so that's certainly a reality.
I'm personally interested in how different their chip architecture will be from the new xbox, and whether the technical bells and whistles of the PS4 will attract developers to make exclusive titles for Sony.
From what we know so far, [With absolutely NO factual basis at all]
XBOX may go with 32MB embedded eDRAM within the APU to compensate future shader bottlenecks, when it comes to graphic commands. The APU will use 8GB or more DDR3 SDRAM. PS4 also has 18 CUs where as XBOX NG is supposed to have only 12CUs. cost of the APUs may be pretty much the same.
Benefits in this approach would be two-fold.
1. Memory is abundant and cheap, box will cost considerably less
2. Windows kernels are already designed to use DDR3 addressing system.
Leaving Microsoft to make up for it by means of their strong software features and truly next gen Kinect.
That was a very enlightening read. I was worried that PC would overtake PS4 (and the next Xbox) soon, but it seems Sony has taken a very different take on processing power and computing.
The hardware sounds great and powerful. It's up to the developers to exploit that potential. Unfortunately, the popularity of the console will come down to price. The PS4 will be extremely popular if it's competitively priced and not overpriced like the PS3 was at launch.
@ Dan - I personally prefer my gaming on console (and the PS4 sounds excellent), but there are a relatively (compared with the console install base) small hard core of PC users that aren't going anywhere soon.
@TC, did you not see the latest Windows 8 numbers? I think its a bit absurd to say people have been saying that for 20 years, when 20 years ago there wasn't an alternative to PCs. That's like equating ARM with 1993's librarians I guess? Specious to say the least.
Once I got a corded X360 controller for the PC I saw no reason to touch my consoles anymore, other than exclusives, or the newest game that wouldn't run on my PC.
Also the Windows 8 numbers are just that, Windows 8. You don't need a new desktop or OS to do PC gaming. I think a lot of people are looking too much into this. There is probably more correlation in that people don't "need" a computer every 3 years as to why the numbers are really low with every other version of Windows...
@Dan in the early 90s there were all types of new and exciting gaming options coming out. SNES, Genesis, TurboGraphx, to mention but a few, and yes people called pc and pc gaming dead back then as well. I was in this industry back then.
People have been wrong for 20 years, and continue to be about the PC. With a 1 to 2 billion person installed user base, PC 's and pc gaming isnt going anywhere anytime soon.
@Weidner, you're missing the point. The PC is being driven into obsolescence by competing computing platforms. It doesn't have anything to do with gaming consoles. Today's SNES and Sega Genesis and the PS3 and Xbox 360. The iPad, Nexus 7, smartphones, and hybrid business tablets are a new category all together.
The PC( Personal Computer ) is not going anywhere, and I'll be quite happy to say I've told you so down the line.
The PC may not exist in the same form we use today -- which is a given, my Mac as an example is way differen than the TI99 I had in the eighties, or anything I worked on in the nineties -- but even if it's a tablet I have plugged into an external monitor and keyboard/mouse( Whatever input, I use a Wacom. ) down the line, it's still a PC.
There's plenty of room for mobile devices and PCs to exist. Both platforms compliment one another at the momemnt and even if and when a convergence happens, it's only because other devices have become more like PCs.
If a day comes where the PC does go away, and you're right, it won't be because we've replaced them, it will be because the "P" in PC has been eliminated. That will be a sad day when people no longer own their device( computer ), let alone their information, as they're tied into and completely reliant on some CLOUD service with a monthly subscription; only a fool would welcome this.
@Chris
Its easy to say I told you so when your argument is just a semantics game. The point is very simple and clear: very few people buy desktop computers anymore. Very few companies even make them anymore. Laptops, tablets, smartphones are the wave of the future. New hardware architectures will preclude Valve from simply declaring Steam the go to portal for all games. Yes there will always be hobbyists and people who love the latest GPU, but they won't dictate overall market trends.
Yes, when you account for less variables and live only in the now, things can be simpler and clearer.
And laptops are part of the wave of the future? Here's some semantics, this isn't the eighties, notebooks have been the now for a long time -- at least from my perspective; and even though there's crossovers, notebooks are more closely related to a desktop PC, than a tablet/smartphone; in my case, they are my primary desktop comp.
Your thinking isn't remotely new, it's only more relevant, since the technology is finally reaching a point that PCs can exist in more forms and fill in more niches, but they're still PCs. Unless whatever is trendy can truly live up to the task required, it will not replace; and mobile OSs are by no means a replacement as of yet; and locked down OSs will never be a replacement.
Anyways, good luck predicting the future with your tEh DOOM outlook on desktop PCs. Maybe generalization is they key?
@Chris
Smart phones are the mediating variable towards the conventional PC's death. We've already seen the transition begin to take place with Android and iOS being featured on tablets, so to even preclude these platforms as "mobile" is deceptive.
That said, while these platforms being locked down is a topic worth thinking on, the reason it will work towards replacing PCs is that they're dramatically cheaper and more convenient than conventional PCs.
No consumer is going to buy a "Dell X3594" when they can get a say, "Script Writer" tablet, that is very cheap, and yet has software completely specialized towards the task of word processing for instance, and thus actually better at it than a trojan horse PC is capable of.
A prime example of this actually is the Amazon Kindle, and it also helps refute your concern about Android being a "locked down" platform. By branding its own hardware Amazon is able to proliferate all its digital business channels, and make more money. They're able to curate and shape Android to the liking of their brand. There's no way they could do that sort of thing with a Windows based device---because it depends on Microsoft's legacy of apps, and relationships with Intel, etc.
Now you can raise semantic hell, and tell me that in fact the Amazon Kindle is a "PC" until the cows come home. But can Valve sell Steam games on such a platform? No. Does such a platform threaten the sellability of high end graphics cards that desktop PCs rely on? Yes. The best hope for the future of "PC gaming" is that ARM architecture increases in power exponentially, and essentially resets the field.
*Incidentally the other future for PC gaming is that they create external hardware designed to interact with mobile and "smart" devices and enhance their graphics capabilities. And in this sense, the PC will just turn into what consoles already are. And the PS4 is already simulating this future by bundling in an ARM processor to perform OS functions---in other words the "Windows" like functions.
If PC goes away then I guess we will all be modeling and coding and writing design docs on touch screens... Sounds productive... I may go back to pen and paper.
I love these "platform of the future" arguments, so I'll chip in.
If anything is going to be threatened by mobile devices it's consoles. A few years from now people will think: "sure these consoles can do some nifty stuff, but why should I pay extra for that when my phone can already hook to the TV and let me play full HD 3D games?"
I think that for now PC is relatively safe thanks to being the content creation platform. Once that changes, PC users will need to worry. (The PC will possibly move to hardware more like the PS4 though.)
Windows 8 numbers are Windows 8 numbers. Windows Vista sucked in the eyes of many when it came out (so did its sales), and when Windows 7 rolled around, sales bounced back. Windows 8 sucks in the eyes of many people today (WHO wants to cripple the desktop with touchscreen controls that can't be used except at a very premium price? Who wants to use Metro on their desktop? Who wants another potential walled garden? Who wants to have to replace most of their expensive software libraries?), and in fact offers severe incompatibilities with some older software, especially development software. Windows 8 numbers reflect that the marketplace doesn't buy Windows 8 as a good platform and they're sticking with 7, and even XP in some cases.
I feel rather optimistic about the PS4 hardware, but ultimately it will come down to developers being able to deliver experiences that really distinguish the next gen from the current gen games.
Can I install linux and call it a steambox? Cause at close to the same price (presumably) an actual computer is about 100x more valuable to me than a console.
The only rumoured Steambox so far is the Xi3 piston, at (an estimated) double the price - I'm not sure how you'd get the kind of performance that we're likely to see from PS4 from a $500 machine.
It doesn't surprise me at all though. Cerny's been working on some of my favorite games for decades. When he came out on stage in the reveal and said he was the system architect, it turned it into a must buy system for me.
Not quite, I have yet to hear the PS4 tools include Unity. It will work with the system, and it will likely not be that difficult but that is about as far as it is.
From what Sony has announced, the Unreal engine will be bundled with the toolkit.
@Duvelle: From what Sony has announced, the Unreal engine will be bundled with the toolkit.
What's your source on that? Epic announced they were supporting it and showed the Elemental demo, but I haven't seen any announcements about Unreal being bundled with the SDK.
I hope PS4 will change the development games have had since the previous console release.
Got tired of my PS3 "movie" colection, I'd like to see some better games for a change.
A couple of details I'd like filled in:
1) What are kind/spec of chip is being used for the dedicated download (and other peripheral) CPUs? PowerPC? ARM? Is this how some bits of cross-platform/backward-compatibility will be addressed?
2) Lots of use of the word async makes me ask: Will we be getting a toolchain and library set that is C++11 native? lambdas, atomic, copy elision, etc are all things that high-performance highly-concurrent state machines and simulations will need to maximize the hardware.
3) will OpenMP be supported by the toolchain for the same reasons? A GCC 4.8-based toolchain would be awesome, but I would settle for a 4.7-based one with enhancements from Google's branches. We see ~20% better raw performance from Google's 4.7 branch versus the FSF 4.7.
4) Will mono be a first-class citizen with full C# 5.0 async/concurrency suport? Again, a necessity for allowing developers to use language-native constructs for distributing work to the CPU/GPU transparently without crazy macros that can collide and whatnot. Note that I mean mono in-general, not specifically PSM or Unity.
5) The GPU can only access the main memory at 20GB/sec? out of a total of 176GB/sec bandwidth? Why?
As for 1) The 'download assist chip' is a little ARM.
And 5) It seems that's the second bus on the GPU for smaller read/write system memory data transfers that bypass the GPU's L1/L2 to circumvent any synchronization issues. In all the documentation so far, the GPU's main bus memory transfer is listed at the 176GB/sec rate, but there's a degree of graphics/compute sharing to juggle.
@Mike, a "little ARM"? do you have a URL to docs? Is it similar to the individual cores in the Vita, pointing to potential cross-platform execution capabilities?
I'm surprised they didn't use a scaled down PowerPC somewhere in there. Certainly SPU code can be dynamically recompiled on the fly for GPU/CPU, but PowerPC might be more difficult. Then again, if Microsoft was able to do dynamic recompilation+static patches for Xbox1->360 compatibility, I guess it's entirely possible.
Sony doesn't appear to care at all about backwards compatibility in the PS4. They've already demonstrated they strongly favor remakes over BC, with items like the Jak & Daxter and other HD remake collections. The PS3 won't be going away for several more years and Sony will point people at that for playing PS3 games.
Down the road, probably the simplest way to do an improved remake of a PS3 game that was multi-platform, would be to port from the PC version and see how much the visuals can be improved over the PS3 version without substantial investment. Actually, we might be seeing ports of a lot of PC games that never appeared on any Sony console. It could be Sony's own version of GoG.
With the above in mind, it doesn't really matter what architecture the assist chip uses. It will likely be completely reserved for the system and not accessible by app and game developers.
All of the above sounds excellent - that Cerny lad really knows what he's on about it (no surprise, given his huge amount of experience), is genuinely passionate about it, and is working hard to deliver a great vision. How devs use it is up for the devs (@ Marius - very odd comment there, sounds like you're unaware of the vast majority of the PS3's game library!) which will, in turn, reflect market demand (to a degree), but the openness to indies, the willingness by Sony to lead with something like Knack (ie, not a military shooter) and the looks of things like Watch Dogs make it sound very appealing.
Sounds great, though really I wouldn't be surprised at a big compute power bottleneck down the road. 8 gigs was chosen off the ratio of computer power to ram needed for this generation. But minimizing ram and streaming assets has gotten really good, and will only get better, and at some point there's only so many models and textures and render targets you need to store in RAM before you run out of compute and can't do anything more with it.
If the high end titles, making the most use of the hardware anyway, all use virtualized textures, that's at max a 16k buffer maybe, maybe two if you really want to get into complex materials. If a good, cheap crack free tesselation scheme can be found, you can get models down too almost a tenth the size they are now with displacement maps. Suddenly in 2015 you have a ton of ram and you don't have anything to fill it with, because you can't compute that much that fast. I can foresee the new PS and Xbox needing their own set of hacks, caching everything possible, trying to make use of those 8 gigs, kind of the opposite of this last generations struggle to get every mb to count.
Shhh! Don't mention those fancy new techniques. They require too much thinking. The point of new consoles is to make games look better without any new complicated stuff to learn. Keep things just as before, same old trusted displacement mapping, just feed it 16x the memory, fingers crossed.
Being able to cache more things in memory and make load/stream delays non-existent is one way to use the RAM, obviously. Various geometry/model compression tricks are cool, but most games still do smoke screen+model replacement for destruction and particles instead of real mutable models. Once people develop these new models that can be destroyed without a distract+replace, the models will be more complex and that will also use more RAM.
(Of course, the Killzone demo we saw at the launch event did some distract+replace... not sure why, though.)
Not everyone will be virtualizing and you can never have enough compute. Even if you have a low gpu profile with cheap virtualized textures you need lots of compute to decode and encode that texture. Then you can use the rest for particles, physics, AI, sound filters, buffer effects, run-time voxelation for indirect light computation. The list goes on. Trust me you can have 3 of these gpu and not render at all and still need more if you really want to push the limits of a game design.
I'd far rather the developers have a hardware resource they rarely find reason to fully utilize than for them to constantly struggle to get things done with a resource of inadequate capacity.
Boy, what a change from the disastrous PS3 launch. Okay, it hasn't launched yet, but the devs I've talked to are happy, without going into NDA-violating specifics, and that sure sounds like a nice architecture for doing whatever you want to do.
In another interview, Cerny gives the credit to Kaz Hirai for recognizing that Kutaragi's EE method of 'Okay, here's some hardware we designed in a windowless tower, now you software guys do whatever you do' wasn't going to be workable any more and going developer friendly for PS4 very early on.
I've always thought of Microsoft and Sony consoles as computer wannabes, and this account reinforces it. I hope the different architecture provides some improvement for a while over PCs. The RAM size certainly isn't impressive.
We console gamers do not care if PC gamers are impressed. That is the whole point of console gaming. They are not PC's. Hardware matters not.
Mac's used to Motorola processor's. Yet now they use Intel procs. Are they less of a Macintosh because of the hardware change? Of course not. They still use OSX and all the other services that Apple provides.
That is what matters in the end. The OS, the programs/games, and the services that Sony and Microsoft provide for the gamers.
The RAM -size- is perhaps less impressive than the RAM -type-.
How many PC gamers do you know with hardware that can selectively dedicate 4-6 GB of GDDR5 solely to the GPU's needs? All those PC gamers with $1000 graphics cards in 4GB DDR5 flavors? Plus add-in PCIe DDR5 RAM modules costing $400? And with those luxury cards you still won't get close to the effective PS4 memory bandwidth with PCIe bottlenecks.
While your PC tower features gigantic power-hogging add-in cards in all of its PCIe slots.
The RAM is quite impressive on the PS4. At least you can capitalize on it in every sense for a game, unlike the bloated system overhead on PC -- not to mention the fact that few PC games are even optimized for 4GB video cards at this time.
That lovely unified memory and processor architecture should give the PS4 some solid future-proofing.
Maybe the raw spec numbers of PS4 get overshadowed (already has) by higher end PC gaming hardware, but does the PC provide as efficient/elegant of an architecture yet, for games? Not yet.
I think PC gamers are the least of their concerns. Their concerns are: longtime performance value for DEVELOPERS, and longtime great looking games for CONSOLE GAMERS.
you can't future proof a system that will be made with the same specs 5-10 years from now in terms of hardware. if PS4 architecture proves to be very capable can't AMD creates a whole unified computer (including motherboard) using what they've learned?
I'm sure we'll see the same architecture migrating to PC very soon, especially considering AMD's razor focus on GCN and unified ALU designs.
Even when AMD goes all in with a consolidated, unified board/chipset in the near future, I wonder if the PC parts industry will be ready to keep up with third-party motherboard variants, or third-party RAM (and would the board feature both DDR3 and DDR5 slots, or would DDR5 still be relegated to the realm of add-ins, and does PCIe bus speed dramatically improve?).
But sure, it's altogether possible that by 2015 or so this form of unified architecture could be widespread in mainstream computers, as we shrink down component sizes, get more efficient, consolidated, leaner and greener across our computing devices.
People seem to forget that this unified memory strategy already exist even on the PC side.
All CPU:s today has GPU built in and AMD's Fusion platform is many way built for the same purpose as described in the article to use the GPU to perform alot of tasks instead of the CPU like physics calculations etc.
The PC have 1 big GPU for graphics. And a smaller GPU built in the CPU with shared memory.
Of course it is not exactly the same as PS4 and a bit more complexed. But the end result would probably be the same for the gamer.
Yes, that may be true, but how many consumers are going to buy these new systems? How many people will get a PS4? More than will upgrade their PC's I would imagine.
Closed hardware environments will still have the upper hand for some time. As you can see on the second page of the article.
They added a second BUS from GPU to sytem memory and a bunch of dedicated hardware units to real benefit.
Let's not forget the main disadvantage of such PCs. Windows OS kernels are the same for every PC. It doesn't at all take into account if the SYSTEM can do this (HSA computing).
But in the future AMD may take things further and get this tech to PC, in which case AMD and SONY still benefits enormously.
Not remotely the same. AMD's main line of CPUs do not have IGA on board. Nor do the existing APU products have access to the same memory pool as the CPU. The BIOS carves off a piece of RAM at boot and that becomes the dedicated RAM for the GPUs operations. The CPU can write to it to feed the GPU.
What AMD is doing with the upcoming systems is creating a true unified pool of memory in which both CPU and GPU not only have access tot he entirety of RAM but also use the same addressing scheme to reference a location. This is a big, big change over how things work now and makes using the GPU far simpler.
Interesting. I don't understand most of the tech talk, but it seems that Sony understands what they did wrong with the PS3 architechture. Knack looks pretty cool too. Maybe they have a new mascot on their hands?
So maybe in the middle of next gen i wont have a pc which only cost about 20% more than a console at launch which shows this current gen up in proformance. With the great advantage of not having to shell out the full price for the system all at once or being stuck with paying top dollar for every title and its easily serviceable at home.
With all that said I really hope next gen is better than the current one has been. This gen was very pricey, unreliable (durability wise) and lacking in moderate priced software.
Really? Both my PS3 and Xbox 360 have worked perfectly from day one. Both are second generation models because I decided it was worth waiting for the engineering refinement, lower cost, and mature library. I still had tons of material from the preceding generation to hold me over. (Heck, I still have games from the PS1 era I haven't given proper attention yet.)
I have well over a hundred games on each machine. Nearly all were bought new and my average cost was between $10 and $20. Everything turns up cheap if you're patient and pay attention to the deals when they pop up.
I enjoy building PCs, especially if some whizzy new stuff is involved. But I also enjoy being able to just press a button and have the thing work every time when it comes to games. It's enjoyable to read the review of the latest GPU but when it comes time to play I'm perfectly happy to use old hardware if everything just works without a hassle.
Cool, and no doubt another big step up for consoles, but the proof will be in the pudding. Sony's consoles have never lived up to their initial performance/technical hype... But on the other hand, they have plenty of hindsight now with 3 consoles under their belt, so that's only in their favor.
Glad he was able to get this message through and that the lead on the PS4 understands the multiplatform world we live in. Great that they are taking some of the hardware puzzles out. I always felt like the software was incomplete to use all that power, relying on each game studio to reimplement that and essentially waste game focused development time over core tech that could have been more unified and on Sony. PS4 just might be a hit with this philosophy.
I thought there were going to be 2 ps4 skus 1 @ 425ish and the other @ 525ish perhaps it's nothing more than with and withlut a plus contract. If ms can do it why not Sony...I'm not sure if gankai will be folded into plus but that would keep things from fragmenting.
Basic,no plus but gankai or gankai,no plus basic...basic would continue free online where plus and gankai get dedicated servers.
Basic would blend all online on non dedicated servers so kz and cod share servers.
No need for the cattle prod of xbl in addition to your isp when free,discounted and streaming games over dedicated servers is enough of a carrot to get people to subscribe...imho
However, there's an indie game revolution going on, and those games tend not to push the limits of the hardware so much as the licensing model. Sony seems to be playing their cards right on this as well, but it remains to be seen where the tiny developers end up, whether it be PC, Steambox, Ouya, PS4 or Xbox.
Installed user base will be a huge factor in deciding where the indies end up, and most of that will come down to price and the strength of their launch library. The things described in this article won't get a chance to play out unless the PS4 does well through the first couple of years.
PS move and EYE may look silly at times now, but it sure is more responsive and reliable than it's competitors by miles. IF you don't include one of the cardinal pieces, such as the EYE camera, they will have to be sold with games.
Less of these games are sold and and the publishers don't have a lot of incentive.
This is a big problem because, gaming will always stay the same way in terms of interactivity. I for one DO NOT want that.
But, I agree SONY does have to be reasonable and then some.
That is simply untrue. The Blu-ray drive mechanism was expensive at the time compared to a DVD-ROM unit but it was well established production line with low defect levels.
The bulk of the PS3's cost at launch was the silicon. Yields on the CELL were horrible, which is why the original multiple CELLs doing everything design was scrapped and a dedicated GPU was added late in the process. This wasn't the first time Sony had very expensive silicon at launch. The PS2 was intended to launch with the EE and GS produced .18 micron. The existing .25 micron chips were only for engineering samples and early developer kits. But Sony couldn't get a .18 micron production line going in time. (Intel was only starting to ship .18 micron parts at the time.) So the PS3 launch in Japan used BIG .25 micron chips that made those units very expensive for Sony. But not more expensive than it would have been to delay launching for several months and give Sega time to draw more consumers to the Dreamcast.
Blu-ray drives cost a lot less now but by far the biggest cost reduction for Sony on the PS3 is shrinking and integrating the chip set. This meant better yields, lower cost per unit aside from yield levels, and subsequently lesser power and cooling needs, which in turn allowed for more cost reduction in the later models.
I'm personally interested in how different their chip architecture will be from the new xbox, and whether the technical bells and whistles of the PS4 will attract developers to make exclusive titles for Sony.
XBOX may go with 32MB embedded eDRAM within the APU to compensate future shader bottlenecks, when it comes to graphic commands. The APU will use 8GB or more DDR3 SDRAM. PS4 also has 18 CUs where as XBOX NG is supposed to have only 12CUs. cost of the APUs may be pretty much the same.
Benefits in this approach would be two-fold.
1. Memory is abundant and cheap, box will cost considerably less
2. Windows kernels are already designed to use DDR3 addressing system.
Leaving Microsoft to make up for it by means of their strong software features and truly next gen Kinect.
As a consumer I couldn't be happier.
The hardware sounds great and powerful. It's up to the developers to exploit that potential. Unfortunately, the popularity of the console will come down to price. The PS4 will be extremely popular if it's competitively priced and not overpriced like the PS3 was at launch.
Also the Windows 8 numbers are just that, Windows 8. You don't need a new desktop or OS to do PC gaming. I think a lot of people are looking too much into this. There is probably more correlation in that people don't "need" a computer every 3 years as to why the numbers are really low with every other version of Windows...
People have been wrong for 20 years, and continue to be about the PC. With a 1 to 2 billion person installed user base, PC 's and pc gaming isnt going anywhere anytime soon.
The PC( Personal Computer ) is not going anywhere, and I'll be quite happy to say I've told you so down the line.
The PC may not exist in the same form we use today -- which is a given, my Mac as an example is way differen than the TI99 I had in the eighties, or anything I worked on in the nineties -- but even if it's a tablet I have plugged into an external monitor and keyboard/mouse( Whatever input, I use a Wacom. ) down the line, it's still a PC.
There's plenty of room for mobile devices and PCs to exist. Both platforms compliment one another at the momemnt and even if and when a convergence happens, it's only because other devices have become more like PCs.
If a day comes where the PC does go away, and you're right, it won't be because we've replaced them, it will be because the "P" in PC has been eliminated. That will be a sad day when people no longer own their device( computer ), let alone their information, as they're tied into and completely reliant on some CLOUD service with a monthly subscription; only a fool would welcome this.
Its easy to say I told you so when your argument is just a semantics game. The point is very simple and clear: very few people buy desktop computers anymore. Very few companies even make them anymore. Laptops, tablets, smartphones are the wave of the future. New hardware architectures will preclude Valve from simply declaring Steam the go to portal for all games. Yes there will always be hobbyists and people who love the latest GPU, but they won't dictate overall market trends.
Yes, when you account for less variables and live only in the now, things can be simpler and clearer.
And laptops are part of the wave of the future? Here's some semantics, this isn't the eighties, notebooks have been the now for a long time -- at least from my perspective; and even though there's crossovers, notebooks are more closely related to a desktop PC, than a tablet/smartphone; in my case, they are my primary desktop comp.
Your thinking isn't remotely new, it's only more relevant, since the technology is finally reaching a point that PCs can exist in more forms and fill in more niches, but they're still PCs. Unless whatever is trendy can truly live up to the task required, it will not replace; and mobile OSs are by no means a replacement as of yet; and locked down OSs will never be a replacement.
Anyways, good luck predicting the future with your tEh DOOM outlook on desktop PCs. Maybe generalization is they key?
Smart phones are the mediating variable towards the conventional PC's death. We've already seen the transition begin to take place with Android and iOS being featured on tablets, so to even preclude these platforms as "mobile" is deceptive.
That said, while these platforms being locked down is a topic worth thinking on, the reason it will work towards replacing PCs is that they're dramatically cheaper and more convenient than conventional PCs.
No consumer is going to buy a "Dell X3594" when they can get a say, "Script Writer" tablet, that is very cheap, and yet has software completely specialized towards the task of word processing for instance, and thus actually better at it than a trojan horse PC is capable of.
A prime example of this actually is the Amazon Kindle, and it also helps refute your concern about Android being a "locked down" platform. By branding its own hardware Amazon is able to proliferate all its digital business channels, and make more money. They're able to curate and shape Android to the liking of their brand. There's no way they could do that sort of thing with a Windows based device---because it depends on Microsoft's legacy of apps, and relationships with Intel, etc.
Now you can raise semantic hell, and tell me that in fact the Amazon Kindle is a "PC" until the cows come home. But can Valve sell Steam games on such a platform? No. Does such a platform threaten the sellability of high end graphics cards that desktop PCs rely on? Yes. The best hope for the future of "PC gaming" is that ARM architecture increases in power exponentially, and essentially resets the field.
*Incidentally the other future for PC gaming is that they create external hardware designed to interact with mobile and "smart" devices and enhance their graphics capabilities. And in this sense, the PC will just turn into what consoles already are. And the PS4 is already simulating this future by bundling in an ARM processor to perform OS functions---in other words the "Windows" like functions.
If anything is going to be threatened by mobile devices it's consoles. A few years from now people will think: "sure these consoles can do some nifty stuff, but why should I pay extra for that when my phone can already hook to the TV and let me play full HD 3D games?"
I think that for now PC is relatively safe thanks to being the content creation platform. Once that changes, PC users will need to worry. (The PC will possibly move to hardware more like the PS4 though.)
Windows 8 numbers are Windows 8 numbers. Windows Vista sucked in the eyes of many when it came out (so did its sales), and when Windows 7 rolled around, sales bounced back. Windows 8 sucks in the eyes of many people today (WHO wants to cripple the desktop with touchscreen controls that can't be used except at a very premium price? Who wants to use Metro on their desktop? Who wants another potential walled garden? Who wants to have to replace most of their expensive software libraries?), and in fact offers severe incompatibilities with some older software, especially development software. Windows 8 numbers reflect that the marketplace doesn't buy Windows 8 as a good platform and they're sticking with 7, and even XP in some cases.
Different strokes for different folks, but for me consoles provide an experience that my PC doesn't quite live up to at times.
From what Sony has announced, the Unreal engine will be bundled with the toolkit.
What's your source on that? Epic announced they were supporting it and showed the Elemental demo, but I haven't seen any announcements about Unreal being bundled with the SDK.
Got tired of my PS3 "movie" colection, I'd like to see some better games for a change.
1) What are kind/spec of chip is being used for the dedicated download (and other peripheral) CPUs? PowerPC? ARM? Is this how some bits of cross-platform/backward-compatibility will be addressed?
2) Lots of use of the word async makes me ask: Will we be getting a toolchain and library set that is C++11 native? lambdas, atomic, copy elision, etc are all things that high-performance highly-concurrent state machines and simulations will need to maximize the hardware.
3) will OpenMP be supported by the toolchain for the same reasons? A GCC 4.8-based toolchain would be awesome, but I would settle for a 4.7-based one with enhancements from Google's branches. We see ~20% better raw performance from Google's 4.7 branch versus the FSF 4.7.
4) Will mono be a first-class citizen with full C# 5.0 async/concurrency suport? Again, a necessity for allowing developers to use language-native constructs for distributing work to the CPU/GPU transparently without crazy macros that can collide and whatnot. Note that I mean mono in-general, not specifically PSM or Unity.
5) The GPU can only access the main memory at 20GB/sec? out of a total of 176GB/sec bandwidth? Why?
And 5) It seems that's the second bus on the GPU for smaller read/write system memory data transfers that bypass the GPU's L1/L2 to circumvent any synchronization issues. In all the documentation so far, the GPU's main bus memory transfer is listed at the 176GB/sec rate, but there's a degree of graphics/compute sharing to juggle.
I'm surprised they didn't use a scaled down PowerPC somewhere in there. Certainly SPU code can be dynamically recompiled on the fly for GPU/CPU, but PowerPC might be more difficult. Then again, if Microsoft was able to do dynamic recompilation+static patches for Xbox1->360 compatibility, I guess it's entirely possible.
Down the road, probably the simplest way to do an improved remake of a PS3 game that was multi-platform, would be to port from the PC version and see how much the visuals can be improved over the PS3 version without substantial investment. Actually, we might be seeing ports of a lot of PC games that never appeared on any Sony console. It could be Sony's own version of GoG.
With the above in mind, it doesn't really matter what architecture the assist chip uses. It will likely be completely reserved for the system and not accessible by app and game developers.
If the high end titles, making the most use of the hardware anyway, all use virtualized textures, that's at max a 16k buffer maybe, maybe two if you really want to get into complex materials. If a good, cheap crack free tesselation scheme can be found, you can get models down too almost a tenth the size they are now with displacement maps. Suddenly in 2015 you have a ton of ram and you don't have anything to fill it with, because you can't compute that much that fast. I can foresee the new PS and Xbox needing their own set of hacks, caching everything possible, trying to make use of those 8 gigs, kind of the opposite of this last generations struggle to get every mb to count.
At least, I can see that as a possibility.
(Of course, the Killzone demo we saw at the launch event did some distract+replace... not sure why, though.)
In another interview, Cerny gives the credit to Kaz Hirai for recognizing that Kutaragi's EE method of 'Okay, here's some hardware we designed in a windowless tower, now you software guys do whatever you do' wasn't going to be workable any more and going developer friendly for PS4 very early on.
No, PC gamers won't be impressed.
Mac's used to Motorola processor's. Yet now they use Intel procs. Are they less of a Macintosh because of the hardware change? Of course not. They still use OSX and all the other services that Apple provides.
That is what matters in the end. The OS, the programs/games, and the services that Sony and Microsoft provide for the gamers.
How many PC gamers do you know with hardware that can selectively dedicate 4-6 GB of GDDR5 solely to the GPU's needs? All those PC gamers with $1000 graphics cards in 4GB DDR5 flavors? Plus add-in PCIe DDR5 RAM modules costing $400? And with those luxury cards you still won't get close to the effective PS4 memory bandwidth with PCIe bottlenecks.
While your PC tower features gigantic power-hogging add-in cards in all of its PCIe slots.
The RAM is quite impressive on the PS4. At least you can capitalize on it in every sense for a game, unlike the bloated system overhead on PC -- not to mention the fact that few PC games are even optimized for 4GB video cards at this time.
That lovely unified memory and processor architecture should give the PS4 some solid future-proofing.
Maybe the raw spec numbers of PS4 get overshadowed (already has) by higher end PC gaming hardware, but does the PC provide as efficient/elegant of an architecture yet, for games? Not yet.
Even when AMD goes all in with a consolidated, unified board/chipset in the near future, I wonder if the PC parts industry will be ready to keep up with third-party motherboard variants, or third-party RAM (and would the board feature both DDR3 and DDR5 slots, or would DDR5 still be relegated to the realm of add-ins, and does PCIe bus speed dramatically improve?).
But sure, it's altogether possible that by 2015 or so this form of unified architecture could be widespread in mainstream computers, as we shrink down component sizes, get more efficient, consolidated, leaner and greener across our computing devices.
Do you accuse the systems in a new car of being 'computer wannabes' because they're specialized and would not lend themselves to most common PC uses?
All CPU:s today has GPU built in and AMD's Fusion platform is many way built for the same purpose as described in the article to use the GPU to perform alot of tasks instead of the CPU like physics calculations etc.
The PC have 1 big GPU for graphics. And a smaller GPU built in the CPU with shared memory.
Of course it is not exactly the same as PS4 and a bit more complexed. But the end result would probably be the same for the gamer.
They added a second BUS from GPU to sytem memory and a bunch of dedicated hardware units to real benefit.
Let's not forget the main disadvantage of such PCs. Windows OS kernels are the same for every PC. It doesn't at all take into account if the SYSTEM can do this (HSA computing).
But in the future AMD may take things further and get this tech to PC, in which case AMD and SONY still benefits enormously.
What AMD is doing with the upcoming systems is creating a true unified pool of memory in which both CPU and GPU not only have access tot he entirety of RAM but also use the same addressing scheme to reference a location. This is a big, big change over how things work now and makes using the GPU far simpler.
With all that said I really hope next gen is better than the current one has been. This gen was very pricey, unreliable (durability wise) and lacking in moderate priced software.
I have well over a hundred games on each machine. Nearly all were bought new and my average cost was between $10 and $20. Everything turns up cheap if you're patient and pay attention to the deals when they pop up.
I enjoy building PCs, especially if some whizzy new stuff is involved. But I also enjoy being able to just press a button and have the thing work every time when it comes to games. It's enjoyable to read the review of the latest GPU but when it comes time to play I'm perfectly happy to use old hardware if everything just works without a hassle.
Basic,no plus but gankai or gankai,no plus basic...basic would continue free online where plus and gankai get dedicated servers.
Basic would blend all online on non dedicated servers so kz and cod share servers.
No need for the cattle prod of xbl in addition to your isp when free,discounted and streaming games over dedicated servers is enough of a carrot to get people to subscribe...imho