|
Both the Xbox 360 and the Wii have
graphics processing units designed by AMD. After the recent roundtable
discussion on the future of console gaming chips, Gamasutra had a chance
to sit down with Bob Feldstein, VP of strategic development at the company,
and Jon Carvill, PR for graphics.
The discussion touched on the
philosophy behind console GPU development, as well as the present and
future of the game industry through the eyes of one of its most important
collaborators.
AMD has two clients in this race:
Microsoft and Nintendo. Obviously, the differences are pretty big between
the Xbox 360 and the Wii. Can you talk about how you perceive those?
Bob Feldstein: So, what -- I have to
be careful here, obviously -- so, what I can say here is that it's interesting.
I think that the Xbox 360 is really exciting to the real gamers, and
I think Nintendo is, too. I think that people are going to have two
consoles, now. Real gamers are going to have the Wii and they're going
to have the Xbox 360-like thing.
And I think the Wii is extending the
market. I really think that now, all of the sudden, we have this new
dynamic, where just the ordinary person who is not really adept with
a controller, will start to have fun with the games. And, yeah, it is
about fun; it's about immersion, losing yourself in the game. So, I
think both have a place.
I don't think that either suffers from
the other, and I think Nintendo's just going to expand the market. And
everybody's going to try to take advantage of that next time. So
that, you know, that's kind of it. And, as far as the technology inside
of them, they're both interesting. Nintendo's immersive, and it shows
us there's lots of sides of being immersive. You, do you have both?
Yeah, I buy everything.
BF: Good. (laughs)
To an extent, the technology drives
what the developers are capable of doing with a console, but at the
same time I would imagine that you take into account what developers
might want to do when you're developing a chip. So, how does that work?
BF: It's really a back-and-forth. Sometimes
the developers don't know what -- they know what they want to do, but
they don't know really what's available in chip technology. And
sometimes they presume "not much"; that they have to take
care of everything. So if you really think about these advanced developers,
they're used to, in the Intel world, say... multi-threading is a problem
really left to them; the hardware doesn't help you much.
Whereas, in GPUs, actually multi-threading
has been there forever, since 2002. And the GPUs just make sure that
things work. They don't tell you that they have a lot of parallel threads
running, they make them up on the fly, and they run. And we want to
extend that further, and find the kind of features, the kind of play,
the kind of other things besides graphics, that are important to game
developers -- I mean, physics, obviously -- and intermix those threads
on GPUs, and create the kind of compute engines that are easy for them
to use, advantageous.
And this is something, by the way,
the console game developers are used to. They're almost their own operating
system. You put a game on, it takes care of everything. But, the compute
model gets more complicated. I don't think, just like they don't want
to do assembly language, they want to do C compiling. I think that they're
going to be surprised about the kind of libraries and compilers we supply,
to make them make it easier.
But you're going to abstract away the
complication of the hardware, and you don't want every developer to
have to figure out everything from the very beginning. And that's what
they do now -- and they're very smart, and they do do a good job --
but I think that as the computer architecture gets more complicated,
more parallel, and does more things, that it's going to want to be abstracted
away. They want to be involved in the drama of the game, and the beauty
of the game, but not necessarily all the time in the metal of how the
processors work in the deep, dark metal.
|