Following Nvidia's acquisition of PhysX developer Ageia
, the company is set to announce a partnership with NaturalMotion that will pair the PhysX physics engine with NaturalMotion's Morpheme animation engine, starting from the Morpheme 2.0 release, currently planned for August of this year.
While developers will not be required to license both products, the tight integration between the two tools will allow for a heightened level of control and interplay between animation and physics, according to NaturalMotion CEO Torsten Reil.
Gamasutra talked with with Reil to discuss the partnership and what developers could expect from the integrated tools. NaturalMotion's Morpheme is described as "the industry's first graphically-authorable animation engine for Playstation 3, Xbox 360 and PC", and has a number of licensees.
Reil also touched on how the announcement would relate to another perhaps more well-known NaturalMotion tool, Euphoria, which is used in games including Rockstar's Grand Theft Auto IV
and LucasArts' The Force Unleashed
, and allows for dynamic real-time animation of characters on next-gen platforms.
In the course of the discussion, Reil also demoed and covered the company's forthcoming football title, Backbreaker
, which uses the company's tech solutions to present realistically-reacting players and crowds.
What's the timeframe for the release of Morpheme 2.0?
Torsten Reil: We're looking to ship in August of this year. Morpheme 2.0 will have a host of other features, but the big thing we're announcing is that it's going to be integrated with PhysX.
It seems that not all people understand that PhysX [which Nvidia [recently acquired with parent company Ageia] is totally separate from the PhysX hardware -- that it's usable the same way that Havok is.
TR: That's a really good point, because that's something that's really important to us. Obviously, our bread and butter is console and PC. It's not just one platform or the other, so we have to support all of them.
What we've been really impressed with over the last few months from Nvidia is the investment that they've made into cross-platform support and cross-platform optimization, rather than just focusing on one. So I think we've actually seen significant input performance across the range.
If you wanted a product that's useful, especially for console development, right now the two biggest install bases globally are Wii and 360, and both of those have ATI graphics chips. You can't limit it, or this would not be a useful product.
TR: That's exactly right. You have to use them. You have to have technology that runs on all of them. So Morpheme ships on the Wii, the 360, the PS3, and on the PC, obviously, and PhysX does as well.
Say someone's licensing Morpheme now -- when they get 2.0, how does licensing work? Will they automatically get an upgraded 2.0 with PhysX, or is it all separate?
TR: The way that it works is that all of our current customers get an automatic upgrade to Morpheme 2.0. PhysX is actually offered separately, because we have to be flexible. Everyone might not want to have PhysX straight away, particularly if they are already down the production chain.
We expect that a lot of people will take PhysX straight away with it. It's going to be very integrated, the way we're actually offering that to people. It's offered both through us, and through Nvidia.
How do PhysX and Morpheme currently interact, and what will change when the products become more tightly integrated?
TR: We actually ship all source code for Morpheme for the run-time engines, so you can have it integrate with whatever you want, essentially. What we're doing is making sure that the integration with PhysX is very tight. We actually have additional code there to make sure that is the case.
In theory it doesn't prevent anyone from doing anything else, because obviously, it's very important that game developers have complete flexibility. I think what's going to happen is that, because the integration is so tight, what's going to happen is we have PhysX functionality in the Morpheme tool itself, so you can graphically author how your ragdolls interact with your animation, and how supple that animation is. That tight integration we can only get with Nvidia PhysX, so that's not something that's easy to get with another physics engine.
The transition from animation to ragdoll and back to animation is something you don't see a lot of in games. Usually, characters go to ragdoll because they've died.
TR: That's going to be a part of Morpheme 2.0 with PhysX. It's something that we want to make as easy as possible. Part of the reason it's not used so much in other games is that it has to be set up to get everything right, and then you see problems, and they're not easily debuggable. Very often, you want animators to actually control that, and that's what you can do now.
It also seems quite difficult that by the very nature of a physics-based ragdolling, you don't know what position they're going to be in. If you start up an animation again, they could be upside-down or something.
TR: That's true, but again, we have quite sophisticated algorithms, because we're doing so much of that stuff with Euphoria, basically. It's running in real-time, but Euphoria goes beyond all that, again, because it has behaviors running on top of it.
But yeah, you're right. Those are the problems that you can solve with Morpheme now with the PhysX integration. [Ageia founder and CEO] Manju [Hegde] was saying this before, but it's a really interesting thing, because this technology is now available, but it's really important to make it easy to use. It's used by the content creators, and that's partly in the past what's been holding games back. The technology was there, but it wasn't that easy to integrate.
When you look at a tool like that, it's easy to think, "Oh, that's why it takes three years to make a game." It seems that as we reach these levels of fidelity and complexity, it gets a lot tougher.
TR: It's a very good point. What we're trying to do with Backbreaker
, as an example, is to see if it's true that you need these big teams. Where are the bottlenecks?
We're thinking a lot of the bottlenecks are in the tools themselves, and in the fact that you have to create so much content manually. Backbreaker
actually has only six people on the core team. That's because we used things like Morpheme and also things like Euphoria to make the development process much faster.
You're not contracting out any of Backbreaker's development?
TR: No, it's internally developed. We have a separate games team which is separate from our technology team, so the games team mostly serves to us to basically get early versions of Morpheme, for example. In fact, they're essentially prototyping interaction between Morpheme and PhysX, because those are running in Backbreaker
That allows us to really stress test the technology. But internally, we treat them as a customer, rather than being part of the technology team, because it allows us to be very disciplined about our support processes.
You want them to serve as a test bed, and give them things that could potentially break, and at the same time, to know how real developers are going to react to these situations. You need to keep them in the dark, to an extent.
TR: Exactly. That's exactly how we're doing it. Sometimes it's tempting to not do it that way, but in the long term, it's good to be prudent on that, because you will benefit from it.
Talking about how your technology is not just relative to games, it's also relevant to film and animation. It seems like as graphics reach higher fidelity, those lines begin to blur on the tools that are relevant to other media.
TR: Absolutely. Graphics costs keep going up, and you can pretty much predict where it's going to be in five or ten years. Animation quality has been lagging far behind. That's obviously a big problem in games right now, where the rendering quality is amazing, but the animation quality isn't quite there.
That really works on two levels. On the one hand, just animation like the stuff you do in Morpheme, where you have locomotion, it needs to be smooth, for example. Then you take it one level further, which is the true interactivity, because it can't be actually interactive. However good the rendering is, it just breaks down.
One thing that's also interesting is that you can then take things like that one step further. For example, if you talk about accelerated things on the GPU like hair, for example, or cloth, that doesn't look good if the underlying animation doesn't look good. All of these things need to be on top of each other, which is why it makes sense for us and for Nvidia to work together, to bring the level up.
Something that's been a big point of discussion is that games are so interdisciplinary. There are so many different things that have to work together in concert, and people with different kinds of expertise have to work together, so potentially, tools that can bridge those gaps are going to be very useful to developers.
TR: I think that's going to happen more and more. A lot of the requirements are very technical for games, but on the other hand, you need to have people with an aesthetic eye to actually implement them. There's only one way to bridge that, and that's to give tools that simplify the process. But very often, at the end of the day, you have to have real content creators create stuff, rather than just people with lots of technology knowledge.
Another part of the problem is developers can spend so long in development cycles solving technological issues that, by the time they get to the point where they can really build and implement their content, there's not enough time left in the development cycle to really successfully do that.
TR: Exactly. That's the other problem. A lot of that, I think, over time will be solved just with tools and technology. For me, it's two things. One of them is tools. The other one is using the hardware to generate content. I really think it's a massive thing, and basically procedural content, whether it's animation or textures or things like trees that are physically simulated. Being able to move from manual production, which takes a lot of time, to something that's more generated on the fly, is the thing.
But that always has to be done with the animator or content creator in mind. So just because it's generated in real-time doesn't mean that there doesn't have to be an animator, for example, to define what it looks like. That's really the crucial thing. Giving people tools to do that is the crucial thing.
That's the thing. It has to be compatible with the aesthetics and the vision. Backbreaker has to look like football. You've seen football on TV hundreds of times, and you know what to expect from those physical reactions, so it has to match that. But a girl dancing, it requires... the underlying technology that may drive those two things is quite similar, but the output is quite different, so you have to make it available.
TR: Exactly. Totally different styles. That's the kind of thing you have to have animators control over that. Even hopefully that in the end everything or in that area will in the end be real-time and generated, you have to have animators creating your style.