1. Great Expectations
The first few weeks of January, when we were porting our Vector Engine over to run on the Tegra 2 dev hardware, were a bit of a nailbiter.
We had signed up to deliver an in-engine demo for NVIDIA to show at Mobile World Congress in early February. We didn't know what to expect from the hardware in terms of real performance, but we had high hopes. All we had to go on were the paper specs and the expansive claims from NVIDIA and early-adopter developers that Tegra 2 could provide a "console quality" experience on mobile.
Well, as is generally true with performance claims for new hardware, the reality was a bit more complex.
As I mentioned earlier, porting our 360/PC cross platform engine over to the Android NDK went smoothly. By the end of the second week, Ralf had my rough prototype level up and running on the Tegra 2 dev kit, and in some respects the Tegra 2 hardware exceeded our expectations.
The CPU was amazing -- it ate up our water sim and fluids calculations, munched happily on a straight implementation of Bullet Physics, chewed up my high-poly models, and asked us for more.
The GPU was another story. We were extremely fill-rate bound, which is particularly a challenge in a game where almost half of the screen space is covered by a giant translucent water mesh. We quickly realized that the complex pixel shaders we'd relied on so heavily on the 360 just wouldn't allow us the framerates we needed for a fast-paced racing game.
We streamlined our shaders, moved every per-pixel process we could to the per-vertex level, and switched to low-precision calculations wherever possible. We stripped normal mapped ripples off the water and increased mesh density to maintain the complexity we needed to break up reflections. I dusted off my old skool artist toolkit, baking lighting into vertex colors, painting lighting detail into diffuse textures, and ruthlessly managing texture and mesh variation to keep the GPU pipe as open as possible.
NVIDIA provided a ton of helpful performance analysis throughout this process, and in the end, we were able to deliver that first in-game MWC demo with a level of visual quality that, at least to the untrained eye, really does look comparable to something you might see on a modern PC or console. The rate of technological advance on mobile platforms these days is staggering, and I expect that within a few years even savvy gamers will be hard pressed to tell you whether an in-game screenshot was rendered on a high-end console or on a phone.
2. Creative Reuse, or the Lack Thereof
From the start we knew that content creation would be critical path in Riptide GP's development. Unlike the code, which we owned outright, we weren't able to reuse any content -- models, texture libraries, anything -- from Hurricane, so I had to start from scratch.
When I first broke down my 4.5 month art schedule, it looked pretty scary. I had about four weeks for the jet skis, characters and animations, 10 weeks for environments, and five weeks for everything else, including UI, particle effects, polish, optimization, and whatever unforeseen gotchas might lay in wait.
The environment schedule was the hardest. On average I had about six to seven days to finish each of our six main race tracks, with another day or two allocated for creating each reverse variant, which I wanted to be at least a little different, with unique lighting and water features.
The only way this would be possible was through conscientious reuse and instancing, and to some extent I was able to make this work -- the tracks were built from instanced segments, and I looked for every opportunity to reuse and recycle textures and geometry from track to track. But I couldn't quite let go of my desire to try and make each track feel different, and I didn't reuse as many assets as I could have.
In the end, I got it all done the old fashioned way, through hard work and long hours. Overall I didn't mind; it was creative work, and as I mentioned above, we were energized and heavily invested in delivering a quality game. I was still polishing and tweaking details the night before we released Riptide GP on the Android Market in May. But I will say I was pretty relieved when we finally hit that Publish button.
3. Small Team Challenges
We didn't start out intending make Riptide GP with just the two of us. I'd thought I might contract out a month or two of artwork, and we had some budget allocated for things like sound design and music.
But because we were partially funding Riptide GP out of our own pockets, we were powerfully motivated to keep costs down. And because our schedule was so tight, we felt we couldn't afford the time it would take to manage outsourcing.
So we just went full steam ahead. My schedule was pretty much filled from start to finish with art tasks, but as it turned out, Ralf's programming schedule started to open up once he got past the graphics engine optimizations. He ended up taking on a ton of miscellaneous content tasks, including sound design and music curating (we sourced most of it from online stock sites like Audiomicro), screens flow scripting, and even some particle effects.
The problem as always with a very small team -- in our case, a very, very small team -- is that there is virtually no wiggle room for unforeseen variables. We went from three week Scrum sprints at the start of the project, to two week sprints, to one week, and towards the very end we tossed Scrum completely and just went with prioritized task lists, squeezing as much as we could into the time we had. As hard as it was, it was creative, rewarding work, and somehow we managed to get it all done on time. But the process wasn't always pretty.