Gamasutra: The Art & Business of Making Gamesspacer
View All     RSS
June 19, 2018
arrowPress Releases
  • Editor-In-Chief:
    Kris Graft
  • Editor:
    Alex Wawro
  • Contributors:
    Chris Kerr
    Alissa McAloon
    Emma Kidwell
    Bryant Francis
    Katherine Cross
  • Advertising:
    Libby Kruse






If you enjoy reading this site, you might also want to check out these UBM Tech sites:


 

Art Design Deep Dive: Rendering the player as a form of pure energy in Recompile

by Phi Dinh on 05/23/18 04:01:00 am   Featured Blogs

3 comments Share on Twitter    RSS

The following blog post, unless otherwise noted, was written by a member of Gamasutra’s community.
The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company.

 

Who

Hi, I'm Phi Dinh, owner of Phigames and creator/coder/designer of our upcoming atmospheric exploration platformer and environmental hacking game Recompile. I work closely with animator/VFX artist James Vincent Marshall and together we have worked on this game for nearly 2 years. In this post, I will be explaining and exploring the processes we used to generate the visuals for our protagonist, known to our players as The Program.

I have a background in Computer Science, and have worked for various technology companies including real estate, gambling and digital advertising. My real passion however, lies in games, so in 2013 I decided to take control over my career and form my own game development studio, Phigames. My first commercial title was TinyKeep, a hardcore dungeon surviving romp for the most masochists of gamers. The game had a successful Kickstarter campaign, and I was able to quit my job and move to the North of England. The cheaper accommodation and idyllic countryside vistas allowed my creative and technical juices to truly thrive.

What

Recompile's humanoid protagonist is rendered as a form of pure energy, using instanced cube meshes with a custom shader.

It works by drawing hundreds of cubes every frame, at positions where the player mesh colliders intersect with a quantized world-space grid.

This process can be broken down into the following steps. (Note: The terminlogy and API references are Unity specific).

1. Work out which positions in the world-space grid are enclosed within the mesh collider.
2. Create an array of Matrix translations from these positions.
3. Pass this array into a Graphics.DrawMeshInstanced call, on every Update loop.

There is a major performance gotcha with #1, it is very expensive to calculate whether a position in world-space actually lies within a specific convex collider hull. We can get around this computation step by baking in a number of local positions that we know are definitely inside the collider. This is easily done by using OverlapSphere with a very small radius over a sample set of test points. Whilst this is an expensive operation, is doesn't matter as we are offloading it to be baked during Editor time. When the game is running, all we have to do is transform these predefined positions and compare them to our quantized grid at runtime.

The cubes themselves never actually move, but instead slip in and out of existence as the collider intersects with the cells from the quantized grid. We also animate the cube scaling as they appear and disappear, giving the impression of smooth movement at a distance.

Even when the colliders rotate, the cubes always remain axis-aligned and fixed to the quantized grid, resulting in a distinctive voxel-style look.

The resolution of the quantized grid can even be modified in real time, so we can change the character detail based on any number of gameplay parameters.

This is a quick test of the player run animation at a medium grid resolution.

We can also pass an array of Colors to a MaterialPropertyBlock for the Graphics.DrawMeshInstanced call, they will be handled by the shader to tint the cubes on a per-instance basis. Handy for creating scanline effects, and for changing the color of the player based on gameplay factors such as health, level, damage etc.

Finally, additional VFX and lighting are added to bring the player visuals to the next level of polish.

Why

The story of Recompile is set in a purely digital virtual world located inside a mysterious supercomputer complex. The art style for both the environment and the world's inhabitants are largely geometric and low-poly, however for the player character we wanted something a little different. We needed to emphasize a more organic, fuzzy nature to reflect that this is an external entity created by humans - but still retain a glitchy, code-like feel.

For an early prototype we used a cube shaped character, complete with jelly-like animations. Eventually we wanted to get away from this in favour of a humanoid model, as we felt this kind of abstract cube character was starting to become somewhat of an indie game trope.

Instead of opting for a traditionally modelled, rigged and textured character design which we felt would be in danger of looking like other similar games, we went with the idea that the player was created from pure, volatile energy. Our artist has a background in animation & VFX, so in order to play to our strengths we emphasized pure motion over art fidelity.

We also needed a contrasting color scheme, so that the character would stand out from the dark surroundings, and to create an iconic look that would become instantly recognizable in screenshots and social media.

We also needed the ability for our character to be very dynamic in its rendering, based on gameplay parameters such as health and progression. Being able to change resolution at any time, and morph into different shapes and colors was especially useful in providing vital feedback to the player.

Finally, it was important to us that the player character remains gender ambiguous. A step further from the traditional silent protagonist of videogames, our solution emphasizes player action over identity. We want to tell the story about what the player does, not what they are.

Result

On the surface I feel we nailed the final aesthetic, it's something unique that's rarely seen in games before. It also appears technically innovative and impressive (even though it's actually quite simple from an implementation standpoint) and the character's design perfectly matches the lore and story of the game.

Under the hood, we definitely ran into some performance issues. Our implementation uses instanced meshes drawn every frame, and whilst much more optimal than incurring GameObject overheads, it's still not 100% perfect.

We are currently prototyping a new system that uses GPU/Compute Shaders, which will potentially allow us to increase our cube count from what we have to something in the order of millions. However, there may be some problems getting this to work well with other platforms such as MacOSX and consoles. Much more research to be done on this on our part, but exciting times ahead!

Recompile is currently being developed by Phigames.

Our core team consists of 3 multi-disciplined individuals :-

Phi Dinh, Creator/Coder/Designer

James Vincent Marshall, Animator/VFX Artist/Art Director

Richard Evans, Sound Design & Music


Related Jobs

Deep Silver Volition
Deep Silver Volition — Champaign, Illinois, United States
[06.18.18]

Junior Gameplay Programmer
Insomniac Games
Insomniac Games — Burbank, California, United States
[06.18.18]

Gameplay Programmer
Digital Extremes
Digital Extremes — London, Ontario, Canada
[06.18.18]

Video Game Data Scientist
Armature Studio
Armature Studio — AUSTIN, Texas, United States
[06.15.18]

SENIOR GAMEPLAY PROGRAMMER





Loading Comments

loader image