Gamasutra: The Art & Business of Making Gamesspacer
arrowPress Releases
September 22, 2014
PR Newswire
View All





If you enjoy reading this site, you might also want to check out these UBM Tech sites:


 
Making 2D Games With Unity
by Josh Sutphin on 05/19/13 02:06:00 pm   Expert Blogs   Featured Blogs

The following blog post, unless otherwise noted, was written by a member of Gamasutra’s community.
The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company.

 

(This article originally appeared at third-helix.com)

Unity is well-known for being an easy-to-use, cross-platform 3D engine and toolset, but that doesn’t mean you’re forced to make an FPS or third-person action-adventure game. I’ve been creating 2D sprite-based games in Unity for two years now - games like Conquistador and Fail-Deadly - and in this article I’m going to show you the techniques I used to achieve the classic 2D look.

Who This Article Is For

I’m going to present a brief overview of a number of techniques I’ve used to create a classic 2D “pixel art” look in Unity. This article is not a beginners’ tutorial: I’m assuming you already know how to use Unity in a 3D context and are just looking for some pointers on how to make it work for 2D pixel art.

Sprite Setup

The first thing to understand is that even though you’re making something that looks 2D, it’s still technically a 3D scene. Each sprite in the scene is a single, textured quad, positioned in 3D space just like a regular model.

You’ll need to create and import a quad to use as your mesh. I made mine in Modo, my modeling package of choice. It’s just a simple one-sided quad, 1 unit to a side, with its face normal pointing down negative Z. I also applied a planar UV projection to normalize UVs across the face.

Why is it important for the quad to face down negative Z? Because you want to set up your game camera facing down positive Z in Unity so that world XY correspond to screen XY, and that means the quad will need to face the opposite direction so that it’s facing the camera, and thus can be seen.

Incidentally, you may be wondering if you can just use Unity’s built-in Plane primitive instead of modeling your own quad. I don’t recommend this, because the Plane primitive actually consists of a 10x10 quad grid, meaning each sprite will render 100 times the amount of geometry that you actually need!

In Unity, you’ll import your quad and then set up a prefab consisting of a MeshFilter and MeshRenderer, so that the mesh can be seen. You can make prefabs for different game objects - enemies, pickups, effects, etc. - like you would in 3D, just making sure that they all use this quad model.

Texture Atlassing

To create different sprites you’ll need different textures. The simplest way to do this is to assign a different material to each sprite prefab, which contains an image of the sprite you want, but this actually has a nasty hidden performance cost. Every unique texture in the scene triggers a GPU context switch at runtime; the more unique textures you have, the more context switches have to happen every frame, and thus the worse your frame rate.

You can solve this problem by creating a sprite atlas. This is just a single texture with all of your sprites contained in it, in a grid:

Each sprite prefab has the same material assigned (more on the material assignment in a minute). You can write a simple script to handle the atlas lookup: just expose four numbers - min X, min Y, width, height - and then programatically set the sprite’s UVs to match that rectangle. Here’s the UV assignment code I used (note that you have to flip the V coordinate when translating from texture space to UV space, otherwise your sprite will be upside-down):

Vector2[] uvs       = new Vector2[m_mesh.uv.Length];
Texture texture     = m_meshRenderer.sharedMaterial.mainTexture;

Vector2 pixelMin    = new Vector2(
    (float)m_currentStrand.frames[m_animFrame].x /
        (float)texture.width,

    1.0f - ((float)m_currentStrand.frames[m_animFrame].y /
        (float)texture.height));


Vector2 pixelDims   = new Vector2(
    (float)m_currentStrand.frames[m_animFrame].width /
        (float)texture.width,

    -((float)m_currentStrand.frames[m_animFrame].height /
        (float)texture.height));


// main mesh
{
    Vector2 min = pixelMin + m_textureOffset;
    uvs[0] = min + new Vector2(pixelDims.x * 0.0f, pixelDims.y * 1.0f);
    uvs[1] = min + new Vector2(pixelDims.x * 1.0f, pixelDims.y * 1.0f);
    uvs[2] = min + new Vector2(pixelDims.x * 0.0f, pixelDims.y * 0.0f);
    uvs[3] = min + new Vector2(pixelDims.x * 1.0f, pixelDims.y * 0.0f);
    m_mesh.uv = uvs;
}

The principle behind this is quite simple. UV space represents a percentage of each dimension of the texture:

Calculating the actual UV values for a particular sprite rectangle is tedious. It’s much easier to express the sprite rectangle in pixels, especially since Photoshop’s Info panel shows you the cursor’s current pixel coordinates and the pixel size of the selection:

So, the code simply divides the pixel coordinates by the overall texture dimension to get a percentage along each axis, and voila: valid UV coordinates!

(Remember the gotcha, though: the V coordinate has to be flipped!)

My script actually does more than just assign a static set of UVs: it also functions as a simple animation manager. Since you can set UVs programatically, it’s easy to define an array of different UVs in sequence which define each of the frames of an animation, then programatically swap the UVs at the appropriate rate in order to animate the sprite. My script is simple, and requires manually entering pixel coordinates for each frame of each animation strand, which is admittedly tedious… but since I don’t have a ton of animation data, it’s been acceptable thus far. It would be straightforward (though beyond the scope of this article) to extend the editor to improve the process, for example by visually selecting rectangles directly on the texture in the editor UI.

Sprite Shader

Your sprite still needs a material to reference your texture atlas, and for that you need a shader. The most obvious choice is the default Transparent Diffuse, but even this simple shader does more than you need (such as supporting per-pixel lighting, which you’re probably not using in a traditional 2D sprite-based art style). Unlit Transparent Cutout is simpler, but we can get simpler still. I wrote a custom Sprite shader which is as bare-bones as I could get it:

// Custom sprite shader - no lighting, on/off alpha
Shader "Sprite" {
Properties {
    _MainTex ("Base (RGB) Trans (A)", 2D) = "white" {}
}
SubShader {
    Tags {"Queue"="Transparent" "IgnoreProjector"="True" "RenderType"="Transparent"}
//    LOD 100
    ZWrite Off
    Blend SrcAlpha OneMinusSrcAlpha
    Lighting Off
    Pass {
        SetTexture [_MainTex] { combine texture }
    }
}
}

(I suspect this can be cheaper still, but my knowledge of ShaderLab is limited at best.)

Texture Filtering

If you’re going for the “pixel art” look, then it’s absolutely critical that you set your sprite textures to use Point filtering mode, not the default Bilinear. Point filtering preserves hard edges in the source texture, keeping your sprites nice and clean:

You’ll also want to disable Mip Map Generation (mip maps make faraway textures look better, but this only applies to a 3D perspective view) and check your texture compression settings. If you’re building for iOS the default compression setting is some flavor of PVRTC which will ruin pixel art. The most accurate setting, but also the most memory-intensive, is RGBA32. Since most pixel art uses a limited palette, you can typically get away with RGBA16 with no visual degradation, and reduce the memory footprint of the texture by half. If your sprite doesn’t need an alpha channel (perhaps this texture atlasses a bunch of background tiles?) then set RGB16 to save additional memory by discarding the alpha component.

Camera Setup

For a typical 2D style, you’re going to want to use an orthographic camera. With an orthographic camera setup, objects do not get smaller as they recede into the distance. This allows you to use the Z (depth) axis as a layering mechanism, controlling which sprites draw on top of which while ensuring everything still lines up nicely.

Place your camera at the world orgin (0, 0, 0) and orient it to face down positive Z. Take note of the world axis display in the viewport: note that when you’re facing down positive Z, world X corresponds to screen X (increasing to the right) and world Y corresponds to screen Y (increasing from bottom to top). This makes it very easy to a) think of your game in traditional XY coordinates, b) translate between world space, screen space, and GUI space (more on that in a minute).

Orthographic Size

If you’re going for the “pixel art” look then the camera’s orthographic size is of critical importance; this is the trickiest part of nailing 2D in Unity.

The orthographic size expresses how many world units are contained in the top half of the camera projection. For example, if you set an orthographic size of 5, then the vertical extents of the viewport will contain exactly 10 units of world space. (The horizontal extents are dependent on the display aspect ratio.)

Recall that your sprite quad is 1 unit to a side. That means the orthographic size tells you how many sprites you can stack vertically in the viewport (divided by 2).

To render the pixel-art look cleanly, you need to ensure that each pixel of the sprite’s source texture maps 1:1 to the viewport display. You don’t want source pixels being skipped or doubled-up, or your sprites will look distorted and “dirty”. The trick to ensuring this 1:1 ratio is to set an orthographic size that matches your vertical screen resolution divided by the pixel height of a sprite.

Let’s say you’re running at 960x640, and you’re using 64x64 sprites. Dividing the vertical screen resolution (640) by the pixel height of a sprite (64) yields 10, the number of 64x64 sprites that can be vertically stacked in 640 pixels. Remember that the orthographic size is a half-height, so your target orthographic size in this case is going to be 5 (one-half of 10). It should look like this:

If you set your orthographic size to half or double that target you may still get usable results, because the sprite’s vertical size will still divide evenly into the viewport’s vertical size. But if you set the orthographic size incorrectly, you will see some pixels skipped or doubled, and it will look very bad indeed:

Variable Resolution

You don’t need to be confined to a single, fixed resolution in order to render clean pixel art. The simplest way to handle variable resolutions is to attach a custom script to your camera which sets the orthographic size according to the current vertical resolution and a known (fixed) sprite size:

// set the camera to the correct orthographic size
// (so scene pixels are 1:1)

s_baseOrthographicSize = Screen.height / 64.0f / 2.0f;
Camera.main.orthographicSize = s_baseOrthographicSize;

While that is a simple fix, it does have a drawback: as the screen resolution decreases, you’ll see less and less of the world, and sprites will take up more and more of the screen. That’s the consequence of keeping a 1:1 ratio between source and screen pixels: a 64x64 sprite takes up more apparent space at 640x480 than it does at 1920x1200. Whether this is a problem or not depends on the needs of your specific game.

If you want your sprites to remain the same apparent size regardless of screen resolution, then simply set the orthographic size to a fixed value and leave it there regardless of the screen resolution. The drawback there is that your sprites will no longer have a 1:1 source-to-screen pixel ratio. You can mitigate the ill effects of that by only allowing resolutions which are exactly half or exactly double your target resolution.

GUI Considerations

If you’re using Unity’s immediate-mode GUI, there’s a simple trick you can use to automatically rescale the GUI to fit the current screen resolution, even if you’ve hard-coded all your GUI coordinates. Simply put the following at the top of your OnGUI call:

void OnGUI()
{
    // scale the GUI to the current resolution
    float horizRatio = Screen.width / 1024.0f;
    float vertRatio = Screen.height / 768.0f;
    GUI.matrix = Matrix4x4.TRS(
        Vector3.zero,
        Quaternion.identity,
        new Vector3(horizRatio, vertRatio, 1.0f)
    );

You may occasionally need to translate between world- and screen-space coordinates. The built-in Camera.WorldToScreenPoint and Camera.ScreenToWorldPoint functions work perfectly well with an orthographic camera, but there is a gotcha: their notion of screen-space, and the GUI system’s notion of screen-space, use inverted Y axes.

When you use Camera.WorldToScreenPoint you’ll get back a point with X increasing to the right and Y increasing from bottom to top, with (0, 0) at the lower-left of the screen. The GUI system expects coordinates with X increasing to the right and Y increasing from top to bottom, with (0, 0) at the upper-left of the screen. So if you’re translating between world space and GUI space you’ll need to invert the Y coordinate:

y = Screen.height - y;

Physics in 2D

You can constrain Unity’s physics sim to run in 2D… sort of. Create a physics object and attach a ConfigurableJoint component to it, then set the “ZMotion”, “Angular XMotion”, and “Angular YMotion” properties to “Locked”. This prevents the physics object from moving along the Z (depth) axis, and constrains its rotation to only take place around that same axis (so it can’t pitch or twist “into” the screen). It’s no Box2D, but it’ll get the job done.

Note that you’ll need to set up this kind of ConfigurableJoint on every physics object in your scene. Unfortunately there is no way to globally constrain the entire physics sim to two dimensions; it must be done on a per-object basis.

Particle Systems

You don’t generally need to do anything special to use particle systems in 2D. Depending on the desired effect, you may wish to ensure the Z velocities are always zero (for example if you want to ensure a more-or-less even spread of particles in the camera plane, e.g. for an explosion). Because you’re using an orthographic camera, any Z motion in the particles will not be obvious. (If you see particles moving strangely, this is the first thing you should check.)

If you also want your particles to have a clean “pixel art” look just like your sprites, simply assign a material using the Sprite shader (discussed earlier) in the ParticleRenderer component. (Unfortunately I have yet to devise a way to atlas sprites in particle systems.)

Fin

That’s pretty much all there is to it. Best of luck in your 2D Unity endeavors!

(Josh Sutphin is an indie game developer, former lead designer of Starhawk (PS3), and creator of the Ludum Dare-winning RTS/tower-defense hybrid Fail-Deadly. He blogs at third-helix.com and tweets nonsense at @invicticide.)


Related Jobs

Blizzard Entertainment
Blizzard Entertainment — Irvine, California, United States
[09.19.14]

Art Director - World of Warcraft
SAE Institute
SAE Institute — San Jose, California, United States
[09.19.14]

User Interface Design Instructor
SAE Institute
SAE Institute — San Jose, California, United States
[09.19.14]

Compositing Instructor
HITN
HITN — Brooklyn, New York, United States
[09.19.14]

Associate Producer/Game Designer






Comments


Chris Clogg
profile image
Wow that was really interesting to read. Haven't dabbled into Unity yet personally.

Also wanted to mention that I really like the look of the game, especially the shadows.

Carlo Delallana
profile image
A very timely writeup as i'm teaching myself Unity with a 2D game project. Thanks for sharing!

Jonathan Jennings
profile image
I just finished a really basic 2D project with friends a few weeks ago and definitely have to say getting the Sprites to behave correctly took some effort! excellent write up Josh and I agree the game looks great. A game I think looks exceptional in Unity with a 2D setup is " Lovers in a dangerous space time" absolutely gorgeous to me .

Josh Sutphin
profile image
"Lovers" did some very clever rendering tricks, as described in one of the more brilliant Gama blogs in recent memory: http://www.gamasutra.com/blogs/MattHammill/20130206/186138/Making
_Two_and_a_half_D_art_for_Lovers_in_a_Dangerous_Spacetime.php

Their approach makes me feel like an amateur, frankly. ^_^

(can I not make an actual link in a comment or wtf O_o)

Christian Nutt
profile image
Links are disabled, sorry. People will need to copy/paste. I'll look into that as we visit upgrading the blog system soon.

Thomas Happ
profile image
They should just add a dedicated 2D mode.

Rob Graeber
profile image
Thanks for the write up, now I know stay away from Unity 2D.

But seriously is making a 2d game in unity as complicated as you're making it seems? Seems like you'd be better off going with a non-unity 2d game framework.

Jonathan Ghazarian
profile image
It's a lot of startup work, but once you get the groundwork laid out, you can utilize a lot of great features from unity. Depending on the game you want to make, it's definitely worth giving it a shot.

Daniel Balmert
profile image
Unity has a huge userbase (getting help/resources is easier than most 2d solutions). Working in 3d space is actually more intuitive for 2d games.

• Parallax, for example, is attempting to emulate 3d space in 2d. In 3D, you can *actually* manipulate parallax planes in 3d space intuitively.
• It's hard to find 2d lighting solutions, but you can fake a lot of stuff in 3d quickly.
• Walking"behind" objects is way easier with actual 3d space.
• Hundreds more I can't think of...

There's just a bunch of points where 2D tries to emulate 3D, and it's super easy if you're just already in 3d. It's not the perfect solution and many game ideas won't need any of those points, but it's fun to work in 3D for a 2D game.

Mark Grossnickle
profile image
Its really not that bad. There are plenty of plugins that take care of a lot of what Josh is doing by hand. Kihon uses 2D Toolkit for a majority of it and then SmoothMoves for bone animations. Highly recommend both to anyone trying 2D in Unity.

Steven Stark
profile image
an alternative would be to use 2dtk ( 2D toolkit ) found in the asset store. I have used it a lot, it's very nice.

Great article for those who want to do it themselves, thanks.

Roberto Caldas
profile image
Great tutorial, Josh!
A question: why use the ConfigurableJoint? Isn't enough to set the constraints in the RigidBody?
Also, a bit too emphatic to say that we cannot globally constrain the entire physics since that can be achieved programmatically, am I right?

Ted Brown
profile image
Superlative! A great file for the bookmarks.

nick ATpainttehDOTcom
profile image
looks a bit strange 2d unity. It's like buying Ferrari to drive 50?!? or it's just me.
It remembers me times when ppl were trying to use java for everything :))

Arturo Nereu
profile image
Thanks for sharing this. We at Phyne Games are working on a Unity3D - 2D game, we will share our experience soon too.

Livy Stork
profile image
I'm confused what to do with the UV assignment code you provided. If I created my mesh programmatically in C#, can I paste that bit of code (or similar code) into that script? Can you explain what the variables represent? Because I've been trying to understand that code for more than an hour now and don't get how it works. (I understand it conceptually though) Thanks!

Greg Back
profile image
This is a fantastic start. One thing I was wondering is how you would handle tiling a texture along a single quad. For instance if I had a brick texture somewhere on the atlas and I would like to tile it without generating more quads, how could it be repeated?

Livy Stork
profile image
Can someone please explain how the UV code works? I'm on the brink of giving up here.

Matthew Pigram
profile image
He opens the atlas image in photoshop then for each different sprite he find its pixel co-ordinates, he then passes them into his function (as pixelDims, which is a 2d vector) then he calculates the UV's based on the pixelDims and applies the UV to the appropriate sprite.

There is code missing, but the general idea is there


none
 
Comment: