Gamasutra: The Art & Business of Making Gamesspacer
View All     RSS
August 16, 2017
arrowPress Releases






If you enjoy reading this site, you might also want to check out these UBM Tech sites:


 
Coloring the world with shaders
by Victor Ochoa on 06/15/17 10:10:00 am

The following blog post, unless otherwise noted, was written by a member of Gamasutra’s community.
The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company.

 

Hi

I'm a game dev student and some time ago I made a game with a small team for a 5-days-long class game jam. The game is called Color Wars and it's a very simple 2.5D PvP game where you shoot your opponent, painting them while also painting the scenario.

Here you have a link to the itch.io page and I'll leave my social media in the end of the post.

 

Show, don't tell

Before starting with the technical part, I want to show you what I'm talking about.

<color wars demo gif>

This is basically it, that's the "painting" effect that made the difference in our little game — and how I managed to get that working is what I'm going to talk about.

 

Breakdown

Geometry

Behind all the magic there are the raw 3D models that form the scenario, and the sprites of our little characters. Everything colored and textured.

The tricky part is that we are going to mask out the color in the alpha channel.

In other words, the whole alpha channel of the screen will be black by default, until we throw in some paint splatters which will turn white some areas of it. Then, an image effect will blend between color and grayscale based on that.

Like this:

As you can see there, we are using projectors to draw the splatters onto the surfaces and create the color mask. Each projector is procedurally instantiated when a bullet ( the white dots flying ) impacts a surface. Projectors have a box collider, so bullets that impact there don't create another projector but rather make that one bigger. This way paint splatters grow and we keep the amount of projectors in the scene fairly low.

The problem is that by default, Unity shaders write "1" to the alpha channel for any opaque object. So I replaced the Unity Standard Shader with a custom one.

It's very basic, I just created a Standard Surface Shader and replaced the surface function with this one:

void surf (Input IN, inout SurfaceOutputStandard o) 
{
	// Albedo comes from a texture tinted by color
	fixed4 c = tex2D (_MainTex, IN.uv_MainTex) * _Color;
	o.Albedo = c.rgb;
	// Metallic and smoothness come from slider variables
	o.Metallic = _Metallic;
	o.Smoothness = _Glossiness;
	o.Alpha = 0;  // I'm just adding this line!
}

 

Also it's really, really important to change the #pragma line to this:

CGPROGRAM
#pragma surface surf Standard fullforwardshadows keepalpha
// adding "keepalpha" will tell Unity not to override our alpha value!

Something important to notice is that this hack won't work in Unity's deferred rendering pipeline, as it overrides the alpha channel of the G-buffer to store occlusion data.

 

Paint projection

As shown above, when bullets impact a surface a Unity projector is created procedurally in the impact point. These projectors have a custom material and a custom shader.

The material texture is just a custom splatter texture with an alpha channel. I'm using this one.

The important part is that its Wrap mode is set to "Clamp" rather than "Repeat" on the Import settings.

This is the shader I used for the projector material. It's just a variation of the default ProjectorLight provided by Unity:

Shader "Projector/ProjectAlpha"
{
	Properties
	{
		_ShadowTex ("Cookie", 2D) = "gray" {}
	}
	Subshader
	{
		Tags { "Queue"="Transparent"}
		Pass
		{
			ZWrite Off
			Blend Zero One, One One
			Offset -1, -1

			CGPROGRAM
			#pragma vertex vert
			#pragma fragment frag
			#pragma multi_compile_fog
			#include "UnityCG.cginc"
			
		   struct Input
		   {
			float4 vertex : POSITION;
			float3 normal : NORMAL;
		   };

			struct v2f
			{
				float4 uvShadow : TEXCOORD0;
				UNITY_FOG_COORDS(2)
				float4 pos : SV_POSITION;
				fixed nv : COLOR0;
			};
			
			float4x4 unity_Projector;
			float4x4 unity_ProjectorClip;
			
			v2f vert (Input v)
			{
				v2f o;
				o.pos = mul(UNITY_MATRIX_MVP, v.vertex);
				o.uvShadow = mul(unity_Projector, v.vertex);
				UNITY_TRANSFER_FOG(o,o.pos);

				// For me, splatters were being projected on both sides of the
				// object, so I used the view direction and the surface normal
				// to check if it was facing the camera.
				float3 normView = normalize(float3(unity_Projector[2][0], unity_Projector[2][1], unity_Projector[2][2]));
				float nv = dot(v.normal, normView);
				// negative values means surface isn't facing the camera
				o.nv = nv < 0 ? 1 : 0;
				
				return o;
			}
			
			sampler2D _ShadowTex;
			sampler2D _FalloffTex;
			
			fixed4 frag (v2f i) : COLOR
			{
				fixed4 texS = tex2Dproj (_ShadowTex, UNITY_PROJ_COORD(i.uvShadow));
				fixed4 res = fixed4(1, 1, 1, texS.a );
				// Multiply by alpha channel to
				// remove back-side projection.
				res.a *= i.nv;

				UNITY_APPLY_FOG_COLOR(i.fogCoord, res, fixed4(1,1,1,1));
				return res;
			}
			ENDCG
		}
	}
}

 

The most important part of it is the blending options.

But, how does blending work?

When a shader computes the color of a pixel, this color has to be applied on top of the color already on the screen. By default, the new pixel will completely override it, but instead you can blend between the old and the new color. This is used obviously to make objects transparent, or create semi-transparent effects, but it can used for a bunch of other cool stuff.

The Blend keyword can be inside the Subshader or the Pass tags. You can even blend in between passes of the same shader.

After the Blend keyword, you must write the blend factors.There are a bunch of these:

One Zero
SrcColor SrcAlpha
OneMinusSrcColor OneMinusSrcAlpha
DstColor DstAlpha
OneMinusDstColor OneMinusDstAlpha

 

Src referes to the color computed by the shader. Dst refers to the pixel color already on the screen.
The color computed by the shader is multiplied the first factor, and the color on the screen is multiplied by the second factor. Then both colors are added together and the result is written to the screen.

 

So "Blend SrcAlpha One", would multiply the color of the current shader by its own alpha value and the color on the screen would be unaltered. Then both would be added together.

Also, you can write 2 sets of factors, separated with a comma — the blending options before the comma work on the color channels, the ones after the comma work on the alpha channel only.

You can find more info about blending modes on the Unity docs. Experiment to get used to it.

So we got this on our projector shader:  "Blend Zero One, One One"

"Zero One" removes the splatter texture color and takes instead the color of the surface it's painted on.
"One One" adds the splatter alpha value to the surface alpha value.

 

Now, if we spawn projectors with this shader and texture we should be paiting white the alpha channel of the scene view.

 

Color and grayscale

Now we can modify the alpha channel at will, but this isn't doing nothing by its own. We should now create our image effect to make use of our alpha mask.

First we need the shader that will use our image effect. For this I just created a default Image Effect Shader inside Unity — then replaced the fragment function with this:

fixed4 frag (v2f i) : SV_Target
{
	fixed4 col = tex2D(_MainTex, i.uv);

	// This lines generates a Black&White version of the screen
	fixed3 bnw = dot(col.rgb, float3(0.3, 0.59, 0.11));
	// Switch between B&W and Color based on alpha channel
	col.rgb = lerp(bnw, col.rgb, col.a);

	return col;
}

 

You can change the bnw to whatever, and it would blend nicely. But last, to get this all working we need a script to run the image effect. The code for the script is as simple as this:

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityStandardAssets.ImageEffects;

[ExecuteInEditMode]
[ImageEffectAllowedInSceneView]
public class AlphaColorSwitch : ImageEffectBase
{
	void OnRenderImage ( RenderTexture source, RenderTexture destination )
	{
		Graphics.Blit ( source, destination, material );
	}
}

Notice I'm using ImageEffectBase. This is inside the Effects package of Unity's Standard Assets. Import it, attach the script to the camera and set the public shader variable to the one just mentioned.

Be sure to set your camera to Forward.

 

And you should be able to paint the scenario with some projectors!

 

Limitations & further improvement

Alpha masking means no alpha channel!

Which means no transparency and no support for deferred rendering. This could probably be solved using stencils, command buffers or even Multiple Render Targets. My implementation is not the best for sure, but it worked for our jam.

Too much custom shaders

I like when things "just work" and having to replace every object's shader to a custom one it's not a really good practice. Deferred rendering solves this, if you get the whole thing to work in deferred.

 

Thanks!

For reading this post! It's my very first one but I will try to make more if I just find the time. I hope you find some interesting stuff throughout the post. Let me know any improvements, thoughts or feedback in the comments!

You can find me on Twitter and Artstation.


Related Jobs

SYBO Games
SYBO Games — Copenhagen, Denmark
[08.16.17]

Senior Game Engineer
Naughty Dog
Naughty Dog — Santa Monica, California, United States
[08.15.17]

Tools Programmer (ICE Team)
Naughty Dog
Naughty Dog — Santa Monica, California, United States
[08.15.17]

Graphics Programmer (ICE Team)
Naughty Dog
Naughty Dog — Santa Monica, California, United States
[08.15.17]

UI Designer / Scripter





Loading Comments

loader image