Writing Shader Code for the Universal RP

Fragment Shader

The fragment shader is responsible for determining the colour of the pixel output (including alpha). For unlit shaders this can be a fairly simple solid colour or a colour obtained from sampling an input texture. For lit shaders, it’s a bit more complicated but URP provides some handy functions which I’ll be going through in the Lighting section.

The Varyings data that the fragment shader receives is linearly interpolated between the three vertices that makes up the triangle the fragment/pixel is a part of. This is why in the vertex shader you might output (1,0,0), (0,1,0) and (0,0,1) for each vertex of a triangle, but in the fragment you end up with something like this image :

If this data was for normals (used for lighting/shading), you might see that the interpolation will cause values no longer be a unit vector. e.g. This result is similar to the barycentric coordinates system so the center point is (0.33,0.33,0.33), which has a magnitude of around 0.577, rather than a unit vector with a magnitude of 1. Because of this, normals should be normalised (via the normalize function) in the fragment stage before use, although this is an extreme case where the normals of each vertex are very different – and in many cases the magnitude is closer to 1 so it can sometimes be skipped to reduce the complexity of the shader (e.g. for lower quality settings).

For now since our shader is Unlit, all we need is :

half4 frag(Varyings IN) : SV_Target {
	half4 baseMap = SAMPLE_TEXTURE2D(_BaseMap, sampler_BaseMap, IN.uv);

	return baseMap * _BaseColor * IN.color;
}

This produces a shader which outputs a half4 colour, based on the _BaseMap texture, which is also tinted by the _BaseColor and vertex colour (IN.color).

The SV_Target part is the semantic to go with the half4 output, which tells the shader that it is the colour output. There is also a SV_Depth output, which is a float that is used to override the Z buffer value per pixel. (They can be put into a struct to output both SV_Target and SV_Depth). In most cases overriding this isn’t needed and according to this unity docs page, for many GPUs it turns off some depth buffer based optimisations, so don’t override it unless you know what you are doing and need to.

Our fragment shader samples the _BaseMap texture using the SAMPLE_TEXTURE2D macro provided by the URP ShaderLibrary, which takes the texture, a sampler and the UVs as inputs.

Something that we might also want to do, is discard pixels if their alpha value is below a certain threshold, so that the entire mesh isn’t visible – e.g. for grass/leaf textures on quads. This can be done in opaque shaders as well as transparent, and it’s usually referred to as alpha clip/cutoff. If you are familiar with shadergraph, it’s handled with the Alpha Clip Threshold input on the master nodes.

A common way to handle this, is provide a _Cutoff property to control the threshold, and do the following. (This property would have to be added to our Shaderlab Properties as well as the UnityPerMaterial CBUFFER for SRP Batcher-compatibility).

if (_BaseMap.a < _Cutoff){
	discard;
}
// OR
clip(_BaseMap.a - _Cutoff);
// inside the fragment function, before returning

This is the Unlit Shader Example complete, you can find the full code here.