Writing Shader Code for the Universal RP

Summary of Built-in vs URP differences

Here’s a summary the differences between the built-in pipeline and Universal RP. Focusing mostly on the shader code differences. Sorry if I’ve missed anything out!

  • Use the “RenderPipeline”=”UniversalPipeline” tag on the Subshader block
  • URP uses the following “LightMode” tags :
    • UniversalForward – used to render objects with the Forward Renderer.
    • ShadowCaster – used for casting shadows
    • DepthOnly – seems to be used when rendering the depth texture for the scene view, but not in-game? Some renderer features might make use of it though.
    • Meta – used during lightmap baking only
    • Universal2D – used when the 2D Renderer is enabled, instead of the Forward Renderer.
    • UniversalGBuffer – related to deferred rendering. I think this is in development / testing. URP v10+?
  • URP uses a single-pass forward rendered, so only the first supported UniversalForward pass will be rendered. While you can achieve multi-pass shaders with other passes untagged, be aware that it will break batching with the SRP Batcher. It is instead recommended to use separate shaders/materials, either on separate MeshRenderers or use the Render Objects feature on the Forward Renderer :
  • The RenderObjects Forward Renderer feature can be used to re-render objects on a specific layer with an overrideMaterial (which is similar to replacement shaders, but the values of properties is not retained – Unless you use a material property block, but that also breaks batching with the SRP Batcher). You can also override stencil and ztest values on the feature. See here for examples of the RenderObjects feature being used (Toon “inverted hull” style outlines and object xray/occlusion effects).
  • You can also write Custom Forward Renderer features, for example a Blit feature like the one here (Blit.cs and BlitPass.cs), can be used to achieve custom post processing effects (as URP’s post-processing solution currently doesn’t include custom effects).
  • Always use HLSLPROGRAM (or HLSLINCLUDE) and ENDHLSL, not the CG versions, otherwise there will be conflicts with the URP ShaderLibrary.
  • Instead of including UnityCG.cginc, use the URP ShaderLibrary. The main one to include is :
#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
  • There’s not really any documentation on the functions the ShaderLibrary includes, but see the source code here and here. I’ll also be mentioning a few commonly used things below.
  • The structs used to pass data in and out of the vertex program are usually called Attributes and Varyings in URP, rather than appdata and v2f. Just a naming convention thing and probably isn’t important though.
  • Your shaders need to have a UnityPerMaterial CBUFFER to be compatible with the SRP Batcher. (A UnityPerDraw is also required but the URP ShaderLibrary handles this). This buffer must be the same for every pass in the shader, so it’s usually a good idea to put it in a HLSLINCLUDE in the Subshader. It should include every exposed Property which will be used in the shader functions, except textures which don’t have to be in there. As an example :
Properties {
	_BaseMap ("Example Texture", 2D) = "white" {}
	_BaseColor ("Example Colour", Color) = (0, 0.66, 0.73, 1)
	#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"

	float4 _BaseMap_ST;
	float4 _BaseColor;
	// any other properties etc.
  • Also shown above, _BaseMap tends to be used for the albedo texture, instead of _MainTex. It’s mostly just a naming convention difference, which isn’t too important unless you include SurfaceInput.hlsl. _MainTex should still be used for post processing Blit and obtaining the sprite from a SpriteRenderer.
  • When defining the texture and sampler, use the following macros :
  • And for sampling the texture, use SAMPLE_TEXTURE2D :
half4 baseMap = SAMPLE_TEXTURE2D(_BaseMap, sampler_BaseMap, IN.uv);
  • The TRANSFORM_TEX macro is also included in URP.
  • UnityObjectToClipPos has been replaced with TransformObjectToHClip. You can however also use GetVertexPositionInputs, which allows you to obtain the position in clip space (positionCS), world space (positionWS), view space (positionVS) and normalised device coords (positionNDC). Any unused ones won’t be calculated so this is quite a convenient function. For example :
struct Attributes {
	float4 positionOS	: POSITION;

struct Varyings {
	float3 positionCS	: SV_POSITION;
	float3 positionWS	: TEXCOORD2;

Varyings vert(Attributes IN) {
	Varyings OUT;
	VertexPositionInputs positionInputs = GetVertexPositionInputs(IN.positionOS.xyz);
	OUT.positionCS = positionInputs.positionCS;
	OUT.positionWS = positionInputs.positionWS;
	return OUT;

Similarly, there is a GetVertexNormalInputs, to obtain the world space normal (normalWS), as well as world space tangent (tangentWS) and bitangent (bitangentWS). If you just need the normalWS you can use TransformObjectToWorldNormal instead too.

VertexNormalInputs normalInputs = GetVertexNormalInputs(IN.normalOS, IN.tangentOS);
// or, if you just need the normal :
OUT.normalWS = TransformObjectToWorldNormal(IN.normalOS)
  • URP does not support Surface Shaders, you have to write vertex/fragment ones. If you want to support lighting there is a Lighting.hlsl file you can include which contains some useful lighting functions.
  • For examples you’ll have to look at the Lighting Introduction and PBR Lighting sections of the post.
#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Lighting.hlsl"
  • When supporting lighting and shadows using Lighting.hlsl, you should also include the following. If these keywords aren’t defined, the ShaderLibrary will skip calculations :
// Main Light Shadows
#pragma multi_compile _ _MAIN_LIGHT_SHADOWS
#pragma multi_compile _ _MAIN_LIGHT_SHADOWS_CASCADE

// Additional Lights & Shadows
#pragma multi_compile _ _ADDITIONAL_LIGHT_SHADOWS

// Soft Shadows
#pragma multi_compile _ _SHADOWS_SOFT

// Other (Mixed lighting, baked lightmaps, fog)
#pragma multi_compile _ _MIXED_LIGHTING_SUBTRACTIVE
#pragma multi_compile _ DIRLIGHTMAP_COMBINED
#pragma multi_compile _ LIGHTMAP_ON
#pragma multi_compile_fog

// Supporting shadows will also require passing a positionWS, 
// and shadowCoord into the fragment shader, again you'll have 
// to see the Lighting sections for actual examples.
  • To handle fog, use ComputeFogFactor and MixFog functions :
#pragma multi_compile_fog

struct Varyings {
	half fogFactor : TEXCOORD5;
	// or whatever unused texcoord
	// if none are unused pack it together with a half3 or something

// In the vertex shader :
half fogFactor = ComputeFogFactor(positionInputs.positionCS.z);

// In the fragment, just before returning the color :
color.rgb = MixFog(color.rgb, IN.fogFactor);

If I’ve missed anything out, feel free to contact me. @Cyanilux on twitter. I also have a discord server. If you have any questions about URP I might be able to answer them there.