Writing Shader Code for the Universal RP

PBR Lighting

Physically Based Rendering (PBR) is the shading/lighting model that Unity’s “Standard” shader uses, as well as the URP’s “Lit” shader and PBR Master node in shadergraph.

As mentioned in the previous section, shading/lighting in the built-in pipeline was usually handled by Surface Shaders, where the “Standard” option was a PBR model. These used a surface function which outputted Albedo, Normal, Emission, Smoothness, Occlusion, Alpha and Metallic (or Specular if using the “StandardSpecular” workflow). Unity would take those and generate a vertex and fragment shader behind the scenes, handling certain calculations, such as the PBR shading/lighting and shadows, for you.

The Universal RP does not support surface shaders, however the ShaderLibrary does provide functions to help handle a lot of the lighting calculations for us. These are contained in Lighting.hlsl. In this section we’ll be focusing on UniversalFragmentPBR :

half4 UniversalFragmentPBR(InputData inputData, half3 albedo,
   half metallic, half3 specular, half smoothness, half occlusion, 
   half3 emission, half alpha)

// Also has a version with a SurfaceData struct, added in v10.x.x
// for versions before, need to use the above one instead.
// (But you can still use the SurfaceData struct to organise/hold the data)
half4 UniversalFragmentPBR(InputData inputData, SurfaceData surfaceData)

// There is also :
half4 UniversalFragmentBlinnPhong(InputData inputData, half3 diffuse, half4 specularGloss, half smoothness, half3 emission, half alpha)
// Which replicates the "old" surface shader from before Unity v4,
// and is used by URP's "SimpleLit" shader
// Uses the Lambert (diffuse) & BlinnPhong (specular) lighting models

First we should add some properties that the PBR lighting model uses.

I’m leaving out the metallic/specular map and occlusion maps, mainly because they don’t have nice functions to handle the sampling for you (unless you copy them out of LitInput.hlsl, which is a part of the Lit shader URP provides, not the actual ShaderLibrary) and this section is quite long and complicated enough already. There’s very little I can actually explain as it’s mainly just knowing which functions to use where. You can always add them in later using the LitInput as an example.

Properties {
	_BaseMap ("Base Texture", 2D) = "white" {}
	_BaseColor ("Example Colour", Color) = (0, 0.66, 0.73, 1)
	_Smoothness ("Smoothness", Float) = 0.5

	[Toggle(_ALPHATEST_ON)] _EnableAlphaTest("Enable Alpha Cutoff", Float) = 0.0
	_Cutoff ("Alpha Cutoff", Float) = 0.5

	[Toggle(_NORMALMAP)] _EnableBumpMap("Enable Normal/Bump Map", Float) = 0.0
	_BumpMap ("Normal/Bump Texture", 2D) = "bump" {}
	_BumpScale ("Bump Scale", Float) = 1

	[Toggle(_EMISSION)] _EnableEmission("Enable Emission", Float) = 0.0
	_EmissionMap ("Emission Texture", 2D) = "white" {}
	_EmissionColor ("Emission Colour", Color) = (0, 0, 0, 0)
    }
...
// And need to adjust the CBUFFER to include these too
CBUFFER_START(UnityPerMaterial)
	float4 _BaseMap_ST; // Texture tiling & offset inspector values
	float4 _BaseColor;
	float _BumpScale;
	float4 _EmissionColor;
	float _Smoothness;
	float _Cutoff;
CBUFFER_END

We also need to make a bunch of changes to the Unlit shader code, including adding some multi_compile and shader_features and adjustments to the Attributes and Varyings structs, as we need normals and tangents data from the mesh and send them into the fragment in order to use them for lighting calculations.

These TOGGLE parts in our Properties block allow us to enable/disable the shader_feature keywords from the material inspector. (Alternatively, we could write a Custom editor/inspector GUI for the shader or use the debug inspector).

If we want to support baked lightmaps, we’ll also need lightmap UVs which are passed in the TEXCOORD1 channel.

I’m also using SurfaceInput.hlsl from the ShaderLibrary to assist with some things, it helps with a SurfaceData struct to hold the data needed for PBR and some functions for sampling albedo, normal and emission maps (note, that struct seems to have moved to SurfaceData.hlsl in URP v10, but that gets included automatically by SurfaceInput.hlsl).

// Material Keywords
#pragma shader_feature _NORMALMAP
#pragma shader_feature _ALPHATEST_ON
#pragma shader_feature _ALPHAPREMULTIPLY_ON
#pragma shader_feature _EMISSION
//#pragma shader_feature _METALLICSPECGLOSSMAP
//#pragma shader_feature _SMOOTHNESS_TEXTURE_ALBEDO_CHANNEL_A
//#pragma shader_feature _OCCLUSIONMAP

//#pragma shader_feature _SPECULARHIGHLIGHTS_OFF
//#pragma shader_feature _ENVIRONMENTREFLECTIONS_OFF
//#pragma shader_feature _SPECULAR_SETUP
#pragma shader_feature _RECEIVE_SHADOWS_OFF

// URP Keywords
#pragma multi_compile _ _MAIN_LIGHT_SHADOWS
#pragma multi_compile _ _MAIN_LIGHT_SHADOWS_CASCADE
#pragma multi_compile _ _ADDITIONAL_LIGHTS_VERTEX _ADDITIONAL_LIGHTS
#pragma multi_compile _ _ADDITIONAL_LIGHT_SHADOWS
#pragma multi_compile _ _SHADOWS_SOFT
#pragma multi_compile _ _MIXED_LIGHTING_SUBTRACTIVE

// Unity defined keywords
#pragma multi_compile _ DIRLIGHTMAP_COMBINED
#pragma multi_compile _ LIGHTMAP_ON
#pragma multi_compile_fog

// Some added includes, required to use the Lighting functions
#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Lighting.hlsl"
// And this one for the SurfaceData struct and albedo/normal/emission sampling functions.
// Note : It also defines the _BaseMap, _BumpMap and _EmissionMap textures for us, so we should use these as Shaderlab Properties too.
#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/SurfaceInput.hlsl"

struct Attributes {
	float4 positionOS   : POSITION;
	float3 normalOS		: NORMAL;
	float4 tangentOS	: TANGENT;
	float4 color		: COLOR;
	float2 uv           : TEXCOORD0;
	float2 lightmapUV   : TEXCOORD1;
};

struct Varyings {
	float4 positionCS				: SV_POSITION;
	float4 color					: COLOR;
	float2 uv    					: TEXCOORD0;
	DECLARE_LIGHTMAP_OR_SH(lightmapUV, vertexSH, 1);
	// Note this macro is using TEXCOORD1
#ifdef REQUIRES_WORLD_SPACE_POS_INTERPOLATOR
	float3 positionWS				: TEXCOORD2;
#endif
	float3 normalWS					: TEXCOORD3;
#ifdef _NORMALMAP
	float4 tangentWS				: TEXCOORD4;
#endif
	float3 viewDirWS 				: TEXCOORD5;
	half4 fogFactorAndVertexLight	: TEXCOORD6;
	// x: fogFactor, yzw: vertex light
#ifdef REQUIRES_VERTEX_SHADOW_COORD_INTERPOLATOR
	float4 shadowCoord				: TEXCOORD7;
#endif
};

//TEXTURE2D(_BaseMap);
//SAMPLER(sampler_BaseMap);
// Removed, since SurfaceInput.hlsl now defines the _BaseMap for us

Our Varyings also now contains the lightmap UVs, normals and tangents which are being passed through, but we’ve also added a View Direction which will be required for lighting calculations, Fog, Vertex Lighting support, and a Shadow Coord for receiving shadows.

Now our vertex shader needs to be updated to handle all these changes, which is mainly just knowing which functions to use :

#if SHADER_LIBRARY_VERSION_MAJOR < 9
	// This function was added in URP v9.x.x versions
	// If we want to support URP versions before, we need to handle it instead.
	// Computes the world space view direction (pointing towards the viewer).
	float3 GetWorldSpaceViewDir(float3 positionWS) {
		if (unity_OrthoParams.w == 0) {
			// Perspective
			return _WorldSpaceCameraPos - positionWS;
		} else {
			// Orthographic
			float4x4 viewMat = GetWorldToViewMatrix();
			return viewMat[2].xyz;
		}
	}
#endif

Varyings vert(Attributes IN) {
	Varyings OUT;

	// Vertex Position
	VertexPositionInputs positionInputs = GetVertexPositionInputs(IN.positionOS.xyz);
	OUT.positionCS = positionInputs.positionCS;
#ifdef REQUIRES_WORLD_SPACE_POS_INTERPOLATOR
	OUT.positionWS = positionInputs.positionWS;
#endif
	// UVs & Vertex Colour
	OUT.uv = TRANSFORM_TEX(IN.uv, _BaseMap);
	OUT.color = IN.color;

	// View Direction
	OUT.viewDirWS = GetWorldSpaceViewDir(positionInputs.positionWS);

	// Normals & Tangents
	VertexNormalInputs normalInputs = GetVertexNormalInputs(IN.normalOS, IN.tangentOS);
	OUT.normalWS =  normalInputs.normalWS;
#ifdef _NORMALMAP
	real sign = IN.tangentOS.w * GetOddNegativeScale();
	OUT.tangentWS = half4(normalInputs.tangentWS.xyz, sign);
#endif

	// Vertex Lighting & Fog
	half3 vertexLight = VertexLighting(positionInputs.positionWS, normalInputs.normalWS);
	half fogFactor = ComputeFogFactor(positionInputs.positionCS.z);
	OUT.fogFactorAndVertexLight = half4(fogFactor, vertexLight);

	// Baked Lighting & SH (used for Ambient if there is no baked)
	OUTPUT_LIGHTMAP_UV(IN.lightmapUV, unity_LightmapST, OUT.lightmapUV);
	OUTPUT_SH(OUT.normalWS.xyz, OUT.vertexSH);

	// Shadow Coord
#ifdef REQUIRES_VERTEX_SHADOW_COORD_INTERPOLATOR
	OUT.shadowCoord = GetShadowCoord(positionInputs);
#endif
	return OUT;
}

Now we can also update the fragment to actually use the UniversalFragmentPBR function. As it requires the InputData struct input, we’ll need to create and set that. Instead of doing this in the fragment shader, we’ll create another function to help organise things.

Similarly, to handle all the Albedo, Metallic, Specular, Smoothness, Occlusion, Emission and Alpha inputs, we’ll use the SurfaceData struct (provided by the SurfaceInput.hlsl we included earlier) and create another function to handle that.

InputData InitializeInputData(Varyings IN, half3 normalTS){
	InputData inputData = (InputData)0;

#if defined(REQUIRES_WORLD_SPACE_POS_INTERPOLATOR)
	inputData.positionWS = IN.positionWS;
#endif
				
	half3 viewDirWS = SafeNormalize(IN.viewDirWS);
#ifdef _NORMALMAP
	float sgn = IN.tangentWS.w; // should be either +1 or -1
	float3 bitangent = sgn * cross(IN.normalWS.xyz, IN.tangentWS.xyz);
	inputData.normalWS = TransformTangentToWorld(normalTS, half3x3(IN.tangentWS.xyz, bitangent.xyz, IN.normalWS.xyz));
#else
	inputData.normalWS = IN.normalWS;
#endif

	inputData.normalWS = NormalizeNormalPerPixel(inputData.normalWS);
	inputData.viewDirectionWS = viewDirWS;

#if defined(REQUIRES_VERTEX_SHADOW_COORD_INTERPOLATOR)
	inputData.shadowCoord = IN.shadowCoord;
#elif defined(MAIN_LIGHT_CALCULATE_SHADOWS)
	inputData.shadowCoord = TransformWorldToShadowCoord(inputData.positionWS);
#else
	inputData.shadowCoord = float4(0, 0, 0, 0);
#endif

	inputData.fogCoord = IN.fogFactorAndVertexLight.x;
	inputData.vertexLighting = IN.fogFactorAndVertexLight.yzw;
	inputData.bakedGI = SAMPLE_GI(IN.lightmapUV, IN.vertexSH, inputData.normalWS);
	return inputData;
}

SurfaceData InitializeSurfaceData(Varyings IN){
	SurfaceData surfaceData = (SurfaceData)0;
	// Note, we can just use SurfaceData surfaceData; here and not set it.
	// However we then need to ensure all values in the struct are set before returning.
	// By casting 0 to SurfaceData, we automatically set all the contents to 0.
		
	half4 albedoAlpha = SampleAlbedoAlpha(IN.uv, TEXTURE2D_ARGS(_BaseMap, sampler_BaseMap));
	surfaceData.alpha = Alpha(albedoAlpha.a, _BaseColor, _Cutoff);
	surfaceData.albedo = albedoAlpha.rgb * _BaseColor.rgb * IN.color.rgb;

	// Not supporting the metallic/specular map or occlusion map
	// for an example of that see : https://github.com/Unity-Technologies/Graphics/blob/master/com.unity.render-pipelines.universal/Shaders/LitInput.hlsl

	surfaceData.smoothness = _Smoothness;
	surfaceData.normalTS = SampleNormal(IN.uv, TEXTURE2D_ARGS(_BumpMap, sampler_BumpMap), _BumpScale);
	surfaceData.emission = SampleEmission(IN.uv, _EmissionColor.rgb, TEXTURE2D_ARGS(_EmissionMap, sampler_EmissionMap));
	surfaceData.occlusion = 1;
	return surfaceData;
}

half4 frag(Varyings IN) : SV_Target {
	SurfaceData surfaceData = InitializeSurfaceData(IN);
	InputData inputData	= InitializeInputData(IN, surfaceData.normalTS);
				
	// In URP v10+ versions we could use this :
	// half4 color = UniversalFragmentPBR(inputData, surfaceData);

	// But for other versions, we need to use this instead.
	// We could also avoid using the SurfaceData struct completely, but it helps to organise things.
	half4 color = UniversalFragmentPBR(inputData, surfaceData.albedo, surfaceData.metallic, 
	  surfaceData.specular, surfaceData.smoothness, surfaceData.occlusion, 
	  surfaceData.emission, surfaceData.alpha);
				
	color.rgb = MixFog(color.rgb, inputData.fogCoord);

	// color.a = OutputAlpha(color.a);
	// Not sure if this is important really. It's implemented as :
	// saturate(outputAlpha + _DrawObjectPassData.a);
	// Where _DrawObjectPassData.a is 1 for opaque objects and 0 for alpha blended.
	// But it was added in URP v8, and versions before just didn't have it.
	// And I'm writing thing for v7.3.1 currently
	// We could still saturate the alpha to ensure it doesn't go outside the 0-1 range though :
	color.a = saturate(color.a);

	return color;
}

And that’s the PBR Lighting… I know that was a quite lot of code! Hopefully it’s not too complicated to see how this works. As I mentioned earlier it’s mainly just knowing which functions to use where so there’s not a lot that I can really explain.

Currently however, while our shader can receive shadows it doesn’t include a ShadowCaster pass, so isn’t casting any shadows. This will be handled in the next section.