Vertex Displacement

While fragment shaders provide colours for each fragment/pixel, we can use vertex shaders to offset the vertices of the mesh. In shadergraph, nodes that are connected to the Master node Position input will be written into the Vertex stage of the shader, (while anything else is probably written into the Fragment stage). Note that this can affect how nodes can be connected. Nodes that are in the vertex stage for example may not connect to nodes in the fragment stage, and vice versa. (e.g. The Sample Texture 2D node does not work in the vertex stage. If you need to sample a texture, use Sample Texture 2D LOD instead!)

The process of offsetting vertices is usually referred to as Vertex Displacement. To offset the vertices we create a Position node set to Object space. This returns a Vector3 of our vertex position that we can then manipulate. To offset the vertex position, we can use Add or Subtract nodes, and we can also scale it by using Multiply or Divide nodes. Scaling should be done first, assuming you want to keep the offset in terms of object space units. We can scale/offset each vertex by using a Vector3 node and set each component separately, or use a Vector3 property and control it from the inspector or a C# Script (via materialReference.SetVector(“_Ref”, vector), where _Ref is the reference string of the property in the shadergraph blackboard).

In terms of code, this is just equal to:
vertex *= scale;
vertex += offset;
But then a vertex code shader needs a clip space position output, so:
o.pos = UnityObjectToClipPos(vertex);
That’s only available if you include “UnityCG.cginc”, but in a URP shader TransformWorldToHClip is used instead, included in “Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl”.

And here’s another example if you want to scale both the X and Z with the same Vector1 property. Since we use a Multiply to adjust scale, we set the Y value to 1 so there is no change. While when offsetting, since we use an Add/Subtract, we would use a value of 0 for no change.

vertex *= float3(_Scale, 1, _Scale);

We can also use a Time node to animate the offset over time. For example, creating a swaying effect for grass/plants (shown below), or animating fish or butterflies (shown later in the post).

(Click image to view a larger version)
Right Note : “World space swaying movement : e.g. Swaying for grass, plants, trees to simulate wind.
By using the Position set to World space, we ensure that the offsetting will always occur in the X world space axis, regardless of the transform’s rotation. The input on the master needs to be in Object space though, so we use a Transform node. Want the sway on a different axis? Change the Vector3 node inputs. Plug it into both X and Z for some diagonal movement. Or put the Vector3 into a Rotate About Axis node if you want a more specific rotation. (Use axis as 0,1,0, for rotating about the Y axis)”

Left Note : “Use UVs.y for Quads. Use Position.y (object space) for Models. If origin is not at base of model, use Add to offset. If model doesn’t use y as up, use a different output on the Split.”

Note that shaders run on the GPU and will not actually update the mesh itself or affect colliders. This means, while we can scale or offset the vertices outside the bounds of the mesh, the bounds won’t be updated. If the camera goes outside the original bounds, the mesh renderer will be culled. Either keep scaling/offsetting small, or set the bounds to be larger:

Mesh mesh = GetComponent<MeshFilter>().mesh;
mesh.bounds = new Bounds(origin, size);

Note that this creates a new mesh instance, be sure to Destroy(mesh) in the MonoBehaviour.OnDestroy function! Use .sharedMesh instead if you don’t want to create a mesh instance. However, if you use sharedMesh, all objects that use that mesh will share those bounds. As a tip for knowing what origin/size to use, add a Box Collider to the object, resize it, and use the center/size values.

Some uses of vertex displacement:

  • Adding detail to a mesh using displacement textures along with tessellation (which can’t be done in shadergraph yet as far as I know, at least not in LWRP/URP).
  • Animating vertices on a water plane to simulate waves. (See this catlikecoding tutorial for a good example)
  • Creating various effects such as melting and wobbling.
  • Simulating wind on grass, plants and trees, an example of which was shown above.
  • Adding simple animation to static animal meshes. Specifically the motion of swimming fish and flapping butterfly/bird wings is fairly simple to simulate. See below for examples of this. The game ABZÛ also uses this a lot, see this ABZÛ GDC Talk.

I won’t be explaining the graphs below in detail, but it should give some more examples of vertex displacement.

Swimming Motion (Fish)

A simple way to create a swimming motion for a fish, is to offset the R/X position of the fish (left/right) by a Sine wave based on Time added with the position along the fish (forward/back, which is actually G/Y for the model I’m using). This can then be multiplied with a mask created from the G/Y position so vertices at the front of the fish move less.

FishGraph(click to open a larger version in a new tab)

In terms of code, this would be something along the lines of :

vertex.x += sin(vertex.y * 2 + _Time.y * 4) * (vertex.y - 1) * 0.25;

Note that values may vary based on the scale of the fish mesh. This was based on a model with 2 units length, with the origin being in the center.

Wings Motion (Butterfly, Bird)

For the motion of wings for a butterfly we first create a mask based on the R/X (left/right) axis of the position to determine how we offset the wings, which uses an Absolute so that we offset both wings in the same B/Z direction (which is up/down for this model).

We can then offset the B/Z position by a Sine wave based just on Time which will make the wings move linearly. If we want the wings to bend (which would be better for a bird) we can Add the mask to the Time output and put this into the Sine node instead.

We can also offset the R/X (left/right) a little, so as the wings go upwards they also move slightly inwards, which will reduce stretching and make the motion feel more realistic.

ButterflyGraph(click to open a larger version in a new tab)

In terms of code, this would be something along the lines of :

float mask = abs(vertex.x) - 0.2;
float s = sin(_Time.y * 15.0); // for butterfly
// or : float s = sin(_Time.y * 10.0 + mask) // for bird
vertex.x += saturate(s * 0.6) * saturate(mask) * -1.0 * sign(vertex.x);
vertex.z += (s + 0.5) * 0.75 * mask;

Note that values may vary based on the scale of the butterfly mesh. This was based on a model of 2 by 2 units, with the origin being in the center.