Introduction to Shadergraph

Housekeeping

  • Please mute your mic!
  • If you have a question either use the hand thingy on zoom or just chime in with your voice (temp unmute = spacebar).
  • I am recording this. If you’d rather not be included in the record message me offline and I’ll remove you.
  • If I am going too fast, feel free to slow me down. If I’m going too slow, feel free to jump ahead using this post!

Anatomy of a Shader

In short: a program that runs on the GPU, not the CPU.

Expand for more
  • CPU = one thing at a time, but can talk to many other things.
  • GPU = many things at a time, but sucks at communication.

Note: Recently shaders have grown far beyond their traditional use (see compute, tessellation, primitive shaders) but for the scope of this workshop we’re focussing on the 2 OG shader types: vertex and fragment.

The vertex shader takes the geometry information of your scene (defined each frame by your Unity Scene) and smooshes it all into the layered 2D image. Things in front are arranged in front.

The fragment shader then takes that 2D image and colours it. This happens at the same time for every pixel on the screen.

This is a huge rabbit hole to go down (here’s a great intro), but for the purposes of today there are a few things to keep in mind:

  • The pixels in the fragment shader can’t talk to each other, so things like blur aren’t super straightforward.
  • Each frame, your Scene is described by all the GameObjects and their components (transforms, lights, meshes, scripts, etc). Once fully described, that information is handed to the Renderer to process on the GPU. The renderer draws the image on your screen, but does not save anything for the next frame. So if you use the Vertex Shader to change geometry, that is not handed back to the CPU and is not saved anywhere.

Shader vs Shadergraph

Shadergraph is a node-based editor for creating vertex/fragment shaders. Though it has its own learning curve, the visual basis of it can be less intimidating for newcomers than a wall of code.

Getting Started

To use shadergraph your project must be set up for one of the scriptable render pipelines. The most straightforward way to do this is select a “URP” project when creating your project. If you forgot to do that or wish to change and existing project, this guide should help you.

Once your project is set up, create a Shader Graph in your assets folder by right clicking, then “Create>Shader>PBR Graph.”

  • PBR is “Physically Based Rendering” and generally responds to lighting and shadows in your scene.
  • Unlit will not respond to lights in your scene, unless you specifically create that functionality. Could be good for particle effects.

Once you have created a shader, you’ll need to apply it to a material. You can do this by dragging the shader onto the material.

A material specifies what shader to use on an object, and exposes certain parameters of that shader. For example the default shader exposes things like colour and shininess.

Pro Tip: if you right click on the name of your new shader and create a material, the material will automatically be connected to the shader! Neat!


A Simple Shader

Create a new shader graph, apply it to a material, and apply the material to an object in your scene.

Simply the Best

When you double click on the shader graph, a new window will open.

  • Save Asset saves the work you’ve done on this graph.
  • Blackboard shows you which properties of the shader are exposed to your material (more on this later).
  • Main Preview sort of shows how your shader might look. Sometimes.

Pro Tip: With your mouse over the new window, press Shift+Space to make that window full screen. Very useful for larger graphs.

The PBR Master Node

Once you’ve cleared your screen a little, you should see the standard PBR node:

These are the standard inputs for Unity’s physically based renderer. The ones important for today are listed below (the rest you can ignore for now):

  • (Vertex Shader)
  • Vertex Position: The offset of the rendered vertex from its position as described in the scene. You can use this to change an object’s shape using the GPU.
  • Vertex Normal: The direction the vertex faces. This is used to determine both how colours are interpolated, and whether a piece of geometry is forward- or backward-facing.
  • (Fragment Shader)
  • Albedo: The diffuse colour an object reflects from a light source
  • Normal: The direction perpendicular to the surface that is hit. This is used to calculate things like specular reflection – the shiny parts of objects. By default this is exactly perpendicular to the geometry, but if we alter this we can create the illusion of things like bumps or divots in the surface.
  • Emission: Regardless of the lighting condition, this is how bright an object is. A high-emission object will seem like it has its own internal light source. Note that current limitations mean an emissive object will not actually cast light into the scene and light other objects, just itself.
  • Metallic: One way to determine the shininess of the object (I’ll talk about other options later).
  • Smoothness: Smoother objects are shinier than less smooth objects.
  • Alpha: If using transparency, how see-through this part of the object is.
  • Alpha Cutoff: Regardless of transparency, the alpha value below which this part of the object will not render.

Adding Nodes

The Spacebar brings up a search menu for all nodes, or you can right click for more node options.

For now, let’s add an Input>Basic>Color node, and connect its output to the albedo input.

Click on the colour bar and choose any colour of your heart’s desire.

Save Asset

This is so important it has its own heading. Shadergraph does not auto update, and Ctrl+S does not save the graph. You have to manually click save asset.

You will forget this at some point and wonder why your shader is not working.

Exposing Properties to the Material

So that’s cool, we’ve made a shader that is basically red (or whatever colour you chose.

Useful if you need a red thing, but useless for anything else. If you open the material that this shader is attached to, in the inspector you might see something like this:

THe shader we’ve made can only do one thing now, which is make something red. What if we wanted different colours? We can expose certain properties in our graph to the material, which means that you can adjust those properties without having to go back into the shader editor. It also means you can adjust those properties dynamically, which we’ll get to in a bit.

Right click on the node you want to expose, and choose “Convert to Property”

Now you should see in the Blackboard section your new property.

The material name is what you see in the Inspector. The Reference is how you change this property in code.

After Saving your Asset, in the inspector window of the material you should now see:

So now you can change the material colour on the fly, or have multiple versions of this material with different colours, or change the colour in a script.


A Glowy Outline Shader

Fine, but why are we merely replicating what the default shader can already do? Let’s make a fun outline shader that can be used for selections, forcefields, highlights, reticules, whatever you please.

Glow Away

First, our shader will be transparent in parts. Let’s flag the shader as transparent, by clicking on the gear icon in the master node:

Note: Transparency in Unity can get weird. Lots of transparent shaders can cause slowness, shadows can cause issues, and depending on how many other layers are in front of/behind the object the outcome might look unexpected.

Now search for and add a Fresnel Effect node. The Fresnel effect is useful enough to have its own node in Unity. Basically, the more perpendicularly you view a surface, the lighter it appears. The inverse of that is the more “head on” you view a surface from, the darker it appears. The default Fresnel node shows what this looks like on a sphere:

We can leave Normal and View Dir alone for now.

  • Out is a value between 0 and 1
  • Power changes the curve of interpolation between 0 and 1: A higher power is a steeper change.

We can feed this into our Alpha input on the master node, but things kind of look weird. Since AlphaClipThreshold is default 0.5, any alpha value below that diasppears. Set that to 0 and things improve a bit:

That’s better but not perfect. Pretty cool for a transparent glass shader, maybe, but we’re not getting a “glow” effect with this. If we move our Colour input from Albedo to Emission though, it looks a lot better!

One thing to note is that “Power” allows us to adjust the curve of the change. A higher power gives more of an “outliney” feel. Let’s expose “Power” as a property so we can adjust it from the inspector!

To do this, add a Vector1 node, attach it to the Power input, and convert that Vector1 to a property. Rename it to something useful. You may need to adjust the Default value in the Blackboard window.

Vector1 is shadergraph’s version of a single float value.

Extra Credit: If you have post processing enabled with the Bloom effect, you can get some really cool visuals by changing the colour mode to HDR and using an intensity value greater that 1. You can change the colour mode in the blackboard property:


A Dissolve Shader

Dissolve Me

Here we’ll make a shader that dissolves into view on our command, and set it to be controlled by a slider. To start, I’m going to make sure the PBR Master node is set to Opaque, not transparent. The reason why is that each section of our material is either visible or not, there’s no halfway. Transparency is useful when things are partially see-through, but unnecessary when it’s one or the other.

For this shader we are going to make use of the AlphaClipThreshold. We’ll create a noisey texture on our object, and then increase or decrease the overall brightness of that texture. Sections that are below the AlphaClipThreshold will not render, and sections above will. When the whole texture is bright, the whole object will be seen. When the whole texture is dark, the whole image will be invisible.

It’s Getting Noisey in Here

First thing to do is add come noise to our shader. There are a few noise nodes for that, and a nice smooth one is Gradient Noise. I’m going to connect the Out of the node to the Emission of the PBR node so that I can see what effect the noise has.

Adding that, we can see there’s a strange seam in our Preview window. We’ll cover that in a later section, but for now it is good enough.

Noise takes in 2 inputs, a UV and a Scale. For now, let’s ignore UV and create an exposed Vector1 property for Scale.

If you also connect the Out of the noise node to Alpha, and make sure AlphaClipThreshold is 0.5, parts of the preview will disappear! The shader is making all section of the object with an Alpha below 0.5 invisible.

Pro Tip: Turn on “Two Sided” in the PBR settings (gear icon) to see both sides…

Ad mentioned earlier, we cant to increase and decrease the overall ‘brightness’ of this noise to put more (or less) of the object above the clip threshold. To do this we can add an Add node, which simply adds two values together. Observe what happens when you ajust the B input.

When the second input is -1, the object is completely invisible. When it is 1, the object is completelt visible. I’m going to remove the emission and add some colour to my object:

I’m going to add a Vector1 to that B input and expose it as a property. Now in the materials inpector I can adjust this value and see the results in my scene.

Pro Tip: You can set the “Mode” of your Vector1 property to ‘slider’, which will reveal a handy slider in the inspector

Gradients

We have the mechanics of our dissolve now, but it seems a little lacklustre. What about some pizzaz? Let’s try add a glowy little outline around the sections that are dissolving in.

The great thing about our setup is that we know that the values near the edges of the dissolve shader have a relatively low alpha – just above 0.5. So if we could find some way to affect the colour/emission of those areas that would give us what we want.

There’s a super handy node called Gradient, which allows is to map values from 0 to 1 to a gradient of colours. By default th gradient node doesn’t look like much:


All we have is an Out? If you click on the colour bar you can set up your gradient as you would normally in Unity. To harness the power of gradients, we need a Sample Gradient node:

Sample gradient takes an incoming value between 0 and 1 (called “Time”) and maps it to a corresponding point on the gradient, 0 being all the way left and 1 being all the way right.

Recall that we know that our alpha values right near the edge of the dissolve are just above 0.5. So if we set up our gradient to have a burst of colour just above 0.5 then quickly fade back out to black, and give the sampler our alpha values, we should see a neat little outline appear:

The stuff below 50% doesn’t matter because that won’t be visible anyway!
The output from our sample gradient can go dirently into emission: the black areas won’t emit anything!

To change the colour of the outline, change the colour of the gradient:

Making This Dynamic

That’s all well and good, but we want to be able to dissolve in the material at runtime, not just from the inspector. Earlier I mentioned that to adjust things in code we need the “Reference” of the property, not just the name of it.

To get (and change) the “Reference” of a property, go to the Blackboard window and… look for “Reference”. Make sure you rememeber/note what your property reference is:

With that done, we can go back to the Unity scene and add a new script component to the object with our dissolve material. I’ve called mine “DissolveHandlerTest” but you do you.

For brevity’s sake what I’m going to do is make a public function that takes in a float and sets the dissolve property to be the float. Then I can fire that function from anywhere, in this case a slider. In our new script:

public class DissolveHandlerTest : MonoBehaviour
{
    [SerializeField] Material dissolveMaterial;

    //We'll fire this function from a slider:
    public void OnSliderChanged(float value)
    {
        dissolveMaterial.SetFloat("DissolveAmount", value);
    }
}

Now we can set up a UI slider and connect the OnValueChanged listener to the public function we just created:

Final Shader Graph:


Using the Vertex Shader

At the beginning of this whole debacle I said we can move geometry around with the shader, but so far we haven’t seen any of that. Let’s get to it!

Workshop Notes in Progress!

Vertex Shader

What I want to do is create a funky morphing/bubble kind of effect, on a sphere. To begin with, let’s add a Position node to the Vertex Position of our PBR node, just to see what happens:

Don’t forget to SAVE ASSET

Seemingly nothing? Try move your object around in the scene? There’s some sort of weird offset. As best I can put it what’s happening here is the vertex shader expects the position coming in to be in Object (local) space. Each vertex has it’s own local position. But when you give the input the world space of the vertex, you’re taking the local space and adding the transform position of the object itself. So the further you move the original object away from the world origin, the further the offset it.

If you change the Position Space to “Object” then things should return to normal.

Normal

Speaking of normal, le’ts try to take each vertex and move it along its normal. Since the normal is kind of the ‘direction’ of the vertex, this should expand/shrink our geometry?

I’ve added a Normal Vector node, a multiplier, and then added that to the position node. As we move the values of the multiplier we expand/shrink the sphere!

If we scale the multiplier uniformly, the sphere scales uniformly. Perhaps we can use this to our advantage by scaling things uniformly along XYZ, but differenly across different sections of the material. That seems clunky to say, so let’s just try do it:

By simply adding a noise node, and adjusting the scale, we’ve turned a sphere into some weird amorphous blob. But the amount that each vertex is moving is a little too much for me, I’d like the effect to be more subtle:

The Remap node is a super helpful node that remaps one range of values to another. Here I’ve converted (0-1) into (-0.2)-(0.2).

If we add a little colour we can see that the seams issue from earlier is starting to cause major trouble. At the seams out vertices are tearing arat creating holes in the mesh.

How textures get applied

A term you might here in the 3D is “UV Mapping” or “UV Wrapping”. IN short this is the process of taking a 2D image and wrapping it around a 3D object. When we’ve applied noise up until now, we’ve created a 2D image of noise and wrapped it onto the 3D mesh.

The problem here is that without extreme care there will likely be parts of the model that don’t exactly line up. These ‘seams’ sometimes aren’t a problem, but in this case are.

The way I often get around this when dealing with procedural textures is to use a 3D, rather than 2D, noise pattern. This means that instead of exaluating noise along X and Y (or in the case of Unity U and V), the algorithm evaluates noise along X, Y, and Z. So we can give it a position in space and the algorithm will return the noise value from that point.

The trouble is, Shadergraph does not include this by default, so we have to add our own.

GitHub user Keijiro has compiled a fantastic list of different shadergraph-ready noise algorithms. Today we’ll be using Classic3D.