Welcome to the first chapter of Part 3. Here, we will dive deep into the different graphics and audio systems of Unity to dramatically improve the look and feel of the game. We will start by discussing what a shader is and how to create our own to achieve several custom effects that couldn’t be accomplished using the default Unity Shaders. We will be creating a simple water animation effect using Shader Graph, a visual shader editor included in the Universal Render Pipeline. Also known as URP, this is one of the different rendering pipelines available in Unity, which provides rendering features oriented toward performance. We will be discussing some of its capabilities in this chapter.
In this chapter, we will examine the following shader concepts:
We created Materials in Part 1 of the book, but we never discussed how they internally work and why their Shader property is important. In this first section of this chapter, we will be exploring the concept of a shader as a way to program the video card to achieve custom visual effects. We will also be discussing how URP works with those shaders, and the default shaders it provides.
In this section, we will cover the following concepts related to shaders:
Let’s start by discussing how a shader modifies the Shader Pipeline to achieve effects.
Whenever a video card renders a 3D model, it needs different information to process, such as a Mesh, Textures, the transform of the object (position, rotation, and scale), and lights that affect that object. With that data, the video card must output the pixels of the object into the back-buffer, an image where the video card will be drawing our objects, but the user won’t see this yet. This is done to prevent the user from seeing unfinished results, given we can still be drawing at the time the monitor refreshes. That image will be shown when Unity finishes rendering all objects (and some effects) to display the finished scene, swapping the Back-buffer with the front-buffer, the image that the user actually sees. You can imagine this as having a page with an image that is being shown to the user while you draw a new image, and when you finish the new drawing, you just swap the pages and start drawing again on the page the user is not seeing, repeating this with every frame.
That’s the usual way to render an object, but what happens between the input of the data and the output of the pixels can be handled in a myriad of different ways and techniques that depend on how you want your object to look; maybe you want it to be realistic or look like a hologram, maybe the object needs a disintegration effect or a toon effect—there are endless possibilities. The way to specify how our video card will handle the render of the object is through a shader.
A shader is a program coded in specific video card languages, such as:
Any of those languages can be used to configure different stages of the render process necessary to render a given object, sometimes not only configuring them but also replacing them with completely custom code to achieve the exact effect we want. All of the stages to render an object make up what we call the Shader Pipeline, a chain of modifications applied to the input data until it is transformed into pixels.
Each stage of the pipeline is in charge of different modifications and depending on the video card shader model, this pipeline can vary a lot. In the next diagram, you can find a simplified Render Pipeline, skipping advanced/optional stages that are not important right now:
Figure 10.1: Common Shader Pipeline
Let’s discuss each of the stages:
Figure 10.2: Example of figures being rasterized
Figure 10.3: Rendering the occluded parts of the character
Shader Pipelines is a subject that would require an entire book, but for the scope of this book, the previous description will give you a good idea of what a shader does, and the possible effects that it can achieve. Now that we have discussed how a shader renders a single object, it is worth discussing how Unity renders all of the objects using Render Pipelines.
We have covered how the video card renders an object, but Unity is in charge of asking the video card to execute its Shader Pipeline per object. To do so, Unity needs to do lots of preparations and calculations to determine exactly how and when each shader needs to be executed. The responsibility of doing this is with what Unity calls the Render Pipeline.
A Render Pipeline is a way to draw the objects of the scene. At first, it sounds like there should be just one simple way of doing this, for example, iterating over all objects in the scene and executing the Shader Pipeline with the shader specified in each object’s Material, but it can be more complex than that.
Usually, the main difference between one Render Pipeline and another is the way in which lighting and some advanced effects are calculated, but they can differ in other ways.
In previous Unity versions, there was just one single Render Pipeline, which is now called the Built-in Renderer Pipeline (also known as BIRP). It was a pipeline that had all of the possible features you would need for all kinds of projects, from mobile 2D graphics and simple 3D to cutting-edge 3D like the ones you can find in consoles or high-end PCs. This sounds ideal, but actually, it isn’t. Having one single giant renderer that needs to be highly customizable to adapt to all possible scenarios generates lots of overhead and limitations that cause more headaches than creating a custom Render Pipeline. Luckily, the last versions of Unity introduced Scriptable Render Pipeline (SRP), a way to create Render Pipeline adapted for your project.
Luckily, Unity doesn’t want you to create your own Render Pipeline for each project (which is a complex task), so it has created two custom pipelines for you that are ready to use: URP (formerly called LWRP), which stands for Universal Render Pipeline, and HDRP, which stands for High Definition Render Pipeline. The idea is that you must choose one or the other based on your project’s requirements (unless you really need to create your own).
URP, the one we selected when creating the project for our game, is a Render Pipeline suitable for most games that don’t require lots of advanced graphics features, such as mobile games or simple PC games, while HDRP is packed with lots of advanced rendering features for high-quality games. The latter requires high-end hardware to run, while URP runs in almost every relevant target device. It is worth mentioning that you can swap between Built-in Renderer, HDRP, and URP whenever you want, including after creating the project (but this is not recommended):
Figure 10.4: Project wizard showing HDRP and URP templates
We can discuss how each one is implemented and the differences between each, but again, this could fill entire chapters; right now, the idea of this section is for you to know why we picked URP when we created our project because it has some restrictions we will encounter throughout this book that we will need to take into account, so it is good to know why we accepted those limitations (to run our game on every relevant hardware).
Also, we need to know that we have chosen URP because it has support for Shader Graph, the Unity tool that we will be using in this chapter to create custom effects. Previous Unity built-in pipelines didn’t provide us with such a tool (aside from third-party plugins). Finally, another reason to introduce the concept of URP is that it comes with lots of built-in shaders that we will need to know about before creating our own to prevent reinventing the wheel. This will allow us to get used to those shaders, because if you came from previous versions of Unity, the shaders you already know won’t work here; actually, this is exactly what we are going to discuss in the next section of this chapter: the difference between the different URP built-in shaders.
Now that we know the difference between URP and other pipelines, let’s discuss which shaders come integrated into URP. Let’s briefly describe the three most important shaders in this pipeline:
This is extremely performant, especially for low-end devices such as mobile phones. Also, you have other cases such as light tubes or screens, objects that can’t receive shadows because they emit light, so they will be seen at their full color even in complete darkness. In the following screenshot, you can see a 3D model using an Unlit Shader. It looks like it’s being lit, but it’s just the texture of the model that applied lighter and darker colors in different parts of the object:
Figure 10.5: Pod using an Unlit effect to simulate cheap lighting
Let’s do an interesting disintegration effect with the Simple Lit Shader to demonstrate its capabilities. You must do the following:
Figure 10.6: Noise texture
Figure 10.7: Generate Alpha From Gray Scale texture setting
The Alpha channel of a color is often associated with transparency, but you will notice that our object won’t be transparent. The Alpha channel is extra color data that can be used for several purposes when creating effects. In this case, we will use it to determine which pixels are being disintegrated first.
Figure 10.8: Material creation button
Figure 10.9: Cube primitive creation
Figure 10.10: Simple Lit Shader selection
0.5
:Figure 10.11: Alpha Clipping Threshold Material slider
Figure 10.12: Disintegration effect with Alpha Clipping
Figure 10.13: Double-sided render face
Figure 10.14: Double-sided Alpha Clipping
The idea of this section is not to give a comprehensive guide of all of the properties of all URP shaders, but to give you an idea of what a shader can do when properly configured and when to use each one of the integrated shaders. Sometimes, you can achieve the effect you need just by using existing shaders, probably in 99% of cases in simple games, so try to stick to them as much as you can. But if you really need to create a custom shader to create a very specific effect, the next section will teach you how to use the URP tool called Shader Graph.
Now that we know how shaders work and the existing shaders in URP, we have a basic notion of when it is necessary to create a custom shader and when it is not necessary. In case you really need to create one, this section will cover the basics of effects creation with Shader Graph, a tool to create effects using a visual node-based editor. This is an easy tool to use when you are not used to coding.
In this section, we will discuss the following concepts of the Shader Graph:
Let’s start by seeing how we can create and use a Shader Graph.
Shader Graph is a tool that allows us to create custom effects using a node-based system. An effect in the Shader Graph can look like in the following screenshot:
Figure 10.15: Shader Graph with nodes to create a custom effect
We will discuss later what those nodes do and we will be creating an example effect step by step, but in the screenshot, you can see how the author created and connected several nodes—the interconnected boxes—with each one executing a specific process to achieve the effect. The idea of creating effects with Shader Graph is to learn which specific nodes you need and how to connect them properly. This is similar to the way we code the gameplay of the game, but this Shader Graph is adapted and simplified just for effect purposes.
To create and edit our first Shader Graph, do the following:
Figure 10.16: PBR Shader Graph creation
Water
. If you want the opportunity to rename the asset, remember that you can select the asset, right-click, and select Rename:Figure 10.17: Shader Graph Asset
WaterMaterial
and set Shader to Shader Graphs/Water. If for some reason Unity doesn’t allow you to do that, try right-clicking on the Water Graph and clicking Reimport. As you can see, the created Shader Graph now appears as a shader in the Material:Figure 10.18: Setting a Shader Graph as a Material Shader
Now, you have created your first custom shader and applied it to a Material. So far, it doesn’t look interesting at all—it’s just a gray effect—but now it’s time to edit the graph to unlock its full potential. As the name of the graph suggests, we will be creating a water effect in this chapter to illustrate several nodes of the Shader Graph toolset and how to connect them, so let’s start by discussing the Master node.
When you open the graph by double-clicking the shader asset, you will see the following:
Figure 10.19: Master node with all of the properties needed to calculate object appearance
All nodes will have input pins, the data needed to work, and output pins, the results of its process. As an example, in a sum operation, we will have two input numbers and an output number, the result of the sum. In this case, you can see that the Master node only contains inputs, and that’s because all data that enters the Master node will be used by Unity to calculate the rendering and lighting of the object, things such as the desired object color or texture (Base Color input pin), how smooth it is (Smoothness input pin), or how much metal it contains (Metallic input pin), properties that will affect how the lighting will be applied to the object.
You can see that the Master node is split between a Vertex section and a Fragment section. The first is capable of changing the mesh of the object we are modifying to deform it, animate it, etc., while the latter will change how it will look, which textures to use, how it will be illuminated, etc. Let’s start exploring how we can change that data in the Fragment section by doing the following:
Figure 10.20: Base Color node input pin
Figure 10.21: Color picker
0.9
, which will make the object almost completely smooth (90% of the total smoothness possible). This will make our water reflect the sky almost completely:Figure 10.22: Smoothness PBR Master node input pin
Figure 10.23: Shader Graph saving options
Figure 10.24: Initial Shader Graph results
As you can see, the behavior of the shader varies according to the properties you set in the Master node, but so far, doing this is no different than creating an Unlit Shader and setting up its properties; the real power of Shader Graph is when you use nodes that do specific calculations as inputs of the Master node. We will start looking at the texturing nodes, which allow us to apply Textures to our model.
The idea of using Textures is to have an image applied to the model in a way that we can paint different parts of the models with different colors. Remember that the model has a UV map, which allows Unity to know which part of the Texture will be applied to which part of the model:
Figure 10.25: On the left, a face Texture; on the right, the same texture applied to a face mesh
We have several nodes to do this task, one of them being Sample Texture 2D, a node that has two main inputs. First, it asks us for the texture to sample or apply to the model, and then for the UV. You can see it in the following screenshot:
Figure 10.26: Sample Texture 2D node
As you can see, the default value of the Texture input node is None, so there’s no texture by default, and we need to manually specify that. For UV, the default value is UV0
, meaning that, by default, the node will use the main UV channel of the model, and yes, a model can have several UVs set. For now, we will stick with the main one. If you are not sure what that means, UV0 is the safest option. Let’s try this node, doing the following:
Figure 10.27: Water tileable Texture
Figure 10.28: Texture Repeat mode
Figure 10.29: Shader Graph Create Node option
Sample texture
and all of the sampler nodes will show up. Double-click Sample Texture 2D. If for some reason you can’t double-click the option, right-click on it first and then try again. There is a known bug on this tool and this is the workaround:Figure 10.30: Sample texture node search
Figure 10.31: Sample Texture node with a Texture in its input pin
Figure 10.32: Connecting the results of a Texture sampling with the Base Color pin of the Master node
Figure 10.33: Results of applying a Texture in our Shader Graph
As you can see, the Texture is properly applied to the model, but if you take into account that the default plane has a size of 10x10 meters, the ripples of the water seem too big. So, let’s tile the Texture!
To do this, we need to change the UVs of the model, making them bigger. You may imagine that bigger UVs mean the Texture should also get bigger, but take into account that we are not making the object bigger; we are just modifying the UV.
In the same object area, we will display more of the texture area, meaning that in the bigger texture sample area (achieved by bigger UVs), repetitions of the texture may appear. To do so, follow the next steps:
Figure 10.34: Searching for the UV node
4
,4
,4
,4
:Figure 10.35: Multiplying the UVs by 4
Figure 10.36: Using the multiplied UVs to sample the Texture
Figure 10.37: Results of the model’s UV multiplication
Another interesting effect we can do now is to apply an offset to the Texture to move it. The idea is that even if the plane is not actually moving, we will simulate the flow of the water through it, moving just the Texture. Remember, the responsibility of determining the part of the Texture to apply to each part of the model belongs to the UV, so if we add values to the UV coordinates, we will be moving them, generating a Texture sliding effect. To do so, let’s do the following:
Figure 10.38: Adding values to the UVs
Figure 10.39: Adding time to the UVs
Figure 10.40: Added and multiplied UVs as an input of the sample Texture
Figure 10.41: Enabling Always Refresh to preview the effect
Figure 10.42: Multiplication of time to move the texture slower
Figure 10.43: Hiding the preview from the graph nodes
Figure 10.44: Hiding unused pins from the graph nodes
So, to recap, first we added the time to the UV to move it and then multiplied the result of the moved UV to make it bigger to tile the Texture. It is worth mentioning that there’s a Tiling and Offset node that does all of this process for us, but I wanted to show you how a simple multiplication to scale the UV and an add operation to move it generates a nice effect; you can’t imagine all of the possible effects you can achieve with other simple mathematical nodes! Actually, let’s explore other usages of mathematical nodes to combine Textures in the next section.
Even though we have used nodes, we haven’t created anything that can’t be created using regular shaders, but that’s about to change. So far, we can see the water moving but it stills look static, and that’s because the ripples are always the same. We have several techniques to generate ripples, and the simplest one would be to combine two water Textures moving in different directions to mix their ripples, and actually, we can simply use the same Texture just flipped to save some memory. To combine the Textures, we will sum them and then divide them by 2, so basically, we are calculating the average of the textures! Let’s do that by doing the following:
Figure 10.45: Selecting several nodes
Figure 10.46: Duplication of nodes
-4
,-4
,-4
,-4
. You can see that that flipped the texture.-0.1
:Figure 10.47: Multiplication of values
Figure 10.48: Adding two Textures
0.5,0.5,0.5,0.5
, which will divide each resulting color channel by 2, averaging the color. You can also experiment with what happens when you set different values to each channel if you want, but for our purposes, 0.5
is the proper value for each channel:Figure 10.49: Dividing the sum of two Textures to get the average
Figure 10.50: Results of texture blending
You can keep adding nodes to make the effect more diverse, such as using Sine nodes (which will execute the trigonometry sine operation) to apply non-linear movements and so on, but I will let you learn that by experimenting with this by yourself. For now, we will stop here. As always, this topic deserves a full book, and the intention of this chapter is to give you a small taste of this powerful Unity tool. I recommend you look for other Shader Graph examples on the internet to learn other usages of the same nodes and, of course, new nodes. One thing to consider here is that everything we just did is basically applied to the Fragment Shader stage of the Shader Pipeline we discussed earlier. Now, let’s use the Blending Shader stage to apply some transparency to the water.
Before declaring our effect finished, a little addition we can do is to make the water a little bit transparent. Remember that the Shader Pipeline has a blending stage, which has the responsibility of blending each pixel of our model into the image being rendered in this frame. The idea is to make our Shader Graph modify that stage to apply Alpha Blending, a blending mode that combines our model and the previously rendered models based on the Alpha value of our model.
To get that effect, take the following steps:
Figure 10.51: Graph Inspector Transparency settings
0.5
.Figure 10.52: Setting Alpha of the Master node
Figure 10.53: Shadows from the water being applied to a cube
Figure 10.54: Enabling lights in the Scene View
Figure 10.55: Disabling shadow casting
Adding transparency is a simple process but it has its caveats, like the shadow problem, and in more complex scenarios, it can have other problems, like increasing overdraw, meaning the same pixel needs to be drawn several times (the pixel belonging to the transparent object, and one of the objects behind). I would suggest you avoid using transparency unless it is necessary. Actually, our water can live without transparency, especially when we apply this water to the river basin around the base because we don’t need to see the part under the water, but the idea is for you to know all of your options. In the next screenshot, you can see how we have put a giant plane with this effect below our base, big enough to cover the entire basin:
Figure 10.56: Using our water in the main scene
Now that we have modified how the object looks through the Fragment node section, let’s discuss how to use the Vertex section to apply a mesh animation to our water.
So far, we have applied water textures to our water, but it’s still a flat plane. We can go further than that and make the ripples not only via textures but also by animating the mesh. To do so, we will apply the noise texture we used at the beginning of the chapter in the shader, but instead of using it as another color to add to the Base Color of the shader, we will instead use it to offset the Y position of the vertexes of our plane.
Due to the chaotic nature of the noise texture, the idea is that we will apply a vertical offset to different parts of the model, so we can emulate the ripples:
Figure 10.57: Default plane mesh subdivided into a grid of 10x10 with no offset
To accomplish something like this, you can modify the Vertex section of your shader to look like the following:
Figure 10.58: Ripples vertex effect
In the graph, you can see how we are creating a Vector whose y axis depends on the noise Texture we downloaded at the beginning of the chapter. The idea behind that is to create a Vector pointing upward whose length is proportional to the grayscale factor of the texture; the whiter the pixel of the texture, the longer the offset. This texture has an irregular yet smooth pattern so it can emulate the behavior of the tide.
Please notice that here we used Sample Texture 2D LOD instead of Sample Texture 2D; the latter does not work in the Vertex section, so keep that in mind.
Then we multiply the result by 0.3 to reduce the height of the offset to add, and then we add the result to the Position node.See that the Space property of the Position node is set to Object mode. We need that mode to work with the Vertex section of the Shader Graph (we discussed World and Local spaces before in Chapter 2, Editing Scenes and GameObjects but you can also search Object vs World Space
on the internet for more info about this). Finally, the result is connected to the Position node of the Vertex section.
If you save, you will see something like the following image:
Figure 10.59: Ripples vertex effect applied
Of course, in this case, the ripples are static because we didn’t add any time offset to the UV as we did before. In the following screenshot, you can see how to add that, but before looking at it I recommend you try to resolve it first by yourself as a personal challenge:
Figure 10.60: Animated ripples vertex effect graph
As you can see, we are again taking the original UV and adding the time multiplied by any factor so it will slowly move, the same as we did previously with our water texture. You can keep playing around with this, changing how this looks with different textures, multiplying the offset to increase or reduce the height of the ripples, applying interesting math functions like sine, and so much more, but for now, let’s finish with this.
In this chapter, we discussed how a shader works in the GPU and how to create our first simple shader to achieve a nice water effect. Working with shaders is a complex and interesting job, and in a team, there is usually one or more people in charge of creating all of these effects, in a position called Technical Artist; so, as you can see, this topic can expand up to a whole career. Remember, the intention of this book is to give you a small taste of all the possible roles you can take in the industry, so if you really liked this role, I suggest you start reading shader-exclusive books. You have a long but super interesting road in front of you.
Enough shaders for now! In the next chapter, we will look at how to improve our graphics and create visual effects with particle systems!