Welcome to the first chapter of Part 2! I am super excited that you have reached this part of the book because here we will dive deep into different graphics and audio systems of Unity to dramatically improve the look and feel of the game. We will start this part with this chapter, where we will be discussing what the Shader of a Material is and how to create our own Shaders to achieve several custom effects that couldn't be accomplished using default Unity Shaders. We will be creating a simple water animation effect to learn this new concept.
In this chapter, we will examine the following Shader concepts:
We created Materials in the previous chapter, but we never discussed how they internally work and why the Shader property is super important. In this first section of the chapter, we will be exploring the concept of a Shader as a way to program the video card to achieve custom visual effects.
In this section, we will cover the following concepts related to Shaders:
Let's start by discussing how a Shader modifies the Shader Pipeline to achieve effects.
Whenever a video card renders a 3D model, it needs input data to process, such as a Mesh, Textures, the transform of the object (position, rotation, and scale), and lights that affect that object. With that data, the video card must output the pixels of the object into the Back-Buffer, the image where the video card will be drawing our objects. That image will be shown when Unity finishes rendering all objects (and some effects) to display the finished scene. Basically, the Back-Buffer is the image the video card renders step by step, showing it when the drawing has finished (at that moment, it becomes the Front-Buffer, swapping with the previous one).
That's the usual way to render an object, but what happens between the input of the data and the output of the pixels can be handled through a myriad of different ways and techniques that depend on how you want your object to look; maybe you want it to be realistic or look like a hologram, maybe the object needs a disintegration effect or a toon effect—there are endless possibilities. The way to specify how our video card will handle the render of the object is through a Shader.
A Shader is a program coded in a specific video card language, such as CG, HLSL, or GLSL, that configures different stages of the render process, sometimes not only configuring them but also replacing them with completely custom code to achieve the exact effect we want. All of the stages of rendering form what we call the Shader Pipeline, a chain of modifications applied to the input data until it's transformed into pixels.
Important note
Sometimes, what we called the Shader Pipeline in this book can be also found in another bibliography as the Render Pipeline, and whereas the latter is also correct, in Unity, the term Render Pipeline refers to something different, so let's stick with this name.
Each stage of the pipeline is in charge of different modifications and depending on the video card Shader Model, this pipeline can vary a lot. In the next diagram, you can find a simplified Render Pipeline, skipping advanced/optional stages that are not important right now:
Let's discuss each of the stages:
Shader Pipelines is a subject that would require an entire book, but for the scope of this book, the previous description will give you a good idea of what a Shader does, and the possible effects that it can achieve. Now that we have discussed how a Shader renders a single object, it is worth discussing how Unity renders all objects using Render Pipelines.
We have covered how the video card renders an object, but Unity is in charge of asking the video card to execute its Shader Pipeline per object. To do so, Unity needs to do lots of preparations and calculations to determine exactly how and when each Shader needs to be executed. The responsibility of doing this is given to what Unity calls a Render Pipeline.
A Render Pipeline is a way to draw the objects of a scene. At first, it sounds like there should be just one simple way of doing this, such as just iterating over all objects in the scene and executing the Shader Pipeline with the Shader specified in each object's Material, but it can be more complex than that. Usually, the main difference between one Render Pipeline and another is the way in which lighting and some advanced effects are calculated, but they can differ in other ways.
In previous Unity versions, there was just one single Render Pipeline, which is now called the Built-in Render Pipeline. It was a Pipeline that had all of the possible features you would need for all kinds of projects, from mobile 2D graphics and simple 3D graphics to cutting-edge 3D graphics what ones you can find on consoles or high-end PCs. This sounds ideal, but actually, it isn't; having one single giant renderer that needs to be highly customizable to adapt to all possible scenarios generates lots of overhead and limitations that cause more headaches than creating a custom Render Pipeline. Luckily, the latest version of Unity introduced the Scriptable Render Pipeline (SRP), a way to create Render Pipelines adapted for your project.
Thankfully, Unity doesn't want you to create your own Render Pipeline for each project (a complex task), so it created two custom Pipelines for you that are ready to use: URP (formerly called LWRP), which stands for Universal Render Pipeline, and HDRP, which stands for High Definition Render Pipeline. The idea is that you must choose one or the other based on your project requirements (unless you really need to create your own). URP, the one we selected when creating the project for our game, is a Render Pipeline suitable for most games that don't require lots of advanced graphics features, such as mobile games or simple PC games, while HDRP is packed with lots of advanced rendering features for high-quality games. The latter requires high-end hardware to run, while URP runs in almost every relevant target device. It is worth mentioning that you can switch between Built-in Renderer, HDRP, and URP whenever you want, including after creating the project (not recommended):
We can discuss how each one is implemented and the differences between each, but again, this can fill entire chapters; right now, the idea of this section is for you to know why we picked URP when we created our project because it has some restrictions we will encounter throughout this book that we will need to take into account, so it is good to know why we accepted those limitations (to run our game on all relevant hardware). Also, we need to know that we have chosen URP because it has support for Shader Graph, the Unity tool that we will be using in this chapter to create custom effects. Previous Unity Built-in Pipelines didn't provide us with such a tool (aside from third-party plugins). Finally, another reason to introduce the concept of URP is that it comes with lots of built-in Shaders that we will need to know about before creating our own to avoid reinventing the wheel, and to adapt ourselves to those Shaders, because if you came from previous versions of Unity, the ones you know won't work here, and actually this is exactly what we are going to discuss in the next section of this book: the differences between the different URP Built-in Shaders.
Now that we know the difference between URP and other pipelines, let's discuss which Shaders come integrated into URP. Let's briefly describe the three most important Shaders in this Pipeline:
Let's do an interesting disintegration effect with the Simple Lit Shader to demonstrate its capabilities. You must do the following:
Important note
The alpha channel of a color is often associated with transparency, but you will notice that our object won't be transparent. The Alpha channel is extra color data that can be used for several purposes when doing effects. In this case, we will use it to determine which pixels are deintegrated first.
The idea of this section is not to give a comprehensive guide to all of the properties of all URP Shaders, but to give you an idea of what a Shader can do when properly configured and when to use each one of the integrated Shaders. Sometimes, you can achieve the effect you need just by using existing Shaders. In fact, you can probably do so for probably 99% of the cases in simple games, so try to stick to them as much as you can. But if you really need to create a custom Shader to create a very specific effect, the next section will teach you how to use the URP tool called Shader Graph.
Now that we know how Shaders work and the existing Shaders in URP, we have a basic notion of when it is necessary to create a custom Shader and when it is not necessary. In case you really need to create one, this section will cover the basics of effects creation with Shader Graph, a tool to create effects using a visual node-based editor, being an easy tool to use when you are not used to coding.
In this section, we will discuss the following concepts of the Shader Graph:
Let's start seeing how we can create and use a Shader Graph.
Shader Graph is a tool that allows us to create custom effects using a node-based system. An effect in Shader Graph can look like the following screenshot, where you can see the nodes needed to create a hologram effect:
We will discuss later what those nodes do and will do a step-by-step effect example, but in the screenshot, you can see how the author created and connected several nodes, which are those interconnected boxes, each one doing a specific process to achieve the effect. The idea of creating effects with Shader Graph is to learn which specific nodes you need and how to connect them properly, to create an "algorithm" or a series of ordered steps to achieve a specific result. This is similar to the way we code the gameplay of the game, but this Graph is adapted and simplified just for effect purposes.
To create and edit our first Shader Graph asset, do the following:
Now, you have created your first custom Shader and applied it to a Material. So far, it doesn't look interesting at all—it's just a gray effect, but now it's time to edit the graph to unlock its full potential. As the name of the Graph suggests, we will be creating a water effect in this chapter to illustrate several nodes of the Shader Graph toolset and how to connect them, so let's start by discussing the Master node. When you open the graph by double-clicking it, you will see the following:
All nodes will have input pins, the data they need to work, and output pins, which are the results of its process. As an example, in a sum operation, we will have two input numbers and an output number, the result of the sum. In this case, you can see that the Master node just has inputs, and that's because all data that enters the Master node will be used by Unity to calculate the Rendering and Lighting of the object, things such as the desired object color or texture (the Albedo input pin), how smooth it is (the Smoothness input pin), or how much metal it contains (the Metallic input pin), so they are all of the properties that will affect how the lighting will be applied to the object. In a sense, the input of this node is the output data of the entire graph and the ones we need to fill.
Let's start exploring how we can change that output data by doing the following:
As you can see, the behavior of the Shader varies according to the properties you set in the Master node, but so far, doing this is no different than creating an Unlit Shader and setting up its properties; the real power of Shader Graph is when you use nodes that do specific calculations as inputs of the Master node. We will start seeing the texturing nodes, which allow us to apply Textures to our model.
The idea of using Textures is to have an image applied to the model in a way that means we can paint different parts of the model with different colors. Remember that the model has the UV map, which allows Unity to know which part of the Texture will be applied to which part of the model:
We have several nodes to do this task, one of them being Sample Texture 2D, a node that has two main inputs. First, it asks us for the texture to sample or apply to the model and then the UV. You can see it in the following screenshot:
As you can see, the default value of the Texture input node is None, so there's no texture by default, and we need to manually specify that. For UV, the default value is UV0, meaning that, by default, the node will use the main UV channel of the model, and yes, a model can have several UVs set, but for now, we will stick with the main one. Let's try this node, doing the following:
As you can see, the texture is properly applied to the model, but if you take into account that the default plane has a size of 10x10 meters, the ripples of the water seem too big, so let's tile the Texture! To do that, we need to change the UVs of the model, making them bigger. Bigger UVs sounds like the Texture should also get bigger, but take into account that we are not making the object bigger; we are just modifying the UV, so the same object size will read more of the texture, meaning that the bigger texture sample area will make repetitions of the texture and put them in the same object size, so that will be compressed inside the model area. To do so, follow the next steps:
Another interesting effect we can do now is to apply an Offset to the Texture to move it. The idea is that even if the plane is not actually moving, we will simulate the flow of the water through it, moving just the Texture. Remember, the responsibility of determining the part of the Texture to apply to each part of the model belongs to the UV, so if we add values to the UV coordinates, we will be moving them, generating a Texture sliding effect. To do so, let's do the following:
So, to recap, first we added the time to the UV to move it and then multiplied the result of the moved UV to make it bigger to tile the Texture. It is worth mentioning that there's a Tiling and Offset node that does all of this for us, but I wanted to show you how a simple multiplication to scale the UV and an add operation to move it generated a nice effect; you can't imagine all of the possible effects you can achieve with other simple mathematical nodes! Actually, let's explore other usages of mathematical nodes to combine Textures in the next section.
Even though we have used nodes, we haven't created anything that can't be created using regular Shaders, but that's about to change. So far, we can see the water moving but it still look static, and that's because the ripples are always the same. We have several techniques to generate ripples, and the simplest one would be to combine two water Textures moving in different directions to mix their ripples, and actually, we can simply use the same Texture, just flipped, to save some memory. To combine the Textures, we will sum them and then divide them by 2, so basically, we are calculating the average of the textures! Let's do that by doing the following:
You can keep adding nodes to make the effect more diverse, such as using Sinus nodes to apply non-linear movements and so on, but I will let you learn that by experimenting with this by yourself. For now, we will stop here. As always, this topic deserves a full book, and the intention of this chapter is to give you a small taste of this powerful Unity tool. I recommend you look for other Shader Graphs examples on the internet to learn other usages of the same nodes and, of course, new nodes. One thing to consider here is that everything we just did is basically applied to the Fragment Shader stage of the Shader Pipeline we discussed earlier. Now, let's use the Blending Shader stage to apply some transparency to the water.
Before declaring our effect finished, a little addition we can do is to make the water a little bit transparent. Remember that the Shader Pipeline has this Blending stage, which has the responsibility of blending each pixel of our model into the image being rendered in this frame. The idea is to make our Shader Graph modify that stage to apply an Alpha Blending, a blending that combines our model and the previous rendered models based on the Alpha value of our model. To get that effect, do the following steps:
Adding transparency is a simple process but has its caveats, such as the shadow problem, and in more complex scenarios, it can have other problems, so I would suggest that you avoid using transparency unless it is necessary. Actually, our water can live without transparency, especially when we apply this water to the river basin around the base, because we don't need to see what's under the water, but the idea is for you to know all of your options. In the next screenshot, you can see how we have put a giant plane with this effect below our base, big enough to cover the entire basin:
In this chapter, we discussed how a Shader works using a GPU and how to create our first simple Shader to achieve a nice water effect. Using Shaders is a complex and interesting job, and in a team, there are usually one or more people in charge of creating all of these effects, in a position called technical artist; so, as you can see, this topic can expand up to become a whole career. Remember, the intention of this book is to give you a small taste of all the possible roles you can take in the industry, so if you really liked this role, I suggest you start reading Shader-exclusive books. You have a long but super-interesting road in front of you.
But enough Shaders, for now—let's move to the next topic about improving graphics and creating visual effects with particle systems!