10

Materials and Effects with URP and Shader Graph

Welcome to the first chapter of Part 3. Here, we will dive deep into the different graphics and audio systems of Unity to dramatically improve the look and feel of the game. We will start by discussing what a shader is and how to create our own to achieve several custom effects that couldn’t be accomplished using the default Unity Shaders. We will be creating a simple water animation effect using Shader Graph, a visual shader editor included in the Universal Render Pipeline. Also known as URP, this is one of the different rendering pipelines available in Unity, which provides rendering features oriented toward performance. We will be discussing some of its capabilities in this chapter.

In this chapter, we will examine the following shader concepts:

  • Introducing shaders and URP
  • Creating shaders with Shader Graph

Introducing shaders and URP

We created Materials in Part 1 of the book, but we never discussed how they internally work and why their Shader property is important. In this first section of this chapter, we will be exploring the concept of a shader as a way to program the video card to achieve custom visual effects. We will also be discussing how URP works with those shaders, and the default shaders it provides.

In this section, we will cover the following concepts related to shaders:

  • Shader Pipeline
  • Render Pipeline and URP
  • URP built-in shaders

Let’s start by discussing how a shader modifies the Shader Pipeline to achieve effects.

Shader Pipeline

Whenever a video card renders a 3D model, it needs different information to process, such as a Mesh, Textures, the transform of the object (position, rotation, and scale), and lights that affect that object. With that data, the video card must output the pixels of the object into the back-buffer, an image where the video card will be drawing our objects, but the user won’t see this yet. This is done to prevent the user from seeing unfinished results, given we can still be drawing at the time the monitor refreshes. That image will be shown when Unity finishes rendering all objects (and some effects) to display the finished scene, swapping the Back-buffer with the front-buffer, the image that the user actually sees. You can imagine this as having a page with an image that is being shown to the user while you draw a new image, and when you finish the new drawing, you just swap the pages and start drawing again on the page the user is not seeing, repeating this with every frame.

That’s the usual way to render an object, but what happens between the input of the data and the output of the pixels can be handled in a myriad of different ways and techniques that depend on how you want your object to look; maybe you want it to be realistic or look like a hologram, maybe the object needs a disintegration effect or a toon effect—there are endless possibilities. The way to specify how our video card will handle the render of the object is through a shader.

A shader is a program coded in specific video card languages, such as:

  • HLSL: The DirectX shading language, DirectX being a graphics library.
  • GLSL: The OpenGL shading language, OpenGL also being a graphics library.
  • CG: A language that can output either HLSL or GLSL, depending on which graphics library we use in our game.
  • Shader Graph: A visual language that will be automatically converted into one of the previously mentioned languages according to our needs. This is the one we will be using given its simplicity (more on that later).

Any of those languages can be used to configure different stages of the render process necessary to render a given object, sometimes not only configuring them but also replacing them with completely custom code to achieve the exact effect we want. All of the stages to render an object make up what we call the Shader Pipeline, a chain of modifications applied to the input data until it is transformed into pixels.

Each stage of the pipeline is in charge of different modifications and depending on the video card shader model, this pipeline can vary a lot. In the next diagram, you can find a simplified Render Pipeline, skipping advanced/optional stages that are not important right now:

Figure 10.1: Common Shader Pipeline

Let’s discuss each of the stages:

  • Input Assembler: Here is where all of the mesh data, such as vertex position, UVs, and normals, is assembled to be prepared for the next stage.
  • Vertex Shader: This stage used to be limited to applying the transformation of the object, the position and perspective of the camera, and simple lighting calculations. In modern GPUs, you are in charge of doing whatever you want. This stage receives each one of the vertexes of the object to render and outputs a modified one. You have the chance to modify the geometry of the object here. The usual code here is applying the transform of the object, but you can also apply several effects such as inflating the object along its normals to apply the old toon effect technique or apply distortion adding random offsets to each vertex to recreate a hologram. There’s also the opportunity to calculate data for the next stages.
  • Culling: Most of the models you are going to render have the particularity that you will never see the back side of a model face. In a cube, there’s no way to look at its inner sides. Given that, rendering both sides of each face of the cube makes no sense, and this stage takes care of that. Culling will determine whether the face needs to be rendered based on the orientation of the face, saving lots of pixel calculation of occluded faces. You can change this to behave differently for specific cases; as an example, we can create a glass box that needs to be transparent to see all sides of the box.
  • Rasterizer: Now that we have the modified and visible geometry of our model calculated, it’s time to convert it into pixels. The rasterizer creates all pixels for the triangles of our mesh. Lots of things happen here but again, we have very little control of that; the usual way to rasterize is just to create all pixels inside the edges of the mesh triangles. We have other modes that just render the pixels on the edges to see a wireframe effect, but this is usually used for debugging purposes:
Image result for rasterizer

Figure 10.2: Example of figures being rasterized

  • Fragment Shader: This is one of the most customizable stages of all. Its purpose is simple: just determine the color of each one of the fragments (pixels) that the rasterizer has generated. Here, lots of things can happen, from simply outputting a plain color or sampling a texture to applying complex lighting calculations such as normal mapping and PBR. Also, you can use this stage to create special effects such as water animations, holograms, distortions, disintegrations, and any special effects that require you to modify what the pixels look like. We will explore how we can use this stage in the next sections of this chapter.
  • Depth Testing: Before showing a pixel on the screen, we need to check whether it can be seen. This stage checks whether the pixel’s depth is behind or in front of the previous pixel rendered in the same position, guaranteeing that regardless of the rendering order of the objects, the nearest pixels to the camera are always being drawn on top of others. Again, usually, this stage is left in its default state, prioritizing pixels that are nearer to the camera, but some effects require different behavior. Also, nowadays we have Early-Z testing, which does this same test but before the Fragment shader, but let’s keep things simple for now. As an example, in the next screenshot, you can see an effect that allows you to see objects that are behind other objects, like the one used in Age of Empires when a unit is behind a building:
Image result for occluded effect shader

Figure 10.3: Rendering the occluded parts of the character

  • Blending: Once the color of the pixel is determined and we are sure the pixel is not occluded by a previous pixel, the final step is to put it in the back-buffer (the frame or image you are drawing). Usually, we just override whatever pixel was in that position (because our pixel is nearer to the camera), but if you think about transparent objects, we need to combine our pixel with the previous one to make the transparent effect. Transparencies have other things to take into account aside from the blending, but the main idea is that blending controls exactly how the pixel will be combined with the previously rendered pixel in the back-buffer.

Shader Pipelines is a subject that would require an entire book, but for the scope of this book, the previous description will give you a good idea of what a shader does, and the possible effects that it can achieve. Now that we have discussed how a shader renders a single object, it is worth discussing how Unity renders all of the objects using Render Pipelines.

Render Pipeline and URP

We have covered how the video card renders an object, but Unity is in charge of asking the video card to execute its Shader Pipeline per object. To do so, Unity needs to do lots of preparations and calculations to determine exactly how and when each shader needs to be executed. The responsibility of doing this is with what Unity calls the Render Pipeline.

A Render Pipeline is a way to draw the objects of the scene. At first, it sounds like there should be just one simple way of doing this, for example, iterating over all objects in the scene and executing the Shader Pipeline with the shader specified in each object’s Material, but it can be more complex than that.

Usually, the main difference between one Render Pipeline and another is the way in which lighting and some advanced effects are calculated, but they can differ in other ways.

In previous Unity versions, there was just one single Render Pipeline, which is now called the Built-in Renderer Pipeline (also known as BIRP). It was a pipeline that had all of the possible features you would need for all kinds of projects, from mobile 2D graphics and simple 3D to cutting-edge 3D like the ones you can find in consoles or high-end PCs. This sounds ideal, but actually, it isn’t. Having one single giant renderer that needs to be highly customizable to adapt to all possible scenarios generates lots of overhead and limitations that cause more headaches than creating a custom Render Pipeline. Luckily, the last versions of Unity introduced Scriptable Render Pipeline (SRP), a way to create Render Pipeline adapted for your project.

Luckily, Unity doesn’t want you to create your own Render Pipeline for each project (which is a complex task), so it has created two custom pipelines for you that are ready to use: URP (formerly called LWRP), which stands for Universal Render Pipeline, and HDRP, which stands for High Definition Render Pipeline. The idea is that you must choose one or the other based on your project’s requirements (unless you really need to create your own).

URP, the one we selected when creating the project for our game, is a Render Pipeline suitable for most games that don’t require lots of advanced graphics features, such as mobile games or simple PC games, while HDRP is packed with lots of advanced rendering features for high-quality games. The latter requires high-end hardware to run, while URP runs in almost every relevant target device. It is worth mentioning that you can swap between Built-in Renderer, HDRP, and URP whenever you want, including after creating the project (but this is not recommended):

Figure 10.4: Project wizard showing HDRP and URP templates

We can discuss how each one is implemented and the differences between each, but again, this could fill entire chapters; right now, the idea of this section is for you to know why we picked URP when we created our project because it has some restrictions we will encounter throughout this book that we will need to take into account, so it is good to know why we accepted those limitations (to run our game on every relevant hardware).

Also, we need to know that we have chosen URP because it has support for Shader Graph, the Unity tool that we will be using in this chapter to create custom effects. Previous Unity built-in pipelines didn’t provide us with such a tool (aside from third-party plugins). Finally, another reason to introduce the concept of URP is that it comes with lots of built-in shaders that we will need to know about before creating our own to prevent reinventing the wheel. This will allow us to get used to those shaders, because if you came from previous versions of Unity, the shaders you already know won’t work here; actually, this is exactly what we are going to discuss in the next section of this chapter: the difference between the different URP built-in shaders.

URP built-in shaders

Now that we know the difference between URP and other pipelines, let’s discuss which shaders come integrated into URP. Let’s briefly describe the three most important shaders in this pipeline:

  • Lit: This is the replacement of the old Standard Shader. This shader is useful for creating all kinds of realistic physics materials such as wood, rubber, metal, skin, and combinations of them (such as a character with skin and metal armor). It supports features like Normal Mapping, Occlusion, different lighting workflows like Metallic and Specular, and transparencies.
  • Simple Lit: This is the replacement of the old Mobile/Diffuse Shader. As the name suggests, this shader is a simpler version of Lit, meaning that its lighting calculations are simpler approximations of how light works, getting fewer features than its counterpart. Basically, when you have simple graphics without realistic lighting effects, this is the best choice.
  • Unlit: This is the replacement of the old Unlit/Texture Shader. Sometimes, you need objects with no lighting whatsoever, and in that case, this is the shader for you. No lighting doesn’t mean an absence of light or complete darkness; it actually means that the object has no shadows at all, and it’s fully visible without any shade. Some simplistic graphics can work with this, relying on shadowing being baked in the texture, meaning that the texture comes with the shadow.

This is extremely performant, especially for low-end devices such as mobile phones. Also, you have other cases such as light tubes or screens, objects that can’t receive shadows because they emit light, so they will be seen at their full color even in complete darkness. In the following screenshot, you can see a 3D model using an Unlit Shader. It looks like it’s being lit, but it’s just the texture of the model that applied lighter and darker colors in different parts of the object:

Figure 10.5: Pod using an Unlit effect to simulate cheap lighting

Let’s do an interesting disintegration effect with the Simple Lit Shader to demonstrate its capabilities. You must do the following:

  1. Download and import a Cloud Noise texture from any search engine:
Image result for cloud noise texture

Figure 10.6: Noise texture

  1. Select the recently imported texture in the Project panel.
  2. In the Inspector, set the Alpha Source property to From Gray Scale. This will make the alpha channel of the texture be calculated based on the grayscale of the image:

Figure 10.7: Generate Alpha From Gray Scale texture setting

The Alpha channel of a color is often associated with transparency, but you will notice that our object won’t be transparent. The Alpha channel is extra color data that can be used for several purposes when creating effects. In this case, we will use it to determine which pixels are being disintegrated first.

  1. Click the + icon in the Project view and select Material:

Figure 10.8: Material creation button

  1. Create a cube by going to GameObject | 3D Object | Cube:

Figure 10.9: Cube primitive creation

  1. Drag the Material from the Project window to the cube in the Scene window.
  2. Click in the drop-down menu at the right of the Shader property in the Inspector and look for the Universal Render Pipeline | Simple Lit option. We could also work with the default shader (Lit), but Simple Lit is going to be easier on performance and we won’t use the advanced features of Lit:

Figure 10.10: Simple Lit Shader selection

  1. Select the Material and drag the downloaded cloud Texture to the rectangle at the left of Base Map.
  2. Check the Alpha Clipping checkbox and set the Threshold slider to 0.5:

Figure 10.11: Alpha Clipping Threshold Material slider

  1. As you move the Threshold slider, the object will start to disintegrate. Alpha Clipping discards pixels that have less Alpha intensity than the Threshold value:

Figure 10.12: Disintegration effect with Alpha Clipping

  1. Finally, set Render Face to Both to see both sides of the cube’s faces:

Figure 10.13: Double-sided render face

  1. Take into account that the artist that creates the texture can configure the Alpha channel manually instead of calculating it from the grayscale, just to control exactly how the disintegration effect must look regardless of the texture’s color distribution:

Figure 10.14: Double-sided Alpha Clipping

The idea of this section is not to give a comprehensive guide of all of the properties of all URP shaders, but to give you an idea of what a shader can do when properly configured and when to use each one of the integrated shaders. Sometimes, you can achieve the effect you need just by using existing shaders, probably in 99% of cases in simple games, so try to stick to them as much as you can. But if you really need to create a custom shader to create a very specific effect, the next section will teach you how to use the URP tool called Shader Graph.

Creating shaders with Shader Graph

Now that we know how shaders work and the existing shaders in URP, we have a basic notion of when it is necessary to create a custom shader and when it is not necessary. In case you really need to create one, this section will cover the basics of effects creation with Shader Graph, a tool to create effects using a visual node-based editor. This is an easy tool to use when you are not used to coding.

In this section, we will discuss the following concepts of the Shader Graph:

  • Creating our first Shader Graph
  • Using textures
  • Combining textures
  • Applying transparency
  • Creating Vertex effects

Let’s start by seeing how we can create and use a Shader Graph.

Creating our first Shader Graph

Shader Graph is a tool that allows us to create custom effects using a node-based system. An effect in the Shader Graph can look like in the following screenshot:

Figure 10.15: Shader Graph with nodes to create a custom effect

We will discuss later what those nodes do and we will be creating an example effect step by step, but in the screenshot, you can see how the author created and connected several nodes—the interconnected boxes—with each one executing a specific process to achieve the effect. The idea of creating effects with Shader Graph is to learn which specific nodes you need and how to connect them properly. This is similar to the way we code the gameplay of the game, but this Shader Graph is adapted and simplified just for effect purposes.

To create and edit our first Shader Graph, do the following:

  1. In the Project window, click the + icon and find the Shader Graph | URP | Lit Shader Graph option. This will create a Shader Graph using the PBR mode, meaning that this shader will support lighting effects (unlike Unlit Graphs):

Figure 10.16: PBR Shader Graph creation

  1. Name it Water. If you want the opportunity to rename the asset, remember that you can select the asset, right-click, and select Rename:

Figure 10.17: Shader Graph Asset

  1. Create a new Material called WaterMaterial and set Shader to Shader Graphs/Water. If for some reason Unity doesn’t allow you to do that, try right-clicking on the Water Graph and clicking Reimport. As you can see, the created Shader Graph now appears as a shader in the Material:

Figure 10.18: Setting a Shader Graph as a Material Shader

  1. Create a plane with the GameObject | 3D Object | Plane option.
  2. Drag the Material to the Plane to apply it.

Now, you have created your first custom shader and applied it to a Material. So far, it doesn’t look interesting at all—it’s just a gray effect—but now it’s time to edit the graph to unlock its full potential. As the name of the graph suggests, we will be creating a water effect in this chapter to illustrate several nodes of the Shader Graph toolset and how to connect them, so let’s start by discussing the Master node.

When you open the graph by double-clicking the shader asset, you will see the following:

Figure 10.19: Master node with all of the properties needed to calculate object appearance

All nodes will have input pins, the data needed to work, and output pins, the results of its process. As an example, in a sum operation, we will have two input numbers and an output number, the result of the sum. In this case, you can see that the Master node only contains inputs, and that’s because all data that enters the Master node will be used by Unity to calculate the rendering and lighting of the object, things such as the desired object color or texture (Base Color input pin), how smooth it is (Smoothness input pin), or how much metal it contains (Metallic input pin), properties that will affect how the lighting will be applied to the object.

You can see that the Master node is split between a Vertex section and a Fragment section. The first is capable of changing the mesh of the object we are modifying to deform it, animate it, etc., while the latter will change how it will look, which textures to use, how it will be illuminated, etc. Let’s start exploring how we can change that data in the Fragment section by doing the following:

  1. Double-click the Shader Graph asset in Project View to open its editor.
  2. Click in the gray rectangle at the left of the Base Color input pin:

Figure 10.20: Base Color node input pin

  1. In the color picker, select a light blue color, like water. Select the bluish part of the circle and then a shade of that color in the middle rectangle:

Figure 10.21: Color picker

  1. Set Smoothness to 0.9, which will make the object almost completely smooth (90% of the total smoothness possible). This will make our water reflect the sky almost completely:

Figure 10.22: Smoothness PBR Master node input pin

  1. Click the Save Asset button at the top left of the window:

Figure 10.23: Shader Graph saving options

  1. Go back to the Scene View and check the plane is light blue with the sun reflected on it:

Figure 10.24: Initial Shader Graph results

As you can see, the behavior of the shader varies according to the properties you set in the Master node, but so far, doing this is no different than creating an Unlit Shader and setting up its properties; the real power of Shader Graph is when you use nodes that do specific calculations as inputs of the Master node. We will start looking at the texturing nodes, which allow us to apply Textures to our model.

Using Textures

The idea of using Textures is to have an image applied to the model in a way that we can paint different parts of the models with different colors. Remember that the model has a UV map, which allows Unity to know which part of the Texture will be applied to which part of the model:

Figure 10.25: On the left, a face Texture; on the right, the same texture applied to a face mesh

We have several nodes to do this task, one of them being Sample Texture 2D, a node that has two main inputs. First, it asks us for the texture to sample or apply to the model, and then for the UV. You can see it in the following screenshot:

Figure 10.26: Sample Texture 2D node

As you can see, the default value of the Texture input node is None, so there’s no texture by default, and we need to manually specify that. For UV, the default value is UV0, meaning that, by default, the node will use the main UV channel of the model, and yes, a model can have several UVs set. For now, we will stick with the main one. If you are not sure what that means, UV0 is the safest option. Let’s try this node, doing the following:

  1. Download and import a tileable water texture from the internet:
Image result for tileable water texture

Figure 10.27: Water tileable Texture

  1. Select the Texture and be sure that the Wrap Mode property of the Texture is set to Repeat, which will allow us to repeat the Texture as we did in the terrain because the idea is to use this shader to cover large water areas:

Figure 10.28: Texture Repeat mode

  1. In the Water Shader Graph, right-click in an empty area of the Shader Graph and select Create Node:

Figure 10.29: Shader Graph Create Node option

  1. In the Search box, write Sample texture and all of the sampler nodes will show up. Double-click Sample Texture 2D. If for some reason you can’t double-click the option, right-click on it first and then try again. There is a known bug on this tool and this is the workaround:

Figure 10.30: Sample texture node search

  1. Click in the circle to the left of the Texture input pin of the Sample Texture 2D node. It will allow us to pick a Texture to sample—just select the water one. You can see that the Texture can be previewed in the bottom part of the node:

Figure 10.31: Sample Texture node with a Texture in its input pin

  1. Drag the output pin RGBA from the Sample Texture 2D node to the Base Color input pin of the Master node:

Figure 10.32: Connecting the results of a Texture sampling with the Base Color pin of the Master node

  1. Click the Save Asset button at the top-left part of the Shader Graph editor and see the changes in the Scene view:

Figure 10.33: Results of applying a Texture in our Shader Graph

As you can see, the Texture is properly applied to the model, but if you take into account that the default plane has a size of 10x10 meters, the ripples of the water seem too big. So, let’s tile the Texture!

To do this, we need to change the UVs of the model, making them bigger. You may imagine that bigger UVs mean the Texture should also get bigger, but take into account that we are not making the object bigger; we are just modifying the UV.

In the same object area, we will display more of the texture area, meaning that in the bigger texture sample area (achieved by bigger UVs), repetitions of the texture may appear. To do so, follow the next steps:

  1. Right-click in any empty space and click New Node to search for the UV node:

Figure 10.34: Searching for the UV node

  1. Using the same method, create a Multiply node.
  2. Drag the Out pin of the UV node to the A pin of the Multiply node to connect them.
  3. Set the B pin input value of Multiply to 4,4,4,4:

Figure 10.35: Multiplying the UVs by 4

  1. Drag the Out pin of the Multiply node to the UV of the Sample Texture 2D node to connect them:

Figure 10.36: Using the multiplied UVs to sample the Texture

  1. If you save the graph and go back to the Scene view, you can see that now the ripples are smaller, because we have tiled the UVs of our model. You can also see that in the preview of the Sampler Texture 2D node:

Figure 10.37: Results of the model’s UV multiplication

Another interesting effect we can do now is to apply an offset to the Texture to move it. The idea is that even if the plane is not actually moving, we will simulate the flow of the water through it, moving just the Texture. Remember, the responsibility of determining the part of the Texture to apply to each part of the model belongs to the UV, so if we add values to the UV coordinates, we will be moving them, generating a Texture sliding effect. To do so, let’s do the following:

  1. Create an Add node to the right of the UV node.
  2. Connect the Out pin of the UV to the A pin of the Add node:

Figure 10.38: Adding values to the UVs

  1. Create a Time node at the left of the Add node.
  2. Connect the Time node to the B pin of the Add node:

Figure 10.39: Adding time to the UVs

  1. Connect the Out pin of the Add node to the A input pin of the Multiply node:

Figure 10.40: Added and multiplied UVs as an input of the sample Texture

  1. Save and see the water moving in the Scene view. If you don’t see it moving, click the layers icon in the top bar of the scene and check Always Refresh:

Figure 10.41: Enabling Always Refresh to preview the effect

  1. If you feel the water is moving too fast, try using the multiplication node to make the time a smaller value. I recommend you try it by yourself before looking at the next screenshot, which has the answer:

Figure 10.42: Multiplication of time to move the texture slower

  1. If you feel the graph is too big, try to hide some of the node previews by clicking on the up (^) arrow that appears on the preview when you move the mouse over it:

Figure 10.43: Hiding the preview from the graph nodes

  1. Also, you can hide unused pins by selecting the node and clicking the arrow at its top right:

Figure 10.44: Hiding unused pins from the graph nodes

So, to recap, first we added the time to the UV to move it and then multiplied the result of the moved UV to make it bigger to tile the Texture. It is worth mentioning that there’s a Tiling and Offset node that does all of this process for us, but I wanted to show you how a simple multiplication to scale the UV and an add operation to move it generates a nice effect; you can’t imagine all of the possible effects you can achieve with other simple mathematical nodes! Actually, let’s explore other usages of mathematical nodes to combine Textures in the next section.

Combining Textures

Even though we have used nodes, we haven’t created anything that can’t be created using regular shaders, but that’s about to change. So far, we can see the water moving but it stills look static, and that’s because the ripples are always the same. We have several techniques to generate ripples, and the simplest one would be to combine two water Textures moving in different directions to mix their ripples, and actually, we can simply use the same Texture just flipped to save some memory. To combine the Textures, we will sum them and then divide them by 2, so basically, we are calculating the average of the textures! Let’s do that by doing the following:

  1. Select all of the nodes between Time and Sampler 2D (including them) creating a selection rectangle by clicking in any empty space in the graph, holding and dragging the click, and then releasing when all target nodes are covered:

Figure 10.45: Selecting several nodes

  1. Right-click and select Copy, and then again right-click and select Paste, or use the classic Ctrl + C, Ctrl + V commands (Command + C, Command + V on Mac).
  2. Move the copied nodes below the original ones:

Figure 10.46: Duplication of nodes

  1. For the copied nodes, set the B pin of the Multiply node connected to Sample Texture 2D to -4,-4,-4,-4. You can see that that flipped the texture.
  2. Also, set the B pin of the Multiply node connected to the Time node to -0.1:

Figure 10.47: Multiplication of values

  1. Create an Add node at the right of both Sampler Texture 2D nodes and connect the outputs of those nodes to the A and B input pins of the Add node:

Figure 10.48: Adding two Textures

  1. You can see that the resulting combination is too bright because we have summed up the intensity of both textures, so let’s fix that by multiplying the Out of the Add node by 0.5,0.5,0.5,0.5, which will divide each resulting color channel by 2, averaging the color. You can also experiment with what happens when you set different values to each channel if you want, but for our purposes, 0.5 is the proper value for each channel:

Figure 10.49: Dividing the sum of two Textures to get the average

  1. Connect the Out pin of the Multiply node to the Base Color pin of the Master node to apply all of those calculations to the color of the object.
  2. Save the Asset and see the results in the Scene view:

Figure 10.50: Results of texture blending

You can keep adding nodes to make the effect more diverse, such as using Sine nodes (which will execute the trigonometry sine operation) to apply non-linear movements and so on, but I will let you learn that by experimenting with this by yourself. For now, we will stop here. As always, this topic deserves a full book, and the intention of this chapter is to give you a small taste of this powerful Unity tool. I recommend you look for other Shader Graph examples on the internet to learn other usages of the same nodes and, of course, new nodes. One thing to consider here is that everything we just did is basically applied to the Fragment Shader stage of the Shader Pipeline we discussed earlier. Now, let’s use the Blending Shader stage to apply some transparency to the water.

Applying transparency

Before declaring our effect finished, a little addition we can do is to make the water a little bit transparent. Remember that the Shader Pipeline has a blending stage, which has the responsibility of blending each pixel of our model into the image being rendered in this frame. The idea is to make our Shader Graph modify that stage to apply Alpha Blending, a blending mode that combines our model and the previously rendered models based on the Alpha value of our model.

To get that effect, take the following steps:

  1. Look for the Graph Inspector window floating around. If you don’t see it, click the Graph Inspector button at the top-right part of the Shader Graph editor.
  2. Click the Graph Settings tab.
  3. Set the Surface Type property to Transparent.
  4. Set the Blending Mode property to Alpha if it isn’t already at that value:

Figure 10.51: Graph Inspector Transparency settings

  1. Set the Alpha input pin of the Master to 0.5.

Figure 10.52: Setting Alpha of the Master node

  1. Save the Shader Graph and see the transparency being applied in the Scene view. If you can’t see the effect, just put a cube into the water to make the effect more evident:

Figure 10.53: Shadows from the water being applied to a cube

  1. You can see the shadows that the water is casting on our cube because Unity doesn’t know the object is transparent and hence casts shadows. Click on the water plane and look for the Mesh Renderer component in the Inspector. If you don’t see the shadow, click the lightbulb at the top of the Scene view.

Figure 10.54: Enabling lights in the Scene View

  1. In the Lighting section, set Cast Shadows to Off; this will disable shadow casting from the plane on the parts of the cube that are underwater:

Figure 10.55: Disabling shadow casting

Adding transparency is a simple process but it has its caveats, like the shadow problem, and in more complex scenarios, it can have other problems, like increasing overdraw, meaning the same pixel needs to be drawn several times (the pixel belonging to the transparent object, and one of the objects behind). I would suggest you avoid using transparency unless it is necessary. Actually, our water can live without transparency, especially when we apply this water to the river basin around the base because we don’t need to see the part under the water, but the idea is for you to know all of your options. In the next screenshot, you can see how we have put a giant plane with this effect below our base, big enough to cover the entire basin:

Figure 10.56: Using our water in the main scene

Now that we have modified how the object looks through the Fragment node section, let’s discuss how to use the Vertex section to apply a mesh animation to our water.

Creating Vertex Effects

So far, we have applied water textures to our water, but it’s still a flat plane. We can go further than that and make the ripples not only via textures but also by animating the mesh. To do so, we will apply the noise texture we used at the beginning of the chapter in the shader, but instead of using it as another color to add to the Base Color of the shader, we will instead use it to offset the Y position of the vertexes of our plane.

Due to the chaotic nature of the noise texture, the idea is that we will apply a vertical offset to different parts of the model, so we can emulate the ripples:

Figure 10.57: Default plane mesh subdivided into a grid of 10x10 with no offset

To accomplish something like this, you can modify the Vertex section of your shader to look like the following:

Figure 10.58: Ripples vertex effect

In the graph, you can see how we are creating a Vector whose y axis depends on the noise Texture we downloaded at the beginning of the chapter. The idea behind that is to create a Vector pointing upward whose length is proportional to the grayscale factor of the texture; the whiter the pixel of the texture, the longer the offset. This texture has an irregular yet smooth pattern so it can emulate the behavior of the tide.

Please notice that here we used Sample Texture 2D LOD instead of Sample Texture 2D; the latter does not work in the Vertex section, so keep that in mind.

Then we multiply the result by 0.3 to reduce the height of the offset to add, and then we add the result to the Position node.See that the Space property of the Position node is set to Object mode. We need that mode to work with the Vertex section of the Shader Graph (we discussed World and Local spaces before in Chapter 2, Editing Scenes and GameObjects but you can also search Object vs World Space on the internet for more info about this). Finally, the result is connected to the Position node of the Vertex section.

If you save, you will see something like the following image:

Figure 10.59: Ripples vertex effect applied

Of course, in this case, the ripples are static because we didn’t add any time offset to the UV as we did before. In the following screenshot, you can see how to add that, but before looking at it I recommend you try to resolve it first by yourself as a personal challenge:

Figure 10.60: Animated ripples vertex effect graph

As you can see, we are again taking the original UV and adding the time multiplied by any factor so it will slowly move, the same as we did previously with our water texture. You can keep playing around with this, changing how this looks with different textures, multiplying the offset to increase or reduce the height of the ripples, applying interesting math functions like sine, and so much more, but for now, let’s finish with this.

Summary

In this chapter, we discussed how a shader works in the GPU and how to create our first simple shader to achieve a nice water effect. Working with shaders is a complex and interesting job, and in a team, there is usually one or more people in charge of creating all of these effects, in a position called Technical Artist; so, as you can see, this topic can expand up to a whole career. Remember, the intention of this book is to give you a small taste of all the possible roles you can take in the industry, so if you really liked this role, I suggest you start reading shader-exclusive books. You have a long but super interesting road in front of you.

Enough shaders for now! In the next chapter, we will look at how to improve our graphics and create visual effects with particle systems!

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset