It’s finally time to make our first shader! If you’ve ever followed a tutorial to learn a new programming language, then your first program was probably logging “Hello, World!” to the console or the screen. It’s a bit harder to do that with shaders, so we’ll settle with displaying a mesh and applying a basic color to it. In this chapter, I will show you how to set up your scene and explain which components need to be attached to objects. Then, we will see how shaders in Unity work, and we will write our very first vertex and fragment shader functions. The differences between shaders in each of Unity’s render pipelines are explored. Finally, I will cover the basic shader syntax that you will be seeing throughout the book.
Shader code isn’t the only way to write shaders in Unity. If you prefer to avoid programming, then the next chapter will introduce Shader Graph, Unity’s visual shader editor. However, you will still find the first part of this chapter useful. To prepare for writing our very first shader, let’s set up a new Unity project.
Project Setup
The built-in render pipeline was Unity’s only rendering code up until 2017. Most older learning resources available regarding shaders use the built-in pipeline, although some are very outdated.
The Universal Render Pipeline, URP, is a more “lightweight” pipeline aimed at lower-end machines and cross-platform development. It used to be named the “Lightweight Render Pipeline,” or LWRP.
The High-Definition Render Pipeline, HDRP, is intended for games that require high-fidelity rendering and lighting quality. A restricted number of platforms support HDRP.
If you’re just starting out with Unity and are here to learn shaders without having a specific project in mind, I recommend starting with URP. Unity intends to make URP the default pipeline for new projects in the future, so an increasing proportion of learning resources will move away from the built-in pipeline and toward URP.
If you are working on a multi-platform project targeting high-end consoles (e.g., PS5, PS4, Xbox One, Xbox Series X/S) and PC, then you can choose any pipeline, subject to other requirements. These are the platforms that currently support HDRP, which is the best choice if you want to use cutting-edge graphics.
If you plan to target mobile, web, or other consoles (e.g., Nintendo Switch), then do not use HDRP.
If you plan only to use shader code and not Shader Graph, then I recommend not using HDRP. Although code-based shaders are possible in HDRP, learning resources are lacking compared with the other two pipelines; Unity themselves recommend using Shader Graph when working in HDRP. That’s what I’ll be doing throughout the book!
If you are left with a choice between the built-in pipeline and URP, it can be difficult to fall on either side of the fence. I still recommend URP because it is going to receive the most active development in the future, but if you do pick the built-in pipeline, then rest assured almost all the book’s examples will still work!
Once you have picked a pipeline, let’s create a new project via the Unity Hub.
Creating a Project
If you’re using the built-in pipeline, pick the template simply called “3D.”
In URP, pick the template called “3D Sample Scene (URP).” This template contains an example scene with a few assets already set up for you.
Similarly, in HDRP, pick “3D Sample Scene (HDRP).”
Type a project name and save location into the fields on the right-hand side of the screen and click the Create project button. Unity will create a new folder and populate it with the template files. Then the Unity Editor will open. We can now set up a scene ready to create and test our shaders.
Setting Up the Scene
Add the Unity primitive sphere to your scene, which you can do via the toolbar through GameObject ➤ 3D Object ➤ Sphere. You may find it useful at times to use other meshes, but a sphere is perfect for quickly testing most effects.
- We use materials in Unity to attach shaders to objects. Materials are contained in the Assets folder alongside other assets like textures, scripts, and meshes. Create one by right-clicking in the Project View and selecting Create ➤ Material. Name it something like “HelloWorld”, since we’ll be making our very first shader.
Unity automatically uses a default shader: the Standard shader in the built-in pipeline and the Lit shader in URP and HDRP.
Drag the material onto the sphere mesh you added. The appearance of the sphere may change slightly when you do so.
At this point, if you are using HDRP or want to only use Shader Graph, skip to the next chapter to make your first shader. Writing shaders with code is possible in HDRP, but it is a magnitude more difficult to do due to a lack of learning resources and the increased complexity in using HDRP.
Create a new shader file by right-clicking in the Project View and choosing Create ➤ Shader ➤ Unlit Shader, which copies a template shader into the new file with a .shader file extension. Name it “HelloWorld.shader” since it’s our first shader. You don’t need to type the extension.
Select the HelloWorld material we created previously, and in the Shader drop-down at the top of the Inspector window, find “Unlit/HelloWorld”. See Figure 3-3 to see an example material in Unity.
You can change how the preview window on a material behaves using the row of buttons just above the preview. Click the play button to animate the material over time. Use the next button along to change to a different preview mesh. Use the button with yellow dots on it to tweak how many light sources are simulated. Use the drop-down with the half-blue sphere icon to specify a reflection probe for the preview. And use the menu on the right-hand side to dock and undock the preview window.
The template shader is already a completed shader, but it’s no use to start off with a completed shader since we’re here to learn. Open the shader file by double-clicking it in the Project View and delete the file contents – any material using the shader will turn magenta, which happens whenever the shader fails to compile properly. Now that our scene is set up, we can focus on writing the shader file. We will start by discussing shader syntax.
If you installed Unity with the default settings, you most likely have Visual Studio or Visual Studio Code installed. Therefore, when you double-click a shader file, Unity will open it in one of those. You can customize which editor is used via Preferences ➤ External Tools.
Writing a Shader in Unity
The OpenGL API, a popular cross-platform graphics library, uses a shading language called GLSL (for OpenGL Shading Language).
DirectX, which is designed for use on Microsoft’s platforms, uses HLSL (High-Level Shading Language).
Cg, a deprecated shading language developed by Nvidia, uses the common feature set of GLSL and HLSL and can cross-compile to each depending on the target hardware.
The shading language is what you write shader code in. A game or game engine will compile shaders written in one of those languages to run on the GPU. Although it is possible to write shaders using any one of GLSL, HLSL, or Cg, modern Unity shaders are written in HLSL.
In the past, Unity shaders used Cg as the primary shading language. Over time, the default has switched to HLSL. Unity will automatically cross-compile your shader code for the target platform.
ShaderLab provides ways to communicate between the Unity Editor, C# scripts, and the underlying shader language.
It provides an easy way to override common shader settings. In other game engines, you might need to delve into settings windows or write graphics API code to change blend, clipping, or culling settings, but in Unity, we can write those commands directly in ShaderLab.
ShaderLab provides a cascading system that allows us to write several shaders in the same file, and Unity will pick the first compatible shader to run. This means we can write shaders for different hardware or render pipelines and the one that matches up with the user’s hardware and your project’s chosen render pipeline will get picked.
It’ll become a lot easier to understand how this all works with a practical example, so let’s start writing some ShaderLab.
Writing ShaderLab Code
In this example, we will write a shader to display an object with a single color, and we’ll add the option to change that color from within the Unity Editor. Most of the code required for this shader is the same between the built-in and Universal render pipelines, but there are a few differences, which I will explain when we reach them.
When there is a difference between the code required for each pipeline, I will present you with two code blocks labeled with the pipeline they are intended for. Choose only the one for your pipeline.
Open the HelloWorld.shader file. Inside the file, we’ll start by naming the shader using the Shader keyword. This name will appear when viewing any material in the Inspector if you use the Shader drop-down at the top of the material (see Figure 3-3). After declaring the name, the rest of the shader is enclosed within a pair of curly braces, so we will put any subsequent code inside these braces.
You can include folders inside the name – for example, naming the shader “Examples/HelloWorld” places the shader under the folder “Example” alongside any other shaders that use that folder in their path.
Beginning a shader file
Declaring properties in ShaderLab
Conventionally, shader property names start with an underscore, and we capitalize the start of each word. In this case, _BaseColor is the computer-readable name of the property, and we will refer to it in shader code later as _BaseColor.
Inside the parentheses, we first specify a human-readable name in double quotes, which Unity uses in the Inspector. In this case, the name we chose is “Base Color”, which is like the code-readable name anyway.
Next comes the type of the variable, which is Color. We could also have types like Texture2D, Cubemap, Float, and so on – these are all types we’ll see later.
Finally, we give the property a default value after the equals sign, which is used when you create a new material with this shader.
It seems strange to have two different names for each property, but it’s useful to have both types of names like this because we might want to use certain technical names within the code, but another person working with this shader to create materials in the Inspector might not understand (or need to know) what the code name means. A human-readable name makes it clearer what the property is for.
Colors are often stored as unsigned (positive) integers between 0 and 255. This requires 8 bits of storage space. 0 means no color, and 255 means full color.
Colors are made up of a mix of red, green, and blue. Plus we have an “alpha” value that represents transparency. Therefore, we use four channels of 8 bits each for colors.
In Unity, especially in shaders, we instead use floating-point values between 0 and 1 to represent each color channel. A floating-point number has a fractional part. A color value of (1, 1, 1, 1) means all four channels use the maximum value, which appears as fully opaque white.
We use this representation in shaders for higher precision. All you need to remember is that a regular color value is between 0 and 1.
Now that we’ve dealt with the Properties block, we will add a SubShader.
Adding a SubShader
Adding a SubShader in ShaderLab
If you define multiple SubShader blocks, Unity picks the first one that works on your combination of hardware and render pipeline. When your hardware is incompatible with every SubShader, the shader will fail to compile, and Unity will display the error material, which is magenta.
Always put the SubShader with the highest requirements first. There doesn’t seem to be a hard limit on the number of SubShaders you can include in one file, but you’ll find it difficult to maintain the file if you add too many.
Specifying the Unlit/Color shader as a fallback
Inside the SubShader, we will start to add settings that control how the shader will operate – there are a lot of possible options, but we’ll add only one for now: Tags.
SubShader Tags
The Tags block lets us specify whether the shader is opaque or transparent, set whether this object is rendered after others, and specify which render pipeline this SubShader works with. Each tag is a key-value pair of two strings, where the first string is the name of the tag and the second string is its value. Let’s add a RenderType tag to specify we want to use opaque rendering for this object.
We can add code comments in ShaderLab in a similar manner to C-style languages: single-line comments start with a double forward slash //, and multiline comments are enclosed between /* and */.
Adding Tags inside a SubShader in ShaderLab
Background = 1000
Geometry = 2000
AlphaTest = 2450
Transparent = 3000
Overlay = 4000
Setting the rendering queue in the Tags block
For URP, the tag value is “UniversalPipeline”.
In HDRP, the tag value is “HDRenderPipeline”.
In the built-in pipeline, there is no corresponding tag value. Place any SubShader blocks for the built-in pipeline at the bottom of the list.
Adding a RenderPipeline tag in the Tags block in URP
With the Tags block out of the way, the last bit of ShaderLab we need to add is a Pass block.
Adding a Pass
Creating a shader pass inside the SubShader
Using the UniversalForward LightMode tag in URP
Inside the Pass, we will also specify which shading language we are using. In the past, Unity used the Cg language for its shaders, but the language has since been discontinued, and Unity shaders now use HLSL (although it is also possible to write GLSL shaders too). I’m mentioning this here because we are going to use two enclosing keywords to wrap our shader code – HLSLPROGRAM and ENDHLSL.
You might find tutorials online that still use the Cg language, which requires code to be enclosed in CGPROGRAM and ENDCG. Most of the syntax is identical between Cg and HLSL, but we’re going to exclusively do things the modern way in HLSL.
Specifying the shading language
We’re finally ready to write some HLSL code. How exciting! From this point, all code will be written between the HLSLPROGRAM and ENDHLSL keywords. We’ll no longer be writing in Unity’s proprietary ShaderLab language and will instead be writing in the HLSL shading language. Next, let’s do some setup for our shader.
Pragma Directives and Includes
#pragma statements for declaring vertex and fragment functions
- In the built-in pipeline, it’s not easy to access these within the engine directly. They can be found at [Unity root installation folder]/Editor/Data/CGIncludes.
The most important and frequently used file is UnityCG.cginc. Don’t let the cginc file extension confuse you – it’s still compatible with HLSL.
- In URP and HDRP, include files can be accessed in-Editor. In the Project View, scroll down to the Packages section and find the following folders:
Core RP Library/ShaderLibrary contains core shader files common to both pipelines.
Universal RP/ShaderLibrary contains URP’s shader files.
High Definition RP/Runtime/Material contains HDRP’s shader files in a series of subfolders.
Including Unity’s standard shader library in the built-in pipeline
Including Unity’s standard shader library in URP
The first step of the graphics pipeline involves collecting all the data from the scene to pass to the shader, so we need to devise some way of obtaining the data here on the shader side. We’ll do that via structs.
Controlling Data Flow with Structs
We pass data between shader stages via containers called structs, which contain a bunch of variables. The first struct contains all the data we want to pull from the mesh and pass to the vertex shader.
The appdata Struct
We usually name this struct appdata, VertexInput, or Attributes; I will stick with the name appdata throughout the book because Unity’s built-in structs are named similarly, although you can name this whatever you want. Each instance of appdata contains data about one vertex of the mesh, and for now, all we need is the position of the vertex. Vertex positions are defined in object space, where each position is relative to the origin point of the mesh (for a refresher on object space, see Figure 2-10).
HLSL requires us to add what’s called a semantic to each variable. It’s just a bit of added information that tells Unity what each variable will be used for in the next shader stage – for example, vertex positions need to use the POSITION semantic. Semantic names don’t need to be capitalized, although most documentation will use capitalized names. We will make it clear that the vertex position is in object space by naming the variable positionOS.
A full list of semantics can be found on the Microsoft HLSL website. At the time of writing, it can be found here: https://docs.microsoft.com/en-us/windows/win32/direct3dhlsl/dx-graphics-hlsl-semantics.
The appdata struct for passing data to the vertex shader
Take note of the semicolon after the closing brace! The type of the positionOS variable is float4 because we are using floating-point values to represent each component of the position, and there are four components. We will cover the core types in HLSL and how to use them later in the chapter. While we are thinking about structs, we will also write the struct for data being passed between the vertex and fragment shaders.
The v2f Struct
The v2f struct for passing data from the vertex shader to the fragment shader
You’ll notice that the semantic is different here. HLSL makes a distinction between a position being input to and output by the vertex shader, so we use the SV_POSITION semantic instead. Like all semantics, other learning resources might choose not to capitalize the name. Next, we will deal with variables.
Variables in HLSL
Although we declared the _BaseColor property back in the Properties block, we need to declare it again inside HLSL. It’s also possible to declare variables here that are not specified in the Properties block – in that case, we would need to use C# scripting to set the values of those variables rather than modifying values in the material’s Inspector. _BaseColor is, obviously, a color, which doesn’t have a special type in HLSL. It’s just a four-element vector of floating-point numbers, for which we use the float4 type. We declare these variables just below the structs we just wrote.
Unity may also generate certain shader variables for us. We need to declare some of them inside HLSL, but we won’t need to include them in Properties or pass the data to the shader ourselves with scripting. An example of this kind of variable is _CameraDepthTexture, which we will see later.
Declaring variables in HLSL in the built-in pipeline
The rules regarding variables are slightly different when using URP. This code will still work, but I’m including a section once we’ve finished the shader explaining how to tweak the code to make use of features exclusive to URP and HDRP. With that change aside, everything is set up for us to start writing the two shader functions, vert and frag.
The Vertex Shader
- In the built-in pipeline, this function is called UnityObjectToClipPos:
The name is long, but it intends to clarify what it is doing: it’s carrying out the object-to-clip transformation, and it’s operating on positions.
There are similarly named functions in the built-in pipeline, such as UnityObjectToWorldDir, which performs the object-to-world transformation and operates on direction vectors.
- In URP, this function is called TransformObjectToHClip:
Similarly, the name is meant to tell you what the function is doing, and other functions in the core shader library are named using similar conventions.
In both pipelines, the respective function takes the object-space position as input and returns the clip-space position as output.
The vertex shader in the built-in pipeline
The vertex shader in URP
Finally, we will write the fragment shader function, frag.
The Fragment Shader
The fragment shader
You can see the Base Color property on the material, which we can tweak to change the color of the preview at the bottom of the window and any object that uses this material.
It is also possible to override the Queue we defined within the shader – instead of using the Geometry queue, which has a value of 2000, we can set any integer value here to modify how Unity renders the object. There may be edge cases where this is necessary, but I usually leave this field alone and let it inherit the value from the shader.
If we tick the Double Sided Global Illumination option, then Unity will account for both sides of each face of the mesh while lightmapping, even if only one side of each face is rendered normally.
We have successfully written a shader that renders an object in a single color with no lighting, which is about as “Hello World” as you can get. Congratulations for making it to this stage! Now that we’ve written our first shader, let’s revisit one of the key differences between writing shaders for the built-in pipeline and URP.
The SRP Batcher and Constant Buffers
As we have seen, there are sometimes differences between shaders designed for each of Unity’s render pipelines. Some of these differences amount to changing a specific function name because the core libraries differ slightly between pipelines, and other differences represent a fundamental change in how the pipelines operate. In this section, I want to provide an overview of one difference in particular: the SRP Batcher.
If you are just starting out with Unity, you may find it useful to stick with the built-in pipeline for now and come back to the Universal Render Pipeline later, since a lot of tutorials out there were written for the built-in pipeline. However, if you are planning on using Shader Graph, then you will require URP or HDRP. Eventually, URP will become the default for new projects in Unity, and future learning materials will focus on it.
This is a system supported by all Scriptable Render Pipelines (including URP and HDRP) to render objects more efficiently than traditional methods, but our shaders need to conform to a handful of rules to be compatible with the SRP Batcher. Namely, we need to include most of our variable declarations inside a special structure called a constant buffer.
The CBUFFER for declaring variables in URP
We don’t need to change the variable names or types in any way – we just need to enclose them in the constant buffer. By making this change, Unity can batch together objects that use the same shader and the same mesh and render them using a single draw call. I’ll be going into much more detail about optimizations like this in Chapter 13, but this is one SRP-exclusive feature I want to keep in mind throughout the book! Now that we have written our first shader, let’s cover basic shader features that we didn’t see in that shader.
Common Shader Syntax
Shaders come with many types, operators, and functions, which we’ll be seeing a lot throughout the book, so I will introduce the most important ones here. By the end of this section, you should understand the difference between similar variable types and what they are used for, as well as operators that work on those types.
Scalar Types
The float, half, and double data types
The int and uint types
Valid math operations in HLSL
Vector Types
Vector types in HLSL are made by combining the scalar types we just covered. The way we construct these types is simple: take the name of the scalar type we are basing the vector type on and add a number to the end that represents the number of elements. If we need a two-element vector of floating-point numbers, we use the float2 type. A three-element vector of integers? That’s an int3.
Personally, I wish conventional programming languages commonly supported these kinds of types out of the box.
Vector types in HLSL
Vector math operations
Accessing vector components
Accessing multiple vector components
Using swizzling to access multiple vector components
Combining parts of multiple vectors into new vectors
Matrix Types
Constructing the 3 × 3 identity matrix in HLSL
Accessing matrix elements in HLSL
Swizzling matrix elements in HLSL
The * operator on matrices
Included Variables
Unity includes several variables to aid your shader programing. While it is possible to send arbitrary data to a shader ourselves through C# scripting, Unity sends a lot of data to the shader automatically, such as time-based variables, transformation matrices, and camera properties. Some of these will be explored in detail in their respective chapters, so we will cover only a selection of the variables here.
Transformation Matrices
Matrices provided by Unity
Matrix name | Description |
---|---|
UNITY_MATRIX_M unity_ObjectToWorld | The model matrix that transforms from object space to world space. These two names are aliases for one another. |
UNITY_MATRIX_I_M unity_WorldToObject | The inverse model matrix that transforms from world space to object space. These two names are aliases for one another only on URP; for the built-in pipeline, only unity_WorldToObject exists. |
UNITY_MATRIX_V | The view matrix that transforms from world space to view/camera space. |
UNITY_MATRIX_P | The projection matrix that transforms from view space to clip space. |
UNITY_MATRIX_MV | The model-view matrix that transforms from object space directly to view space. |
UNITY_MATRIX_VP | The view-projection matrix that transforms from world space to clip space. This can be considered the “camera matrix” since both view and projection are reliant on the camera properties. |
UNITY_MATRIX_MVP | The model-view-projection matrix that transforms from object space directly to clip space. This matrix is often used in the vertex shader. |
Time-Based Variables
Shaders can be animated over time without requiring us to write external time data to the shader. Unity already provides plenty of time variables for our shaders, covering the time since the level was loaded and the time since the last frame execution. Let’s see these variables in action.
The _Time variable
Creating a looping timer using _Time
Using the sine of _Time. The following two statements are equivalent
Using delta time in shaders
Summary
Shaders must be attached to a material to be applied to a mesh.
HLSL, GLSL, and Cg are examples of shader languages. The standard shader language in Unity is HLSL, since Cg has been deprecated.
Unity’s proprietary language, ShaderLab, wraps around shader code and provides an interface between the shader and the rest of Unity.
Properties are shader variables that we can edit on a material.
A ShaderLab file can contain many SubShaders, and Unity picks the first one that is valid on the hardware.
Tags can be used to customize the rendering order and render pipeline for a specific SubShader or Pass.
Unity provides helpful macros and functions that are commonly used in shaders.
URP shaders must declare most variables inside a constant buffer.
HDRP uses Shader Graph instead of code for most user-generated shaders.
There are several core variable types in HLSL that represent scalars, vectors, and matrices of different dimensions.
Swizzling can be used as a shorthand to access vector components in any order or combination, with possible repetition.