What are shaders?

GPUs first came into use because of none other than the video game industry. Arcade cabinets in the 1970's had GPU chips separate from the main CPU to handle the specialized visual needs of the games compared with other computing applications at the time. Eventually, the need to draw 3D graphics in games in the mid-1990's led to the modern GPU architecture we have now. Shaders were actually first introduced in 1988 by Pixar back when the company was run by Apple's cofounder Steve Jobs. Shaders are little programs we can write directly to the GPU to process the vertex and pixel data. Originally, APIs such as OpenGL ES 1.0 didn't make use of shader processing but instead were what's known as fixed-function APIs. In fixed-function APIs, programmers just referenced simple set rendering commands to the GPU. As GPUs evolved and took more work away from the CPU, the use of shaders increased. Although a rather more advanced way to traverse the graphics pipeline than the fixed-function methodology, shaders allow for even deeper customization of what the GPU displays to the screen. Game developers and 3D artists continue to push visual effects in games with them.

From OpenGL 2.0 and onwards, shaders were built in the API's C-like language named GLSL. In the Apple Metal API, we build shaders with the Metal Shading Language, which is a subset of C++11 of the file type .metal and can run the pipeline in either Objective-C or Swift with our view controllers.

Types of shaders

Shaders come in a number of types that continue to grow as 3D games and art animation continues to progress. The most commonly used are Vertex shaders and Fragment shaders. Vertex shaders are used to transform 3D coordinates into 2D coordinates for the screen to display, in short, the positioning data of our graphics. Fragment shaders, also known as Pixel shaders, are what are used to convert colors and other visual attributes of pixels on the screen. These other attributes of Fragment Shaders can include bump mapping, shadows, and specific highlights as well. We emphasized the word attributes because that's usually the name given for the properties or input of our shader programs.

Here is a code sample of a simple Vertex and Fragment shader written in the Metal Shading Language.

//Shaders.metal
//(1)
#include <metal_stdlib>
using namespace metal;
//(2)
vertex float4 basic_vertex(                           
//(3)
  const device packed_float3* vertex_array [[ buffer(0) ]], 
//(4)
  unsigned int vertexID [[ vertex_id ]]) {       
//(5)          
  return float4(vertex_array[vertexID], 1.0);              
}
//(6)
fragment half4 basic_fragment() { 
  return half4(1.0);  

The code here is a bit different than what we've seen throughout the course of the book. Let's go over it line by line.

  1. The Metal Shading Language is a C++11-like language, so we see that the Metal Standard Library is imported into the shader file with the line #include <metal_stdlib> in addition to using namespace metal;.
  2. The next line is the creation of our Vertex shader using the keyword vertex. This shader is a vertex of four floats. Why four floats when 3D space only deals with x, y, and z coordinates? To summarize, 3D matrix math involves a fourth component, w, to accurately handle the math calculations of 3D space. In short if w= 0, the x, y, and z coordinates are vectors; if w = 1, then those coordinates are points. The purpose of this shader will be to draw simple points to the screen, so w will be 1.0.
  3. Here, we create a pointer to an array of float3 type (holders for our x, y, and z coordinates) and set it to the very first buffer with the [[ buffer(0) ]] declaration. The [[ ]] syntax is used to declare inputs/attributes for our shaders.
  4. The unsigned integer vertexID is what we name the vertex_id attribute of this particular array of vertices.
  5. This is where the float4 type is returned, or in this case, the final position of this vertex array. We see that it returns two sections of the output: the first being the reference to this vertex array, identified by the vertex_id attribute and the w value of 1.0, to represent that these are points in space.
  6. This line is where we create the fragment shader, using the fragment keyword. This shader is of the data type half4, which is an array of [4,4] 16-bit floats. This is, in this case, ultimately to create 16-bit colored pixels. The data in this [4,4]-component vector type saves 16 bits to R, G, B, and alpha channels. This shader is going to simply show pure white pixel shading with no transparency, so we simply write return half4(1.0);. This sets all of the bits to 1, which is equivalent to rgba(1,1,1,1).

When we create a Buffer Object, which can just be a Struct of floating points on the screen, we pass that data through these shaders and out would pop up a white triangle or set of triangle shapes on the screen.

Looking back at the Graphics pipeline diagram, we see that after the vertex shader is calculated, the GPU does what's known as Primitive Assembly. This is essentially where the points and vectors defined in the vertex shader are mapped to coordinates in screen space. The Rasterizer step, in simple terms, then figures from the vertex data where and how we can and can't color that pixel data onto the screen using the fragment shader information. After taking in the fragment shader information, the GPU then uses that information for the blending of that pixel data. Finally, that output is sent to or committed to the frame buffer where the player sees that output. This all happens in a single draw call in the render cycle. Having all of your game's lights, pixels, effects, physics, and other graphics cycle through this in .016666 seconds is the name of the game.

We'll go over some more Metal code later but understand for now that shaders are like little instruction factories for data input we send to them in our Swift/Object-C code. Other shader types that have arisen over the years are Geometry Shaders and Tessellation Shaders.

Note

Both the Vertex and Fragment shaders are present in this single .metal file, but typically shaders are written in separate files. Xcode and Metal will combine all .metal files in your project, so it doesn't matter if the shaders are in one file or not. OpenGL's GLSL for the most part forces the separation of shader types.

For years, OpenGL worked well for many different GPUs but as we all see, Apple Metal allows us to perform draw calls up to 10x times faster than OpenGL ES.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.133.127.37