Adding material and lighting

The most important properties for a mesh, other than the vertices themselves, are material and lighting. With the ability to specify the texture, color, and the way the surface behaves with lighting, we can generate much more realistic scenes.

In the previous recipe, we have already added two of the three input vectors needed for calculating the lighting:

  • View direction (calculated from the camera's location)
  • Surface normal

The final input vector is light direction. For this sample, we are going to create a simple directional light, that is, a light that has a constant direction anywhere in the scene (like the sun). We will define this light with a color and direction.

We also introduce the following material properties:

  • Ambient light color: The ambient reflection is a constant ambient-lighting value. This is a simple, approximate representation of light that has bounced around a room and is lighting the back of the object, providing the indirect light. This value is not based on any of the three input vectors and will be provided by our material properties.
  • Diffuse reflection color: Diffuse reflection is the reflection of light from a diffuse surface where the ray is reflected from the surface in random directions. In a 3D real-time rendering, we only approximate this light model. We will instead say that the light is reflected from the surface equally in all directions (Luna 2012, p280). This means that regardless of the viewer's angle, the amount of light reflected from a point on the surface is constant.
  • Material diffuse color: This is multiplied with the vertex color within the vertex shader to give the final pixel-diffused color. The intensity of this value represents the amount of direct light that the light provides to the surface and is determined using the light direction and surface normal with Lambert's cosine law.
  • Specular reflection color and power (shininess of the surface): Specular reflection represents the amount of perfectly reflected light that bounces off the surface of an object. The material's specular color represents the color of this reflected light, and the specular power represents the exponent of the equation that determines how shiny the surface is (the higher the value, the shinier the surface, and therefore, the smaller the specular highlight). The specular amount is calculated using all three input vectors.
  • Emissive light color: The emissive lighting value is a constant that represents the emitted light from the surface. This value is not based on any of the input vectors and is not affected by the light color. The value of this constant is controlled via the material properties.

From the three input vectors and the previous material properties, we determine the four lighting output components: the ambient reflection, diffuse reflection, specular reflection, and emissive lighting.

Getting ready

For this recipe, we need the vertices to include a normal vector and the supporting changes from the previous recipe.

The completed project can be found in the companion code as Ch02_02MaterialAndLighting.

How to do it…

The first thing we will do is make changes to ShadersCommon.hlsl to include a new per material constant buffer and add directional light to the per frame constant buffer to store the light's color and direction.

  1. First add a new structure for the directional light class; this must be placed before the PerFrame structure.
    // A simple directional light (e.g. the sun)
    struct DirectionalLight
    {
        float4 Color;
        float3 Direction;
    };
  2. Now we will update the PerFrame structure to include the light.
    cbuffer PerFrame: register (b1)
    {
        DirectionalLight Light;
        float3 CameraPosition;
    };
  3. Finally, we add a new constant buffer for the material properties (note that the slot number used is 2).
    cbuffer PerMaterial : register (b2)
    {
        float4 MaterialAmbient;
        float4 MaterialDiffuse;
        float4 MaterialSpecular;
        float MaterialSpecularPower;
        bool HasTexture;
        float4 MaterialEmissive;
        float4 UVTransform;
    };

    Note

    A good practice would be to render objects sorted by their material, saving on pipeline changes and constant buffer updates.

  4. Next, we will combine the vertex color and material diffuse and pass the result to the pixel shader by modifying the vertex shader. Find the appropriate line in ShadersVS.hlsl and add the highlighted code:
    result.Diffuse = vertex.Color * MaterialDiffuse;

    Note

    If we do not set a vertex color or material diffuse, the color will be black as the colors are multiplied. Therefore, it is important that both are provided with a value. If the diffused color should have no impact upon the final color, for example, the texture sample provides all the necessary surface colors, the vertex and material diffuse colors should be set to white (1.0f, 1.0f, 1.0f, and 1.0f). Alternatively, if the vertex color is to be ignored, provide the correct brightness, some grayscale value, and vice versa.

  5. Also within the vertex shader, we apply the material's UVTransform matrix to the UV coordinates. The following code shows how this is done:
    / Apply material UV transformation
    result.TextureUV = mul(float4(vertex.TextureUV.x, 
        vertex.TextureUV.y, 0, 1), (float4x2)UVTransform).xy;
  6. Within ConstantBuffers.cs, we need to update the PerFrame structure to also include the directional light and to create a new structure, PerMaterial, while keeping in mind the HLSL 16-byte alignment.
    [StructLayout(LayoutKind.Sequential, Pack = 1)]
    public struct DirectionalLight
    {
        public SharpDX.Color4 Color;
        public SharpDX.Vector3 Direction;
        float _padding0;
    }
    [StructLayout(LayoutKind.Sequential, Pack = 1)]
    public struct PerFrame
    {
        public DirectionalLight Light;
        ...
    }
    [StructLayout(LayoutKind.Sequential, Pack = 1)]
    public struct PerMaterial
    {
        public Color4 Ambient;
        public Color4 Diffuse;
        public Color4 Specular;
        public float SpecularPower;
        public uint HasTexture; // Has texture 0 false, 1 true
        Vector2 _padding0;
        public Color4 Emissive;
        public Matrix UVTransform; // Support UV transforms
    }
  7. Within our D3DApp class, create a new private member field, perMaterialBuffer, and initialize the constant buffer in CreateDeviceDependentResources. As the material constant buffer will be used by both the vertex and pixel shaders, assign the buffer to each of them using SetConstantBuffer with the slot 2.

    We're now ready to update the render loop.

  8. Update the per material constant buffer with the following lines of code.
    var perMaterial = new ConstantBuffers.PerMaterial();
    perMaterial.Ambient = new Color4(0.2f);
    perMaterial.Diffuse = Color.White;
    perMaterial.Emissive = new Color4(0);
    perMaterial.Specular = Color.White;
    perMaterial.SpecularPower = 20f;
    perMaterial.HasTexture = 0;
    perMaterial.UVTransform = Matrix.Identity;
    context.UpdateSubresource(ref perMaterial, perMaterialBuffer);
  9. Now update the perFrame variable with the following lines of code.
    perFrame.Light.Color = Color.White;
    var lightDir = Vector3.Transform(new Vector3(1f, -1f, -1f), 
        worldMatrix);
    perFrame.Light.Direction = new Vector3(lightDir.X, 
        lightDir.Y, lightDir.Z);
  10. Compile and run (F5) the code. The output should still be the same as the previous recipe; however, it is worth double checking that the shaders are compiling correctly.

Tip

The shader compilation will throw an exception with line numbers and the description of any syntax errors. Depending on the error, it may also provide correct examples of usage.

We will now implement three lighting shaders: diffuse (using Lambert's cosine law), Phong, and Blinn-Phong.

Implementing diffuse shaders

Follow the given steps for implementing diffuse shaders:

  1. In Common.hlsl, add the following function to determine the diffuse reflection:
    float3 Lambert(float4 pixelDiffuse, float3 normal, float3 toLight) 
    {
    // Calculate diffuse color (Lambert's Cosine Law - dot 
    // product of light and normal). Saturate to clamp the 
    // value within 0 to 1.
        float3 diffuseAmount = saturate(dot(normal, toLight))
        return pixelDiffuse.rgb * diffuseAmount;
    }
  2. Create a new shader file ShadersDiffusePS.hlsl (again, remember the encoding), add the include directive #include "Common.hlsl", and then declare a texture and texture sampler:
    Texture2D Texture0 : register(t0);
    SamplerState Sampler : register(s0);
  3. Within a PSMain function, include this code:
    // After interpolation the values are not necessarily 
    // normalized
    float3 normal = normalize(pixel.WorldNormal);
    float3 toEye = normalize(CameraPosition – 
        pixel.WorldPosition);
    float3 toLight = normalize(-Light.Direction);
        
    // Texture sample (use white if no texture)
    float4 sample = (float4)1.0f;
    if (HasTexture)
        sample = Texture0.Sample(Sampler, pixel.TextureUV);
    
    float3 ambient = MaterialAmbient.rgb;
    float3 emissive = MaterialEmissive.rgb;
    float3 diffuse = Lambert(pixel.Diffuse, normal, toLight);
    
    // Calculate final color component
    float3 color = (saturate(ambient+diffuse) * sample.rgb) * Light.Color.rgb + emissive;
    // We saturate ambient+diffuse to ensure there is no over-
    // brightness on the texture sample if the sum is greater 
    // than 1 (we would not do this for HDR rendering)
        
    // Calculate final alpha value
    float alpha = pixel.Diffuse.a * sample.a;
    return float4(color, alpha);
  4. Within D3DApp.CreateDeviceDependentResources, create the pixel shader as per the simple and depth pixel shaders.

Implementing Phong shaders

Follow the given steps for implementing Phong shaders:

  1. In Common.hlsl, we will now add a function for determining specular reflection using the Phong reflection model.
    float3 SpecularPhong(float3 normal, float3 toLight, float3 toEye)
    {
        // R = reflect(i,n) => R = i - 2 * n * dot(i,n)
        float3 reflection = reflect(-toLight, normal);
    
        // Calculate the specular amount (smaller specular power = 
        // larger specular highlight) Cannot allow a power of 0 
        // otherwise the model will appear black and white
        float specularAmount = pow(saturate(dot(reflection,toEye)), max(MaterialSpecularPower,0.00001f));
        return MaterialSpecular.rgb * specularAmount;
    }
  2. Create a new shader file ShadersPhongPS.hlsl. After the include directive, create the PSMain function with the same contents as the diffuse shader, except for the following two changes for calculating the color component:
    float3 specular = SpecularPhong(normal, toLight, toEye);
    float3 color = (saturate(ambient+diffuse) * sample.rgb + specular) * Light.Color.rgb + emissive;
  3. Within D3DApp.CreateDeviceDependentResources, create the pixel shader as per the simple and depth pixel shaders.

Implementing Blinn-Phong shaders

Follow the given steps for implementing Blinn-Phong shaders:

  1. This time in Common.hlsl, we will create a pixel shader that uses the Blinn-Phong shading model. This is similar to the Phong reflection model; however, instead of the costly reflection calculation per pixel, we use a half-way vector.
    float3 SpecularBlinnPhong(float3 normal, float3 toLight, float3 toEye) {
      // Calculate the half vector
      float3 halfway = normalize(toLight + toEye);
      // Saturate is used to prevent backface light reflection
      // Calculate specular (smaller power = larger highlight)
      float specularAmount = pow(saturate(dot(normal, 
        halfway)), max(MaterialSpecularPower,0.00001f));
      return MaterialSpecular.rgb * specularAmount;
    }
  2. Create the shader file, ShadersBlinnPhongPS.hlsl. After the include directives, Texture2D and SamplerState, add the PSMain function as per the Phong shader, except in this case, call SpecularBlinnPhong instead of SpecularPhong. Again, create the pixel shader object in your D3DApp class.
  3. The final output of each material/lighting shader is shown in sequence in the following screenshot. The downloadable sample code binds the number keys 1, 2, 3, and 4 to each of these in order.
    Implementing Blinn-Phong shaders

    Material and lighting output comparison – None, diffuse, Phong, and Blinn-Phong

How it works…

We first added the light's color and direction to our per frame constant buffer. This groups the camera location and light together as in most situations they will not change between the start and end of a frame. The new per material buffer on the other hand could change many times per frame but not necessarily for each object.

We have set the light's direction in world space as we are performing all our light calculations in this space.

We have added a new structure for storing material properties. These properties are based on the information that we will be loading from the Visual Studio graphics content pipeline CMO file.

UV mapping

By adding the UV transformation matrix to the per material constant buffer, we have completed support for UV mapping for Visual Studio CMO meshes. The UV transform is used to rotate or flip the vertex UV coordinates. This is necessary depending on how the mesh has been converted by the Visual Studio graphics content pipeline. It can also be useful to use the UV transform when changing a mesh vertex's winding order.

UV mapping is the process of unwrapping a mesh and assigning 2D texture coordinates to vertices in such a way that when rendered in 3D space, the texture wraps around the object. This process is performed within the 3D modeling software and looks something like the following screenshot. From a rendering point of view, we are interested in the UV coordinates assigned and the UV transform applied to the mesh.

UV mapping

UV mapping within Blender (www.blender.org)

When performing the UV unwrapping process, it is important to consider the impact that mip-mapping will have on the final render result as this can lead to color bleeding. Mip-mapping is the process of sampling from lower resolution versions of a texture to control the level of detail for objects that are further away. For example, if a UV coordinate borders two colors with little room to wiggle at a lower resolution, the two colors may get blended together.

Tip

Mipmaps can be created for a DDS texture within the Visual Studio graphics editor or at runtime when loading a texture by configuring the texture description with the ResourceOptionFlags.GenerateMipMaps flag.

var desc = new Texture2DDescription();
...
desc.OptionsFlags = ResourceOptionFlags.GenerateMipMaps;

Lighting

In our common shader file, we have implemented three lighting formulas. The Lambert function calculates the diffuse reflection using Lambert's Cosine law, while the SpecularPhong and SpecularBlinnPhong methods calculate the specular reflection using the Phong and Blinn-Phong models respectively. Our pixel shaders then combine the components for ambient light, emissive light, and diffuse, and in the case of the two specular light models, specular reflection.

Blinn-Phong is a modification of the Phong reflection model that produces a slightly larger specular highlight when using the same specular power as Phong. This can easily be corrected by increasing the specular power when using the Blinn-Phong shader. The Blinn-Phong shader generally produces more desirable results than the Phong method (see http://people.csail.mit.edu/wojciech/BRDFValidation/index.html).

Lighting

Unit vectors used in the calculation of Phong and Blinn-Phong specular reflection

Both specular formulas use the normal, light, and view unit vectors shown in the previous figure (note that the vector directions for Light and View are the vectors towards the light and view). The Phong model requires the calculation of a reflection vector (using the normal and the from-light vector, and not to-light vector). The dot product of the reflection vector and view vector (or the eye vector) are then used to determine the specular amount (along with the specular power material constant).

float3 Reflection = -Light - 2 * Normal * dot(Light,Normal)

Blinn-Phong uses the vector halfway between the view and the light instead; this is then used in the dot product between the normal and halfway vectors.

float3 halfway = normalize(Light + View);

Although Blinn-Phong is less efficient than pure Phong (as the normalization contains a square root calculation) for directional lights, the halfway vector can be calculated outside of the pixel shader as it does not rely on the normal vector. In most other cases where lights are not treated to be at infinity (as in directional lights), the Phong model will be faster.

float3 normalizedV = v / sqrt(dot(v, v));

There's more…

Currently we are saturating the result of the lighting operations to keep the value within the 0.0 to 1.0 range. If instead, we are supporting high-dynamic-range (HDR) rendering, we could choose a render target format that supports more than 8 bits per component (for example, Format.R16G16B16A16_Float) and stores a larger range of values. To display the result, this technique requires further processing called tone mapping to map colors from HDR to a lower dynamic range that matches the capabilities of the display device.

Included in the companion source code is an additional project (Ch03_02WithCubeMapping.csproj) that demonstrates how to sample a texture cube (or cube map) using float4 sample = CubeMap.Sample(SamplerClamp, normal);.

The support for sampling the texture cube was added by the following methods:

  • Creating a cube map texture (within a DDS) by importing each of the six 512 x 512 textures into the DirectX Texture Tool (DxTex.exe) found in the June 2010 DirectX SDK
  • Adding the texture and sampler state in the cube and sphere renderers
  • Sampling the TextureCube method in the pixel shader, passing the normal vector as the UVW coordinate
    There's more…

    Cube mapping with diffuse and specular highlights

Note

Texture cube UVW coordinates are unit vectors pointing from the center of the object to the surface where it is mapped to one of the six textures of the cube map by the sampler within the pixel shader.

See also

  • Chapter 6, Adding Surface Detail with Normal and Displacement Mapping, demonstrates how to add support for normal mapping
  • For more information about implementing other types of lights, including volumetric lights, see Chapter 10, Implementing Deferred Rendering
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.188.152.157