Prerequisites for texture drawing

Implementing textures is easy and requires only a few steps. Let's have a quick overview of this first, then we will take a deep dive into it:

  1. Texture coordinates: Textures are glued to the geometry surfaces using texture coordinates. For each vertex, there is a corresponding texture coordinate attached. In our implementation, we specified the vertices and texture coordinates in an interleaved form.
  2. The shader stage: The vertex and fragment shader are modified to bound the texture resources. The shader stage allows the fragment shader to access the texture resource and paint the fragments. Textures are shared in the form of a sampler at the shader stage.
  3. Loading the image files: Parse the image files and load the raw image data into the local data structure. This will be helpful in producing Vulkan image resources and sharing them at the shader stage.
  4. Local image data structure: The TextureData local data structure stores all the image-specific attributes.

Specifying the texture coordinates

The geometry coordinates (x, y, z, w) are interleaved with texture coordinates (u, v) in the VertexWithUV structure defined in MeshData.h:

struct VertexWithUV 
{   
    float x, y, z, w;   // Vertex Position 
    float u, v;         // Texture format U,V 
}; 

In this present sample, we will render a cube drawn with textured faces. The following code shows one of the cube faces with four vertex positions followed by two texture coordinates. Refer to MeshData.h for the complete code:

static const VertexWithUV geometryData[] = { 
   { -1.0f,-1.0f,-1.0f, 1.0f, 0.0f, 1.0f },  // -X side 
   { -1.0f,-1.0f, 1.0f, 1.0f, 1.0f, 1.0f }, 
   { -1.0f, 1.0f, 1.0f, 1.0f, 1.0f, 0.0f }, 
   { -1.0f, 1.0f, 1.0f, 1.0f, 1.0f, 0.0f }, 
   { -1.0f, 1.0f,-1.0f, 1.0f, 0.0f, 0.0f }, 
   { -1.0f,-1.0f,-1.0f, 1.0f, 0.0f, 1.0f }, 
. . . . 
// Similar, specify +X, -Y, +Y, -Z, +Z faces 
} 

Updating the shader program

In addition to the vertex coordinates, now our vertex shader will also take the texture coordinates under consideration. The input texture coordinates are received at the layout location 1 in the inUV attribute. These coordinates are then passed on to the fragment shading stage and received in outUV. The following code shows the modification in the existing vertex shader in bold:

// Vertex Shader
#version 450
layout (std140, binding = 0) uniform bufferVals {
    mat4 mvp;
} myBufferVals;

layout (location = 0) in vec4 pos;
layout (location = 1) in vec2 inUV;

layout (location = 0) out vec2 outUV;

void main() {
   outUV          = inUV;
   gl_Position    = myBufferVals.mvp * pos;
   gl_Position.z = (gl_Position.z + gl_Position.w) / 2.0;
}  

The following code implements the fragment shader where the sampled texture is received at the layout binding index 1. The received texture is used with the input texture coordinates to fetch the fragment colors:

// Fragment Shader
#version 450
layout(binding = 1) uniform sampler2D tex;

layout (location = 0) in vec2 uv;
layout (location = 0) out vec4 outColor;

void main() {
outColor = texture(tex, uv);
}

Loading the image files

The image files are loaded in our sample application using the GLI library. OpenGL Image (GLI) is a header-only C++ image library that supports the loading of KTX and DDS image files for a graphics software application. It provides various features such as texture loading and creation, texture compression, accessing of the texture texels, sample textures, convert textures, mipmaps, and more.

You can download this library from http://gli.g-truc.net/0.8.1/index.html. In order to use this library, perform the following changes:

  • CMakeLists.txt: Add GLI support by adding the following lines to the project's CMakeLists.txt file:
      # GLI SETUP 
      set (EXTDIR "${CMAKE_SOURCE_DIR}/../../external/gli") 
      set (GLIINCLUDES "${EXTDIR}") 
      get_filename_component(GLIINC_PREFIX "${GLIINCLUDES}" ABSOLUTE) 
      if(NOT EXISTS ${GLIINC_PREFIX}) 
          message(FATAL_ERROR "Necessary gli headers do not exist:
          " ${GLIINC_PREFIX}) 
      endif() 
      include_directories( ${GLIINC_PREFIX} ) 
  • Header files: This includes the header files for GLI in the Headers.h file:
      /*********** GLI HEADER FILES ***********/ 
      #include <gli/gli.hpp> 

Using the GLI library

The following code is the minimal usage of the GLI library in our application. This code demonstrates image loading, the querying of a dimension, mipmap levels, and the retrieval of image data:

   // Load the image  
   const char* filename = "../VulkanEssentials.ktx"; 
   gli::texture2D image2D(gli::load(filename)); 
   assert(!image2D.empty()); 
 
   // Get the image dimensions at ith sub-resource 
   uint32_t textureWidth   = image2D[i].dimensions().x; 
   uint32_t textureHeight  = image2D[i].dimensions().y; 
 
   // Get number of mip-map levels 
   uint32_t mipMapLevels   = image2D.levels(); 
 
   // Retrieve the raw image data 
   void* rawData           = image2D.data(); 

Local image data structure

The wrapper.h contains a user-defined TextureData structure to hold the image attributes and various pieces of image-specific information in the application. The following is the syntax and description of each field:

struct TextureData{ 
   VkSampler               sampler; 
   VkImage                 image; 
   VkImageLayout           imageLayout; 
   VkMemoryAllocateInfo    memoryAlloc; 
   VkDeviceMemory          mem; 
   VkImageView             view; 
   uint32_t                mipMapLevels; 
   uint32_t                layerCount; 
   uint32_t                textureWidth, textureHeight; 
   VkDescriptorImageInfo   descsImgInfo; 
}; 

The following table describes the various fields of the user-defined structure, TextureData:

Parameters

Description

sampler

This is the VkSampler object associated with the image object.

image

This is the VkImage object.

imageLayout

This contains specific implementation-dependent layout information of the image resource object.

memoryAlloc

This stores the memory allocation information bound with associated image object (VkImage).

mem

This refers to the physical device memory allocated for this image resource.

view

This is the ImageView object of image.

mipMapLevels

This refers to the number of mipmap levels in the image resource.

layerCount

This refers to the number of layer count in the image resource.

textureWidth

textureHeight

These are the dimensions of the image resource.

descsImgInfo

This is the descriptor image information that contains the image view and sample information with proper image layout usage type.

In the next section, we will start implementing our image resource and see it in action.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.226.185.87