Time for action — examining the ray traced scene

  1. Open the file ch10_Raytracing.html in an HTML5 browser. You should see a scene with a simple lit, bobbing sphere like the one shown in the following screenshot:
    Time for action — examining the ray traced scene
  2. First, in order to give us a way of triggering the shader, we need to draw a full screen quad. Luckily for us, we already have a class that helps us do exactly that from the post-processing example earlier in this chapter! Since we don't have a scene to process, we're able to cut a large part of the rendering code out, and the entirety of our JavaScript drawing code becomes:
    function render(){
    gl.viewport(0, 0, c_width, c_height);
    gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);
    //Checks to see if the framebuffer needs to be resized to match the canvas
    post.validateSize();
    post.bind();
    //Render the fullscreen quad
    post.draw();
    }
    
  3. That's it. The remainder of our scene will be built in the fragment shader.
  4. At the core of our shader, there are two functions: One which determines if a ray is intersecting a sphere and one that determines the normal of a point on the sphere. We're using spheres because they're typically the easiest type of geometry to raycast, and they also happen to be a type of geometry that is difficult to represent accurately with polygons.
    // ro is the ray origin, rd is the ray direction, and s is the sphere
    float sphereInter( vec3 ro, vec3 rd, vec4 s ) {
    // Transform the ray into object space
    vec3 oro = ro - s.xyz;
    float a = dot(rd, rd);
    float b = 2.0 * dot(oro, rd);
    float c = dot(oro, oro) - s.w * s.w; // w is the sphere radius
    float d = b * b - 4.0 * a * c;
    if(d < 0.0) { return d; }// No intersection
    return (-b - sqrt(d)) / 2.0; // Intersection occurred
    }
    vec3 sphereNorm( vec3 pt, vec4 s ) {
    return ( pt - s.xyz )/ s.w;
    }
    
  5. Next, we will use those two functions to determine where the ray is intersecting with a sphere (if at all) and what the normal and color of the sphere is at that point. In this case, the sphere information is hardcoded into a couple of global variables to make things easier, but they could just as easily be provided as uniforms from JavaScript.
    vec4 sphere1 = vec4(0.0, 1.0, 0.0, 1.0);
    vec3 sphere1Color = vec3(0.9, 0.8, 0.6);
    float maxDist = 1024.0;
    float intersect( vec3 ro, vec3 rd, out vec3 norm, out vec3 color ) {
    float dist = maxDist;
    float interDist = sphereInter( ro, rd, sphere1 );
    if ( interDist > 0.0 && interDist < dist ) {
    dist = interDist;
    vec3 pt = ro + dist * rd; // Point of intersection
    norm = sphereNorm(pt, sphere1); // Get normal for that point
    color = sphere1Color; // Get color for the sphere
    }
    return dist;
    }
    
  6. Now that we can determine the normal and color of a point with a ray, we need to generate the rays to test with. We do this by determining the pixel that the current fragment represents and creating a ray that points from the desired camera position through that pixel. To aid in this, we will utilize the uInverseTextureSize uniform that the PostProcess class provides to the shader.
    vec2 uv = gl_FragCoord.xy * uInverseTextureSize;
    float aspectRatio = uInverseTextureSize.y/uInverseTextureSize.x;
    // Cast a ray out from the eye position into the scene
    vec3 ro = vec3(0.0, 1.0, 4.0); // Eye position is slightly up and back from the scene origin
    // Ray we cast is tilted slightly downward to give a better view of the scene
    vec3 rd = normalize(vec3( -0.5 + uv * vec2(aspectRatio, 1.0), -1.0));
    
  7. Finally, using the ray that we just generated, we call the intersect function to get the information about the sphere intersection and then apply the same diffuse lighting calculations that we've been using all throughout the book! We're using directional lighting here for simplicity, but it would be trivial to convert to a point light or spotlight model if desired.
    // Default color if we don't intersect with anything
    vec3 rayColor = vec3(0.2, 0.2, 0.2);
    // Direction the lighting is coming from
    vec3 lightDir = normalize(vec3(0.5, 0.5, 0.5));
    // Ambient light color
    vec3 ambient = vec3(0.05, 0.1, 0.1);
    // See if the ray intesects with any objects.
    // Provides the normal of the nearest intersection point and color
    vec3 objNorm, objColor;
    float t = intersect(ro, rd, objNorm, objColor);
    if ( t < maxDist ) {
    float diffuse = clamp(dot(objNorm, lightDir), 0.0, 1.0); // diffuse factor
    rayColor = objColor * diffuse + ambient;
    }
    gl_FragColor = vec4(rayColor, 1.0);
    
  8. Rendering with the preceding code will produce a static, lit sphere. That's great, but we'd also like to add a bit of motion to the scene to give us a better sense of how fast the scene renders and how the lighting interacts with the sphere. To add a simple looping circular motion to the sphere we use the uTime uniform to modify the X and Z coordinates at the beginning of the shader.
    sphere1.x = sin(uTime);
    sphere1.z = cos(uTime);
    

What just happened?

We've just seen how we can construct a scene, lighting and all, completely in a fragment shader. It's a simple scene, certainly, but also one that would be nearly impossible to render using polygon-based rendering. Perfect spheres can only be approximated with triangles.

Have a go hero — multiple spheres

For this example, we've kept things simple by having only a single sphere in the scene. However, all of the pieces needed to render several spheres in the same scene are in place! See if you can set up a scene with three of four spheres all with different coloring and movement.

As a hint: The main shader function that needs editing is intersect.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.224.59.231