Time for Action: Examining the Ray Traced Scene

Let's cover an example showcasing the power of ray tracing:

  1. Open the ch10_04_ray-tracing.html file in your browser. You should see a scene with a simple lit, bobbing sphere like the one shown in the following screenshot:

  1. In order to trigger the shader, we need a way to draw a full-screen quad. Fortunately, we have a class from our post-processing examples earlier in this chapter to help us do just that. Since we don't have a scene to process, we can omit a large part of the rendering code and simplify JavaScript's draw function:
function draw() {
gl.viewport(0, 0, gl.canvas.width, gl.canvas.height);
gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);

// Checks to see if the framebuffer needs to be re-sized to match
// the canvas
post.validateSize();
post.bind();

// Render the fullscreen quad
post.draw();
}
  1. That's it. The remainder of our scene will be built in to the fragment shader.
  1. There are two functions at the core of our shader: one that determines if a ray is intersecting a sphere and one that determines the normal of a point on the sphere. We're using spheres because they're typically the easiest type of geometry to raycast, and they also happen to be a type of geometry that is difficult to represent accurately with polygons:
// ro is the ray origin.
// rd is the ray direction.
// s is the sphere
float sphereIntersection(vec3 ro, vec3 rd, vec4 s) {
// Transform the ray into object space
vec3 oro = ro - s.xyz;

float a = dot(rd, rd);
float b = 2.0 * dot(oro, rd);
// w is the sphere radius
float c = dot(oro, oro) - s.w * s.w;

float d = b * b - 4.0 * a * c;

// No intersection
if (d < 0.0) return d;

return (-b - sqrt(d)) / 2.0;
}

vec3 sphereNormaml(vec3 pt, vec4 s) {
return (pt - s.xyz) / s.w;
}
  1. Next, we will use these two functions to determine where the ray is intersecting with a sphere (if at all), along with what the normal and color of the sphere are at that point. To keep things simple, the sphere information is hardcoded as global variables, but they could just as easily be provided as uniforms from JavaScript:
vec4 sphere = vec4(1.0);
vec3 sphereColor = vec3(0.9, 0.8, 0.6);
float maxDistance = 1024.0;

float intersect(vec3 ro, vec3 rd, out vec3 norm, out vec3 color) {
float distance = maxDistance;

// If we wanted multiple objects in the scene you would loop
// through them here and return the normal and color with the
// closest intersection point (lowest distance).

float intersectionDistance = sphereIntersection(ro, rd, sphere);

if (intersectionDistance > 0.0 && intersectionDistance <
distance) {
distance = intersectionDistance;
// Point of intersection
vec3 pt = ro + distance * rd;
// Get normal for that point
norm = sphereNormaml(pt, sphere);
// Get color for the sphere
color = sphereColor;
}

return distance;
}
  1. Now that we can determine the normal and color of a point with a ray, we need to generate the rays for casting. We can do this by determining the pixel that the current fragment represents and then creating a ray that points from the camera position through that pixel. To do so, we will utilize the uInverseTextureSize uniform that the PostProcess class provides to the shader:
// Pixel coordinate of the fragment being rendered
vec2 uv = gl_FragCoord.xy * uInverseTextureSize;
float aspectRatio = uInverseTextureSize.y / uInverseTextureSize.x;

// Cast a ray out from the eye position into the scene
vec3 ro = eyePos;

// The ray we cast is tilted slightly downward to give a better
// view of the scene
vec3 rd = normalize(vec3(-0.5 + uv * vec2(aspectRatio, 1.0), -1.0));
  1. Using the ray we just generated, we call the intersect function to get the information about the sphere's intersection. Then, we apply the same diffuse lighting calculations we've been using all along! To keep things simple, we're using directional lighting here, but it would be easy enough to update the lighting model to point or spot lights:
// Default color if we don't intersect with anything
vec3 rayColor = backgroundColor;

// See if the ray intersects with any objects.
// Provides the normal of the nearest intersection point and color
vec3 objectNormal, objectColor;
float t = intersect(ro, rd, objectNormal, objectColor);

if (t < maxDistance) {
// Diffuse factor
float diffuse = clamp(dot(objectNormal, lightDirection), 0.0,
1.0);
rayColor = objectColor * diffuse + ambient;
}

fragColor = vec4(rayColor, 1.0);
  1. Thus far, our example is a static lit sphere. How do we add a bit of motion to the scene to give us a better sense of how fast the scene renders and how the lighting interacts with the sphere? We do so by adding a simple looping circular motion to the sphere by using the uTime uniform to modify the x and z coordinates at the beginning of the shader:
// Wiggle the sphere back and forth a bit
sphere.x = 1.5 * sin(uTime);
sphere.z = 0.5 * cos(uTime * 3.0);

What just happened?

We covered how we can construct a 3D scene, lighting and all, entirely in a fragment shader. It's a simple scene, of course, but also one that would be nearly impossible to render using polygon-based rendering. That's because perfect spheres can only be approximated with triangles.

Shader Toy
Now that you've seen how to construct 3D scenes entirely in fragment shaders, you will find the demos on ShaderToy.com both beautiful and inspiring.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.14.70.203