ch10_Raytracing.html
in an HTML5 browser. You should see a scene with a simple lit, bobbing sphere like the one shown in the following screenshot:function render(){ gl.viewport(0, 0, c_width, c_height); gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT); //Checks to see if the framebuffer needs to be resized to match the canvas post.validateSize(); post.bind(); //Render the fullscreen quad post.draw(); }
// ro is the ray origin, rd is the ray direction, and s is the sphere float sphereInter( vec3 ro, vec3 rd, vec4 s ) { // Transform the ray into object space vec3 oro = ro - s.xyz; float a = dot(rd, rd); float b = 2.0 * dot(oro, rd); float c = dot(oro, oro) - s.w * s.w; // w is the sphere radius float d = b * b - 4.0 * a * c; if(d < 0.0) { return d; }// No intersection return (-b - sqrt(d)) / 2.0; // Intersection occurred } vec3 sphereNorm( vec3 pt, vec4 s ) { return ( pt - s.xyz )/ s.w; }
vec4 sphere1 = vec4(0.0, 1.0, 0.0, 1.0); vec3 sphere1Color = vec3(0.9, 0.8, 0.6); float maxDist = 1024.0; float intersect( vec3 ro, vec3 rd, out vec3 norm, out vec3 color ) { float dist = maxDist; float interDist = sphereInter( ro, rd, sphere1 ); if ( interDist > 0.0 && interDist < dist ) { dist = interDist; vec3 pt = ro + dist * rd; // Point of intersection norm = sphereNorm(pt, sphere1); // Get normal for that point color = sphere1Color; // Get color for the sphere } return dist; }
uInverseTextureSize
uniform that the PostProcess
class provides to the shader.vec2 uv = gl_FragCoord.xy * uInverseTextureSize; float aspectRatio = uInverseTextureSize.y/uInverseTextureSize.x; // Cast a ray out from the eye position into the scene vec3 ro = vec3(0.0, 1.0, 4.0); // Eye position is slightly up and back from the scene origin // Ray we cast is tilted slightly downward to give a better view of the scene vec3 rd = normalize(vec3( -0.5 + uv * vec2(aspectRatio, 1.0), -1.0));
intersect
function to get the information about the sphere intersection and then apply the same diffuse lighting calculations that we've been using all throughout the book! We're using directional lighting here for simplicity, but it would be trivial to convert to a point light or spotlight model if desired.// Default color if we don't intersect with anything vec3 rayColor = vec3(0.2, 0.2, 0.2); // Direction the lighting is coming from vec3 lightDir = normalize(vec3(0.5, 0.5, 0.5)); // Ambient light color vec3 ambient = vec3(0.05, 0.1, 0.1); // See if the ray intesects with any objects. // Provides the normal of the nearest intersection point and color vec3 objNorm, objColor; float t = intersect(ro, rd, objNorm, objColor); if ( t < maxDist ) { float diffuse = clamp(dot(objNorm, lightDir), 0.0, 1.0); // diffuse factor rayColor = objColor * diffuse + ambient; } gl_FragColor = vec4(rayColor, 1.0);
uTime
uniform to modify the X and Z coordinates at the beginning of the shader.sphere1.x = sin(uTime); sphere1.z = cos(uTime);
We've just seen how we can construct a scene, lighting and all, completely in a fragment shader. It's a simple scene, certainly, but also one that would be nearly impossible to render using polygon-based rendering. Perfect spheres can only be approximated with triangles.
For this example, we've kept things simple by having only a single sphere in the scene. However, all of the pieces needed to render several spheres in the same scene are in place! See if you can set up a scene with three of four spheres all with different coloring and movement.
As a hint: The main shader function that needs editing is intersect
.
18.224.59.231