Ray tracing in fragment shaders

A common (if somewhat impractical) technique used to show how powerful shaders can be is using them to ray trace a scene. Thus far, all of our rendering has been done with polygon rasterization, which is the technical term for the triangle-based rendering that WebGL operates with). Ray tracing is an alternate rendering technique that traces the path of light through a scene as it interacts with mathematically defined geometry.

Ray tracing has several advantages compared to polygonal rendering, the primary of which is that it can create more realistic scenes due to a more accurate lighting model that can easily account for things like reflection and reflected lighting. Ray tracing also tends to be far slower than polygonal rendering, which is why it's not used much for real-time applications.

Ray tracing a scene is done by creating a series of rays (represented by an origin and direction) that start at the camera's location and pass through each pixel in the viewport. These rays are then tested against every object in the scene to determine if there are any intersections, and if so the closest intersection to the ray origin is returned. That is then used to determine the color that pixel should be.

Ray tracing in fragment shaders

There are a lot of algorithms that can be used to determine the color of the intersection point, ranging from simple diffuse lighting to multiple bounces of rays off other objects to simulate reflection, but we'll be keeping it simple in our case. The key thing to remember is that everything about our scene will be entirely a product of the shader code.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.17.184.90