3D rendering

Before we get into talking about light estimation for AR, let's step back and review the rendering process of a 3D model. Take a look at the following diagram that explains the rendering process at a high level:



Typical rendering process for a 3D model

Now, the diagram only visually demonstrates the rendering process. Geometry and vertex shaders never actually render a wireframe model. Rather, they only position and color vertices and surfaces, which are then fed into the pixel/fragment and lighting shaders. This last step is called rasterization and represents the final step when the 2D image is generated or rasterized.

The rendering process we are talking about here is for standard real-time rendering on a device's GPU using DirectX or OpenGL. Keep in mind that there are other rendering processes used for real-time (voxel) and non real-time (ray tracing) rendering.

Euclideon have developed a voxel-like rendering technology, which they are claiming to be, in their words, as follows:

"The First Truly Applicable Hologram Tech is Here."
- Euclideon

This sounds very promising and a game changer for AR and VR. However, this technology has come under incredible scrutiny for making, what some feel are outlandish claims of rendering trillions of points without frame rate loss.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.144.37.12