Visual perception

The rendered image needs to satisfy the expectations of our visual perceptions to the extent that the goal of AR is to display virtual objects so they could realistically seem to reside in our physical environment. If the AR is just an overlay or annotation of the real world, then this may not be as important.

When rendering objects for 3D view, the views from the left and right eyes are offset slightly, based on your interpupillary distance (distance between the eyes), called parallax. This is not a problem and is handled in every VR and wearable AR device, but it's still worth mentioning.

Virtual AR objects coexisting in the real world that are in front of real objects should hide the objects behind them. That's easy; just draw the object on top. The opposite is not as simple. When the virtual object is behind a real-world one, say your virtual pet runs under a table or behind a sofa, it should be partially or completely hidden. This requires a spatial map of the environment; its mesh is used to occlude the computer graphics when rendering the scene.

An even more difficult problem comes up with photorealistic rendering of virtual objects. Ideally, you'd want the lighting on the object to match the lighting in the room itself. Suppose in the real world, the only light is a lamp in the corner of the room, but your AR object is lit from the opposite side. That would be conspicuously inconsistent and artificial. Apple ARKit and Google ARCore address this issue by capturing the ambient light color, intensity, and direction and then adjusting the virtual scene lighting accordingly, even offering the ability to calculate shadows from your virtual objects. This provides a more realistic render of your objects in the real world.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.17.6.75