Meshing and the environment

So, being able to identify features or corners of objects is really just the start of what we would like to know about the user's environment. What we really want to do is use those feature points to help us identify planes, surfaces, or known objects and their pose. ARCore identifies planes or surfaces automatically for us through a technique called meshing. We have already seen how meshing works numerous times in the advanced samples, when ARCore tracks surfaces. Now, before we get ahead of ourselves, let's picture what a point cloud and mesh look like in 3D, with the following diagram:




Point cloud and mesh in 3D
If you pay attention to the diagram, you will see an inset figure showing a polygon and the ordered set of vertices that comprise it. Note how the order of points goes counterclockwise. Yes, the order in which we join points makes a difference to the way a surface is facing when a mesh is lit and shaded. When a scene is rendered we only see surfaces that face the camera. Surfaces pointing away from the camera are removed or back-face culled. The order in which we join points is called winding and isn't something you have to worry about unless you plan to create meshes manually.

Meshing is the process of taking a collection of feature points and constructing a mesh from it. The generated mesh is then often shaded and rendered into the scene. If we run the sample right now and watch, we will see the surfaces or plane meshes being generated and placed by ARCore. How about we open up the Android sample project again in Android Studio to see where this meshing occurs:

  1. Ensure that your code is open to where we left off last time. You should be looking at the lines with mPointCloud.
  2. Scroll down just a little until you see this block of code:
if (messageSnackbar != null) {
for (Plane plane : session.getAllTrackables(Plane.class)) {
if (plane.getType() == com.google.ar.core.Plane.Type.HORIZONTAL_UPWARD_FACING
&& plane.getTrackingState() == TrackingState.TRACKING) {
hideLoadingMessage();
break;
}
}
}
  1. This block of code just loops through the trackables of type Plane (a flat mesh) identified in the session. When it identifies a tracked plane, of the correct type, it hides the loading message and breaks out of the loop.
  1. Then, it renders any planes it identifies with this line:
planeRenderer.drawPlanes(
session.getAllTrackables(Plane.class), camera.getDisplayOrientedPose(), projmtx);
  1. The planeRenderer helper class is for drawing planes. It uses the drawPlanes method to render any of the identified planes the ARCore session has identified using the view and projection matrices. You will notice it passes all the planes in through a call to getAllTrackables(Plane.class).
  2. Put your cursor on drawPlanes and type Ctrl + B (command + B on Mac) to go to the definition.
  3. Now you should see the drawPlanes method in the PlaneRenderer.java fileā€”don't panic. Yes, there is a lot of scary code here, which, thankfully, is already written for us. As an exercise, just scroll through and read the code. We don't have time to go through it in depth, but reading through this code will give you more insight into the rendering process.
  4. From the menu, select Run - Run 'HelloArActivity'. Now, as the app runs, pay special attention to the way the surfaces are rendered and how you can interact with them.

Okay, now we understand how surfaces are created and rendered. What we also need to understand is how we interact with those surfaces or other objects in the environment.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.144.36.141