Chapter 2: Your First AR Scene

Creating a simple Augmented Reality (AR) scene is quite simple with Unity AR Foundation. The steps involved might only take a page or two. However, when we create a scene together in this chapter, each step will be explained in context so that you can gain a full understanding of what comprises an AR scene using AR Foundation.

But before we do that, we'll take a look at some AR examples provided by Unity, including the AR Foundation Samples project, and build their example scenes for your device. And because that project contains some useful assets, we'll export those as an asset package for reuse in our own projects.

In this chapter, we will cover the following topics:

  • Building and running the AR Foundation Samples project
  • Exporting and importing sample assets
  • Constructing a new Unity AR scene
  • Introduction to C# programming and the MonoBehaviour class
  • Using AR raycast to place an object on a plane
  • Instantiating a GameObject
  • Creating and editing prefabs

Technical requirements

To implement the project provided in this chapter, you will need Unity installed on your development computer, connected to a mobile device that supports augmented reality applications (see Chapter 1, Setting Up for AR Development, for instructions). The completed project can be found in this book's GitHub repository at https://github.com/PacktPublishing/Augmented-Reality-with-Unity-AR-Foundation.

Exploring the AR Foundation example projects from Unity

A great way to learn about how to create AR projects with Unity AR Foundation is to explore the various example projects from Unity. These projects include example scenes, scripts, prefabs, and other assets. By cloning a project and opening an example scene, you can learn how to use AR Foundation, experiment with features, and see some best practices. In particular, consider these projects:

Please look through the README file for each of these projects (found on the GitHub project's home page) to gain an understanding of what the project does, any dependencies it has, and other useful information about the project.

Each of these repositories contains a full Unity project. That is, they are not simply Unity asset packages you can import into an existing project. Rather, you'll clone the entire repository and open it as its own project. This is typical for demo projects that may have other package dependencies and require preset settings to build and run properly.

The AR Foundation Samples project is my go-to project for learning various AR Foundation features. It contains many example scenes demoing individual features, often in place of detailed documentation elsewhere (see https://github.com/Unity-Technologies/arfoundation-samples/tree/main/Assets/Scenes).

Each scene is extremely simple (almost to a fault) as it has the atomic purpose of illustrating a single feature. For example, there are separate scenes for plane detection, plane occlusion, and feathered planes. Notably, the project also contains a main menu scene (Assets/Scenes/ARFoundationMenu/Menu) that launches the other scenes when you build them all into a single executable. I recommend starting with the scene named SimpleAR, which we'll review in a moment.

Another is the AR Foundation Demos project, which contains some more complex user scenarios and features not covered in the Samples project. For example, it demonstrates the Unity Onboarding UX assets, which we'll introduce you to in Chapter 4, Creating an AR User Framework. It also covers image tracking, mesh placement, language localization, and some useful shaders (for example, wireframe, shadows, and fog).

The XR Interaction Toolkit Examples repository contains two separate Unity projects: one for VR and another for AR. It is largely a placeholder (in my opinion) for things to come.

Information – XR Interaction Toolkit

The XR Interaction Toolkit from Unity is not covered in this book. It provides components and other assets for developing interactive scenes using hand controllers and device-supported hand gestures. At the time of writing, XR Interaction Toolkit is focused on Virtual Reality (VR) applications (evidenced by its Examples project, which contains seven scenes for VR and just one for AR, which only supports mobile AR) but I believe it is a key part of Unity's XR strategy and architecture for the future. If you are interested in XR Interaction Toolkit for VR, check out my other book, Unity 2020 Virtual Reality Projects – Third Edition, from Packt Publishing.

Let's get a copy of the AR Foundation Samples project and take a look at the SimpleAR scene.

Building and running the Samples project

In this section, you are going to build the AR Foundation Samples project and run it on your device. First, please clone the project from its GitHub repository and open it in Unity, as follows:

  1. Clone a copy of the project from GitHub to your local machine. The project can be found at https://github.com/Unity-Technologies/arfoundation-samples. Please use whatever cloning method you prefer; for example, GitHub Desktop (https://desktop.github.com/) or the command line (https://git-scm.com/download/).
  2. Add the project to Unity Hub by selecting Projects | Add, navigating to the cloned project's root folder, and pressing Select Folder.
  3. Open the project in Unity. In the Unity Hub projects list, if you see a yellow warning icon, then the cloned project's Unity version is not currently installed on your system. Use the Unity Version selection to choose a newer version of the editor that you have, preferably of the same major release (for example, 20XX).
  4. Open the project by selecting it from the Unity Hub projects list.
  5. If your version of Unity is newer than the project from when it was last saved, you will see a prompt asking, "Do you want to upgrade your project to a newer version of Unity?." Press Confirm.

One of the scenes, SimpleAR, is a basic AR example scene. When run, the user will scan their room with their device's camera and the app will detect any horizontal planes that are rendered on the screen. When your user taps on one of these planes, a small red cube will be placed in the environment. You can walk around the room and the cube will remain where it was placed. If you tap again on another location, the cube will be moved there. Let's briefly review this SimpleAR scene's GameObjects:

  1. Open the SimpleAR scene from the Project window by navigating to the Scenes/SimpleAR/ folder and double-clicking the SimpleAR scene file.
  2. In the Hierarchy window, you will find two GameObjects of particular interest: AR Session and AR Session Origin.
  3. Select the AR Session Origin object and examine its components in the Inspector window. These include AR Plane Manager, AR Point Cloud Manager, AR Raycast Manager, and a Place On Plane script. We'll explain all of this later in this chapter.

Now, let's try to build and run the project:

  1. Switch to your target platform if necessary. To do this, go to File | Build Settings, choose your device's platform from the Platform list (for example, Android or iOS), and click Switch Platform.
  2. Most likely, the cloned project's settings have already been configured, but let's make sure. From the Build Settings window, click the Player Settings button to open that window and confirm the necessary settings mentioned in Chapter 1, Setting Up for AR Development. For example, Android ARCore does not support Vulcan graphics and needs Nougat (API Level 24) as a minimum requirement.
  3. In the Build Settings window again, notice that the list of scenes in Scenes in Build starts with the Menu scene and contains all the demo scenes from this project (the first in the list will be the first scene to load when the app loads). You can leave these alone or just pick the one you want in the build.
  4. Ensure your mobile device is plugged into a USB port on your computer.
  5. Press the Build And Run button to build the project and install it on your device. It will prompt you for a file folder location; I like to create a folder in my project root, named Builds/. Give it a filename (if required) and press Save. It may take a while to complete this task.

If all goes well, the project will build, be installed on your device, and launch.

If you encounter errors while building the project, look at the Console window in the Unity Editor for messages (in the default layout, it's a tab behind the Project window). Read the messages carefully, generally starting from the top. If the fix is not obvious, do an internet search for the message's text, as you can be certain you're probably not the first person to have a similar question!

Tip – "Failed to generate ARCore reference image library" error

If you receive an error when attempting to build the project that says something like Failed to generate ARCore reference image library, please make sure there are no spaces in the pathname of your project folder! See https://github.com/Unity-Technologies/arfoundation-samples/issues/119 for more information.

The main menu will be displayed, as shown in the following screen capture (left panel):

Figure 2.1 – Screenshot of my phone running the arfoundation-samples app and SimpleAR scene

Figure 2.1 – Screenshot of my phone running the arfoundation-samples app and SimpleAR scene

A cool thing about AR Foundation (and this project) is that it can detect the capabilities of the device it is running on at runtime. This means that the buttons in the main menu will be disabled when AR Foundation detects that the features demoed in that scene are not supported on the device. (The device I'm using in the preceding screen capture is an Android phone, so some iOS-only feature scenes are disabled).

Click the Simple AR button to open that scene. You should see a camera video feed on your device's screen. Move your phone slowly in different directions and closer/away. As it scans the environment, feature points and planes will be detected and rendered on the screen. Tap one of the planes to place a cube on the scene, as shown in the right-hand panel of the preceding screen capture.

Some of the assets and scripts in the Samples project can be useful for building our own projects. I'll show you how to export them now.

Exporting the sample assets for reuse

Unity offers the ability to share assets between projects using .unitypackage files. Let's export the assets from the AR Foundation Samples project for reuse. One trick I like to do is move all the sample folders into a root folder first. With the arfoundation-samples project open in Unity, please perform the following steps:

  1. In the Project window, create a new folder under Assets named ARF-samples by clicking the + icon (top left of the window) and selecting Folder.
  2. Drag the following folders into the ARF-samples one: Materials, Meshes, Prefabs, Scenes, Scripts, Shaders, and Textures. That is, move all of them but leave the XR folder at the root.
  3. Right-click on the ARF-samples folder and select Export Package.
  4. The Exporting Package window will open. Click Export.
  5. Choose a directory outside this project's root, name the file (for example, arf-samples), and click Save.

The Assets/ARF-samples/ folder in the Project window is shown in the following screenshot:

Figure 2.2 – The Samples assets folder being exported to a .unitypackage file

Figure 2.2 – The Samples assets folder being exported to a .unitypackage file

You can close the arfoundation-samples project now if you want. You now have an asset package you can use in other projects.

Tip – Starting a New Project by Copying the Samples Project

An alternative to starting a new Unity AR project from scratch is to duplicate the arfoundation-samples project as the starting point for new AR projects. To do that, from your Windows Explorer (or macOS Finder), duplicate the entire project folder and then add it to Unity Hub. This way, you get all the example assets and demo scenes in one place, and it's set up with reasonable default project settings. I often do this, especially for quick demos and small projects.

Next, we are going to import the Samples assets into your Unity project and build the given SimpleAR scene.

Building the SimpleAR scene in your own project

As you will see later in this chapter, the Samples project includes some assets we can use in your own projects, saving you time and effort, especially at the start. We will import unitypackage, which we just exported, and then build the given SimpleAR scene as another test to verify that you're set up to build and run AR applications.

Creating a new project

If you already have a Unity project set up for AR development, as detailed in Chapter 1, Setting Up for AR Development, you can open it in Unity and skip this section. If not, perform the following steps, which have been streamlined for your convenience. If you require more details or explanations, please revisit Chapter 1, Setting Up for AR Development.

To create and set up a new Unity project with AR Foundation, Universal Render Pipeline, and the new Input System, here are the abbreviated steps:

  1. Create a new project by opening Unity Hub, selecting Projects | New, choosing Universal Render Pipeline, specifying a Project Name, such as MyARProject, and clicking Create.
  2. Open your project in the Unity Editor by selecting it from Unity Hub's Projects list.
  3. Set your target platform by going to File | Build Settings, choosing Android or iOS from the Platform list, and clicking Switch Platform.
  4. Set up the Player Settings according to Chapter 1, Setting Up for AR Development, and/or your device's documentation by going to the Edit | Project Settings | Player window. For example, Android ARCore does not support Vulcan graphics and needs Nougat (API Level 24) as a minimum requirement.
  5. Install an XR plugin by going to Edit | Project Settings | XR Plugins Manager | Install XR Plugin Management. Then, check the checkbox for your device's Plug-in Provider.
  6. Install AR Foundation by going to Window | Package Manager, choosing Unity Registry from the filter list at the top left, searching for ar using the search input field, selecting the AR Foundation package, and clicking Install.
  7. Install the Input System package by going to Window | Package Manager, choosing Unity Registry from the filter list at the top left, searching for input using the search input field, selecting the Input System package, and clicking Install.

    When prompted to enable the input backend, you can say Yes, but we'll actually change this setting to Both in the next topic when we import the Sample assets into the project.

  8. Add the AR Background Renderer to the URP Forward renderer by locating the ForwardRenderer data asset, usually in the Assets/Settings/ folder. In its Inspector window, click Add Renderer Feature and select AR Background Renderer Feature.

You might want to bookmark these steps for future reference. Next, we'll import the Sample assets we exported from the AR Foundation Samples project.

Importing the Sample assets into your own project

Now that you have a Unity project set up for AR development, you can import the sample assets into your project. With your project open in Unity, perform the following steps:

  1. Import the package from the main menu by selecting Assets | Import Package | Custom Package.
  2. Locate the arf-samples.unitypackage file on your system and click Open.
  3. The Import Unity Package window will open. Click Import.
  4. If you created your project using the Universal Render Pipeline (or HDRP), rather than using the built-in render pipeline like we did, you need to convert the imported materials. Select Edit | Render Pipeline | URP | Upgrade Project Materials to URP Materials. Then, when prompted, click Proceed.
  5. Then, go to Player Settings using Edit | Project Settings | Player, select Configuration | Active Input Handling, and choose Both. Then, when prompted, click Apply.
  6. We will use the new Input System for projects in this book. However, some demo scenes in the Samples project use the old Input Manager. If you choose Input System Package (New) for Active Input Handling, then those demo scenes may not run.

Hopefully, all the assets will import without any issues. However, there may be some errors while compiling the Samples scripts. This could happen if the Samples project is using a newer version of AR Foundation than your project and it is referencing API functions for features your project does not have installed. The simplest solution is to upgrade the version of AR Foundation to the same or later version as the Samples project. To do so, perform the following steps:

  1. To see error messages, open the Console window using its tab or selecting Window | General | Console.
  2. Suppose that, in my project, I have additional errors because I have installed AR Foundation 4.0.12 but the Samples project uses version 4.1.3 features, which are not available in my version. Here, I'll go to Window | Package Manager, select the AR Foundation package, click See Other Versions, select the 4.1.3 version, and then click the Update to 4.1.3 button.
  3. The project also might be using preview versions of packages. Enable preview packages by selecting Edit | Project Settings | Package Manager | Enable preview packages.
  4. Ensure the ARCore XR plugin and/or AR Kit XR plugin version matches the version of the AR Foundation package the project is using.
  5. Another message you might see is that some Samples scripts require that you enable "unsafe" code in the project. Go to Project Settings | Player | Script Compilation | Allow 'unsafe' code and check the checkbox.

    This is not as threatening as it may sound. "Unsafe" code usually means that something you installed is calling C++ code from the project that is potentially unsafe from the compiler's point of view. Enabling unsafe code in Unity is usually not a problem unless, for example, you are publishing WebGL to a WebPlayer, which we are not.

Finally, you can verify your setup by building and running the SimpleAR scene, this time from your own project. Perform the following steps:

  1. Open the SimpleAR scene from the Project window by navigating to the ARF-samples/Scenes/SimpleAR/ folder and double-clicking the SimpleAR scene file.
  2. Open the Build Settings window by going to File | Build Settings.
  3. For the Scenes in Build list, click the Add Open Scenes button and uncheck all the scenes in the list other than the SimpleAR one.
  4. Ensure your device is connected via USB.
  5. Press the Build And Run button to build the project and install it on your device. It will prompt you for a location; I like to create a folder in my project root named Builds/. Give it a filename (if required) and press Save. It may take a while to complete this task.

The app should successfully build and run on your device. If you encounter any errors, please review each of the steps detailed in this chapter and Chapter 1, Setting Up for AR Development.

When the app launches, as described earlier, you should see a camera video feed on your screen. Move your phone slowly in different directions and closer/away. As it scans the environment, feature points and planes will be detected and rendered on the screen. Tap one of these planes to place a cube on the scene.

Your project is now ready for AR development!

Starting a new, basic AR scene

In this section, we'll create a scene very similar to SimpleAR (actually, more like the Samples scene named InputSystem_PlaceOnPlane) but we will start with a new empty scene. We'll add AR Session and AR Session Origin objects provided by AR Foundation to the scene hierarchy, and then add trackable feature managers for planes and point clouds. In the subsequent sections of this chapter, we'll set up an Input System action controller, write a C# script to handle any user interaction, and create a prefab 3D graphic to place in the scene.

So, start the new scene by performing the following steps:

  1. Create a new scene by going to File | New Scene.
  2. If prompted, choose the Basic (Built-in) template. Then, click Create.

    Unity allows you to use a Scene template when creating a new scene. The one named Basic (Built-in) is comparable to the default new scene in previous versions of Unity.

  3. Delete Main Camera from the Hierarchy window by using right-click | Delete (or the Del key on your keyboard).
  4. Add an AR Session by selecting GameObject from the main menu, then XR | AR Session.
  5. Add an AR Session Origin by selecting GameObject from the main menu, then XR | AR Session Origin.
  6. Unfold AR Session Origin and select its child; that is, AR Camera. In the Inspector window, use the Tag selector at the top left to set it as our MainCamera. (This is not required but it is a good practice to have one camera in the scene tagged as MainCamera.)
  7. Save the scene using File | Save As, navigate to the Assets/Scenes/ folder, name it BasicARScene, and click Save.

Your scene Hierarchy should now look as follows:

Figure 2.3 – Starting a scene Hierarchy

Figure 2.3 – Starting a scene Hierarchy

We can now take a closer look at the objects we just added, beginning with the AR Session object.

Using AR Session

The AR Session object is responsible for enabling and disabling augmented reality features on the target platform. When you select the AR Session object in your scene Hierarchy, you can see its components in the Inspector window, as shown in the following screenshot:

Figure 2.4 – The AR Session object's Inspector window

Figure 2.4 – The AR Session object's Inspector window

Each AR scene must include one (and only one) AR Session. It provides several options. Generally, you can leave these as their default values.

The Attempt Update option instructs the AR Session to try and install the underlying AR support software on the device if it is missing. This is not required for all devices. iOS, for example, does not require any additional updates if the device supports AR. On the other hand, to run AR apps on Android, the device must have the ARCore services installed. Most AR apps will do this for you if they are missing, and that is what the Attempt Update feature of AR Session does. If necessary, when your app launches and support is missing or needs an update, AR Session will attempt to install Google Play Services for AR (see https://play.google.com/store/apps/details?id=com.google.ar.core). If the required software is not installed, then AR will not be available on the device. You could choose to disable automatic updates and implement them yourself to customize the user onboarding experience.

Note

The Match Frame Rate option in the Inspector window is obsolete. Ordinarily, you would want the frame updates of your apps to match the frame rate of the physical device, and generally, there is no need to tinker with this. If you need to tune it, you should control it via scripting (see https://docs.unity3d.com/ScriptReference/Application-targetFrameRate.html).

Regarding Tracking Mode, you will generally leave it set to Position and Rotation, as this specifies that your VR device is tracking in the physical world 3D space using both its XYZ position and its rotation around each axis. This is referred to as 6DOF, for six-degrees-of-freedom tracking, and is probably the behavior that you expect. But for face tracking, for example, we should set it to Rotation Only, as you'll see in Chapter 9, Selfies: Making Funny Faces.

The AR Session GameObject also has an AR Input Manager component that manages our XR Input Subsystem for tracking the device's pose in a physical 3D space. It reads input from the AR Camera's AR Pose Driver (discussed shortly). There are no options for the component, but this is required for device tracking.

We also added an AR Session Origin GameObject to the Hierarchy. Let's look at that next.

Using AR Session Origin

The AR Session Origin will be the root object of all trackable objects. Having a root origin keeps the Camera and any trackable objects in the same space and their positions relative to each other. This session (or device) space includes the AR Camera and any trackable features that have been detected in the real-world environment by the AR software. Otherwise, detected features, such as planes, won't appear in the correct place relative to the Camera.

Tip – Scaling Virtual Scenes in AR

If you plan to scale your AR scene, place your game objects as children of AR Session Origin and then scale the parent AR Session Origin transform, rather than the child objects themselves. For example, consider a world-scale city map or game court resized to fit on a tabletop. Don't scale the individual objects in the scene; instead, scale everything by resizing the root session origin object. This will ensure the other Unity systems, especially physics and particles, retain their scale relative to the camera space. Otherwise, things such as gravity, calculated as meters per second, and particle rendering could mess up.

When you select the AR Session Origin object in your scene Hierarchy, you can see its components in the Inspector window, as shown in the following screenshot:

Figure 2.5 – The AR Session object's Inspector window

Figure 2.5 – The AR Session object's Inspector window

At the time of writing, the default AR Session Origin object simply has an AR Session Origin
component. We'll want to build out its behavior by adding more components in a moment.

The Session Origin's Camera property references its own child AR Camera GameObject, which we'll look at next.

Using the AR Camera

The AR Camera object is a child of AR Session Origin. Its Inspector window is shown in the following screenshot:

Figure 2.6 – The AR Camera object's Inspector window

Figure 2.6 – The AR Camera object's Inspector window

During setup, we tagged the AR Camera as our MainCamera. This is not required but it is a good practice to have one camera in the scene tagged as MainCamera, for example, for any code that may use Camera.main, which is a shortcut for the find by tag name.

As its name implies, the AR Camera object includes a Camera component, required in all Unity scenes, which determines what objects to render on your screen. The AR one has mostly default values. The Near and Far Clipping planes have been adjusted for typical AR applications to (0.1, 20) meters. In AR apps, it's not unusual to place the device within inches of a virtual object, so we wouldn't want it to be clipped. Conversely, in an AR app, if you walk more than 20 meters away from an object that you've placed in the scene, you probably don't need it to be rendered at all.

Importantly, rather than using a Skybox, as you'd expect in non-AR scenes, the camera's Background is set to a Solid black color. This means the background will be rendered using the camera's video feed. This is controlled using the AR Camera Background component of the AR Camera. In an advanced application, you can even customize how the video feed is rendered, using a custom video material (this topic is outside the scope of this book). Similarly, on a wearable AR device, a black camera background is required, but with no video feed, to mix your virtual 3D graphics atop the visual see-through view.

The video feed source is controlled using the AR Camera Manager component. You can see, for example, that Facing Direction can be changed from World to User for a selfie face tracking app (see Chapter 9, Selfies: Making Funny Faces).

The Light Estimation options are used when you want to emulate real-world lighting when rendering your virtual objects. We'll make use of this feature later in this chapter.

You also have the option to disable Auto Focus if you find that the camera feature is inappropriate for your AR application.

Tip – When to Disable Camera Auto Focus for AR

Ordinarily, I disable Auto Focus for AR applications. When the software uses the video feed to help detect planes and other features in the environment, it needs a clear, consistent, and detailed video feed, not one that may be continually changing for Auto Focus. That would make it difficult to process AR-related algorithms accurately to decode their tracking. On the other hand, a selfie face tracking app may be fine with Auto Focus enabled and could improve the user experience when the area behind the user loses focus due to depth of field.

The AR Pose Driver component is responsible for updating the AR Camera's transform as it tracks the device in the real world. (There are similar components for VR headsets and hand controllers, for instance.) This component relies on the XR plugin and the Input XR Subsystem to supply the positional tracking data (see https://docs.unity3d.com/Manual/XRPluginArchitecture.html).

Our next step is to add Plane and Point Cloud visualizers to the scene.

Adding Plane and Point Cloud managers

When your application runs, you'll ask the user to scan the room for the AR software to detect features in the environment, such as depth points and flat planes. Usually, you'll want to show these to the user as they're detected. We do this by adding the corresponding feature managers to the AR Session Origin game object. For example, to visualize planes, you'll add an AR Plane Manager to the AR Session Origin object, while to visualize point clouds, you'll add an AR Point Cloud Manager.

AR Foundation supports detecting and tracking the following features:

  • Anchor: A fixed pose (consisting of location and rotation) in the physical environment (controlled by the AR Anchor Manager component). This is also known as a Reference Point.
  • Reflection Probe: Environment reflection probes for rendering shiny surface materials (controlled by the AR Environment Probe Manager component).
  • Face: A human face detected by the AR device (controlled by the AR Face Manager component).
  • Human Body: A trackable human body and the body's skeleton (controlled by the AR Human Body Manager component).
  • Image: A 2D image that has been detected and tracked in the environment's AR Tracked Image Manager component.
  • Participant: Another user (device) in a collaborative session.
  • Plane: A flat plane, usually horizontally or vertically inferred from the point cloud (controlled by the AR Plane Manager component).
  • Point Cloud: A set of depth points detected by the AR device (controlled by the AR Point Cloud Manager component).
  • Object: A 3D object detected and tracked in the environment (controlled by the AR Tracked Object Manager component).

Not all of these are supported on every platform. See the documentation for your current version of AR Foundation (for example, visit https://docs.unity3d.com/Packages/[email protected]/manual/index.html#platform-support and select your version at the top left). We will be using many of these in various projects throughout this book. Here, we will use the Plane and Point Cloud trackables. Please perform the following steps to add them:

  1. Select the AR Session Origin object from the Hierarchy window.
  2. Add a Point Cloud Manager by selecting Add Component, searching for ar in the search input field, then clicking AR Point Cloud Manager.
  3. Add a Plane Manager by selecting Add Component, searching for ar in the search input field, and clicking AR Plane Manager.
  4. On the AR Plane Manager, change Detection Mode to only horizontal planes by selecting Nothing (to clear the list), then selecting Horizontal.

You'll notice that the Point Cloud Manager has an empty slot for the Point Cloud Prefab visualizer and that the Plane Manager has an empty slot for the Plane Prefab visualizer. We'll use prefabs from the Samples project, as follows:

  1. In the Inspector window, go to AR Point Cloud Manager | Point Cloud Prefab and press the doughnut icon on the right-hand side of the field to open the Select GameObject dialog box.
  2. Click the Assets tab and double-click the AR Point Cloud Visualizer prefab.

    There are alternative point cloud visualizer prefabs you might like to try out also, such as AR Point Cloud Debug Visualizer and AllPointCloudPointsPrefab.

  3. Likewise, for AR Plane Manager | Plane Prefab, press the doughnut icon on the right-hand side of the field to open the Select GameObject dialog box.
  4. Click the Assets tab and double-click AR Feathered Plane.

    There are alternative plane visualizer prefabs to try out also, such as AR Plane Debug Visualizer, AR Feathered Plane Fade, and CheckeredPlane.

  5. Save the scene by going to File | Save.

We're using the visualizer prefabs we got from the Samples project. Later in this chapter, we'll talk more about prefabs, take a closer look at the visualizer ones, and learn how to edit them to make our own custom visualizers. First, we'll add the AR Raycast Manager to the scene.

Adding AR Raycast Manager

There's another component I know we're going to need soon, known as AR Raycast Manager. This will be used by our scripts to determine if a user's screen touch corresponds to a 3D trackable feature detected by the AR software. We're going to use it in our script to place an object on a plane. Perform the following steps to add it to the scene:

  1. Select the AR Session Origin object from the Hierarchy window.
  2. Select Add Component | search for ar in the search input field, and click AR Raycast Manager.

The AR Session Origin GameObject with the manager components we added now looks like this in the Inspector window:

Figure 2.7 – AR Session Origin with various manager components

One more thing that's handy to include is light estimation, which helps with rendering your virtual objects more realistically.

Adding Light Estimation

By adding a Light Estimation component to your Directional Light source, the AR camera can use this information when rendering your scene to try and match the scene's lighting more closely to the real-world environment.

To add light estimation, perform the following steps:

  1. In the Hierarchy window, select the Directional Light object.
  2. In the Inspector, click Add Component, search for light estimation, and add the Basic Light Estimation component.
  3. In the Hierarchy window, find AR Camera (child of AR Session Origin), drag it into the Inspector window, and drop it onto the Light Estimation | Camera Manager slot.
  4. In the Hierarchy window, select AR Camera, then set AR Camera Manager | Light Estimation to Everything. Note that not all platforms support all light estimation capabilities, but using the Everything flags will have them use all of the ones that are available at runtime.
  5. Save your work by going to File | Save.

Good! I think we should try to build and run what we have done so far and make sure it's working.

Building and running the scene

Currently, the scene initializes an AR Session, enables the AR camera to scan the environment, detects points and horizontal planes, and renders these on the screen using visualizers. Let's build the scene and make sure it runs:

  1. Open the Build Settings window by going to File | Build Settings.
  2. For the Scenes in Build list, click the Add Open Scenes button and uncheck all the scenes in the list other than this current scene (mine is named BasicARScene).
  3. Ensure your device is connected to your computer via USB.
  4. Press the Build And Run button to build the project and install it on your device. It will prompt you for a location; I like to create a folder in my project root named Builds/. Give it a filename (if required) and press Save. It may take a while to complete this task.

The app should successfully build and run on your device. If you encounter any errors, please read the error messages carefully in the Console window. Then, review each of the setup steps detailed in this chapter and Chapter 1, Setting Up for AR Development.

When the app launches, you should see a video feed on your screen. Move the device slowly in different directions and closer/away. As it scans the environment, feature points and planes will be detected and rendered on the screen using the visualizers you chose.

Next, let's add the ability to tap on one of the planes to instantiate a 3D object there.

Placing an object on a plane

We will now add the ability for the user to tap on a plane and place a 3D virtual object in the scene. There are several parts to implementing this:

  • Setting up a Place Object input action when the user taps the screen.
  • Writing a PlaceObjectOnPlane script that responds to the input action and places an object on the plane.
  • Determining which plane and where to place the object using AR Raycast Manager.
  • Importing a 3D model and making it a prefab for placing in this scene.

Let's begin by creating an input action for a screen tap.

Setting up a PlaceObject input action

We are going to use the Unity Input System package for user input. If the Input System is new to you, the steps in this section may seem complicated, but only because of its great versatility.

The Input System lets you define Actions that separate the logical meaning of the input from the physical means of the input. Using named actions is more meaningful to the application and programmers.

Note – Input System Tutorial

For a more complete tutorial on using the Input System package, see https://learn.unity.com/project/using-the-input-system-in-unity.

Here, we will define a PlaceObject action that is bound to screen tap input data. We'll set this up now, and then use this input action in the next section to find the AR plane that was tapped and place a virtual object there.

Before we begin, I will assume you have already imported the Input System package via Package Manager and set Active Input Handing to Input System Package (or Both) in Player Settings. Now, follow these steps:

  1. In the Project window, create a new folder named Inputs using right-click | Create | Folder (or use the + button at the top left of the window). I put mine under my _App/ folder.
  2. Create an input action controller asset by right-clicking inside the Inputs folder, then selecting Create | Input Actions (or using the + button at the top left of the window). Rename it AR Input Actions.
  3. Click Edit Asset to open its editor window.
  4. In the leftmost Action Maps panel, click the + button and name the new map ARTouchActions.
  5. In the middle Actions panel, rename the default action to PlaceObject using right-click | Rename.
  6. In the right-hand side Properties panel, set Action Type to Value.
  7. Set its Control Type to Vector 2.
  8. In the middle Actions panel, click the child <No Binding> item to add a binding.
  9. In the right-hand side Properties panel, under Binding, using the Path select list, choose TouchScreen | Primary Touch | Position.
  10. At the top of the window, click Save Asset (unless the Auto-Save checkbox is checked).

With that, we've created a data asset named AR Input Actions that contains an action map named ARTouchActions, which has one action, PlaceObject, that detects a screen touch. It returns the touch position as a 2D vector (Vector2) with the X, Y values in pixel coordinates. The input action asset is shown in the following screenshot:

Figure 2.8 – Our AR Input Actions set up for screen taps

Figure 2.8 – Our AR Input Actions set up for screen taps

Now, we can add the input actions to the scene. This can be done via a Player Input component. For our AR scene, we'll add a Player Input component to the AR Session Origin, as follows:

  1. In the Hierarchy window, select the AR Session Origin object.
  2. In its Inspector window, click Add Component | Input | Player Input.
  3. From the Project window, drag the AR Input Actions asset from your Inputs/ folder into the Player Input | Actions slot in the Inspector window.
  4. Leave Behavior set to Send Messages.

    Information – Input System Behavior Types

    Unity and C# provide different ways for objects to signal other objects. The Player Input component lets you choose how you want input actions to be communicated, via its Behavior setting. The options are as follows:

    Send Messages: Will send action messages to any components on the same GameObject (https://docs.unity3d.com/ScriptReference/GameObject.SendMessage.html). As we'll see, your message handler must be named with the "On" prefix (for example, OnPlaceObject) and receives an InputValue argument (https://docs.unity3d.com/Packages/[email protected]/api/UnityEngine.InputSystem.InputValue.html).

    Broadcast Messages: Like Send Messages, Broadcast Messages will send messages to components on this GameObject and all its children (https://docs.unity3d.com/ScriptReference/Component.BroadcastMessage.html).

    Invoke Unity Events: You can set event callback functions using the Inspector or in scripts (https://docs.unity3d.com/Manual/UnityEvents.html). The callback function receives an InputAction.CallbackContext argument (https://docs.unity3d.com/Packages/[email protected]/api/UnityEngine.InputSystem.InputAction.CallbackContext.html).

    Invoke C# Events: You can set event listeners in scripts (https://docs.microsoft.com/en-us/dotnet/csharp/programming-guide/events/).

    To learn more about the Player Input component, see https://docs.unity3d.com/Packages/[email protected]/api/UnityEngine.InputSystem.PlayerInput.html.

I've decided to use Send Messages here, so we'll need to write a script with an OnPlaceObject function, which we'll do next. But first, I'll provide a quick introduction to Unity C# programming.

Introducing Unity C# programming and the MonoBehaviour class

Writing C# scripts is an essential skill for every Unity developer. You don't need to be an expert programmer, but you cannot avoid writing some code to make your projects work. If you are new to coding, you can simply follow the instructions provided here, and over time, you'll get more comfortable and proficient. I also encourage you to go through some of the great beginner tutorials provided by Unity (https://learn.unity.com/) and others, including the following:

Given that, I will offer some brief explanations as we work through this section. But I'll assume that you have at least a basic understanding of C# language syntax, common programming vocabulary (for example, class, variable, and function), using an editor such as Visual Studio, and how to read error messages that may appear in your Console window due to typos or other common coding mistakes.

We're going to create a new script named PlaceObjectOnPlane. Then, we can attach this script as a component to a GameObject in the scene. It will then appear in the object's Inspector window. Let's begin by performing the following steps:

  1. In the Project window, locate your Scripts/ folder (mine is Assets/_App/Scripts/), right-click it, and select Create | C# Script.
  2. Name the file PlaceObjectOnPlane (no spaces nor other special characters are allowed in the name, and it should start with a capital letter).

    This creates a new C# script with the .cs file extension (although you don't see the extension in the Project window).

  3. Double-click the PlaceObjectOnPlane file to open it in your code editor. By default, my system uses Microsoft Visual Studio.

As you can see in the following initial script content of the template, the PlaceObjectOnPlane.cs file declares a C# class, PlaceObjectsOnPlane, that has the same name as the .cs file (the names must match; otherwise, it will cause compile errors in Unity):

using System.Collections;

using System.Collections.Generic;

using UnityEngine;

public class PlaceObjectOnPlane : MonoBehaviour

{

    // Start is called before the first frame update

    void Start()

    {

    }

    // Update is called once per frame

    void Update()

    {

    }

}

The first three lines in this script have a using directive, which declares an SDK library, or namespace, that will be used in the script. When a script references external symbols, the compiler needs to know where to find them. In this case, we're saying that we'll potentially be using standard .NET system libraries for managing sets of objects (collections). And here, we are using the UnityEngine API.

One of the symbols defined by UnityEngine is the MonoBehaviour class. You can see that our PlaceObjectsOnPlane class is declared as a subclass of MonoBehaviour. (Beware its British spelling, "iour"). Scripts attached to a GameObject in your scene must be a subclass of MonoBehaviour, which provides a litany of features and services related to the GameObject where it is attached.

For one, MonoBehaviour provides hooks into the GameObject life cycle and the Unity game loop. When a GameObject is created at runtime, for example, its Start() function will automatically be called. This is a good place to add some initialization code.

The Unity game engine's main purpose is to render the current scene view every frame, perhaps 60 times per second or more. Each time the frame is updated, your Update() function will automatically be called. This is where you put any runtime code that needs to be run every frame. Try to keep the amount of work that's done in Update() to a minimum; otherwise, your app may feel slow and sluggish.

You can learn more about the MonoBehaviour class here: https://docs.unity3d.com/ScriptReference/MonoBehaviour.html. To get a complete picture of the GameObject and MonoBehaviour scripts' life cycles, take a look at this flowchart here: https://docs.unity3d.com/Manual/ExecutionOrder.html.

We can now write our script. Since this is the first script in this book, I'll present it slowly.

Writing the PlaceObjectOnPlane script

The purpose of the PlaceObjectOnPlane script is to place a virtual object on the AR plane when and where the user taps. We'll outline the logic first (in C#, any text after // on the same line is a comment):

using System.Collections;

using System.Collections.Generic;

using UnityEngine;

using UnityEngine.InputSystem;

public class PlaceObjectOnPlane : MonoBehaviour

{

    void OnPlaceObject(InputValue value)

    {

        // get the screen touch position

        // raycast from the touch position into the 3D scene            looking for a plane

        // if the raycast hit a plane then

        //      get the hit point (pose) on the plane

        //      if this is the first time placing an object,

        //          instantiate the prefab at the hit position                     and rotation

        //      else

        //          change the position of the previously                     instantiated object

    }

}

As it turns out, in this script, there is no need for an Update function as it is only used for frame updates, which this script can ignore.

This script implements OnPlaceObject, which is called when the user taps the screen. As we mentioned previously, the Player Input component we added to the AR Session Origin uses the Send Messages behavior and thus expects our script to implement OnPlacedObject for the PlacedObject action. It receives an InputValue. Notice that I also added a line using UnityEngine.InputSystem;, which defines the InputValue class.

First, we need to get the screen touch position from the input value we passed in. Add the following code, which declares and assigns it to the touchPosition local variable:

        // get the screen touch position

        Vector2 touchPosition = value.Get<Vector2>();

The next step is to figure out if the screen touch corresponds to a plane that was detected in the AR scene. AR Foundation provides a solution by using the AR Raycast Manager component that we added to the AR Session Origin GameObject earlier. We'll use it in our script now. Add these lines to the top of your script:

using UnityEngine.XR.ARFoundation;

using UnityEngine.XR.ARSubsystems;

Then, inside the OnPlaceObject function, add the following code:

        // raycast from the touch position into the 3D scene            looking for a plane

        // if the raycast hit a plane then  

        ARRaycastManager raycaster =            GetComponent<ARRaycastManager>();

        List<ARRaycastHit> hits = new List<ARRaycastHit>();

        if (raycaster.Raycast(touchPosition, hits,            TrackableType.PlaneWithinPolygon))

        {

            //

        }

Firstly, we get a reference to the ARRaycastManager component, assigning it to raycaster. We declare and initialize a list of ARRaycastHit, which will be populated when the raycast finds something. Then, we call raycaster.Raycast(), passing in the screen's touchPosition, and a reference to the hits list. If it finds a plane, it'll return true and populate the hits list with details. The third argument instructs raycaster.Raycast on what kinds of trackables can be hit. In this case, PlaneWithinPolygon filters for 2D convex-shaped planes.

Information – For More Information on AR Raycasting

For more information on using ARRaycastManager, see https://docs.unity3d.com/Packages/[email protected]/manual/raycast-manager.html.

For a list of trackable types you can pass in, see https://docs.unity3d.com/Packages/[email protected]/api/UnityEngine.XR.ARSubsystems.TrackableType.html.

The code inside the if statement will only be executed if raycaster.Raycast returns true; that is, if the user had tapped a location on the screen that casts to a trackable plane in the scene. In that case, we must create a 3D GameObject there. In Unity, creating a new GameObject is referred to as instantiating the object. You can read more about it here: https://docs.unity3d.com/Manual/InstantiatingPrefabs.html.

First, let's declare a variable, placedPrefab, to hold a reference to the prefab that we want to instantiate on the selected plane. Using the [SerializedField] directive permits the property to be visible and settable in the Unity Inspector. We'll also declare a private variable, spawnedObject, that holds a reference to the instantiated object. Add the following code to the top of the class:

public class PlaceObjectOnPlane : MonoBehaviour

{

    [SerializeField] GameObject placedPrefab;

    GameObject spawnedObject;

Now, inside the if statement, we will instantiate a new object if this is the first time the user has tapped the screen, and then assign it to spawnedObject. If the object had already been spawned and the user taps the screen again, we'll move the object to the new location instead. Add the following highlighted code:

    public void OnPlaceObject(InputValue value)

    {

        // get the screen touch position

        Vector2 touchPosition = value.Get<Vector2>();

        // raycast from the touch position into the 3D scene            looking for a plane

        // if the raycast hit a plane then  

        ARRaycastManager raycaster =            GetComponent<ARRaycastManager>();

        List<ARRaycastHit> hits = new List<ARRaycastHit>();

        if (raycaster.Raycast(touchPosition, hits,            TrackableType.PlaneWithinPolygon))

        {

            // get the hit point (pose) on the plane

            Pose hitPose = hits[0].pose;

            // if this is the first time placing an object,

            if (spawnedObject == null)

            {

                // instantiate the prefab at the hit position                    and rotation

                spawnedObject = Instantiate(placedPrefab,                    hitPose.position, hitPose.rotation);

            }

            else

            {

                // change the position of the previously                    instantiated object

                spawnedObject.transform.SetPositionAndRotation(                    hitPose.position, hitPose.rotation);

            }

        }

    }

Raycast populates a list of hit points, as there could be multiple trackable planes in line where the user has tapped the screen. They're sorted closest to furthest, so in our case, we're only interested in the first one, hits[0]. From there, we get the point's Pose, a simple structure that includes 3D position and rotation values. These, in turn, are used when placing the object.

After that, save the script file.

Now, back in Unity, we'll attach our script as a component to AR Session Origin by performing the following steps:

  1. First, check the Console window (using the Console tab or Window | General | Console) and ensure there are no compile errors from the script. If there are, go back to your code editor and fix them.
  2. In the Hierarchy window, select the AR Session Origin object.
  3. In the Project window, drag the PlaceObjectOnPlane script into the Inspector window so that when you drop it, it is added as a new component.

    You'll notice that there is a Placed Prefab property in the component's Inspector window. This is the placedPrefab variable we declared in the script. Let's populate it with the red cube prefab provided by the Samples assets.

  4. In the Project window, navigate to the ARF-samples/Prefabs/ folder.
  5. Drag the AR Placed Cube prefab into the Inspector window, on the Place Object On Plane | Placed Prefab slot.
  6. Save the scene by going to File | Save.

Our script, as a component on the AR Session Origin GameObject, should now look as follows:

Figure 2.9 – PlaceObjectOnPlane as a component with its Placed Prefab slot populated

Figure 2.9 – PlaceObjectOnPlane as a component with its Placed Prefab slot populated

Let's try it! We're now ready to build and run the scene.

Building and running the scene

If you've built the scene before, in the previous section, you can go to File | Build And Run to start the process. Otherwise, perform the following steps to build and run the scene:

  1. Open the Build Settings window by going to File | Build Settings.
  2. For the Scenes in Build list, click the Add Open Scenes button and uncheck all the scenes in the list other than this one (mine is named BasicARScene).
  3. Ensure your device is connected via USB.
  4. Press the Build And Run button to build the project and install it on your device. It will prompt you for a location; I like to create a folder in my project root named Builds/. Give it a filename (if required) and press Save. It may take a while to complete this task.

    The app should successfully build and run on your device. As usual, if you encounter any errors, please read the error messages carefully in the Console window. When the app launches, you should see a video feed on your screen. Move your device slowly in different directions and closer/away. As it scans the environment, feature points and planes will be detected and rendered on the screen. If you tap the screen on a tracked plane, the red cube should be placed at that location.

    Refactoring your script

    Refactoring is reworking a script to make the code cleaner, more readable, more organized, more efficient, or otherwise improved without changing its behavior or adding new features. We can now refactor our little script to make the following improvements:

    • Move initialization code that only needs to be done once out of Update() into Start() (for example, initialize the raycaster variable).
    • Avoid allocating new memory in Update() to avoid memory fragmentation and garbage collection (for example, initialize the hits list as a class variable).

The modified script is shown in the following code block. The changed code is highlighted, beginning with the top part, which contains the new class variables and the Start() function:

public class PlaceObjectOnPlane : MonoBehaviour

{

    [SerializeField] GameObject placedPrefab;

    GameObject spawnedObject;

    ARRaycastManager raycaster;

    List<ARRaycastHit> hits = new List<ARRaycastHit>();

    void Start()

    {

        raycaster = GetComponent<ARRaycastManager>();

    }

Now, add the OnPlacedObject function, as follows:

    public void OnPlaceObject(InputValue value)

    {

        // get the screen touch position

        Vector2 touchPosition = value.Get<Vector2>();

        // raycast from the touch position into the 3D scene            looking for a plane

        // if the raycast hit a plane then  

        // REMOVE NEXT TWO LINES

        // ARRaycastManager raycaster =               GetComponent<ARRaycastManager>();

        //List<ARRaycastHit> hits = new List<ARRaycastHit>();

  1. if (raycaster.Raycast(touchPosition, hits, TrackableType.PlaneWithinPolygon))

        {

Please save the script, then build and run it one more time to verify it still works.

Information – Public versus Private and Object Encapsulation

One of the driving principles of object-oriented programming is encapsulation, where an object keeps its internal variables and functions private, and only exposes properties (public variables) and methods (public functions) to other objects when they're intended to be accessible. C# provides the private and public declarations for this purpose. And in C#, any symbol not declared public is assumed to be private. In Unity, any public variables are also visible (serialized) in the Inspector window when the script is attached to a GameObject as a component. Ordinarily, private variables are not visible. Using the [SerializeField] directive enables a private variable to also be visible and modifiable in the Inspector window.

Congratulations! It's not necessarily a brilliant app, and it's modeled after the example scenes found in the Samples projects, but you started from File | New Scene and built it up all on your own. Now, let's have a little fun with it and find a 3D model that's a little more interesting than a little red cube.

Creating a prefab for placing

The prefab object we've been placing on the planes in this chapter is the one named AR Placed Cube, which we imported from the AR Foundation Samples project. Let's find a different, more interesting, model to use instead. In the process, we'll learn a bit more about GameObjects, Transforms, and prefabs.

Understanding GameObjects and Transforms

I think a good place to start is by taking a closer look at the AR Placed Cube prefab we've been using. Let's open it in the Editor by performing the following steps:

  1. In the Project window, navigate to the ARF-samples/Prefabs/ folder.
  2. Double-click the AR Placed Cube prefab.

We are now editing the prefab, as shown in the following screenshot (I have rearranged my windows differently from the default layout):

Figure 2.10 – Editing the AR Placed Cube prefab

Figure 2.10 – Editing the AR Placed Cube prefab

The Scene window now shows the isolated prefab object, and the Hierarchy window is the hierarchy for just the prefab itself. At its root is an "empty" GameObject named AR Placed Cube; it has only one component – Transform, which is required of all GameObjects. Its Transform is reset to Position (0, 0, 0), Rotation (0, 0, 0), and Scale (1, 1, 1).

Beneath the AR Placed Cube is a child Cube object, as depicted in the preceding screenshot. This cube is scaled to (0.05, 0.05, 0.05). These units are in meters (0.05 meters is about 2 inches per side). And that's its size when it's placed in the physical environment with our app.

You'll also notice that the child Cube's X-Y-Z Position is (0, 0.025, 0), where Y in Unity is the up-axis. As 0.025 is half of 0.05, we've raised the cube half its height above the zero X-Z plane.

The origin of a Cube is its center. So, the origin of the AR Placed Cube is the bottom of the child Cube. In other words, when we place this prefab in the scene, the cube's bottom side rests on the pose position, as determined by the hit raycast.

Parenting a model with an empty GameObject to normalize its scale and adjust its origin is a common pattern in Unity development.

Now, let's find a different model for our app and normalize its Transform as we make it a prefab.

Finding a 3D model

To find a 3D model, feel free to search the internet for a 3D model you like. If you're a 3D artist, you may already have ones of your own. You will want a relatively simple, low-poly model (that is, with not many polygons). Look for files in .FBX or .OBJ format, as they will import into Unity without conversion.

I found a model of a virus microbe on cgtrader.com here: https://www.cgtrader.com/free-3d-models/science/medical/microbe. It is a free download and royalty-free, has 960 polygons, and is available in FBX format. My file is named uploads_files_745381_Microbe.fbx.

Once you've found a file and downloaded it to your computer, perform the following steps to import it into Unity:

  1. In the Project window, create a folder named Models under your _App folder (this step is optional).
  2. Drag the model from your Windows File Explorer or macOS Finder into the Models folder to import it into the project. Alternatively, you can use the main menu by clicking Assets | Import New Asset.
  3. When you select the model in the Project window, you can review it in the Inspector window. While there, take a look at the many Import Settings. Generally, you can keep their default values.

Now, we'll make a prefab of the model and make sure it's been scaled to a usable size. I like to use a temporary Cube object to measure it:

  1. In the Project window, create a folder named Prefabs under your _App folder (this step is optional).
  2. Right-click inside the Prefabs folder, select Create | Prefab, and give it a name (I named mine Virus).
  3. Double-click the new prefab, or click its Open Prefab button in the Inspector window.
  4. For measurement purposes, add a temporary Cube by selecting GameObject | 3D Object | Cube from the main menu (or use the + button at the top left, or right-click directly in the Hierarchy window).
  5. Assuming I want my model to appear in the scene as the same size as the red cube we had been using, set this measuring cube Scale to (0.05, 0.05, 0.05) and its Position to (0, 0.025, 0).
  6. Drag the 3D model you imported from your Project Models folder into the Hierarchy window as a child of the root object.
  7. Use the Scene edit toolbar and gizmos to scale and position your model so that it's about the same size and position as the Cube. I found this works: Scale (0.5, 0.05, 0.05), Position (0, 0.04, 0), Rotation (0, 0, 0).
  8. Delete or disable the Cube. With Cube selected, in its Inspector window, uncheck the Enable checkbox at the top left.
  9. Save the prefab by clicking the Save button at the top of the Scene window.

The model I found did not come with a material, so let's create one for it now. With the prefab we're working on still open for editing, perform the following additional steps:

  1. In the Project window, create a folder named Materials under your _App folder (this step is optional).
  2. Right-click inside the Materials folder, select Create | Material, and give it a name. I named mine Virus Material.
  3. Drag Virus Material onto the model object (uploads_files_745381_Microbe) in the Hierarchy window.
  4. With the microbe model selected in the Hierarchy window, you can modify its material in the Inspector window. For example, you can change its color by clicking the Base Map color chip and choosing a new one. I'll also make mine shinier by setting its Metallic Map value to 0.5.
  5. Again, Save your prefab.
  6. Exit back to scene editing by clicking the < button at the top left of the Hierarchy window.

My prefab now looks like this while open for editing (I have rearranged my windows so that they're different from the default layout):

Figure 2.11 – Editing my Virus prefab

Figure 2.11 – Editing my Virus prefab

We're now ready to add this prefab to the scene. After, we will build and run the finished project.

Completing the scene

We now have our own prefab to place in the AR scene. Let's add it to the Place Object On Plane component, as follows:

  1. Ensure you've exited the prefab edit mode and are now editing BasicARScene.
  2. Select the AR Session Origin object in the Hierarchy window.
  3. From the Project window, drag your prefab (mine is _App/Prefabs/Virus) into the Inspector window, onto the Place Object On Plane | Placed Prefab slot.
  4. Save the scene with File | Save.
  5. Build and run the scene by going to File | Build And Run.

As shown in the following screenshot, I have infected my desk with a virus!

Figure 2.12 – Running the project shows a virus on my keyboard

Figure 2.12 – Running the project shows a virus on my keyboard

There it is. You've successfully created an augmented reality scene that places a virtual 3D model in the real world. Perhaps you wouldn't have chosen a virus, but it's a sign of the times!

You're now ready to proceed with creating your own AR projects in Unity.

Summary

In this chapter, we examined the core structure of an augmented reality scene using AR Foundation. We started with the AR Foundation Samples project from Unity, building it to run on your device, and then exported its assets into an asset package for reuse. Then, we imported these sample assets into our own project, took a closer look at the SimpleAR scene, and built that to run on your device.

Then, starting from a new empty scene, we built our own basic AR demo from scratch that lets the user place a virtual 3D object in the physical world environment. For this, we added AR Session and AR Session Origin game objects and added components for tracking and visualizing planes and point clouds. Next, we added user interaction, first by creating an Input Action controller that responds to screen touches, and then by writing a C# script to receive the OnPlaceObject action message. This function performs a raycast from the screen touch position to find a pose point on a trackable horizontal plane. It then instantiates an object on the plane at that location. We concluded this chapter by finding a 3D model on the internet, importing it into the project, creating a scaled prefab from the model, and using it as the virtual object placed into the scene. Several times along the way, we did a Build And Run of the project to verify that our work at that point runs as expected on the target device.

In the next chapter, we will look at tools and practices to facilitate developing and troubleshooting AR projects, which will help improve the developer workflow, before moving on to creating more complete projects in subsequent chapters.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset