In Chapter 1, you left off running the sample scene prepackaged with VRTK. Although the VRTK demo scene is pretty nifty, it might not have been what you imagined the immersive power of VR could fully create.
This chapter will reacquaint you with the Unity and VRTK setup processes and provide you with the knowledge of how to move beyond blocks in a virtual space. We’re going to get introduced to VRTK and set up a 3D, VR-enabled Unity project; connect the VRTK interface; connect our head-mounted display to a virtual camera; and create an application that will put our user into a photorealistic, immersive space!
Learn the meaning and importance of high dynamic range (HDR) images to VR design.
Change the settings for a default Unity Skybox asset.
Download and import HDR images into a Unity project.
Place a virtual camera in a photorealistic 360-degree environment.
Place and manipulate a 3D object within an HDRI Skybox.
Lights, Camera, Render!
I came to Unity and VR with the experience of a filmmaker. Movies and TV were much more familiar to me than 3D graphics or computer programming. However, it wasn’t long before I felt comfortable creating my own VR apps in Unity. The key to Unity’s accessibility to professional developers and novices alike, in my opinion, is its similarity to movies. You don’t need a master’s degree in fine arts to understand that movies and TV shows come about from pointing cameras and lights at subjects. With that intuitive understanding under your belt, you’re already well on your way to becoming a Unity developer.
By using the language and concepts of a medium so familiar to many of us, Unity makes it possible for almost anyone to get their first VR scene up and running in minutes. Here’s how simple the premise of Unity is: Every project starts with a scene. Every scene starts with a light and a camera. Instead of actors, we call the subjects of our scenes game objects. That’s it! Everything else is dressing on the mixed green salad.
Of course, things can get much more complicated very quickly if you’d like. But presumably you’re not here to learn how to make a AAA game for an Xbox in a weekend. Although Unity can indeed help you accomplish such a task, you and I are going to focus on VR applications beyond gaming. Yes, we will use elements and principles of game design in our exercises, but Unity and VRTK are so powerful in concert precisely because they make creating VR accessible to everyone, not just tech-savvy gamers or programming savants.
Not to worry, though, because by the end of this book you will have the knowledge you need to jump-start an even deeper dive into the promises of Unity as a game engine. If that’s not your aim, that’s fine, too! You’ll still reach the last page of this book with the skills you need to confidently prototype whatever VR experience you can imagine. That’s a lofty promise. I beg you to hold me to it!
Getting Started
To begin, first make sure you have completed the ‘getting started’ guide and are able to run the VRTK demo scene. When you are comfortable with the steps in that process, proceed.
Open a new 3D project in Unity. If you have not already, and you intend to test your application on a VR headset of your own, download the necessary files as per the instructions from your headset provider. Most popular headset manufacturers have a section of their web site where they explain how to integrate their tools into Unity. Make sure you have gone through the setup requirements for your device of choice. For example, creating scenes for the Oculus platform might require you to download the Oculus features from the Unity Package Manager, or the Oculus Integration package from the Unity Asset Store. Similarly, developing a scene for SteamVR might require you to install the necessary OpenVR files. Refer to the documentation on the web site of your headset provider for platform-specific requirements. If you prefer to develop and play-test your VR scenes without the use of a head-mounted display, remember that VRTK conveniently provides a simulated camera rig that allows us to test our VR project without a headset connected to our computers.
To set up our first exercise of this chapter, I’ll review the procedure for creating a VR project in Unity and importing the VRTK files through your operating system’s Git Bash. You can find a more detailed, illustrated description in Chapter 1.
Review: Setting Up a VR Scene in Unity and Importing VRTK
- 1.
Create a new 3D Unity project and navigate to Edit ➤ Project Settings ➤ Player. Under XR, click the Virtual Reality Supported check box. If the headset on which you plan to test your scene is not already listed, click the + icon. Select the headset system of your choice.
- 2.Next, navigate to the Assets folder of your project in your file explorer. In File Explorer or Finder navigate to and open the Assets folder. Here, you will need to make sure you have installed Git on your computer. If you have, right-click in the Assets folder and select Open Git Bash Here. A terminal or command prompt window will open. Copy and paste the following text:git clone --recurse-submodules https://github.com/ExtendRealityLtd/VRTK.git
- 3.
Press Enter and navigate to the newly created VRTK directory by typing:
- 4.
Press Enter. You will see a progress bar countdown in your terminal window. Your computer is downloading the files required to run VRTK in your Unity application.
- 5.
When the download is complete, return to Unity. As soon as you are back in Unity another progress bar should appear. This progress bar tells you that Unity is processing the files you downloaded from Git. Click OK on the prompt that appears.
- 6.
Because VRTK is so new and the most recent 2019 edition of Unity is still relatively new, there’s one more step we need to complete. Navigate to Window ➤ Package Manager. On the bottom left of the Package Manager window, click the XR Legacy Input Helpers tab. On the bottom right, click Install. This action downloads the remaining files required to facilitate the connection between Unity and VRTK.
Why Is My UnityXRCameraRig Game Object Blue?
In the Scene Hierarchy, you might have noticed that the UnityXRCameraRig is blue, and the Main Camera object is not (Figure 2-3). Blue game objects indicate the object is an instance of a prefab.
In Unity a prefab is a game object that has been designed independent of the scene to which you apply it. If you create a custom game object consisting of its own unique materials, textures, and child game objects, for example, you can drag the parent object into the Assets folder in your Project window to save it as a prefab. A prefab acts as a template from which you can create copies of a game object to populate your scenes.
The UnityXRCameraRig object is a prefab created by the makers of VRTK. Unlike the Main Camera object that appears in a default scene, the UnityXRCameraRig prefab is not a native Unity game object.
Let’s give our scene a test! Press the play button at the top of the Unity application. Make sure the Unity Camera Rig button is toggled in the top right corner of your game screen, if you plan to use your own head-mounted display. If not, select the simulated camera rig.
This error means more than one game object with an audio listener is active in the Scene Hierarchy. Each camera game object has an audio listener component attached. Hosting more than one active camera in a scene prompts Unity to notify you that only one audio listener can exist in a scene. We can prevent this message from appearing in our console by hosting only one camera object in the Scene Hierarchy. In your Scene Hierarchy, delete the default main camera provided by Unity. You can either select the object and press Delete, or right-click/Ctrl+click the object and select Delete from the shortcut menu that appears. It is okay if you keep both the Unity XR Camera Rig and the Simulated Camera Rig in your hierarchy. However, only one camera object can be toggled on in the scene at a time. If you plan to test your scene using the simulated camera, then you will have to toggle off the Unity XR Camera Rig. If you plan to test your scene with the Unity XR Camera Rig attached to a headset of your choice, then you will have to toggle off the Simulated Camera Rig.
Enabling the Clear on Play feature for your Console window will clear your Console window of errors and warnings every time you stop and restart your application for testing. Of course, the errors and warnings will persist on each restart of the scene if you leave them unaddressed.
Run your scene again. If no errors regarding duplicate audio listeners appear in your Console window, then you’re ready to move on to play-testing your app. Press the Play triangle at the top of the Unity application window to start your scene. Press the Stop button when you are finished.
Congratulations, you did it! You integrated a camera into your VR scene. Now, let’s take this exercise one step further to introduce you to the tools you need to create an experience of your own.
Because this book is meant to introduce you to VR experiences beyond games, each exercise focuses on a single use-case that might have an application in your daily life. Each exercise addresses a scenario in a particular industry to demonstrate the versatility of VR. It is the aim of these exercises to provide you with the knowledge of Unity and VRTK to quickly prototype whatever immersive experience you can imagine.
Exercise 1: GlobeHopper
In this exercise, we are going to address the following use-case.
Priyanka just returned from a vacation to the U.S. Southwest, where she took a 360-degree panoramic photograph on her camera phone. She was so enamored with the scenery she hopes to share it with her friends. Unfortunately, seeing the photo on a phone doesn’t capture the emotion she felt standing on the peak in the desert. Let’s pretend we’re Priyanka and that we want to create a 360-degree immersive experience from the photo we took on our trip.
The Panorama
As with preparing a delicious meal, the success of our immersive experience depends almost entirely on the quality of our ingredients. Some camera phones allow users to capture a 360-degree panoramic photo natively, whereas others require additional hardware, software, or both. My phone, for example, only allows me to capture 180-degree panoramic images. Of course, in VR, where our canvas is completely immersive, presenting a user with only half of a 360-degree environment isn’t a successful recipe for good VR content.
I have included three different 360-degree images in the Assets folder for this project at the GitHub URL. http://www.apress.com/source-code. You are free to use these images for this exercise, but if you plan to use them for your own purposes please review the licensing terms provided by the original artists.
The 360-Degree Image vs. the HDRI
A 360-degree image simply describes the orientation of our photo asset. What truly makes a 360-degree image useful for inclusion in our GlobeHopper application is its quality. In our current context, we define the quality of an image as the amount of information an image file contains to help the Unity Engine render a surrounding as similar to reality as possible. We achieve this high quality through the use of high dynamic range images (HDRIs).
The dynamic range of an image measures the amount of color information an image contains. Digital cameras create HDRIs by snapping several frames of a subject at different exposures. This is called bracketing. For example, when your parents took a photograph with an analog, manual camera, they set the exposure level for the subject determined by the level of available light. Today, digital cameras, some by default, capture several photos at different exposure levels ranging from darkest to brightest at a single press of your finger. Although these files are larger in size than traditional JPEG images, the color information they pack is essential to creating the illusion of reality required for an immersive experience.
The steps in this exercise assume you are working with the images provided in the Assets folder for this book. If you are using an original image, please follow the instructions from your camera manufacturer on how to create HDRIs. The final settings for the image we import into our Unity project as an asset will be a 4096 × 2048 image. This is known by some as a 4K resolution asset. As of this writing, 4096 × 2048 is the largest file format allowed by Unity for skymap images. The Unity best practices guide suggests using an image no larger than 2048 × 2048 because that is the maximum size handled by many graphics cards.
The Cubemap
Let’s start from the very top, as if we’re going to make a recipe from scratch, so we can reinforce what we’ve already covered about integrating VRTK with our Unity project.
Create a new Unity 3D project. Name the project GlobeHopper and save it in a file destination of your choice. It is important that you are able to navigate to the saved destination of your project later in this exercise so note the file path if you must.
Now that we have our 3D project created, let’s make it interactive by importing VRTK. Navigate to the Assets folder of your GlobeHopper project in your File Explorer or Finder. Double-click the Assets folder to make sure you are within the directory GlobeHopper/Assets. If you see a folder labeled Scenes, you are in the right place.
After you’re sure you’ve installed the Git client on your computer and have navigated to the inside of your GlobeHopper Assets folder, right-click in File Explorer and select Open Git Bash Here.
For future reference, you can always find the steps to import VRTK at the VRTK GitHub page: https://github.com/ExtendRealityLtd/VRTK.
Return to Unity and wait as Unity compiles the VRTK scripts you just cloned from GitHub.
If a pop-up menu appears that says “The VRTK Example Scene requires additional Unity Input Axes to be defined,” then click the Add Input Mappings button at the bottom of the menu. If you did not install the VRTK input mappings at this point, it’s okay. We review them in more detail in Chapter 7.
If you haven’t already, also be sure to update the Legacy XR Inputs in the Unity Package Manager to make sure VRTK is fully compatible with your current version of Unity.
Under XR Settings, select the Virtual Reality Supported Check box. If you do not see your Virtual Reality SDK listed, click the + button on the bottom right of the Virtual Reality SDK menu and select the device for which you intend to build your experience. If the SDK for your headset does not appear in the list, then visit your headset manufacturer’s web site for specific information on how to connect your headset’s SDK to Unity. If you continue to have trouble finding your headset listed as an option, then confirm the build settings of your project (Menu ➤ File ➤ Build Settings (Ctrl+Shift+B) are set to PC, Mac, & Linux Standalone; Universal Windows Platform; or even Android if targeting mobile VR devices like the Oculus Go. If you are targeting Universal Windows Platform in your Build Settings Unity might only give you the option IL2CPP as your scripting back end, which is fine. Check the Build Settings section of the Unity documentation for further reference. When this step is complete you can close the Player Settings window. You might see some yellow exclamation marks in your Console window. As long as none of them are red, you can click Clear at the top left of your Console window and proceed.
Now, we drag our UnityXRCameraRig object from our Project window into our Scene Hierarchy and delete our default camera object. As always, if you do not have a headset mounted to your computer you can select VRTK’s SimulatedCameraRig instead of the UnityXRCameraRig prefab.
Finally, press the Play triangle at the top of your Unity project. If you are using an external headset, move it around to verify that the Unity Game Window tracks your headset’s movement. If you are using the VRTK simulated camera rig, you can confirm your setup works by following the onscreen instructions for manipulating the viewfinder on your virtual camera.
In the Project window of your GlobeHopper project, right-click the Assets folder and select Create ➤ Folder. Let’s name this new folder Images. Double-click the Images folder to open it. If you’ve downloaded the assets from the repo associated with this exercise, then navigate to the location of the rocky_dawn_4k.hdr file . Drag and drop rocky_dawn_4k.hdr into the Images folder you created in your Unity Project. If you downloaded HDRI content from the Unity Asset Store, the file extensions of your assets might be *.exr and *.mat. Although the file formats are different, the data they represent are the same.
Uh-oh. What happened? Nothing? Good! When you dragged your Rocky Dawn image into the Lighting window, Unity gave you the “not allowed” symbol of a circle with a line through it. What gives?
Well, remember when we set the Texture property of our Rocky Dawn image to a Cube? Unity, as a 3D engine, uses a graphics pipeline common to all 3D asset programs, and in that pipeline there are three unique (although interdependent) categories that inform the way an object looks in 3D space. One is the Texture property, which helps us to render the difference between textures such as sand and brick, for example. Another is the Shader property, which primarily defines the way our objects interact with light in our virtual space. Different shaders can transform the same object from a highly reflective mirror to a muddy, matte puddle of sludge. The third property is an object’s Material. Personally, I find it helpful to imagine a property’s Material as the fabric pulled over the chicken wire skeleton of an object, called its mesh. It is a material to which we can add texture to transform a cylinder from an aluminum pipe to a bratwurst. We’ll play much more with these parameters in later exercises. For now, though, let’s create a Material to replace our current default Skybox material, a new material onto which we will project our Rocky Dawn texture asset.
Let’s repeat the process we performed to create a folder for our images to create a folder called Materials in our project’s Asset folder. Navigate into the Materials folder you created, right-click in the empty folder space of your Project window, and select Create ➤ Material. Let’s name our new material mySkybox. After renaming your new Material asset, select it and turn your attention to the Inspector window to view its properties. The very first property listed in the Inspector is labeled Shader. Unity defaults to its Standard shader, but let’s see if we can find something more specific.
Close the Lighting settings window. If you’re play-testing on an external headset make sure the UnityXR Camera Rig is the only camera in your Scene Hierarchy on the top-left of the Unity Default Layout. Further, verify that the check box to the left of the UnityXR Camera Rig game object is selected in the Inspector when you highlight the UnityXR Camera Rig in the hierarchy. If you’re using the VRTK SimulatedCameraRig, then make sure it’s the only camera game object activated in your Scene Hierarchy. Ready? Press Play and enter your immersive, photorealistic world.
Welcome back! How did it go? Oh no, it’s not right? I’m sorry for leading you astray (not really). We’ll iron out all the wrinkles right now, I promise.
Confirm your camera settings are to your preference in your Scene Hierarchy, and press Play.
Texture mapping is one of the many tools Unity offers us as a 3D game engine to quickly create photorealistic, immersive experiences. It took humans until the Renaissance to master the complicated mathematics that create the illusion of not only depth, but also curved space. Unity does the same for us in a matter of seconds and a few clicks of a mouse. For further practice, try swapping out our desert setting at dusk for the other HDRIs included in the GlobeHopper repo.
Summary
A convenient way to imagine the makeup of a default Unity scene is through the language of movie-making. Each Unity scene begins with a camera, a light, and a background—called a Skybox. By importing the VRTK Unity XRCameraRig prefab or the SimulatedCameraRig prefab into our scene, we develop the ability to connect the virtual camera in our scene to our user’s movements.
Because of its intuitive layout and drag-and-drop features, Unity makes it easy for developers to quickly jump into manipulating the appearance of our scenes. A quick, straightforward process like changing the Skybox material of a Unity scene can transform a mundane experience into something novel.
In this chapter you learned how to import a virtual camera into a new Unity scene; how to toggle the state of a virtual camera object in the Inspector window; the definition of HDRIs and their value to immersive experiences; the importance of the Mapping parameter on an HDRI texture to the creation of an original Skybox; how to transform an HDRI from a 2D texture into a cubemap through the image’s Texture Shape property; that textures like HDRIs must exist on a Material object for Unity to recognize it as a potential Skybox; how to access the Lighting Settings window; and how to swap the Skybox material for a scene.
Before we move on to learning about creating truly interactive experiences with Unity and the VRTK, let’s try one more thing in our GlobeHopper app to whet our appetite for things to come. As is, our GlobeHopper project doesn’t feel very dynamic. The only thing that moves in the scene is our user’s point of view. The next chapter introduces the concepts of game objects, components, and scripting in Unity. Together they provide a rich foundation from which we can create immersive experiences that combine the real with the virtual.