What a mouthful, huh? One-dimensional axis actions. It sounds more complicated than it is. Hopefully, by the end of this chapter you will agree.
So far in this book we’ve introduced virtual cameras, 3D objects, tracking and spatial anchors, interactors, and button events. Together, those things can makeup a valid virtual experience. However, one element integral to VR—so much so that it’s in the name of its controllers—we haven’t discussed is touch.
In the context of the human mind, the medium through which you and I interact every day, touch is the sensation we feel through our skin. The world outside us, reality, excites nerves and stimulates them, creating the pressure of touch, the weight of touch, the resistance of touch, and the warmth of touch. As of this writing, we are not yet able to re-create the psychophysical sensation of touch through a consumer-grade VR system. If the concept of touch isn’t even within the boundaries of achievable VR, what then do we mean when we discuss touch?
The role touch plays in VR immersive design.
The purpose of the VRTK 1D Axis Action component.
The value of lighting design to VR scenes.
The different kinds of lighting options provided by Unity.
How to navigate through a scene in the Unity Editor using the 3D viewport and Unity hot-keys.
How to connect VRTK’s Input Mappings to VR controllers and Unity Actions.
The Psychology of Touch
One practice designed to treat phantom limb syndrome leverages the patients’ way of thinking to ease their discomfort. For example, to treat patients experiencing phantom limb pain in a left hand they lost in a motor vehicle accident, the exercise requires them to place both arms inside opaque containers. The container for the left hand has an opening through which the patients can see. What the patients see, however, is not actually there. What they see is a mirror image of their right hand broadcast through a video screen, for example. What they see appears to be the return of their left hand. As the patients move their right hand, their “left” hand moves in concert. In a short period of time, the patients’ mental models of their bodies conform to their perceived reality. Once the patients’ mindset shifts toward the presence of the phantom limb, therapists might lead the patients through rehabilitation exercises to reduce the presence of pain. Although physicians don’t fully understand the mechanisms in the brain responsible for the patients’ shift in perception, there are still lessons from the treatment relevant to our decision making as immersive experience developers.
Body and Mind
Other mixed reality systems, such as the Hololens 2, forgo representation completely. Inside-out tracking, which moves infrared sensors from external towers to sensors inside the headset, provides the ability to track a user’s hands without controllers. It likely won’t be long until both MR and VR headsets interpret the presence of a user’s body without controllers at all. Unconscious association of the self with a virtual avatar might become more seamless and more natural over time.
This is the next level of media to which VR evangelists refer when they speak of this great, new technology. The most powerful, most useful feature of VR is the synthesis it catalyzes between the outer and inner self. The moment of transcendence, the moment that clicks within a virtual experience when your body leaves your mind—that is the wonder of VR. It is also the foundation of what we explore in this chapter.
The Unified Whole
If the core of VR is immersion then its antithesis is disruption. Feedback determines the quality of an immersive experience. Does what the users see conform with their expectations? We can orchestrate these expectations through visual and audio cues in our experiences. For example, a highly stylized environment might allow a user to loosen their expectations of realistic physics. A cartoonish, modularly designed, voxel-based world, however, might not prepare a user for a gritty, violent, dramatic experience. At a more fundamental level, however, we find the success of immersion within the synchrony of the brain.
Enter the VRTK Axis Action
Unity Axis Actions tie into the Unity Input Manager and emit events when a defined Input axis changes value. There are two kinds of Unity Axis Actions:
Unity 1D Axis: listens for changes on a single axis and emits a float value for the axis changes.
Unity 2D Axis: listens for changes on two axes and emits a Vector2 value combining both axis changes.
A Unity Axis Action is derived from a Zinnia.Unity Action and therefore can be injected into any VRTK prefab that requires varying float data (e.g., touchpad movement).
VRTK will ask via a popup window on first load of the Unity project whether to attempt to auto create some Unity axis input mappings. If this is accepted then default axis data for left and right controller touchpad/thumbsticks, grip and trigger axes will be created.
The most important information from the VRTK documentation regarding Unity Axis Actions explains how the VRTK framework can help us introduce touch controller action into our scenes beyond simply acknowledging when a user presses a button. Triggers and joysticks are common input features on many VR touch controllers. That these input systems allow for more than simply on/off values like float (numeric decimal) values makes them powerful tools in our toolkit for the creation of immersive, VR experiences.
The Unity Input Manager exists in every Unity project in the Edit ➤ Project Settings window from the menu bar. It maps the relationships between user inputs and their impact on our applications. The Input Manager contains a list of axes and buttons available for us to connect events to in our application. The Unity Input Manager can overwhelm developers new to Unity design. Changes to its settings can have far-reaching impact (both intended and unintended) on the development of our experiences. By tying into the Unity Input Manager, the VRTK framework’s Unity Axis Action components streamline the development process for prototyping user interaction within our VR experiences.
Like with our Unity Button Action and Unity Boolean Action components, VRTK’s Unity Axis Action component uses the TrackedAlias VRTK game object to accept input from our VR touch controllers to produce a change in our program. The only distinction unique to a Unity Axis Action from its Button and Boolean Action siblings is that it tracks a gradient of input through numbers not just an on/off state. For a simple analogy, let’s look at an automobile. The ignition in our car, fundamentally, has two states: on and off. Our accelerator, however, allows us to change speed across a continuum of values. The function of a car’s accelerator is like the function of a Unity Axis Action component through VRTK.
To see the purpose a Unity Axis Action component can serve in our VR apps, let’s try our hand at a prototype.
Exercise: Location Scouting
Annie Svenstrup has a passion for theater. Rather than in front of the lights, however, her place is behind them. She works as a dental hygienist, but she devotes her free time to the community theater in the town where she lives. One day, Elgin Sommers, the eccentric director of this season’s big production, approaches Annie to help him execute his vision for the play’s stage design. He’d like to build the facade of an abandoned, decrepit inn. Before he commits the theater’s budget to building the set, however, he’d like Annie to help him visualize the set’s lighting to convince the donors to cut the big check.
In this exercise, as Annie, we will import a 3D model of Elgin’s set. We will place virtual lights into the scene to illuminate the set. Finally, we will connect user input to the control of the lights so Elgin can impress the theater’s board of directors with the veracity of his vision.
The model we will be using in this exercise is available from the project GitHub repo labeled by chapter. The model is called “Team America 2004 Miniature (photogrammetry)” by Austin Beaulier. You can find the original and more of Austin’s work from their Sketchfab profile at https://sketchfab.com/Austin.Beaulier.
I’m using the model under the Creative Commons licensing agreement (CC BY 4.0). If you plan to incorporate Austin’s work into your own immersive experiences, please follow the guidelines of attribution documented at https://creativecommons.org/licenses/by/4.0/.
It might be of interest to you to know that scenarios like the one I have described are, in fact, taking place in the commercial world. In 2017, Cirque du Soleil used the Microsoft HoloLens to help design sets for future performances. In 2018, hip-hop artist Childish Gambino and his team used VR and the HTC Vive to prototype “Pharos,” an immersive concert experience that won its creators a prestigious award from the Visual Effects Society. Although these productions are enormous endeavors with huge teams, the tools used by their creators to grow a vision from the seeds of imagination are available to you and me.
Before we get to work with VRTK’s Axis Action component, however, let’s create a scene the old-fashioned way to get our feet wet with the fundamentals.
Step 1: Project Setup
An easy reference for getting a Unity VR project setup with VRTK is available at https://github.com/ExtendRealityLtd/VRTK#getting-started.
Create a new VR-supported, 3D project in Unity. If you are using Unity 2019 navigate to the Window ➤ Package Manager console and install the XR Legacy Input Handlers.
Here, we follow the now familiar step of cloning the VRTK GitHub repository into our Unity Project’s Asset folder. Use the following procedure from the VRTK Getting Started documentation to clone the repo.
- 1.
Navigate to the project Assets/ directory.
- 2.Git clone with required submodules into the Assets/ directory:
git clone --recurse-submodules https://github.com/ExtendRealityLtd/VRTK.git
Change to the newly cloned directory: cd VRTK/
git submodule init && git submodule update
- 3.
The Unity software will now import and compile the new files.
VRTK will ask via a popup window on first load of the Unity project whether to attempt to auto create some Unity axis input mappings. If this is accepted then default axis data for left and right controller touchpad/thumbsticks, grip and trigger axes will be created.
Before we drag the UnityXRCameraRig and TrackedAlias objects into our scene from the VRTK prefabs folder in our Project window, let’s create a demo of our scene in simple 3D.
Transform.Position(x,y,z): -0.74, 1, -10
Transform.Rotation(x,y,z): 0, 0, 0
Transform.Scale(x,y,z): 1, 1, 1
Transform.Position(x,y,z): 2.05, -.02, 2.68
Transform.Rotation(x,y,z): 23, -90, 0
Transform.Scale(x,y,z): 2, 2, 2
Transform.Position(x,y,z): 1.02, -0.9, 1.52
Transform.Rotation(x,y,z): 0, -90, 0
Transform.Scale(x,y,z): 5, 1, 5
Alternatively, you can use whatever model you prefer to set your scene or simply construct your own building facade using object primitives (cubes, planes, cylinders) from within the Unity Editor. The purpose of this exercise is to learn the process of placing lights in a Unity scene and controlling the appearance of lighting through user input. As long as you follow along with the placement of lights and the scripting, the purpose of the project will not be lost on you.
Step 2: Swap the Skybox
Lighting is the primary technique we have at our disposal as Unity developers to manipulate the mood of our scene. Because the model Annie uses is of a decrepit, old inn, it doesn’t feel right to Elgin, her director, that the model sits beneath a blue sky lit by the sun. Instead, Annie wants the user, on entering her scene, to feel an ominous threat of possible danger. Anything could be lurking in the night.
As we did in Exercise 1, let’s open up our Skybox properties and swap out its material. Navigate to Window ➤ Rendering ➤ Lighting Settings. Into the Skybox Material field beneath the Lighting ➤ Environment Property, drag and drop the Black material we created.
Step 3: Change the Value of the Directional Light
Transform.Position: 9, 11, -16
Transform.Rotation: 37, -25, 0
Transform.Scale 1, 1, 1
Let’s manipulate the settings of our Light component to better simulate moonlight. In the Inspector, confirm that the Light.Type is Directional. Using the color picker, set the Light.Color property to a cool, whiteish blue.
Step 4: Add a Doorway Point Light
With the settings of our MoonLight object applied, let’s turn our attention to another kind of light provided by Unity. In the model I am using for this exercise, a doorway sits to the right of the building’s main entrance. The doorway looks as if it could lead to some kind of alley; appropriately spooky, I’d say. I’d like to apply a small light to the eave of the alleyway that the user can dim and lighten at will.
Unity offers a set of built-in lighting objects we can use to create an atmosphere in our scenes. They are a directional light, a spotlight, an area light, and a point light. There are also two options for objects called probes, but we discuss those later. In this exercise we’ve already worked with a directional light to mimic the soft light from our moon. Now, let’s add a point light.
Transform.Position: 2.17, 0.83, 2.63
Rotation: 0, 0, 0
Scale: 0.8, 0.8, 0.8
Step 5: Allow the User to Change the Point Light Through the Keyboard
Of course, as we’ve already discussed, the key to a convincingly immersive experience is—that’s right—feedback! Now, we’re going to create a script to empower our user to control the brightness of our doorway light.
Recall that when we introduce interactivity into our scene through an original script, we need a game object in the Scene Hierarchy on which we can attach our script as a component. By convention, we give this otherwise empty game object a name that ends with either “manager” or “controller.” Let’s create an empty game object in our Scene Hierarchy by right-clicking with our mouse. Name the new, empty game object LightController.
Like the script we wrote in our last exercise, this script stores a component of a game object in a variable, manipulates the value of the variable, and attaches the component back to its object. Two things you might not recognize in our Dimmer.cs script are the += and -= operators. These translated into English are “equals itself plus” and “equals itself minus,” respectively. In the code within the Update() function in Dimmer.cs the expressions within the conditional statements (if/else if) can be expressed in words as “set the point light range value equal to itself plus the value of the rate variable” and “set the point light range value equal to itself minus the value of the rate variable.” The value of the rate variable is a property on our Dimmer.cs script component we make public to allow us to adjust it within the Unity Editor.
An Aside on Magic Numbers
Storing known values like 0.08f (where f identifies the value as a data type float) in variables allows us to avoid a bad habit in programming called magic numbers. When developers tell you that you have a magic number in your code they are telling you that you have used a literal value, like 0.08f, instead of a variable. Although our code would still execute if we used the floating-point number 0.08f instead of the variable rate, the practice of avoiding magic numbers pays dividends over time. What if, for example, our Dimmer.cs script was but one component in a complex web of scripts communicating with each other, passing values back and forth? Under such conditions, a request from a project manager to change the rate at which our point light dims would require us to go through all of our code, changing each and every instance of the literal value 0.08f. Using a variable like rate, instead, gives us the freedom in the future, if we choose, to change the value of the point light’s rate of dimming by simply updating the value once at the moment we declare the rate variable in our script.
Step 6: Play-Test
An Aside on Errors
If you get an error in the Console window of the Unity project, click it to read the details. Double-clicking the error might take you to the line of code in the Dimmer.cs script where the error occurred if it is in fact the result of a coding mistake. If you do encounter an error notification connected to the code, refer back to the code printed in this book. Confirm that everything is entered as is with the same spelling, capitalization, and punctuation. Because VRTK 4 only became available for public use in March 2019 and both Unity and Visual Studio release frequent updates, the error might not be a result of anything you did. Such is the nature of developing in such a bleeding-edge space. Fortunately, as an open source project, VRTK allows users to submit errors through its GitHub page. Forums, too, keep developers abreast of the latest developments to a code base.
Step 7: Practice Moving Lights, Cameras, and Yourself Around the Scene
w to display handles to move the object along an axis.
e to display the handles to rotate an object around an axis.
r to display the handles to change the size of the game object.
In Windows, holding the Alt key and the left mouse button in the Scene View window allows you to rotate your view around the scene. Holding the right mouse button allows you to tilt and pan across the scene. If your mouse has a button between its left and right buttons, then scrolling it allows you to zoom in and out of the scene, and holding it down and moving the mouse allows you to move vertically and horizontally in two dimensions within the scene.
Step 8: Change the Value of the Rate Variable in the Unity Editor
Finally, before we move on to converting our scripted dimmer function into a Unity Axis Action component using the VRTK framework, experiment with changing the value of the Rate variable in the Dimmer.cs script component in the Unity Inspector. Remember, changing the value of a game object property while the Editor is in play mode will not save it to the computer’s memory. If, however, you change the value of Rate while the Editor is stopped and then press Play, you will notice that the increment by which the alleyway point light brightens and dims changes, too. As you can see, defining malleable properties in our scripts with the public keyword allows us to prototype more quickly and efficiently in our editor.
Remaining in the Unity Editor as much as possible is a convenient and quick way to get ideas for our immersive experiences out of our imaginations and into a visual space. It’s one thing to creatively solve the problem of what code to write to execute an action you envision in your scene, but it’s another to write the code in an elegant, efficient way that allows for easy troubleshooting and smooth playback. Scripting a dimming function to respond not just to a touch controller’s trigger input, but to the trigger input of any brand of touch controller that has and is yet to exist, is the fastest, easiest way to lose steam on an immersive project.
In today’s fast-paced world where content ships quickly and can update within moments, getting our experiences out into the world is our goal. Although it’s tempting to curate every nook and cranny of our app to make sure our code doesn’t expose us as frauds, the unseemly secret of interactive applications is that we can’t know how they’ll work until someone interacts with them. To that end, rapid iteration is essential to VR development. The more often we can steamroll our way through the prototype of a vision, the more reps our application will get with real-life users and the more feedback we’ll receive to improve it. It is for this very reason that the VRTK framework can be such a valuable weapon in your creative arsenal. Let’s see how easy VRTK makes it to hook up our touch controller to a function that dims a light in our scene.
Adding a Unity Axis Action
By now, if you’ve completed the exercises from previous chapters and the section prior to this, then you are familiar with both connecting controller events to Unity Button Actions and manipulating float values of game objects through user input. What we have not yet addressed, however, is the process required to link user input to a parameter’s float value through a VR controller. A Unity Button Action will not work because our parameter, the intensity of a point light, requires more than an off/on signal. One way we create a channel through which our VR application captures a user’s controller input as a continuous float value is by implementing the VRTK framework’s Unity Axis Action component.
Step 1: Add a Second Point Light
Remember how earlier I said you could have optional practice with our broken building scene by placing a point light in one of its windows? Well, I lied. It’s not optional. If you’ve already done it, excellent! If you haven’t, please do it now. Refer to the steps we followed when placing the alley light in our scene for reference.
Transform.position: -5.43, 4.88, 4.21
Rotation and scale set to default.
Step 2: Write a Second C# Script for the New Axis Action
Now that you have a point light set up inside one of the building’s windows, let’s wire it up to a trigger action on our right touch controller. To accomplish this, we must use a Unity Axis Action component. Why a Unity Axis Action component? Well, buttons have two inherent states that developers use to communicate responses to yes/no, true/false questions. Triggers, as an axis action, allow us, as developers, to measure a user’s input on a spectrum, for example, between 0 and 1.
After you’ve set up a second point light in your scene, name it RoomLight. Create a new empty game object in the Scene Hierarchy and name it LightController2. Because this game object is going to be a controller in our scene, let’s add a script to it. The script connected to the controller as a component will provide the controller instructions for that which it will control. Let’s create a new C# script in our Scripts folder in our Project window and name it, creatively, Dimmer2. Once you’ve confirmed you’ve spelled and capitalized the script’s title as per your intention, double-click it to open it in Visual Studio.
Like the first statement we typed, the class definition appears by default in the Unity script. This is the name we give our script on first creating it in our Project window. Recall that a class is a blueprint for an object. We won’t be creating a Dimmer2 object in our scene. We still, however, need the script to exist for us to access the functions (verbs, the doers) in our script. It exists to hold properties and functions for us like a container.
Here, after we’ve manipulated the desired property, we attach it back to the game object from which we originally took it. In our analogy about the mechanic, this code equates to the moment we return the car’s carburetor after fixing it, for example, back to the car’s engine.
Step 3: Add a Unity Axis Action Component to a Game Object
After we allow Unity to compile the changes we’ve made to our Dimmer2 script, we’re ready to attach it as a component to our LightController2 game object. Select the controller object in the Scene Hierarchy, and in the Inspector add the Dimmer2 script as a component. Remember, you can either click Add Component and search for Dimmer2 or drag and drop the script from the Project window onto the controller object.
Step 4: Identify the Controller Button to Connect to the Unity Axis Action Through the Input Menu
When we open our Unity input settings (Main Menu ➤ Edit ➤ Project Settings ➤ Input) we see a drop-down list of all the axes to which Unity has mapped input.
Congratulations! You just mapped your controller game object and its attached Dimmer2 script to your touch controller’s right trigger button. Now, the float variable rate we defined as a parameter on our DimLight function in our Dimmer2 script will read the value communicated by the pressure placed on your touch controller’s right trigger. With our input into our Unity Axis 1D Action established, let’s define our output.
Step 5: Connect the Chain of Events
Step 4: Add a Virtual Camera and a TrackedAlias Prefab to the Scene
Finally, in the Scene Hierarchy, also confirm that only one camera game object is active. If your scene includes a default Main Camera game object, then deactivate it in the Inspector. Further, confirm the two point light objects you’ve added in this chapter are active, too, and connected to their respective LightControllers.
Step 5: Play-Test
That’s it! Save your scene and give it a whirl!
As Annie, you just heard back from Elgin that the theater’s board of directors were impressed with his immersive presentation. With a clear understanding of Elgin’s vision they have no problem cutting the check required to fund the production. Congratulations, Annie! Because of you the show can go on!
The application of the tools you’ve learned in this exercise is, of course, not limited to dimming lights. With the knowledge you now have about VRTK, Unity, and C# scripting, you can create an assortment of actions influenced by the sensitivity of a user’s input. For example, you could create an interior design tool to help a user visualize what different shades of a certain color would look like in a bedroom. You could prototype an installation and test parameters of different ambient features in a space. The possibilities are defined by your imagination.
Summary
In this chapter we covered the creation of an event handler that connected user input through both a keystroke and touch controller trigger action. The event handler, contained within a VRTK Unity Action component, published a notification to our application when the user provided a particular input. We created subscribers to these user inputs by writing functions in C# scripts that performed an operation on a property of a game object in our scene. Finally, we connected both the scripts and the game objects that served as targets for our user’s actions to the VRTK Unity Action component’s event handler, which we stored in our scene as components of an empty game object we called a controller.
Specifically, in the location scouting exercise, the event handler with which we concerned ourselves the most was the Value Changed event on the Unity 1-Dimensional Axis component. The Directional Light has been a consistent game object in the scenes we have created throughout this book, but in this chapter we manipulated its default settings and introduced Point Light game objects into the scene as listeners to our 1D Axis Action event. Connecting control of the Point Lights’ intensities to both a keyboard input and controller input demonstrated two of the ways available to us, as developers, to capture user input. As VR aims to liberate the user from a workstation like a computer, events in our applications tied to controller input increase the immersive sensation of our experience. Further, familiarity with VRTK’s Button and Axis Action components streamlines the trial-and-error methodology of iterating a VR experience. The more reusable components we can use in the design of our VR applications, like VRTK’s Action components, the more quickly we can create and the more code complexity we can avoid.
So far, we’ve imported 3D assets into our projects; we’ve connected our headsets to our scenes through virtual cameras; and we’ve mapped changes to game objects to user input through both a keyboard and touch controller. Although the key to effective immersive experiences is dynamic feedback to a user’s input, the promise of VR lies in much more than interacting with an object at a distance. We want to be able to touch a virtual object and feel as if our bodies can change our environment. That’s the real high of a virtual existence; the moment our minds slip from R to VR. In the next chapter, we’ll delve deeper into that through the use of VRTK’s suite of tools for interaction.