© Rakesh Baruah 2020
R. BaruahVirtual Reality with VRTK4 https://doi.org/10.1007/978-1-4842-5488-2_7

7. Trigger Input Through One-Dimensional Axis Actions

Rakesh Baruah1 
(1)
Brookfield, WI, USA
 

What a mouthful, huh? One-dimensional axis actions. It sounds more complicated than it is. Hopefully, by the end of this chapter you will agree.

So far in this book we’ve introduced virtual cameras, 3D objects, tracking and spatial anchors, interactors, and button events. Together, those things can makeup a valid virtual experience. However, one element integral to VR—so much so that it’s in the name of its controllers—we haven’t discussed is touch.

In the context of the human mind, the medium through which you and I interact every day, touch is the sensation we feel through our skin. The world outside us, reality, excites nerves and stimulates them, creating the pressure of touch, the weight of touch, the resistance of touch, and the warmth of touch. As of this writing, we are not yet able to re-create the psychophysical sensation of touch through a consumer-grade VR system. If the concept of touch isn’t even within the boundaries of achievable VR, what then do we mean when we discuss touch?

In this chapter you will learn the following:
  • The role touch plays in VR immersive design.

  • The purpose of the VRTK 1D Axis Action component.

  • The value of lighting design to VR scenes.

  • The different kinds of lighting options provided by Unity.

  • How to navigate through a scene in the Unity Editor using the 3D viewport and Unity hot-keys.

  • How to connect VRTK’s Input Mappings to VR controllers and Unity Actions.

The Psychology of Touch

Within the overlap of neuroscience and physical rehabilitation, there exists an exercise that physicians perform with patients who suffer from phantom limb syndrome (Figure 7-1). In a group of people who have undergone amputation of an extremity, a percentage experience pain in the limb they lost. Although the limb is no longer a part of their physical body, the patient feels pain in the limb as if it were still physically there. Because the limb does not exist, physicians conclude the pain the patients feel exists not in the limb itself, but in the mental model the patients have formed of their body. The pain, in other words, occurs in the mind.
../images/488645_1_En_7_Chapter/488645_1_En_7_Fig1_HTML.jpg
Figure 7-1

An exercise to treat phantom limb pain in amputees provides lessons applicable to VR design. Source: Golan Levin via Flickr CC BY 2.0. https://www.flickr.com/photos/golanlevin/19290187922

One practice designed to treat phantom limb syndrome leverages the patients’ way of thinking to ease their discomfort. For example, to treat patients experiencing phantom limb pain in a left hand they lost in a motor vehicle accident, the exercise requires them to place both arms inside opaque containers. The container for the left hand has an opening through which the patients can see. What the patients see, however, is not actually there. What they see is a mirror image of their right hand broadcast through a video screen, for example. What they see appears to be the return of their left hand. As the patients move their right hand, their “left” hand moves in concert. In a short period of time, the patients’ mental models of their bodies conform to their perceived reality. Once the patients’ mindset shifts toward the presence of the phantom limb, therapists might lead the patients through rehabilitation exercises to reduce the presence of pain. Although physicians don’t fully understand the mechanisms in the brain responsible for the patients’ shift in perception, there are still lessons from the treatment relevant to our decision making as immersive experience developers.

Body and Mind

The first time I turned on my Oculus Rift in 2018 I experienced a formative mental shift. On starting, the Oculus application mapped a translucent, blue, human-like hand to my own through my VR headset. As I moved my fingers along the touch controller, the Oculus system’s sensors seemed to track their individual movement. I could point with one finger, I could make a fist, and I could give a thumbs up—all in virtual reality (Figure 7-2). I lost the sense of the controller in my hand. I had identified my mind, my sense of self, with the digital representation of my body.
../images/488645_1_En_7_Chapter/488645_1_En_7_Fig2_HTML.jpg
Figure 7-2

Through the use of physical avatars, VR developers facilitate the sensation of immersion

Other mixed reality systems, such as the Hololens 2, forgo representation completely. Inside-out tracking, which moves infrared sensors from external towers to sensors inside the headset, provides the ability to track a user’s hands without controllers. It likely won’t be long until both MR and VR headsets interpret the presence of a user’s body without controllers at all. Unconscious association of the self with a virtual avatar might become more seamless and more natural over time.

This is the next level of media to which VR evangelists refer when they speak of this great, new technology. The most powerful, most useful feature of VR is the synthesis it catalyzes between the outer and inner self. The moment of transcendence, the moment that clicks within a virtual experience when your body leaves your mind—that is the wonder of VR. It is also the foundation of what we explore in this chapter.

The Unified Whole

If the core of VR is immersion then its antithesis is disruption. Feedback determines the quality of an immersive experience. Does what the users see conform with their expectations? We can orchestrate these expectations through visual and audio cues in our experiences. For example, a highly stylized environment might allow a user to loosen their expectations of realistic physics. A cartoonish, modularly designed, voxel-based world, however, might not prepare a user for a gritty, violent, dramatic experience. At a more fundamental level, however, we find the success of immersion within the synchrony of the brain.

Identification of wholeness, of bodyness, we feel as humans when we watch our fingers move is a fulfillment of the feedback loop between the sensory, motor, and visual cortices of our brains (Figure 7-3). I will an action, I carry it out, and I validate its occurrence. A stuttering frame rate, an unrealistic response to physics, and inaccurate input feedback are the death knells of VR. The human mind is unforgiving of false reality. Any way the movement of our users does not match the visual feedback expected by their brain will tear them out of an immersive state we hope to design them into. How, then, do we capture a user’s hand movement to foster the experience of immersion?
../images/488645_1_En_7_Chapter/488645_1_En_7_Fig3_HTML.png
Figure 7-3

A 20th-century interpretation of the process carried out within the mind of a rat at the sound of a bell. Source: Image from “Brains of Rats and Men; a Survey of the Origin and Biological Significance of the Cerebral Cortex” (1926, 71)

Enter the VRTK Axis Action

According to the VRTK documentation:

Unity Axis Actions tie into the Unity Input Manager and emit events when a defined Input axis changes value. There are two kinds of Unity Axis Actions:

  • Unity 1D Axis: listens for changes on a single axis and emits a float value for the axis changes.

  • Unity 2D Axis: listens for changes on two axes and emits a Vector2 value combining both axis changes.

A Unity Axis Action is derived from a Zinnia.​Unity Action and therefore can be injected into any VRTK prefab that requires varying float data (e.g., touchpad movement).

VRTK will ask via a popup window on first load of the Unity project whether to attempt to auto create some Unity axis input mappings. If this is accepted then default axis data for left and right controller touchpad/thumbsticks, grip and trigger axes will be created.

The most important information from the VRTK documentation regarding Unity Axis Actions explains how the VRTK framework can help us introduce touch controller action into our scenes beyond simply acknowledging when a user presses a button. Triggers and joysticks are common input features on many VR touch controllers. That these input systems allow for more than simply on/off values like float (numeric decimal) values makes them powerful tools in our toolkit for the creation of immersive, VR experiences.

The Unity Input Manager exists in every Unity project in the Edit ➤ Project Settings window from the menu bar. It maps the relationships between user inputs and their impact on our applications. The Input Manager contains a list of axes and buttons available for us to connect events to in our application. The Unity Input Manager can overwhelm developers new to Unity design. Changes to its settings can have far-reaching impact (both intended and unintended) on the development of our experiences. By tying into the Unity Input Manager, the VRTK framework’s Unity Axis Action components streamline the development process for prototyping user interaction within our VR experiences.

Like with our Unity Button Action and Unity Boolean Action components, VRTK’s Unity Axis Action component uses the TrackedAlias VRTK game object to accept input from our VR touch controllers to produce a change in our program. The only distinction unique to a Unity Axis Action from its Button and Boolean Action siblings is that it tracks a gradient of input through numbers not just an on/off state. For a simple analogy, let’s look at an automobile. The ignition in our car, fundamentally, has two states: on and off. Our accelerator, however, allows us to change speed across a continuum of values. The function of a car’s accelerator is like the function of a Unity Axis Action component through VRTK.

To see the purpose a Unity Axis Action component can serve in our VR apps, let’s try our hand at a prototype.

Exercise: Location Scouting

Annie Svenstrup has a passion for theater. Rather than in front of the lights, however, her place is behind them. She works as a dental hygienist, but she devotes her free time to the community theater in the town where she lives. One day, Elgin Sommers, the eccentric director of this season’s big production, approaches Annie to help him execute his vision for the play’s stage design. He’d like to build the facade of an abandoned, decrepit inn. Before he commits the theater’s budget to building the set, however, he’d like Annie to help him visualize the set’s lighting to convince the donors to cut the big check.

In this exercise, as Annie, we will import a 3D model of Elgin’s set. We will place virtual lights into the scene to illuminate the set. Finally, we will connect user input to the control of the lights so Elgin can impress the theater’s board of directors with the veracity of his vision.

The model we will be using in this exercise is available from the project GitHub repo labeled by chapter. The model is called “Team America 2004 Miniature (photogrammetry)” by Austin Beaulier. You can find the original and more of Austin’s work from their Sketchfab profile at https://sketchfab.com/Austin.Beaulier.

I’m using the model under the Creative Commons licensing agreement (CC BY 4.0). If you plan to incorporate Austin’s work into your own immersive experiences, please follow the guidelines of attribution documented at https://creativecommons.org/licenses/by/4.0/.

It might be of interest to you to know that scenarios like the one I have described are, in fact, taking place in the commercial world. In 2017, Cirque du Soleil used the Microsoft HoloLens to help design sets for future performances. In 2018, hip-hop artist Childish Gambino and his team used VR and the HTC Vive to prototype “Pharos,” an immersive concert experience that won its creators a prestigious award from the Visual Effects Society. Although these productions are enormous endeavors with huge teams, the tools used by their creators to grow a vision from the seeds of imagination are available to you and me.

Before we get to work with VRTK’s Axis Action component, however, let’s create a scene the old-fashioned way to get our feet wet with the fundamentals.

Step 1: Project Setup

An easy reference for getting a Unity VR project setup with VRTK is available at https://github.com/ExtendRealityLtd/VRTK#getting-started.

Create a new VR-supported, 3D project in Unity. If you are using Unity 2019 navigate to the Window ➤ Package Manager console and install the XR Legacy Input Handlers.

Here, we follow the now familiar step of cloning the VRTK GitHub repository into our Unity Project’s Asset folder. Use the following procedure from the VRTK Getting Started documentation to clone the repo.

  1. 1.

    Navigate to the project Assets/ directory.

     
  2. 2.
    Git clone with required submodules into the Assets/ directory:
    • Change to the newly cloned directory: cd VRTK/

    • git submodule init && git submodule update

     
  1. 3.

    The Unity software will now import and compile the new files.

     
If Unity prompts you with a window asking if you’d like to update its input mappings, click Yes. This is the window to which the VRTK documentation refers when it states:

VRTK will ask via a popup window on first load of the Unity project whether to attempt to auto create some Unity axis input mappings. If this is accepted then default axis data for left and right controller touchpad/thumbsticks, grip and trigger axes will be created.

Before we drag the UnityXRCameraRig and TrackedAlias objects into our scene from the VRTK prefabs folder in our Project window, let’s create a demo of our scene in simple 3D.

To begin, we import our 3D model into our Assets folder. If you plan to use the 3D model of the building exterior I’ve provided for this exercise, you can download it from the exercise repo. Unzip the file and drag its contents into your project’s Asset folder. Unity might notify you of a warning about the model’s “normals” on its default mesh (Figure 7-4). If the warning appears yellow in your console you can ignore it for the purpose of this exercise. Select the model prefab that is identified by the name “model” preceded by a triangle identifying it as a parent of child objects. Drag the model into the Scene Hierarchy (Figure 7-5).
../images/488645_1_En_7_Chapter/488645_1_En_7_Fig4_HTML.jpg
Figure 7-4

Warning messages appear in yellow in the Unity Console window. Unlike errors, which appear in red, their presence does not prohibit the execution of a scene

../images/488645_1_En_7_Chapter/488645_1_En_7_Fig5_HTML.jpg
Figure 7-5

After downloading the 3D model, drag it from the Assets/Models folder into the Scene Hierarchy

You might notice the orientation of the model is askew in our Scene view (Figure 7-6). This can occur often when importing 3D assets because the zeroed, default positioning of the model might be determined by the asset artist on exporting from 3D modeling software such as Blender, Maya, or Cinema4D. Change the values of the parameters in the Transform component of the model to set the scene to your liking.
../images/488645_1_En_7_Chapter/488645_1_En_7_Fig6_HTML.jpg
Figure 7-6

Models might import to Unity askew because of Transform settings baked in by the model’s modeling software

If you’re using the 3D model provided for this exercise, you can also feel free to copy the Transform settings I’ve used for the game objects in my scene, which include a plane object for the ground (Figure 7-7). Listed here are the game objects in my Scene Hierarchy and the properties I’ve set on their transforms.
../images/488645_1_En_7_Chapter/488645_1_En_7_Fig7_HTML.jpg
Figure 7-7

A screenshot of my scene after changing the Transform settings of the 3D model and adding a 3D plane object

Main Camera
  • Transform.Position(x,y,z): -0.74, 1, -10

  • Transform.Rotation(x,y,z): 0, 0, 0

  • Transform.Scale(x,y,z): 1, 1, 1

Model
  • Transform.Position(x,y,z): 2.05, -.02, 2.68

  • Transform.Rotation(x,y,z): 23, -90, 0

  • Transform.Scale(x,y,z): 2, 2, 2

Plane (right-click in Hierarchy ➤ 3D Object ➤ Plane)
  • Transform.Position(x,y,z): 1.02, -0.9, 1.52

  • Transform.Rotation(x,y,z): 0, -90, 0

  • Transform.Scale(x,y,z): 5, 1, 5

Alternatively, you can use whatever model you prefer to set your scene or simply construct your own building facade using object primitives (cubes, planes, cylinders) from within the Unity Editor. The purpose of this exercise is to learn the process of placing lights in a Unity scene and controlling the appearance of lighting through user input. As long as you follow along with the placement of lights and the scripting, the purpose of the project will not be lost on you.

Step 2: Swap the Skybox

Lighting is the primary technique we have at our disposal as Unity developers to manipulate the mood of our scene. Because the model Annie uses is of a decrepit, old inn, it doesn’t feel right to Elgin, her director, that the model sits beneath a blue sky lit by the sun. Instead, Annie wants the user, on entering her scene, to feel an ominous threat of possible danger. Anything could be lurking in the night.

To create this vibe, let’s swap out the default, sunny, blue Skybox in our scene with a black, starless night. To accomplish this, we simply follow the steps from Exercise 1. First, let’s create a new folder in our Assets folder called Materials. In the Project window, right-click inside the Materials folder and select Create ➤ Material. Name the material black. In the Inspector, set the black material shader to Skybox/Cubemap, and set its Tint Color value to black (Figure 7-8).
../images/488645_1_En_7_Chapter/488645_1_En_7_Fig8_HTML.jpg
Figure 7-8

Create a black Skybox from a material object

As we did in Exercise 1, let’s open up our Skybox properties and swap out its material. Navigate to Window ➤ Rendering ➤ Lighting Settings. Into the Skybox Material field beneath the Lighting ➤ Environment Property, drag and drop the Black material we created.

Step 3: Change the Value of the Directional Light

Speaking of light, let’s add some to our scene! First, let’s start by adding a little bit of moonlight. We can re-create the ambience of moonlight through the use of a Unity Directional Light. Right-click in the Scene Hierarchy and select Light ➤ Directional Light (Figure 7-9).
../images/488645_1_En_7_Chapter/488645_1_En_7_Fig9_HTML.jpg
Figure 7-9

Right-click or Ctrl+Click in the Hierarchy to create a new Light object

Unity places a default Directional Light object in our scene. For convenience sake, if you haven’t already, delete the original Directional Light object that comes as a default object with each new Unity Scene. Name our new Directional Light object MoonLight. The following are the Transform settings I set for the MoonLight object in my scene (Figure 7-10):
  • Transform.Position: 9, 11, -16

  • Transform.Rotation: 37, -25, 0

  • Transform.Scale 1, 1, 1

../images/488645_1_En_7_Chapter/488645_1_En_7_Fig10_HTML.jpg
Figure 7-10

The Transform settings for my Moonlight object are shown in the Inspector

In the Inspector, with the MoonLight object highlighted in the Scene Hierarchy, notice the component called Light attached to the game object (Figure 7-11). Remember, every object in our scene is an actor, ironically, even the lights. As actors, each game object has the capacity to hold components that distinguish it from other game objects. Of course, a Directional Light game object is distinct from an empty game object that only holds a Transform because of its component called Light.
../images/488645_1_En_7_Chapter/488645_1_En_7_Fig11_HTML.jpg
Figure 7-11

A Light component appears by default on a Directional Light object created in the Scene Hierarchy

Let’s manipulate the settings of our Light component to better simulate moonlight. In the Inspector, confirm that the Light.Type is Directional. Using the color picker, set the Light.Color property to a cool, whiteish blue.

Let’s wrap up the settings on our MoonLight game object’s Light component by setting its Intensity to 0.3 and its Shadow Type to Soft Shadows. These settings, like most of the others related to the appearance and location of game objects in our scenes, are to taste. Fiddle away at the knobs until you’ve tweaked the appearance of the scene to your liking (Figure 7-12). You’re learning Unity and VRTK to better express your creative vision, after all, so you might as well practice along the way.
../images/488645_1_En_7_Chapter/488645_1_En_7_Fig12_HTML.jpg
Figure 7-12

What the scene can look like with the Skybox removed, a directional light tinted blue, and soft shadows elevated on the 3D model

Step 4: Add a Doorway Point Light

With the settings of our MoonLight object applied, let’s turn our attention to another kind of light provided by Unity. In the model I am using for this exercise, a doorway sits to the right of the building’s main entrance. The doorway looks as if it could lead to some kind of alley; appropriately spooky, I’d say. I’d like to apply a small light to the eave of the alleyway that the user can dim and lighten at will.

Unity offers a set of built-in lighting objects we can use to create an atmosphere in our scenes. They are a directional light, a spotlight, an area light, and a point light. There are also two options for objects called probes, but we discuss those later. In this exercise we’ve already worked with a directional light to mimic the soft light from our moon. Now, let’s add a point light.

As we did to create the MoonLight object, right-click in the Scene Hierarchy and select Light ➤ Point Light. Name the new object Point Light and in the Scene view place the point light to appear as if it emanates from the overhang of the alley. If you are following along with the exercise using the settings I have applied to my project, you can use the following Transform coordinates to position the Point Light game object.
  • Transform.Position: 2.17, 0.83, 2.63

  • Rotation: 0, 0, 0

  • Scale: 0.8, 0.8, 0.8

In the Light component attached to the Point Light game object confirm the Light.Type is set to Point. Set the range to a value that you feel is appropriate for the mood you’re creating for your scene. Notice if you hold your mouse over the Range field, your cursor turns into a pointer with two opposite-facing arrows. When the two opposite-facing arrows appear around your cursor, you are allowed to click and hold your left mouse button to drag the field to the value you’d like to set as its parameter. I’ve chosen a float value of 4.45 for my PointLight.Light.Range value. Set the Shadow Type to Soft Shadows and select a warm, yellowish color to simulate the appearance of streetlights or glowing lamps (Figure 7-13).
../images/488645_1_En_7_Chapter/488645_1_En_7_Fig13_HTML.jpg
Figure 7-13

A Point Light’s Range, Color, Shadow Type, and other parameters can be manipulated in the Inspector

By now your scene should be looking a lot more interesting than when we started the exercise. By adding three objects to our default scene—a 3D model, a directional light, and a point light—we’ve already created an interesting prototype for an immersive experience (Figure 7-14).
../images/488645_1_En_7_Chapter/488645_1_En_7_Fig14_HTML.jpg
Figure 7-14

Simply adjusting the parameters of light objects in Unity can completely alter the tone of a scene

Step 5: Allow the User to Change the Point Light Through the Keyboard

Of course, as we’ve already discussed, the key to a convincingly immersive experience is—that’s right—feedback! Now, we’re going to create a script to empower our user to control the brightness of our doorway light.

Recall that when we introduce interactivity into our scene through an original script, we need a game object in the Scene Hierarchy on which we can attach our script as a component. By convention, we give this otherwise empty game object a name that ends with either “manager” or “controller.” Let’s create an empty game object in our Scene Hierarchy by right-clicking with our mouse. Name the new, empty game object LightController.

Before we create our script, create a new folder in your Assets folder called Scripts. Getting into the habit of organizing your Unity projects with easily identifiable folders will help increase the speed of your prototyping over time. In the new folder, right-click and create a new C# script. Double-click the script to open it in your IDE, and add the code shown here. You can find the code for this project like the others on the exercise’s GitHub page.
using UnityEngine;
public class Dimmer : MonoBehaviour
{
    public Light targetLight;
    public float rate = .08f;
    // Start is called before the first frame update
    void Start()
    {
    }
    // Update is called once per frame
    void Update()
    {
        Light pointLight = targetLight;
        if (Input.GetKey(KeyCode.UpArrow))
        {
            pointLight.range += rate;
        }
        else if (Input.GetKey(KeyCode.DownArrow))
        {
            pointLight.range -= rate;
        }
    }
}

Like the script we wrote in our last exercise, this script stores a component of a game object in a variable, manipulates the value of the variable, and attaches the component back to its object. Two things you might not recognize in our Dimmer.cs script are the += and -= operators. These translated into English are “equals itself plus” and “equals itself minus,” respectively. In the code within the Update() function in Dimmer.cs the expressions within the conditional statements (if/else if) can be expressed in words as “set the point light range value equal to itself plus the value of the rate variable” and “set the point light range value equal to itself minus the value of the rate variable.” The value of the rate variable is a property on our Dimmer.cs script component we make public to allow us to adjust it within the Unity Editor.

An Aside on Magic Numbers

Storing known values like 0.08f (where f identifies the value as a data type float) in variables allows us to avoid a bad habit in programming called magic numbers. When developers tell you that you have a magic number in your code they are telling you that you have used a literal value, like 0.08f, instead of a variable. Although our code would still execute if we used the floating-point number 0.08f instead of the variable rate, the practice of avoiding magic numbers pays dividends over time. What if, for example, our Dimmer.cs script was but one component in a complex web of scripts communicating with each other, passing values back and forth? Under such conditions, a request from a project manager to change the rate at which our point light dims would require us to go through all of our code, changing each and every instance of the literal value 0.08f. Using a variable like rate, instead, gives us the freedom in the future, if we choose, to change the value of the point light’s rate of dimming by simply updating the value once at the moment we declare the rate variable in our script.

With the code in our Dimmer.cs script set, we save our changes and return to Unity. After waiting for the code to compile in the Unity Editor, attach the Dimmer.cs script as a component to the LightController game object in the Scene Hierarchy. Once you’ve attached the Dimmer.cs script to the LightController object, you will see two empty fields beneath the Dimmer (Script) in the Inspector. As we did with the cake object in our Interest Calculator exercise, drag and drop the Point Light object from the Scene Hierarchy on to the Dimmer (Script) Target Light field (Figure 7-15). Because we assigned the data type Light to this field in our script, Unity will only allow us to drag and drop a Light object into the Target Light field. Notice, too, that the value of our Rate field is already set to the 0.08 value we defined in our script. This is how we can define default values for public properties we create on objects through our scripts.
../images/488645_1_En_7_Chapter/488645_1_En_7_Fig15_HTML.jpg
Figure 7-15

Declaring the Target Light “public” in our C# script allows us to drag and drop the light object we’d like to control through scripting on to the script’s game object property in the Inspector

Step 6: Play-Test

Once you’ve set the properties on the Dimmer.cs script component attached to our Light_Controller, save the scene and give it a play. When you press the up-arrow key, the point light hanging in our alleyway should increase in brightness by a value of 0.08. Pressing the down-arrow key will diminish the brightness by 0.08. This is the work of the conditional statements we wrote in our Update() block (Figure 7-16).
../images/488645_1_En_7_Chapter/488645_1_En_7_Fig16_HTML.jpg
Figure 7-16

The conditional statements (if/else) we wrote in the Update() function of the C# script control the intensity of a light object through user input

An Aside on Errors

If you get an error in the Console window of the Unity project, click it to read the details. Double-clicking the error might take you to the line of code in the Dimmer.cs script where the error occurred if it is in fact the result of a coding mistake. If you do encounter an error notification connected to the code, refer back to the code printed in this book. Confirm that everything is entered as is with the same spelling, capitalization, and punctuation. Because VRTK 4 only became available for public use in March 2019 and both Unity and Visual Studio release frequent updates, the error might not be a result of anything you did. Such is the nature of developing in such a bleeding-edge space. Fortunately, as an open source project, VRTK allows users to submit errors through its GitHub page. Forums, too, keep developers abreast of the latest developments to a code base.

Step 7: Practice Moving Lights, Cameras, and Yourself Around the Scene

If you’d like more practice placing point lights in your scenes, you can find a way to place a point light within the 3D model in our scene to create the illusion that a light burns within one of the rooms of the building. You can use the toolbar above the Scene View window to manipulate the size, location, and rotation of any object in your scene. Convenient shortcut keys for manipulating game objects are as follows:
  • w to display handles to move the object along an axis.

  • e to display the handles to rotate an object around an axis.

  • r to display the handles to change the size of the game object.

In Windows, holding the Alt key and the left mouse button in the Scene View window allows you to rotate your view around the scene. Holding the right mouse button allows you to tilt and pan across the scene. If your mouse has a button between its left and right buttons, then scrolling it allows you to zoom in and out of the scene, and holding it down and moving the mouse allows you to move vertically and horizontally in two dimensions within the scene.

Changing your view within a scene has no impact on the placement of your scene’s main camera. Camera objects, like other game objects, can be moved through the buttons on the toolbar or the keyboard shortcuts laid out earlier. If, however, you would like to reposition a scene’s camera to match the view you’ve found in your Scene window, then you can use the ever-handy shortcut Ctrl+Shift+F (in Windows) to align any game object highlighted in your Scene Hierarchy to the view currently displayed in your Scene View window. Other convenient functions to help you maneuver objects around your scene can be found on the GameObject menu (Figure 7-17).
../images/488645_1_En_7_Chapter/488645_1_En_7_Fig17_HTML.jpg
Figure 7-17

The GameObject menu includes helpful shortcuts for maneuvering around a scene

Step 8: Change the Value of the Rate Variable in the Unity Editor

Finally, before we move on to converting our scripted dimmer function into a Unity Axis Action component using the VRTK framework, experiment with changing the value of the Rate variable in the Dimmer.cs script component in the Unity Inspector. Remember, changing the value of a game object property while the Editor is in play mode will not save it to the computer’s memory. If, however, you change the value of Rate while the Editor is stopped and then press Play, you will notice that the increment by which the alleyway point light brightens and dims changes, too. As you can see, defining malleable properties in our scripts with the public keyword allows us to prototype more quickly and efficiently in our editor.

Remaining in the Unity Editor as much as possible is a convenient and quick way to get ideas for our immersive experiences out of our imaginations and into a visual space. It’s one thing to creatively solve the problem of what code to write to execute an action you envision in your scene, but it’s another to write the code in an elegant, efficient way that allows for easy troubleshooting and smooth playback. Scripting a dimming function to respond not just to a touch controller’s trigger input, but to the trigger input of any brand of touch controller that has and is yet to exist, is the fastest, easiest way to lose steam on an immersive project.

In today’s fast-paced world where content ships quickly and can update within moments, getting our experiences out into the world is our goal. Although it’s tempting to curate every nook and cranny of our app to make sure our code doesn’t expose us as frauds, the unseemly secret of interactive applications is that we can’t know how they’ll work until someone interacts with them. To that end, rapid iteration is essential to VR development. The more often we can steamroll our way through the prototype of a vision, the more reps our application will get with real-life users and the more feedback we’ll receive to improve it. It is for this very reason that the VRTK framework can be such a valuable weapon in your creative arsenal. Let’s see how easy VRTK makes it to hook up our touch controller to a function that dims a light in our scene.

Adding a Unity Axis Action

By now, if you’ve completed the exercises from previous chapters and the section prior to this, then you are familiar with both connecting controller events to Unity Button Actions and manipulating float values of game objects through user input. What we have not yet addressed, however, is the process required to link user input to a parameter’s float value through a VR controller. A Unity Button Action will not work because our parameter, the intensity of a point light, requires more than an off/on signal. One way we create a channel through which our VR application captures a user’s controller input as a continuous float value is by implementing the VRTK framework’s Unity Axis Action component.

Step 1: Add a Second Point Light

Remember how earlier I said you could have optional practice with our broken building scene by placing a point light in one of its windows? Well, I lied. It’s not optional. If you’ve already done it, excellent! If you haven’t, please do it now. Refer to the steps we followed when placing the alley light in our scene for reference.

For your convenience, here are the transform settings for the bedroom pointlight I placed in my scene (Figure 7-18):
  • Transform.position: -5.43, 4.88, 4.21

  • Rotation and scale set to default.

../images/488645_1_En_7_Chapter/488645_1_En_7_Fig18_HTML.jpg
Figure 7-18

Adding a second point light to the scene creates the effect of an occupied bedroom on the model’s second floor

Step 2: Write a Second C# Script for the New Axis Action

Now that you have a point light set up inside one of the building’s windows, let’s wire it up to a trigger action on our right touch controller. To accomplish this, we must use a Unity Axis Action component. Why a Unity Axis Action component? Well, buttons have two inherent states that developers use to communicate responses to yes/no, true/false questions. Triggers, as an axis action, allow us, as developers, to measure a user’s input on a spectrum, for example, between 0 and 1.

After you’ve set up a second point light in your scene, name it RoomLight. Create a new empty game object in the Scene Hierarchy and name it LightController2. Because this game object is going to be a controller in our scene, let’s add a script to it. The script connected to the controller as a component will provide the controller instructions for that which it will control. Let’s create a new C# script in our Scripts folder in our Project window and name it, creatively, Dimmer2. Once you’ve confirmed you’ve spelled and capitalized the script’s title as per your intention, double-click it to open it in Visual Studio.

Now that we’re in Visual Studio, let’s delete everything in this script and write some code from scratch. If you’re feeling palpitations in your chest, don’t worry; I’ll be here to guide you each step of the way.
using UnityEngine;
That’s the first line we write at the top of our script. It tells our script what code library to use. By telling our script to use the UnityEngine namespace we give our script access to all the classes and functions defined inside the UnityEngine namespace such as the Input system, the Event Manager system , the Physics system, and so on.
public class Dimmer2 : MonoBehaviour {

Like the first statement we typed, the class definition appears by default in the Unity script. This is the name we give our script on first creating it in our Project window. Recall that a class is a blueprint for an object. We won’t be creating a Dimmer2 object in our scene. We still, however, need the script to exist for us to access the functions (verbs, the doers) in our script. It exists to hold properties and functions for us like a container.

Notice that our class name inherits from the MonoBehaviour class and concludes with an opening curly brace {. What does it mean when our class inherits from the MonoBehaviour class? It means we can now attach the script of our class to a Unity game object as a component. The opening curly brace tells our compiler (the program that translates our more human-readable C# code into the language of our processor, machine code, through a C++ intermediate) that we are beginning the definition of a class. The next line we write is this:
public Light targetLight;
This line declares a public variable called targetLight that is of data type Light, which we gain access to through the using UnityEngine statement at the start of our script. Because it is public, the field will be available to us in our Unity Editor.
public void DimLight(float rate) {
Again, we use an opening curly brace. Here, however, our curly brace does not indicate the creation of a new class. Because we are already within a class called Dimmer2, this opening curly brace marks the opening of the code block for our function DimLight . The parentheses after a function’s name indicate the type of input our function will take. DimLight will take a float data type. The name of our parameter is arbitrary. I’ve chosen the variable name rate because it determines the rate at which our light’s intensity will increase or decrease according to the user’s trigger input. Although the variable name of a parameter is arbitrary, it must remain consistent through the block of a function.
Debug.Log("Dimmer2 function called");
A value within quotation marks in a script is of data type string. In this line, I’ve passed a string as an argument into the Log function of the Debug class . Let’s do a quick experiment. Comment out the first line we typed into our script using two forward slashes:
//using Unity.Engine;
After a moment your script will recompile. Do you see three red squiggly lines in your code as shown in Figure 7-19? If so, good! In Visual Studio, red squiggly lines tell us we have an error in our code.
../images/488645_1_En_7_Chapter/488645_1_En_7_Fig19_HTML.jpg
Figure 7-19

Commenting out the UnityEngine using a statement removes the definitions for objects in its namespace, creating errors

At the bottom left of the Visual Studio window is a menu called Error List (Figure 7-20). If you click that menu, a window appears that lists the errors the compiler found in our code.
../images/488645_1_En_7_Chapter/488645_1_En_7_Fig20_HTML.jpg
Figure 7-20

The Error List in Visual Studio lists the errors in your code. Double-clicking an error often takes you to the specific line in the code where the error occurs

The third error that appears in my code is this:
Error CS0103 The name 'Debug' does not exist in the current context
Now, let’s uncomment our using statement by removing the two forward slashes. The red squiggly lines disappear and our Error List shows no errors. Further, in our code the words MonoBehaviour, Light, and Debug have returned to the color green (Figure 7-21). We’ve just learned that these three words are classes in the UnityEngine namespace. We did not have to create object instances of these classes in our script because the statement using UnityEngine does that for us.
../images/488645_1_En_7_Chapter/488645_1_En_7_Fig21_HTML.jpg
Figure 7-21

Removing the comment marks returns the UnityEngine library to the scope of the project

I’ve called the Log function on the Unity Debug class and placed it into our function so that I have some kind of visual reference that the code is executing should another error appear in my scene. Its value will become apparent shortly.
float intensity = targetLight.intensity;
Here, again, we’re using a new variable to hold the value of a property on our targetLight object as if our script is a garage accepting a car called targetLight into service. We can’t work directly on the car so we remove its component of interest to isolate it for manipulation. I define my new variable intensity with the data type float because that is the data type required for a Light object’s intensity property. It is because Unity measures a Light’s intensity with a float value that makes it a property great for us to connect to a Unity 1D Axis Action through VRTK. Recall that the distinguishing feature between an Axis Action and a Button Action is that an Axis Action can communicate values on a spectrum. Whereas the integer data type (int) can only hold whole numbers, float data types can hold decimal numbers. As a result we can measure a user’s input beyond just 0 and 1; we can also capture values (to a finite decimal point, of course) between 0 and 1. Naturally, this makes perfect sense for us to use in our DimLight function because we are interested in not only capturing the intensity of a light at its maximum and minimum (on and off), but also in a range between.
intensity = rate;
After storing our public targetLight game object’s intensity property in a variable of its own, we set the value of that variable, intensity, to the value of the argument of our function. Remember, the argument of a function is the specific value passed into the function at the time of its call. Because VRTK’s Unity 1D Axis Action component will execute our DimLight function whenever the user presses the right trigger button on the touch controller, the rate will be whatever float value corresponds to the pressure with which the user pulls the trigger. This expression, therefore, sets the intensity of our targetLight according to the input from the user’s finger.
targetLight.intensity = intensity;

Here, after we’ve manipulated the desired property, we attach it back to the game object from which we originally took it. In our analogy about the mechanic, this code equates to the moment we return the car’s carburetor after fixing it, for example, back to the car’s engine.

Finally, to complete our script, we add two closing curly braces } } to mark the end of our DimLight function code block and Dimmer2 class (Figure 7-22). That’s it! Let’s save our script and return to Unity.
../images/488645_1_En_7_Chapter/488645_1_En_7_Fig22_HTML.jpg
Figure 7-22

The completed program is shown for the Dimmer2 C# script

Step 3: Add a Unity Axis Action Component to a Game Object

After we allow Unity to compile the changes we’ve made to our Dimmer2 script, we’re ready to attach it as a component to our LightController2 game object. Select the controller object in the Scene Hierarchy, and in the Inspector add the Dimmer2 script as a component. Remember, you can either click Add Component and search for Dimmer2 or drag and drop the script from the Project window onto the controller object.

With our Dimmer2 script attached to our LightController2 object we can add the VRTK Unity Axis 1D Action component. With the LightController2 game object selected in the Scene Hierarchy, click Add Component in the Inspector and search for the Unity Axis 1D Action script (Figure 7-23). Notice that it has the same four events as the Unity Button Action component. Uniquely, though, the Axis 1D Action component has a field called Axis Name. This will be the axis to which our Value Changed event will listen. What’s the one-dimensional axis to which we’re interested in listening? It’s the right trigger button on the VR touch controller. How do we do that?
../images/488645_1_En_7_Chapter/488645_1_En_7_Fig23_HTML.jpg
Figure 7-23

Add a Unity Axis 1D Action as a component by clicking Add Component in the Inspector and searching for the term

Step 4: Identify the Controller Button to Connect to the Unity Axis Action Through the Input Menu

Recall that when we imported VRTK into our Unity project, a window appeared asking us if we’d like to import the VRTK Input Mappings. If you didn’t click OK, it’s no problem. Simply navigate to Main Menu ➤ Window ➤ VRTK ➤ Manage Input Mappings (Figure 7-24).
../images/488645_1_En_7_Chapter/488645_1_En_7_Fig24_HTML.jpg
Figure 7-24

You can access the VRTK Manage Input Mappings setting anytime through the Window menu

When we open our Unity input settings (Main Menu ➤ Edit ➤ Project Settings ➤ Input) we see a drop-down list of all the axes to which Unity has mapped input.

Your list of inputs might look different than mine, but what’s relevant to this exercise is the presence of the VRTK Axis mappings (Figure 7-25). If they do not appear in your Input list repeat the import process described in the previous paragraph.
../images/488645_1_En_7_Chapter/488645_1_En_7_Fig25_HTML.jpg
Figure 7-25

Confirm that VRTK input mapping took place by locating the VRTK axes in the Unity Input Manager

Let’s expand the heading VRTK_Axis10_RightTrigger. You’ll find a collection of fields defining the right trigger axis. Copy the value of the Name property to your computer’s clipboard (Figure 7-26). Close the Input menu and return to your project’s Scene Hierarchy.
../images/488645_1_En_7_Chapter/488645_1_En_7_Fig26_HTML.jpg
Figure 7-26

Copy the Name value of the right trigger axis input to connect a Unity Axis Action to the right touch controller

With your LightController2 object selected, turn your attention to the Inspector. In the Axis Name field of the Unity Axis 1D Action component, paste the name of the VRTK axis you just copied (Figure 7-27).
../images/488645_1_En_7_Chapter/488645_1_En_7_Fig27_HTML.jpg
Figure 7-27

Pasting the right trigger axis property name into the Unity Axis 1D Action component connects the trigger event to our Dimmer2 script

Congratulations! You just mapped your controller game object and its attached Dimmer2 script to your touch controller’s right trigger button. Now, the float variable rate we defined as a parameter on our DimLight function in our Dimmer2 script will read the value communicated by the pressure placed on your touch controller’s right trigger. With our input into our Unity Axis 1D Action established, let’s define our output.

Step 5: Connect the Chain of Events

Because our right trigger event will control the intensity of our room light, we want to call our DimLight function whenever the user’s pressure on their trigger changes. To accomplish this, we’ll connect our DimLight function as the subscriber, or listener, on our value changed event of the Axis Action component (the publisher). Click the + button on the Unity Axis 1D Action component Value Changed (Single) event. In the field that presently reads None (Object), drag and drop the Dimmer2 script component from the same LightController2 object (Figure 7-28).
../images/488645_1_En_7_Chapter/488645_1_En_7_Fig28_HTML.jpg
Figure 7-28

Drag and drop the Dimmer2 script component from the LightController2 object into the object property of the Unity Axis Action component

From the Value Changed function pull-down menu select Dimmer2 ➤ (Dynamic Float) DimLight (Figure 7-29). Although there is a DimLight(float) function listed beneath the Static Parameter heading, we want to use the DimLight function beneath the heading Dynamic Float. Confirm that the Dimmer2 script component in the LightController2’s Inspector window has a Light object from your scene identified in its Target Light field. In my scene, I placed a point light in the top left window of the building (in the negative x axis direction). I named the light RoomLight in my Scene Hierarchy and dropped it into my Dimmer2’s Target Light field. The Light object you select should be the one you created after placing the alleyway light in the scene. Before we move on from the controller object, confirm that you’ve properly named the Axis Name on the Axis 1D Action component and connected the Dimmer2 script component to the Value Changed event.
../images/488645_1_En_7_Chapter/488645_1_En_7_Fig29_HTML.jpg
Figure 7-29

Dragging the Dimmer2 script into the Unity Axis ID Action component allows us to connect the Axis Action script to the DimLight function on our Dimmer2 script

Step 4: Add a Virtual Camera and a TrackedAlias Prefab to the Scene

Back in the Scene Hierarchy, let’s add a UnityXRCameraRig object (if you plan to play-test with an external HMD) or a SimulatedCameraRig (if you plan to play-test inside the Unity Editor). Because we are listening to a trigger event from our VR touch controller, let’s also drag and drop a TrackedAlias game object into our hierarchy. Increase the size of the Element property on the TrackedAlias’s Tracked Alias Facade component and drag and drop the UnityXRCameraRig into the field Element 0 to complete the synchronization between the TrackedAlias game object and the UnityXRCameraRig (Figure 7-30).
../images/488645_1_En_7_Chapter/488645_1_En_7_Fig30_HTML.jpg
Figure 7-30

Map the UnityXRCameraRig prefab to the TrackedAlias prefab by dragging and dropping the camera object into the Elements property of the TrackedAlias

Finally, in the Scene Hierarchy, also confirm that only one camera game object is active. If your scene includes a default Main Camera game object, then deactivate it in the Inspector. Further, confirm the two point light objects you’ve added in this chapter are active, too, and connected to their respective LightControllers.

Step 5: Play-Test

That’s it! Save your scene and give it a whirl!

If you play-tested your scene, did you find that when you pulled the right trigger button on your touch controller the point light you placed in one of the building’s windows dimmed on and off? If not, check the Unity console for any error messages. If the Axis 1D Action did work, then your Console window should show the string we passed into the Debug.Log() function in our Dimmer2 script for every time you pulled the right trigger on your touch controller (Figure 7-31). You now have the power to create VR apps that can read the sensitivity of a user’s input through touch.
../images/488645_1_En_7_Chapter/488645_1_En_7_Fig31_HTML.jpg
Figure 7-31

The Console window prints the string we entered into our Debug.Log() function to confirm our trigger action connects to the script we wrote

As Annie, you just heard back from Elgin that the theater’s board of directors were impressed with his immersive presentation. With a clear understanding of Elgin’s vision they have no problem cutting the check required to fund the production. Congratulations, Annie! Because of you the show can go on!

The application of the tools you’ve learned in this exercise is, of course, not limited to dimming lights. With the knowledge you now have about VRTK, Unity, and C# scripting, you can create an assortment of actions influenced by the sensitivity of a user’s input. For example, you could create an interior design tool to help a user visualize what different shades of a certain color would look like in a bedroom. You could prototype an installation and test parameters of different ambient features in a space. The possibilities are defined by your imagination.

Summary

In this chapter we covered the creation of an event handler that connected user input through both a keystroke and touch controller trigger action. The event handler, contained within a VRTK Unity Action component, published a notification to our application when the user provided a particular input. We created subscribers to these user inputs by writing functions in C# scripts that performed an operation on a property of a game object in our scene. Finally, we connected both the scripts and the game objects that served as targets for our user’s actions to the VRTK Unity Action component’s event handler, which we stored in our scene as components of an empty game object we called a controller.

Specifically, in the location scouting exercise, the event handler with which we concerned ourselves the most was the Value Changed event on the Unity 1-Dimensional Axis component. The Directional Light has been a consistent game object in the scenes we have created throughout this book, but in this chapter we manipulated its default settings and introduced Point Light game objects into the scene as listeners to our 1D Axis Action event. Connecting control of the Point Lights’ intensities to both a keyboard input and controller input demonstrated two of the ways available to us, as developers, to capture user input. As VR aims to liberate the user from a workstation like a computer, events in our applications tied to controller input increase the immersive sensation of our experience. Further, familiarity with VRTK’s Button and Axis Action components streamlines the trial-and-error methodology of iterating a VR experience. The more reusable components we can use in the design of our VR applications, like VRTK’s Action components, the more quickly we can create and the more code complexity we can avoid.

So far, we’ve imported 3D assets into our projects; we’ve connected our headsets to our scenes through virtual cameras; and we’ve mapped changes to game objects to user input through both a keyboard and touch controller. Although the key to effective immersive experiences is dynamic feedback to a user’s input, the promise of VR lies in much more than interacting with an object at a distance. We want to be able to touch a virtual object and feel as if our bodies can change our environment. That’s the real high of a virtual existence; the moment our minds slip from R to VR. In the next chapter, we’ll delve deeper into that through the use of VRTK’s suite of tools for interaction.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.188.10.246