© The Author(s), under exclusive license to APress Media, LLC, part of Springer Nature 2022
C. CoutinhoUnity® Virtual Reality Development with VRTK4https://doi.org/10.1007/978-1-4842-7933-5_6

6. Setting Up Interactors and Virtual Hands

Christopher Coutinho1  
(1)
GameWorks, Mumbai, Maharashtra, India
 

With your Camera Rig setup complete, you now need to set up your Left and Right controller game objects, which will mimic your real-life hand movements and map to the transform of your VR controllers. In this chapter, we will first go over what an Interactor and an Interactable are. We will then set up the default controller models provided by the VRTK, which are two cuboid objects representing your hands. However, to increase immersion in Virtual Reality, we will replace these cuboid hands with virtual animated hands. After reviewing how to set up the Oculus-provided hands, you’ll go on to set up your very own custom hands that can be used with the Spatial Simulator and the Camera Rigs, Unity XR setup.

Interactors versus Interactables

To get the hands working in VR, you need to understand the concept of an Interactor and an Interactable. An Interactor is something that knows when it is touching another object with which it can interact. If you close your eyes and grab onto a small exercise ball with your hand, your hand will immediately know it has grabbed onto something. In the program, your hand is referred to as an Interactor, and the exercise ball you grab is the Interactable.

Interactors allow users to interact with the virtual world, offering them a way to select and grab objects. On the other hand, an Interactable is a game object within the virtual world with which the user can interact. In the previous example, the Interactable is the exercise ball. The VRTK provides you with two standard ways to interact with an Interactable. Direct Interaction allows the user to directly interact with objects in the VR world, such as by virtually grabbing a cup with your hand. Other examples of Direct Interaction would be flipping a switch and pressing a button.

The VRTK also provides another form of Interaction in the form of a distance grabber. The distance grabber gives you an invisible beam with a visible reticle to identify Interactable objects in your VR world. It allows such Interactable objects to be grabbed from a distance by simply pressing the Grab button set up on the Interactor. In the previous example, the cup could also be grabbed via a distance grab. However, you wouldn’t want to distance grab a switch or a button. We will look at Interactable objects and how to go about grabbing them in a later chapter.

Setting up Interactors on Controllers

Let’s begin by having your VR controllers work as Interactors, using cuboid avatars to represent your controllers, which are your virtual hands. With your Demo scene loaded, navigate to its Project tab and expand the “Packages” folder. Locate the Tilia Interactions, Interactables Unity package and expand it until you reach its runtime folder. Then, expand its “Interactors” folder and select its “Prefabs” folder.

In the right pane, you’ll notice the Interactions Interactor prefab. This prefab needs to be added as a child to both the Left and Right controller aliases within the Camera Rigs, Tracked Alias game object in the hierarchy.

From within the [VRTK_CAMERA_RIGS_SETUP] game object in the hierarchy, expand its Camera Rigs, Tracked Alias child game object. Then, expand its Aliases game object until you see its Left and Right Controller Alias game objects and expand both of them.

From within the right pane of your Project tab, drag and drop the Interactions Interactor prefab onto the Left Controller Alias game object in the hierarchy, making a child of it. Select this Interactions Interactor game object in the hierarchy and rename it “Interactions Interactor Left” to represent your left hand (controller).

Next, from within the right pane of your Project tab, drag and drop the Interactions Interactor prefab onto the Right Controller Alias game object in the hierarchy, making it a child. Select this Interactions Interactor game object in the hierarchy and rename it “Interactions Interactor Right” to represent your Right Hand (controller). Figure 6-1 shows these Interactors after they have been set up.
Figure 6-1

Interactors set up on the Left and Right controllers

Testing Out Your New Cuboid Avatar Interactors

You can now test your scene within Unity’s editor using the Spatial Simulator to see what your new virtual hands look like. But before playing your scene, ensure that within the [VRTK_CAMERA_RIGS_SETUP] game object in the hierarchy, the Camera Rigs, Spatial Simulator has been activated and the Camera Rigs, Unity XR and Camera Rigs, Oculus Integration have both been deactivated.

With your scene now playing within Unity’s editor, note how your hands look. You’ll see that there are two cubes, yellow and red pins, stuck within your controller. These are your new cuboid virtual hands. You can move them about by moving your mouse around. You can also use the number keys two and three to use either your left or right hand only, as discussed in the last chapter. Now your new cuboid virtual hands have been fitted with the VRTK’s Interactors, which means that they are now set up to enable you to grab any Interactable object.

Setting Up Realistic Animated Virtual Hands

In this section, you’ll learn how to set up realistic animated virtual hands to replace your cuboid hands. The VRTK doesn’t provide you with any animated hands; however, the Oculus Integration SDK does provide you with its version of animated virtual hands, which you will set up for your Camera Rigs, Oculus Integration. Camera Rigs, Unity XR and Camera Rigs, Spatial Simulator don’t provide animated virtual hands. The animated virtual hands provided by Oculus will only work with the Camera Rigs, Oculus Integration setup; they won’t work with either your Camera Rigs, Unity XR or your Camera Rigs, Spatial Simulator setup. You could get the Oculus-provided hands to work with your Camera Rigs, Unity XR or Camera Rigs, Spatial Simulator setup. Still, it would involve writing some code, wherein you would need a script that maps button states to different hand animations to get this working. As this book is all about using a no-coding approach, we’ll take a different route here, for which I’ve provided you a Unity package—namely, Unity VR Hands—that you need to download and import into your project. This Unity package provides you with a Custom Hand, with some basic animations set up that you’ll use with your Camera Rigs, Unity XR and Camera Rigs, Spatial Simulator setups.

Animated Hands for Camera Rigs Oculus Integration

This section will teach you how to enable the hands provided by Oculus to work with the Camera Rigs, Oculus Integration setup. If you’re building for the HTC Vive exclusively, you can skip this part.

From within the Unity editor, select the Project tab, expand the “Assets” folder and select the “Oculus” folder. In the search box in the right pane of the Project tab, type in “Custom Hand.” You’ll notice that the prefabs, Custom Hand Left, and Custom Hand Right show up. These prefabs are located in the Assets ➤ Oculus ➤ Sample Framework ➤ Core ➤ Custom Hands folder. Ensure that you use them only to set up your Oculus hands, not any other Oculus hand prefab.

Now, select the Custom Hand Left prefab within the Project tab. In the Unity hierarchy, expand the Camera Rigs, Tracked Alias game object, and further expand its Aliases game object. Then, expand its Left Controller Alias game object, expand the Interactions Interactor Left game object, and select and expand the Avatar Container game object. Next, drag and drop the Custom Hand Left prefab from the Project pane onto the Avatar Container game object, making it a child. Figure 6-2 shows the Custom Hand Left prefab nested as a child of your Left Controller Alias’s Avatar Container game object.

Within Unity’s hierarchy, expand the Camera Rigs, Tracked Alias, and Aliases game objects. Then, expand the Right Controller Alias game object, expand its Interactions Interactor Right game object, and, finally, select and expand its Avatar Container game object. Now, drag and drop the Custom Hand Right prefab from the Project pane onto the Avatar Container game object. Figure 6-2 shows the Custom Hand Right prefab nested as a child of your Right Controller Alias’s Avatar Container game object. Take a minute to ensure that you have dropped the correct prefabs into the proper locations within the hierarchy.
Figure 6-2

Oculus custom hands set up on the Left and Right controllers

Select both Custom Hand Left and Custom Hand Right game objects within the hierarchy, and look at the Inspector. You will see that it contains an OVR Grabber component for both hands. Disable this component for both by unchecking its checkbox within the Inspector. You’ll be using VRTKs grab interaction system instead. All you need from the Oculus Hands are its models and animations.

Playtesting Your Scene Using Your Oculus Headset

From within the [VRTK_CAMERA_RIGS_SETUP] game object in the hierarchy, you activate Camera Rigs, Oculus Integration and deactivate both Camera Rigs Unity XR and Camera Rigs, Spatial Simulator. Your Camera Rigs, Tracked Alias must always be active.

Now you are all set to test your new Oculus-provided hands with your Oculus headset. To do so, hit the Play button within Unity’s editor and mount your Oculus headset. Press your Grab or Trigger button on your Oculus controller to activate your hands within the scene. Now, press the Grab button on your Oculus controller, and you should see your middle, ring, and pinky fingers animate. Then, press the Trigger buttons on your Oculus controller, and you should see your index finger animate. Last, press down on your Thumbstick with your thumb to see your thumbs animate. Notice that your cuboid hands are part of your Oculus hands. You’ll learn how to turn off these cuboid hands in a later section of this chapter.

Now it’s time to test out your Oculus Hands using the Camera Rig, Unity XR setup. Do this by hitting the Play button again in Unity’s editor to stop your scene from playing. From within the [VRTK_CAMERA_RIGS_SETUP] game object in the hierarchy, deactivate Camera Rigs, Oculus Integration and activate Camera Rigs, Unity XR instead. Tap the Play button within Unity’s editor and mount your Oculus headset. Try pressing the various buttons on your Oculus controller. You’ll notice that your Oculus hands will no longer animate if the Camera Rigs, Unity XR or Camera Rigs, Spatial Simulator setup are enabled. Your Oculus-provided hands will work exclusively with the Camera Rigs, Oculus Integration setup.

Animated Hands for Unity XR and Spatial Simulator Camera Rigs

As we saw in the previous section, your Oculus hand and finger animations won’t work with the Camera Rigs, Unity XR or the Camera Rigs, Spatial Simulator setups. This doesn’t mean that they won’t ever work with these setups. Some code would be required to get them working, wherein you would need a script that maps button states to different hand animations. However, as this is a no-coding course, we will take a different approach here, for which I have provided you a Unity package—namely, Unity VR Hands—that you’ll need to download and import into your project. This Unity package provides you with a Hand Proto custom hand prefab with some basic animation setup, which you’ll use with your Camera Rigs, Unity XR and Camera Rigs, Spatial Simulator setups.

First, download the Unity VR Hands unity package, provided as part of this book’s downloads, onto your desktop or any other easily accessible folder. Drag and drop this “Unity VR Hands” unity package file into your open projects “Assets” folder . An import Unity package dialog box will pop up. Import the entire package by clicking the Import button. This may take a few seconds. Within your Assets folder, you’ll notice that a new “VR Hands” folder has been created. Your Hand Proto custom hand prefab is now available within the “Prefabs/Tutorial” folder.

Let’s begin by setting up this Hand prototype. Navigate to the Assets ➤ VR Hands ➤ Prefabs ➤ Tutorial folder, which you’ll notice contains your Hand Proto prefab. You’ll use this prefab to create both Left and Right custom hands for use with the Camera Rigs, Unity XR and Camera Rigs, Spatial Simulator setups.

Select the Hand Proto prefab available in the right pane of your Project tab, and look at the Inspector. It contains an Animator component, whose Controller property has been set to Hand Controller. Examine the Hand Controller by double-clicking the Controller property value in the Inspector, which will thereby launch the Animator window. You’ll see that the window comprises four animations: Open, Grab, Release, and Teleporting. These are the basic animations made available to you. You can always create additional animations for each finger if you want to.

Now navigate back to the “Assets ➤ VR Hands ➤ Prefabs ➤ Tutorial” folder. Your Hand proto prefab will be available in the right pane of your Project tab. Within the hierarchy, ensure that both the Left and Right Controller Alias game objects are expanded to the point where you can see their Avatar Container game object. They should already be expanded for you, as shown in Figure 6-2.

Now drag and drop the Hand Proto prefab, available within the right pane of your Project tab, onto the Avatar Container game object of the Left Controller Alias. Select this Hand Proto game object in the hierarchy and rename it “Hand Proto Left” to represent your left hand. Adjust the Inspector’s Transform, Position values so that they appear as follows: X = 0.003; Y = -0.005; and Z = 0. Then adjust the Inspector’s Transform, Scale values so that they are as follows: X = -0.85; Y = 0.85; and Z = 0.85. You may want to tweak these values further so that your left hand is positioned and scaled well for you.

By default, the provided Hand Proto prefab represents a right hand, but you haven’t been provided with a separate Left-Hand prefab. A common trick used is to mirror the right hand to look like a left hand. To enable this mirroring, you need to set the X scale value of your Hand Proto prefab to a negative value. You’ll notice that for the Hand Proto prefab that you previously set up against your Left controller Alias, you explicitly set its Transform X Scale value to -0.85, which essentially mirrored the Right-Hand Proto prefab to look and function like a left hand.

Now let’s begin setting up your Right Hand. Drag and drop the Hand Proto prefab, available within the right pane of your Project tab, onto the Avatar Container game object of the Right Controller Alias. Select this Hand Proto game object in the hierarchy and reassign the Inspector’s Transform, Position values as follows: X = -0.003; Y = -0.005; and Z = 0. Then adjust its Transform, Scale values as follows: X = 0.85; Y = 0.85; and Z = 0.85. You may want to tweak these values further so that your right hand is positioned and scaled well for you. Note that the X Scale value for your Right Hand is positive and does not need to be mirrored.

Playtesting the Scene Using Your VR Headset

From within the [VRTK_CAMERA_RIGS_SETUP] game object in the hierarchy, deactivate Camera Rigs, Oculus Integration and Camera Rigs, Spatial Simulator and activate Camera Rigs, Unity XR. Your Camera Rigs, Tracked Alias must always be active. As mentioned earlier, you can even use your Oculus headset with the Camera Rigs, Unity XR setup.

In the hierarchy within each of the Avatar Container game objects, deactivate the provided Oculus hands; namely, Custom Hand Left and Custom Hand Right. Ensure that your Hand Proto Left and Hand Proto Right are both active.

Now you are all set to test your custom hand prototypes with your VR headset. To do so, hit the Play button in Unity’s editor and mount your VR headset. Try pressing the various buttons on your VR controller, and you’ll notice that your VR custom hand prototypes do not animate. This is because you still need to capture your controller’s input, based on which you can decide which animation you want to be played. You didn’t face this problem with the provided Oculus hands because capturing controller input using these hands was taken care of by the Oculus components that accompany the Oculus hand prefabs.

You’re provided four animations for your custom hand prototype, Open, Grab, Release, and Teleporting. You’ll capture controller input to function as follows:
  • When you press the Grip button on any of your controllers, the Grab animation will play.

  • When you press the Thumbstick (Trackpad) on any of your controllers, the Teleporting animation will play.

  • When you release the Thumbstick (Trackpad) or Grab button, the Release animation will play.

Animating Custom Prototype Hands

To animate your custom prototype hands so that they can be used with the Camera Rigs, Unity XR and Camera Rigs, Spatial Simulator setups, you first need to find a way to capture input from your controllers. You may capture input from your VR controllers and other devices such as an Xbox controller , keyboard, and mouse. The Xbox controller, keyboard, and mouse would ideally be used when playtesting your scene using the Spatial Simulator. As input can come from different input devices, you need to consolidate such varied input into a single object, which other objects can later poll.

Capturing the Grip, Mouse, or Bumper Button Input

Now, let’s look at an example of a Grab action that can be initiated via either VR controllers Grip buttons or an Xbox controllers Left or Right, Bumper buttons. Also, using the mouse, a Grab action may be initiated against the Left Hand by pressing the left mouse button and against the Right Hand by pressing the Right mouse button. For your Demo scene, you’ll be capturing Grab input using the mouse, Oculus controllers, HTC Vive controllers, and Xbox controllers that are input from four separate devices for each hand. Whenever you decide to introduce a new device, you need to wire its Grab signal to every object in your game that wants to listen for a Grab action, which could get messy.

The solution is to wire the Grab signals from all devices into an intermediary object and then have other objects poll this intermediary object. To achieve this level of indirection, the VRTK provides you with two fundamental simplistic action components—namely, the Boolean action and the Float action—that you will utilize to capture controller device input.

In the example previously cited, you want to know if your VR controller’s Grip button was pressed, initiating a Grab action. The Grip button on your VR controller emits a Boolean value of either True or False. When the Grip button is pressed, the value True is emitted, and when the Grip button is released, the value False is emitted.

Likewise, if the left or right mouse button is pressed, initiating a Grab action, the True value is emitted and when the button is released, the value False is emitted. If either of the Bumper buttons on the Xbox controller were pressed, initiating a Grab action, the value True would be emitted and when the buttons are released, the value False would be emitted. You need to listen only for Boolean values for when a Grab action occurs via a Grip, Mouse, or Bumper button is pressed, and you will utilize the Boolean action component to capture such input.

The VRTK provides you with its Tilia Input, Unity Input Manager package , which contains controller mappings for buttons across several devices. These mappings allow you to listen for input from any button associated with its supported controller devices. The VRTK provides controller mappings for Oculus, Open VR, Xbox, and Windows Mixed Reality devices .

Let’s set this up by listening for a Grab action from the Oculus, HTC Vive, and Xbox controllers, as well as the mouse. You’ll recollect that a Grab action occurs when your VR Controllers Grip button is pressed, your Xbox controllers’ Bumper buttons are pressed, or either your left or right mouse buttons are pressed.

Start by selecting the [VRTK_SETUP] game object in the hierarchy, right-click it, and then select Create Empty from the context menu that pops up to create a new empty game object. Rename this new game object “[VRTK_Input_Controllers].” You need to capture input from four devices: the Oculus, the HTC Vive, the Xbox, and the Mouse.

Then, select the Project tab and expand its “Packages” folder. Locate its Tilia Input, Unity Input Manager package. Expand its “Runtime and Prefabs” folder, and select its “Controller Mappings” folder. Within the right pane, you’ll notice many device names for which controller mappings are available; namely, the Oculus, Open VR, Windows Mixed Reality, and Xbox controller. The controller mappings you’re interested in are the Xbox controller, the Oculus Touch Left and Right controllers, and the Open VR Left and Right controllers. Your HTC Vive will use the Open VR Left and Right controller mappings. Note that nothing has been provided explicitly for mapping the Mouse or Keyboard buttons.

From within the right pane of the Project tab, select the prefabs, Xbox Controller, Oculus Touch Left Controller, Oculus Touch Right Controller, Open VR Left Controller, and Open VR Right Controller. Now drag and drop these prefabs onto the [VRTK_Input_Controllers] game object in the hierarchy, making them children. Figure 6-3 shows this setup with the Left Thumbstick and Left Grip game objects expanded.
Figure 6-3

Input Controller mappings for various input controllers

Before hooking up the various controller buttons, let’s briefly explore the Input, Unity Input Manager, and Oculus Touch Left Controller game objects to see how the controller input buttons have been organized. A similar organizational structure applies to both the Open VR and Xbox controllers, with the button names differing slightly.

From within the [VRTK_Input_Controllers] game object in the hierarchy, expand the Input, Unity Input Manager, Oculus Touch Left Controller game object. Then, expand its “Input Actions” folder. You’ll immediately notice that all the buttons available on the physical Oculus controller have been mapped here, along with different inputs you can capture for each button. Now expand the Left Thumbstick game object. You will see that for this Left Thumbstick, you can capture input for the following actions that can be performed against it: moving the Thumbstick horizontally or vertically, touching the Thumbstick, pressing down on it, and making a near touch against it. Similarly, you can capture input from other buttons on your controller to have a list of actions that can be performed against them.

Let’s look at one more button on your Oculus Touch Left Controller game object. Expand the Left Grip game object and you’ll notice that for this grip, you can capture input for your left controller’s Grip button being pressed and the extent to which it has been pushed in. You could thus ensure that only if the Grip button is pushed in over 50% of the way would you allow a Grab action to occur and play the Grab animation. Figure 6-3 shows this setup with the Left Thumbstick and Left Grip game objects expanded. Similar buttons and actions exist for the Input, Unity Input Manager, Oculus Touch, Right Controller, Input, Unity Input Manager, Open VR, Left, and Right controllers and Input, Unity Input Manager Xbox controller.

As discussed earlier, the input can come from different input devices, so you need to consolidate such varied input into a single object that other objects can later poll. For a Grab action, you can obtain input from three separate devices for each hand; that is, six unique Grip input signals and mouse input via the left and right mouse buttons, for a total of eight Grab signals.

To deal with the six Grip input signals that can be raised by your controllers and the two input signals produced by clicking your left and right mouse button, first create two intermediary game objects. One game object will listen for a left-hand Grip or left mouse press, and the other will listen for a right-hand Grip or Right mouse press. Six of these Grip presses originate from either your Oculus, HTC Vive, or Xbox controllers. The left and right mouse button clicks account for the other two Grab actions. The two intermediary game objects you’ll create will be named, “Left-Hand Grab” and “Right-Hand Grab.” All button action input associated with your Left-Hand controller device will be channeled into the Left-Hand Grab game object and all button action input associated with the right-hand device will be directed into the Right-Hand Grab game object. A left mouse button click will be channeled into the Left-Hand Grab game object, and a right mouse button click will be directed into the Right-Hand Grab game object. As all these input values are purely Boolean values, both your Left-Hand Grab and Right-Hand Grab game objects will need to contain a Boolean action component that can receive Boolean (True/False) values only.

Let’s set this up now. In the hierarchy, select the Demo scene game object and right-click it. From the context menu that pops up, select Game Object ➤ Create Empty. Reset the Transform of this newly created empty game object and rename it “Button Input Actions .” Note that this game object is a child of your Demo (scene) game object.

You now need to configure your Button Input Actions game object. Select this game object in the hierarchy and create two new empty child objects in it. Rename the objects “Left-Hand Grab” and “Right-Hand Grab.” Select both game objects in the hierarchy, and within the Inspector, click the Add Component button. In the search box that shows up below the button, search for “Boolean Action” and add this component to both objects. Within this Boolean Action component, locate and expand its Sources property and set its size to 4, according to which four-element slots will be made available. Each of these slots will reference the concerned controller buttons input action you want to listen for. Figure 6-4 shows what this setup looks like so far.
Figure 6-4

Left-Hand Grab and Right-Hand Grab intermediary game objects with their Boolean Action component set up

Earlier in this section, we noted that controller mappings hadn’t been provided for the Mouse or Keyboard buttons. You’ll need to set up these mappings yourself. Let’s deviate a bit by first setting up controller mappings for your left and right mouse buttons, as these will be the required input to pass into your sources Element property slots for both the Left-Hand Grab and Right-Hand Grab game objects. Setting up controller mappings for keyboard buttons will be discussed later, as we are currently concerned with capturing input for when a Grip, Mouse, or Bumper button is pressed.

From within the hierarchy, select the [VRTK_Input_Controllers] game object and right-click it. Create a new empty child game object and rename it “Mouse Input.” In the Project tab, with the Tilia input, Unity Input Manager package folder still expanded, select the “Actions” folder in the “Prefabs” folder. In the right pane, you’ll notice three action prefabs. Select the Input, Unity Input manager, Button action prefab, and drag and drop it onto the Mouse Input game object in the hierarchy. Rename it “Input, Unity Input Manager, Button action Mouse Right.” Duplicate this game object so that the copy is a child of Mouse Input and rename it, “Input, Unity Input Manager, Button action Mouse Left.” Select the Input, Unity Input Manager, Button Action Mouse Right game object in the hierarchy, and within the Inspector, ensure that the Unity Input Manager, Button Action component is expanded. Locate the KeyCode property. Scroll through its drop-down list and select the item Mouse 1. This captures the action of a Right mouse button click, in which case a Boolean value of True will be emitted.

Next, select the Unity Input Manager, Button Action Mouse Left game object from within the hierarchy. In the Inspector, ensure that the Unity Input Manager, Button Action component has been expanded. Locate the KeyCode property. Scroll through its drop-down list and select the item Mouse 0. This captures the action of a left mouse button click, in which case a Boolean value of True will be emitted. These controller mappings now enable you to listen for button input action that happens against any of your listed devices.

Now, let’s return to hooking up all eight Grab signals among your Left-Hand Grab and Right-Hand Grab game objects. The left-hand Grip, press action input raised by your left-handed controller for either an Oculus, Xbox, or HTC Vive device, or a left mouse button click, will be hooked up to the Left-Hand Grab game object. Thus, your Left-Hand Grab intermediary game object receives four Grab input signals that need to be plugged into the four element slots shown in Figure 6-4. The same analogy applies to hooking up inputs produced by your right-hand controller and right mouse button, wherein these inputs need to be hooked up to your Right-Hand Grab intermediary game object.

Let’s set up these connections for your Left-Hand Grab intermediary game object first. From within the hierarchy, select and expand the [VRTK_Input_Controllers] game object. Expand the Input, Unity Input Manager, Oculus Touch Left Controller game object. Then expand its Input Actions game object and Left Grip game object.

Next, from within the [VRTK_Input_Controllers] game object, expand the Input, Unity Input Manager, Open VR Left Controller game object. Then expand its Input Actions game object and its Left Grip game object.

Next, from within the [VRTK_Input_Controllers] game object, expand the Input, Unity Input Manager, Xbox Controller game object. Then, expand its Input Actions game object.

Finally, expand the Mouse Input game object from within the [VRTK_Input_Controllers] game object. With these input sources expanded, let’s capture input when a left-hand Grip press or a left mouse button click occurs.

From within the hierarchy, select the Left-Hand Grab game object. Note that your Boolean Action Sources property has four element slots in the Inspector, as shown in Figure 6-4.

From within the expanded Input, Unity Input Manager, Oculus Touch Left Controller game object , drag and drop the Left Grip Press game object into the Sources property Element 0 slot, as shown in Figure 6-5. Now, from within the expanded Input, Unity Input Manager, Open VR Left Controller game object, drag and drop the Left Grip Press game object into the Element 1 slot of the Sources property, as shown in Figure 6-5. Next, from within the expanded Input, Unity Input Manager, Controller game object, drag and drop the Left Bumper Press [4] game object into the Sources property Element 2 slot, as shown in Figure 6-5. Last, from within the expanded Mouse Input game object, drag and drop the Input, Unity Input manager, Button action Mouse Left, game object into the Element 3 slot of the Sources property, as shown in Figure 6-5.
Figure 6-5

Hooking up the Left Grip, Bumper, and Mouse (press) inputs to the Left-Hand Grab intermediary game object

Now that we’re done capturing input for a Left-Hand Grab, let’s set ourselves up to capture input for a Right-Hand Grab. With the [VRTK_Input_Controllers] game object expanded, collapse the game objects Input Unity Input Manager Oculus Touch Left Controller; Input Unity Input Manager Open VR Left Controller; and Input Unity Input Manager X-Box Controller. Let the Mouse Input game object remain expanded, as you will be referring to its Input, Unity Input Manager, Button action Mouse Right child game object.

From within the hierarchy, with the [VRTK_Input_Controllers] game object still expanded, expand the Input, Unity Input Manager, Oculus Touch Right Controller game object. Then expand its Input Actions game object, followed by expanding its Right Grip game object.

Next, expand the Input, Unity Input Manager, Open VR Right Controller game object. Then expand its Input Actions game object, followed by expanding its Right Grip game object .

After that, expand the Input, Unity Input Manager, X-Box Controller game object. Then, expand its Input Actions game object. Finally, ensure that the Mouse Input game object has been expanded, too.

Now that these input sources are all expanded, let’s capture input when a right-hand Grip press or a right mouse button click occurs. From within the hierarchy, select the Right-Hand Grab game object. Note that your Boolean Action Sources property has four element slots in the Inspector, as shown in Figure 6-4.

From within the expanded Input, Unity Input Manager, Oculus Touch Right Controller game object, drag and drop the Right Grip Press game object into the Sources property Element 0 slot, as shown in Figure 6-6. Now, from within the expanded Input, Unity Input Manager, Open VR Right Controller game object, drag and drop the Right Grip Press game object into the Element 1 slot of the Sources property, as shown in Figure 6-6. Next, from within the expanded Input, Unity Input Manager, X-Box Controller game object, drag and drop the Right Bumper Press [5] game object into the Sources property Element 2 slot, as shown in Figure 6-6. Last, from within the expanded Mouse Input game object, drag and drop the Input, Unity Input manager, Button Action Mouse Right game object into the Element 3 slot of the Sources property, as shown in Figure 6-6.
Figure 6-6

Hooking up Right Grip, Bumper, and Mouse (press) inputs to the Right-Hand Grab intermediary game object

You finally have your controller mappings set up to capture input from four devices against two hands for a Grab Action. Now you need something to happen when a Grab action takes place. Animate your custom prototype hands with the Grab animation available. You’ll need to set up this Grab animation on each of your hands.

In the hierarchy, with your Button Input Actions game object expanded, select the Left-Hand Grab game object. In the Inspector, you’ll notice its Activated Boolean and Deactivated Boolean events. The Activated Boolean event will be triggered as soon as a left-hand grab takes place. When this happens, you’ll need to play the Grab animation for your left hand. Expand this Activated Boolean event within the Inspector and click the plus symbol located in its bottom right corner to add an event listener box for this Activated event.

Your left hand is represented by the Hand Proto Left game object. It can be located within the hierarchy by navigating to [VRTK_CAMERA_RIGS_SETUP] ➤ Camera Rigs, Tracked Alias ➤ Aliases ➤ Left Controller Alias ➤ Interactions Interactor Left ➤ Avatar Container.

With the Avatar Container for the Left Controller, Alias expanded, drag and drop its Hand Proto Left game object into the box located below the Runtime Only drop-down property of the Activated event, as shown in Figure 6-7. From the drop-down located at the right of the Runtime Only drop-down property, select the option Animator and then Play (string). Essentially, this is telling the Animator component on the Hand Proto Left game object to play a specific animation. In the box available below the Animator Play drop-down, type in the name of the animation you’d like played when the Activated event is triggered after a Grab action has occurred. In this case, the name to type in is “Grab.”
Figure 6-7

Setting up the Grab animation to play after a Grab action and the Release animation to play after an un-Grab action(Left-Hand Grab game object)

Now, let’s set up the Deactivated Boolean event on your Left-Hand Grab game object. This Deactivated Boolean event will be triggered when an “un-Grab” occurs—that is, when you release the Grip button on your left controller. When this happens, you need to play the Release animation for your left hand. Expand this Deactivated Boolean event within the Inspector and click the plus symbol located in its bottom right corner to add an event listener box for this Deactivated event.

With the Avatar Container for the Left Controller, Alias expanded, drag and drop its Hand Proto Left game object into the box located below the Runtime Only drop-down property of the Deactivated event, as shown in Figure 6-7. In the drop-down located at the right of the Runtime Only drop-down property, select the option Animator and choose the option Play (string). Essentially, this tells the Animator component on the Hand Proto Left game object to play a specific animation. In the box available below the Animator, Play drop-down, type in the name of the animation you’d like played when the Deactivated event is triggered after an un-Grab has occurred. Here, the name to type in is “Release.”

Now that you’re done setting up animations against your left hand, you need to set up the same animations against your right hand. You can achieve this by setting up the Activated and Deactivated events for when a right-hand Grab occurs. Let’s set this up now.

Start by selecting the Right-Hand Grab game object in the hierarchy, with your Button Input Actions game object still expanded. In the Inspector, you’ll notice its Activated Boolean and Deactivated Boolean events. The Activated Boolean event will be triggered as soon as a right-hand Grab takes place. When this happens, you need to play the Grab animation for your right hand. Expand this Activated Boolean event within the Inspector and click the plus symbol located at its bottom right corner to add an event listener box for this Activated event.

Your right hand is represented by the Hand Proto right game object. It can be located within the hierarchy by navigating to [VRTK_CAMERA_RIGS_SETUP] ➤ Camera Rigs, Tracked Alias ➤ Aliases ➤ Right Controller Alias ➤ Interactions Interactor Right ➤ Avatar Container.

With the Avatar Container for the Right Controller, Alias expanded, drag and drop its Hand Proto Right game object into the box located below the Runtime Only drop-down property of the Activated event, as shown in Figure 6-8. From the list you get when you click the right of the Runtime Only drop-down property, select the option Animator and then Play (string). Essentially, this tells the Animator component on the Hand Proto Right game object to play a specific animation. In the box available below the Animator.Play drop-down, type in the name of the animation you’d like played when the Activated event is triggered after a Grab action has occurred. In this case, the name of the animation to type in is “Grab.”

Now let’s set up the Deactivated Boolean event on your Right-Hand Grab game object. This event will be triggered when an un-Grab occurs—that is, when you release the Grip button on your Right controller. When this happens, the Release animation should be played for your right hand. Expand this Deactivated Boolean event within the Inspector and click the plus symbol located in its bottom right corner to add an event listener box for this Deactivated event.

With the Avatar Container for the Right Controller, Alias expanded, drag and drop its Hand Proto Right game object into the box located below the Runtime Only drop-down property of the Deactivated event, as shown in Figure 6-8. From the drop-down located at the right of the Runtime Only drop-down property, select the option Animator and then, Play (string). Essentially, this tells the Animator component on the Hand Proto Right game object to play a specific animation. In the box available below the Animator.Play drop-down, type in the name of the animation you’d like played after the Deactivated event has been triggered for an un-Grab. The name to type in here is “Release.”
Figure 6-8

Setting up the Grab animation to play after a Grab action and the Release animation to play after an un-Grab action (Right-Hand Grab game object)

Playtesting the Grab and Release Hand Animations

To playtest the Grab and Release animations, expand the [VRTK_CAMERA_RIGS_SETUP] game object in the hierarchy and activate Camera Rigs, Spatial Simulator and deactivate Camera Rigs, Unity XR and Camera Rigs, Oculus Integration. Also, make sure that Camera Rigs, Tracked Alias is permanently active.

Now, hit the Play button within Unity’s editor and wait for your scene to load. When your Demo scene is playing, click your left and right mouse buttons to see your left and right hands respectively animate. If you have an Xbox controller connected to your computer, you can press its Left and Right controller buttons to see your Left and Right Hand animate. You’ll only need to use the Camera Rigs, Spatial Simulator with your Xbox controller. Finally, hit the Play button again to stop your scene from playing.

Next, test your scene using your VR Headset. From within the [VRTK_CAMERA_RIGS_SETUP] game object, activate Camera Rigs, Unity XR and deactivate Camera Rigs, Spatial Simulator and Camera Rigs, Oculus Integration. Hit the Play button within Unity’s editor and wait for your scene to load. When your Demo scene is playing, mount your VR headset and grab your VR controllers. Press the Grip buttons on your left and right controllers to see your hands and grab animation working.

Capturing Thumbstick and Keyboard Input

Now that you have your Grab animation working with your VR controllers’ Grip buttons, your Xbox controllers, Bumper buttons, and your left and right mouse buttons, it’s time to learn how to capture your controller’s Thumbstick and keyboard input. In this section, upon receiving input from your controller Thumbsticks or pressing the Q or P keys on your keyboard, we’ll make the Teleporting animation play. Pressing the Q key will simulate a left-hand Thumbstick press, while pressing the P key will simulate a right-hand Thumbstick press. The procedure is like what you underwent to set up your Grab animation. The only difference is that you won’t need to set up the Deactivated Boolean event here. When the Teleporting animation ends, you’ll be reverted to an open hand by default.

From within the hierarchy, select the [VRTK_Input_Controllers] game object. Right-click it and create a new child empty game object. Rename this object “Keyboard Input.” In the Project tab, with the “Tilia input, Unity Input Manager” package folder still expanded, select the “Actions” folder in the “Prefabs” folder. In the right pane, you’ll notice three action prefabs. Select the Input, Unity Input manager, Button Action prefab, and drag and drop it onto the Keyboard Input game object in the hierarchy. Rename it “Input, Unity Input Manager, Button Action Q.” You’ll use the Q key on the keyboard to represent a left-hand Thumbstick press. Duplicate this game object so that the copy is a child of Keyboard Input. Rename this duplicated object “Input, Unity Input Manager, Button Action P.” You’ll use the P key on the keyboard to represent a right-hand Thumbstick press. Select the Input, Unity Input Manager, Button Action Q game object in the hierarchy, and within the Inspector, ensure that the Unity Input Manager, Button Action component has been expanded. Locate the KeyCode property. Scroll through its drop-down list and select the letter Q. This captures a Q key press, resulting in a Boolean value of True being emitted.

Next, select the Unity Input manager, Button Action P game object from within the hierarchy. In the Inspector, ensure that the Unity Input Manager, Button Action component has been expanded. Locate the KeyCode property. Scroll through its drop-down list and select the letter P. This captures a P key press, resulting in a Boolean value of True being emitted. These Keyboard input mappings now enable you to listen for either a Q or P key being pressed. You can capture several keypad key presses using this procedure.

Select the Button Input Actions game object in the hierarchy and create two new empty child objects within it. Rename these objects “Left-Hand Thumbstick Press” and “Right-Hand Thumbstick Press.” Select both of these game objects in the hierarchy, and within the Inspector, click the Add Component button. In the search box that shows up below this button, search for Boolean Action and add this component to both objects. Within this Boolean Action component, locate and expand its Sources property and set its size to 4, according to which four-element slots will be made available. Each of these element slots will reference a controller Thumbsticks button input action that you want to listen for. Figure 6-9 shows what this setup looks like so far.
Figure 6-9

Left-Hand Thumbstick press and Right-Hand Thumbstick press intermediary game objects with their Boolean action component set up

Now, let’s connect all eight Thumbstick press signals among your Left-Hand Thumbstick Press and Right-Hand Thumbstick press game objects. The Left-Hand Thumbstick input, raised by your left-handed controller for either the Oculus, Xbox, or HTC Vive devices, or the Q key press, will be hooked up to the Left-Hand Thumbstick Press game object. Thus, your Left-Hand Thumbstick Press intermediary game object will receive four input signals that need to be plugged into the four element slots shown in Figure 6-9. The same analogy applies to hooking up inputs produced by your right-handed controller and P key press, according to which these inputs need to be hooked up to your Right-Hand Thumbstick Press intermediary game object.

Let’s set up these connections for your Left-Hand Thumbstick Press intermediary game object first. From within the hierarchy, select and expand the [VRTK_Input_Controllers] game object. Expand the Input, Unity Input Manager, Oculus Touch Left Controller game object. Then expand its Input Actions game object, followed by expanding its Left Thumbstick game object.

Next, from within the same game object, expand the Input, Unity Input Manager, Open VR Left Controller game object. Then expand its Input Actions game object, followed by expanding its Left Trackpad game object.

After that, from within the same game object, expand the Input, Unity Input Manager, Xbox Controller game object. Then expand its Input Actions game object, followed by expanding its Left Thumbstick game object.

Finally, from within this game object, expand the Keyboard Input game object.

With these input sources expanded, let’s capture input for when a left-hand Thumbstick or the Q key is pressed.

From within the hierarchy, select the Left-Hand Thumbstick Press game object. Note that your Boolean Action Sources property has four element slots in the Inspector, as shown in Figure 6-9.

From within the expanded Input, Unity Input Manager, Oculus Touch Left Controller game object, drag and drop the Left Thumbstick Press [8] game object into the Sources property Element 0 slot, as shown in Figure 6-10. Now, from within the expanded Input, Unity Input Manager, Open VR Left Controller game object, drag and drop the Left Trackpad Press [8] game object into the Element 1 slot of the Sources property, as shown in Figure 6-10.

Next, from within the expanded Input, Unity Input Manager, X-Box Controller game object, drag and drop the Left Thumbstick Press [8] game object into the Element 2 slot of the Sources property, as shown in Figure 6-10.

Last, from within the expanded Keyboard Input game object, drag and drop the Input, Unity Input Manager, Button Action Q game object into the Element 3 slot of the Sources property, as shown in Figure 6-10.
Figure 6-10

Hooking up Left Thumbstick and Keyboard (press) inputs to the Left-Hand Thumbstick Press intermediary game object

Now that we’re done capturing input for a Left-Hand Thumbstick Press, let’s set ourselves up to capture input for a Right-Hand Thumbstick Press. With the [VRTK_Input_Controllers] expanded, collapse the Input Unity Input Manager Oculus Touch Left Controller; Input Unity Input Manager Open VR Left Controller and Input Unity Input Manager Xbox Controller Left Thumbstick game objects. Keep the Keyboard Input game object expanded as you refer to its Input, Unity Input manager, Button Action P child game object.

From within the [VRTK_Input_Controllers] game object, expand the Input, Unity Input Manager, Oculus Touch Right Controller game object. Then expand its Input Actions game object, followed by expanding its Right Thumbstick game object .

Next, expand the Input, Unity Input Manager, Open VR Right Controller game object. Then expand its Input Actions game object, followed by expanding its Right Trackpad game object.

After that, with the Input, Unity Input Manager, Xbox Controller, Input Actions game object expanded, increase the size of its Right Thumbstick game object. Your Keyboard Input game object should already be expanded.

With these input sources expanded, let’s capture input when a right-hand Thumbstick or the P key on the keyboard is pressed. From within the hierarchy, select the Right-Hand Thumbstick Press game object. Note that your Boolean Action Sources property has four element slots in the Inspector, as shown in Figure 6-9.

From within the expanded Input, Unity Input Manager, Oculus Touch Right Controller game object, drag and drop the Right Thumbstick Press [9] game object into the Element 0 slot of the Sources property, as shown in Figure 6-11.

Now, from within the expanded Input, Unity Input Manager, Open VR Right Controller game object, drag and drop the Right Trackpad Press [9] game object into the Element 1 slot of the Sources property, as shown in Figure 6-11.

Next, from within the expanded Input, Unity Input Manager, X-Box Controller game object, drag and drop the Right Thumbstick Press [9] game object into the Element 2 slot of the Sources property, as shown in Figure 6-11. Last, from within the expanded Keyboard Input game object, drag and drop the Input, Unity Input manager, Button Action P game object into the Element 3 slot of the Sources property, as shown in Figure 6-11.
Figure 6-11

Hooking up the Right Thumbstick and Keyboard (press) inputs to the Right-Hand Thumbstick Press intermediary game object

You now have your controller mappings set up to capture input from four devices against two hands when the Thumbstick, Q, or P keys are pressed. Now you need something to happen when any of these actions takes place; what should happen is that your custom prototype hands animate with the Teleporting animation available. You will need to set up this Teleporting animation on each of your hands.

In the hierarchy, with your Button Input Actions game object expanded, select the Left-Hand Thumbstick Press game object. In the Inspector, you’ll notice its Activated Boolean event. This event will be triggered as soon as a Left-Hand Thumbstick Press takes place. When this happens, you need to play the Teleporting animation for your left hand. Expand this Activated Boolean event within the Inspector and click the plus symbol located at its bottom right corner to add an event listener box for this Activated event .

Your left hand is represented by the Hand Proto Left game object within the hierarchy. It can be located by navigating to [VRTK_CAMERA_RIGS_SETUP] ➤ Camera Rigs, Tracked Alias ➤ Aliases ➤ Left Controller Alias ➤ Interactions Interactor Left ➤ Avatar Container.

Now, with the Avatar Container of the Left Controller, Alias expanded, drag and drop its Hand Proto Left game object into the box located below the Runtime Only drop-down property of the Activated event, as shown in Figure 6-12. From the drop-down located at the right of the Runtime Only property, select the option Animator and choose Play (string). Essentially, this tells the Animator component on the Hand Proto Left game object to play a specific animation. In the box available below the Animator Play drop-down, type in the name of the animation you’d like played when the Activated event is triggered after the Thumbstick or Q key has been pressed. Type “Teleporting” into this box.
Figure 6-12

Setting up the Teleporting animation to play after the Left Thumbstick or Q key has been pressed(Left-Hand Thumbstick Press)

Now that you’re done setting up the Teleporting animation against your left hand, you need to set up the same animation against your right hand. This is achieved by setting up its Activated event for when the Right-Hand Thumbstick or P key is pressed.

In the hierarchy, with your Button Input Actions game object still expanded, select the Right-Hand Thumbstick Press game object. In the Inspector, you’ll notice its Activated Boolean event. The event will be triggered as soon as the right-hand Thumbstick or P key is pressed. When this happens, you need to play the Teleporting animation for your right hand. Expand the Activated Boolean event within the Inspector and click the plus symbol located at its bottom right corner to add an event listener box for this Activated event.

Your right hand is represented by the Hand Proto Right game object within the hierarchy. It can be located by navigating to [VRTK_CAMERA_RIGS_SETUP] ➤ Camera Rigs, Tracked Alias ➤ Aliases ➤ Right Controller Alias ➤ Interactions Interactor Right ➤ Avatar Container.

With the Avatar Container for the Right Controller, Alias expanded, drag and drop its Hand Proto Right game object into the box located below the Runtime Only drop-down property of the Activated event, as shown in Figure 6-13. From the drop-down located at the right of the Runtime Only property, select the option Animator and then choose Play (string). Essentially, this tells the Animator component on the Hand Proto Right game object to play a specific animation. In the box available below the Animator Play drop-down, type in the name of the animation you’d like played when the Activated event is triggered after the Thumbstick or P key has been pressed. Type “Teleporting” into this box. You might have noticed that a Deactivated Boolean event wasn’t set up for your Thumbstick press, as when the Teleporting animation ends, you will revert to an open hand by default.
Figure 6-13

Setting up the Teleporting animation to play after the Right Thumbstick or P key has been pressed (Right-Hand Thumbstick Press)

Playtesting Teleporting Hand Animation

Expand the [VRTK_CAMERA_RIGS_SETUP] game object in the hierarchy and activate Camera Rigs, Spatial Simulator and deactivate Camera Rigs, Unity XR and Camera Rigs, Oculus Integration. Also, ensure that Camera Rigs, Tracked Alias is permanently active.

Hit the Play button within Unity’s editor and wait for your scene to load. With your Demo scene playing, press the Q key on your keyboard to see your left thumb animate and press the P key to see your right thumb animate. If you have an Xbox controller connected to your computer, you can press its Left and Right Thumbsticks to see your Left and Right thumbs animate. You’ll need to use the Camera Rigs, Spatial Simulator only with your Xbox controller. Finally, hit the Play button again to stop your scene from playing.

Now test your scene using your VR Headset. From within the [VRTK_CAMERA_RIGS_SETUP] game object, activate Camera Rigs, Unity X, and deactivate Camera Rigs, Spatial Simulator and Camera Rigs, Oculus Integration. Hit the Play button within Unity’s editor and wait for your scene to load. When your Demo scene is playing, mount your VR headset and grab your VR controllers. Press the Thumbsticks (Trackpads) on your Left and Right controllers to see your thumbs animate using the Teleporting animation.

Summary

In this chapter, we started by learning what an Interactor and an Interactable are. We then set up Interactors on both our Left and Right controllers and tested out the cuboid avatar hands provided by the VRTK. We then put in place realistic animated hands for our Camera Rigs, Oculus Integration setup and playtested our Demo scene with our new Oculus-provided hands.

Next, we began a long journey toward setting up custom hands for our Spatial Simulator and Camera Rigs, Unity XR. You learned why the Oculus-provided hands wouldn’t work out of the box with the Camera Rigs, Spatial Simulator, and Camera Rigs, Unity XR.

We imported the custom prototype hands provided with the Unity package for the project. We went on to set up our Hand Proto prefab and learned how to mirror it, and repositioned and resized it to our liking. We also looked at the animations provided to us, along with the Hand Proto prefab.

Next, we took a deep dive, learning how to animate our prototype hands based on controller input. We began by setting up a Grab animation to obtain input from four devices across two hands for eight input signals to be captured. We used the Boolean action component to capture this input and learned how to set it up.

You learned to capture input from your Oculus, HTC Vive, and Xbox controllers, and input from the mouse when a Grab action occurred by pressing the Grip button on the controllers. You then learned about the Activated and Deactivated events and set them up to play an appropriate animation.

You learned how to use an intermediary game object to obtain a level of indirection by channeling several inputs into one game object and then having interested objects poll this intermediary game object. You also learned how to capture input from your Oculus, HTC Vive, and Xbox controller Thumbsticks, as well as keyboard input, and you used this input to play a Teleporting animation. Finally, we tested your Demo scene to ensure that the controllers, mouse, and keyboard input we captured worked correctly.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.141.4.167