© Wallace Wang 2018
Wallace WangBeginning ARKit for iPhone and iPadhttps://doi.org/10.1007/978-1-4842-4102-8_10

10. Interacting with Augmented Reality

Wallace Wang1 
(1)
San Diego, CA, USA
 

Touch gestures let the user control an augmented reality app through finger gestures alone such as taps, swipes, rotations, and pinches. Once you’ve added touch gestures to an augmented reality app, the next step is to use those touch gestures to manipulate virtual objects displayed in the augmented reality view.

In the previous chapter, we learned how to recognize when simple touch gestures occur on a virtual object such as a tap, long press, or swipe. In this chapter, we learn how to scale, rotate, and move a virtual object using a pinch, rotation, and pan touch gesture.

For this chapter’s example, we’ll create the different gesture recognizers through the Object Library. Let’s create a new Xcode project by following these steps:
  1. 1.

    Start Xcode. (Make sure you’re using Xcode 10 or greater.)

     
  2. 2.

    Choose File ➤ New ➤ Project. Xcode asks you to choose a template.

     
  3. 3.

    Click the iOS category.

     
  4. 4.

    Click the Augmented Reality App icon and click the Next button. Xcode asks for a product name, organization name, organization identifiers, and content technology.

     
  5. 5.

    Click in the Product Name text field and type a descriptive name for your project, such as RotatePinch. (The exact name does not matter.)

     
  6. 6.

    Make sure the Content Technology popup menu displays SceneKit.

     
  7. 7.

    Click the Next button. Xcode asks where you want to store your project.

     
  8. 8.

    Choose a folder and click the Create button. Xcode creates an iOS project.

     

This creates a simple augmented reality app that displays a cartoon airplane. Now that we have the airplane displayed for us automatically, let’s start by learning to scale or resize the airplane virtual object in the augmented reality view using the pinch gesture.

The pinch gesture consists of placing two fingertips on the screen and then drawing them apart or together in a pinching motion. To place a pinch gesture recognizer in our app, follow these steps:
  1. 1.

    Click the Main.storyboard file in the Navigator pane.

     
  2. 2.

    Click the Object Library icon to display the Object Library window.

     
  3. 3.

    Type pinch in the Object Library. The Object Library displays the pinch gesture recognizer, as shown in Figure 10-1.

     
../images/469983_1_En_10_Chapter/469983_1_En_10_Fig1_HTML.jpg
Figure 10-1

Finding the pinch gesture recognizer in the Object Library window

  1. 4.

    Drag the Pinch Gesture Recognizer from the Object Library window and drop it on the ARSCNView on the user interface. Although you dragged and dropped the Pinch Gesture Recognizer on to the user interface, you won’t see any sign of it anywhere except in the Document Outline, as shown in Figure 10-2. If the Document Outline is not visible, click the Show Document Outline icon or choose Editor ➤ Show Document Outline.

     
../images/469983_1_En_10_Chapter/469983_1_En_10_Fig2_HTML.jpg
Figure 10-2

The Document Outline displays any gesture recognizers you place on the user interface

  1. 5.

    Click the Assistant Editor icon or choose View ➤ Assistant Editor ➤ Show Assistant Editor to display the Main.storyboard and the ViewController.swift file side by side.

     
  2. 6.

    Move the mouse pointer over the Pinch Gesture Recognizer in the Document Outline, hold down the Control key, and Ctrl-drag above the last curly bracket at the bottom of the ViewController.swift file.

     
  3. 7.

    Release the Control key and the left mouse button. A popup menu appears.

     
  4. 8.

    Make sure the Connection popup menu displays Action.

     
  5. 9.

    Click in the Name text field and type pinchGesture.

     
  6. 10.
    Click in the Type popup menu and choose UIPinchGestureRecognizer . Then click the Connect button. Xcode creates an IBAction method as shown here:
    @IBAction func pinchGesture(_ sender: UIPinchGestureRecognizer) {
    }
     
  7. 11.
    Edit this IBAction method pinchGesture as follows:
    @IBAction func pinchGesture(_ sender: UIPinchGestureRecognizer) {
             print ("Pinch gesture")
    }
     
  8. 12.

    Connect an iOS device to your Macintosh through its USB cable.

     
  9. 13.

    Click the Run button or choose Product ➤ Run. The first time you run this app, it will ask permission to access the camera so give it permission.

     
  10. 14.

    Place two fingertips on the screen and pinch in or out. The Xcode debug area should display Pinch Gesture to let you know that it successfully detected the pinch gesture.

     
  11. 15.

    Click the Stop button or choose Product ➤ Stop.

     

Scaling with the Pinch Touch Gesture

The pinch gesture is a common touch gesture for zooming in or out of an image displayed on the screen, such as while looking at a digital photograph. Likewise, this same pinch gesture can be used to scale the virtual plane that appears in the augmented reality view.

Touch gestures consist of three states:
  • .began—Occurs when the app first detects a specific touch gesture

  • .changed—Occurs while the touch gesture is still going on

  • .ended—Occurs when the app detects that the touch gesture has stopped

For the pinch gesture, we just care about when it’s changing because as the user pinches in or out, we want to scale the size of the virtual plane in the augmented reality view. Edit the pinchGesture function like this:
    @IBAction func pinchGesture(_ sender: UIPinchGestureRecognizer) {
        if sender.state == .changed {
                   print ("Pinch gesture")
        }
    }

If you run this code and pinch on the screen, you should still see “Pinch gesture” appear in the debug area of Xcode. This verifies that the app still recognizes the pinch gesture.

We only want the pinch gesture to resize the virtual plane when the user pinches directly on the virtual plane and not on any other part of the augmented reality view. To detect what part of the screen the user pinched on, we first need to retrieve the entire augmented reality view like this:
let areaPinched = sender.view as? SCNView
Now we need to get the specific location that the user pinched on the screen:
let location = sender.location(in: areaPinched)
Finally, we need to use the hitTest method to determine if the user touched the virtual plane:
            let hitTestResults = sceneView.hitTest(location, options: nil)
            if let hitTest = hitTestResults.first {
            }
If the user touched the first node in the augmented reality view (the plane is the only node), then we can identify the hitTest node with an arbitrary name such as:
            if let hitTest = hitTestResults.first {
                let plane = hitTest.node
            }
Now based on how far the user moves the pinch gesture, we can multiply this value by the virtual plane’s current scale. If the user pinches out, the scale will be larger and the virtual plane should increase in size. If the user pinches in, the scale will be smaller and the virtual plane should decrease in size. The following three lines of code measure how the virtual plane should change in size in the x, y, and z direction:
   let scaleX = Float(sender.scale) * plane.scale.x
   let scaleY = Float(sender.scale) * plane.scale.y
   let scaleZ = Float(sender.scale) * plane.scale.z
Once we know how much to scale the virtual plane in the x, y, and z direction, we can apply these values to the virtual plane itself like this:
plane.scale = SCNVector3(scaleX, scaleY, scaleZ)
Finally, we need to reset the virtual plane’s new size to a scale of 1:
sender.scale = 1
The entire pinchGesture IBAction method should look like this:
    @IBAction func pinchGesture(_ sender: UIPinchGestureRecognizer) {
        if sender.state == .changed {
            let areaPinched = sender.view as? SCNView
            let location = sender.location(in: areaPinched)
            let hitTestResults = sceneView.hitTest(location, options: nil)
            if let hitTest = hitTestResults.first {
                let plane = hitTest.node
                let scaleX = Float(sender.scale) * plane.scale.x
                let scaleY = Float(sender.scale) * plane.scale.y
                let scaleZ = Float(sender.scale) * plane.scale.z
                plane.scale = SCNVector3(scaleX, scaleY, scaleZ)
                sender.scale = 1
            }
        }
    }
Run this app through a connected iOS device and pinch directly on the virtual plane (not on the area around the virtual plane). You should be able to scale the virtual plane larger and smaller depending on which direction you pinch, as shown in Figure 10-3.
../images/469983_1_En_10_Chapter/469983_1_En_10_Fig3_HTML.jpg
Figure 10-3

Pinching scales the virtual plane bigger or smaller

Rotating with the Rotation Touch Gesture

The rotation gesture uses two fingertips much like the pinch gesture. The big difference is that while the pinch gesture involves moving the two fingertips closer or farther apart, the rotation touch gesture involves placing two fingertips on the screen and rotating clockwise or counter-clockwise while keeping the distance between the two fingertips unchanged.

To place a rotation gesture recognizer in our app, follow these steps:
  1. 1.

    Click the Main.storyboard file in the Navigator pane.

     
  2. 2.

    Click the Object Library icon to display the Object Library window.

     
  3. 3.

    Type rotation in the Object Library. The Object Library displays the rotation gesture recognizer, as shown in Figure 10-4.

     
../images/469983_1_En_10_Chapter/469983_1_En_10_Fig4_HTML.jpg
Figure 10-4

Finding the rotation gesture recognizer in the Object Library window

  1. 4.

    Drag the Rotation Gesture Recognizer from the Object Library window and drop it on the ARSCNView on the user interface. Although you dragged and dropped the Rotation Gesture Recognizer on to the user interface, you won’t see any sign of it anywhere except in the Document Outline.

     
  2. 5.

    Click the Assistant Editor icon or choose View ➤ Assistant Editor ➤ Show Assistant Editor to display the Main.storyboard and the ViewController.swift file side by side.

     
  3. 6.

    Move the mouse pointer over the Rotation Gesture Recognizer in the Document Outline, hold down the Control key, and Ctrl-drag above the last curly bracket at the bottom of the ViewController.swift file.

     
  4. 7.

    Release the Control key and the left mouse button. A popup menu appears.

     
  5. 8.

    Make sure the Connection popup menu displays Action.

     
  6. 9.

    Click in the Name text field and type rotationGesture.

     
  7. 10.
    Click in the Type popup menu and choose UIRotationGestureRecognizer. Then click the Connect button. Xcode creates an IBAction method as shown here:
        @IBAction func rotationGesture(_ sender: UIRotationGestureRecognizer) {
        }
     
  8. 11.
    Edit this IBAction method rotationGesture as follows:
        @IBAction func rotationGesture(_ sender: UIRotationGestureRecognizer) {
                 print ("Rotation gesture")
        }
     
  9. 12.

    Connect an iOS device to your Macintosh through its USB cable.

     
  10. 13.

    Click the Run button or choose Product ➤ Run.

     
  11. 14.

    Place two fingertips on the screen and rotate them clockwise or counter-clockwise. The Xcode debug area should display “Rotation Gesture” to let you know that it successfully detected the rotation gesture.

     
  12. 15.

    Click the Stop button or choose Product ➤ Stop.

     

With the rotation gesture, we need to identify when the rotation is actually taking place and when the rotation finally stops. While the rotation gesture is occurring, we’ll need to rotate the virtual plane. As soon as the rotation ends, we’ll need to store the rotated angle as the virtual plane’s current angle.

First, we’ll need to create two variables underneath the IBOutlet for the ARSCNView like this:
    var newAngleZ : Float = 0.0
    var currentAngleZ : Float = 0.0

In this example, we’ll be rotating the virtual plane around its z-axis so the currentAngleZ stores the virtual plane’s current angle. Then we’ll calculate a new angle, based on the rotation gesture, and store this new angle in the newAngleZ variable.

Once these two variables are available, we can write code that detects when the rotation is happening (.changed) and when the rotation has stopped (.ended):
    @IBAction func rotationGesture(_ sender: UIRotationGestureRecognizer) {
        if sender.state == .changed {
        } else if sender.state == .ended {
                currentAngleZ = newAngleZ
        }
    }

As soon as the rotation gesture ends, we want to store the new angle of rotation (newAngleZ) into the currentAngleZ variable .

Now as soon as we detect a rotation gesture, we need to verify that this rotation gesture occurs on the virtual plane. To do this, we need to retrieve the view touched, get the location of the user’s fingertips, and use the hitTest method to determine if it touched any virtual object in the augmented reality view:
    let areaTouched = sender.view as? SCNView
    let location = sender.location(in: areaTouched)
    let hitTestResults = sceneView.hitTest(location, options: nil)
Next, we’ll need to check if the rotation gesture occurs over the first (and only virtual object) in the augmented reality view:
    if let hitTest = hitTestResults.first {
    }
Then we’ll create a “plane” constant to represent the node that the user touched (the virtual plane) and store the rotation angle of the gesture in the newAngleZ variable.
    let plane = hitTest.node
    newAngleZ = Float(-sender.rotation)

The negative sign is necessary to coordinate the rotation gesture on the screen with the rotation of the virtual plane in the augmented reality view. Without this negative sign, the virtual plane would rotate in the opposite direction as the rotation gesture.

We’ll add this new rotation angle to the virtual plane’s current angle and then assign this new angle to rotate the virtual plane around the z-axis. To do this, we need to use the eulerAngles property that defines a virtual object’s rotation around the x-, y-, and z-axes. Since we’re only rotating the virtual plane around the z-axis, we only need to assign the new angle of rotation to the z-axis like this:
    newAngleZ += currentAngleZ
    plane.eulerAngles.z = newAngleZ
The complete rotation gesture IBAction method should look like this:
    @IBAction func rotationGesture(_ sender: UIRotationGestureRecognizer) {
        if sender.state == .changed {
            let areaTouched = sender.view as? SCNView
            let location = sender.location(in: areaTouched)
            let hitTestResults = sceneView.hitTest(location, options: nil)
            if let hitTest = hitTestResults.first {
                let plane = hitTest.node
                newAngleZ = Float(-sender.rotation)
                newAngleZ += currentAngleZ
                plane.eulerAngles.z = newAngleZ
            }
        } else if sender.state == .ended {
                currentAngleZ = newAngleZ
        }
    }
Remember, you must also add the two variables (newAngleZ and currentAngleZ) as Float variables by declaring them near the top of the ViewController.swift class like this:
    var newAngleZ : Float = 0.0
    var currentAngleZ : Float = 0.0
If you run this app, you can place two fingertips on the virtual plane and rotate. Then the virtual plane will rotate in the same direction, as shown in Figure 10-5.
../images/469983_1_En_10_Chapter/469983_1_En_10_Fig5_HTML.jpg
Figure 10-5

Rotating the virtual plane with the rotation gesture

Moving Virtual Objects with the Pan Gesture

The pan gesture occurs when the user slides one fingertip across the screen in any direction. You can define both the minimum and maximum number of fingertips for a pan gesture such as at least two fingers but not more than four. By default, the minimum number of fingertips to detect a pan gesture is 1.

Xcode offers two types of pan gesture recognizers. The one we’ll be using is simply called Pan Gesture Recognizer, which detects fingertip movement anywhere on the screen. The other pan gesture recognizer is called Screen Edge Pan Gesture Recognizer. If you’ve ever swiped up from the bottom of an iPhone screen to display options such as turning your iPhone into a flashlight, then you’ve used the Screen Edge Pan Gesture Recognizer that detects pans that start at the edge of a screen.

To place a regular pan gesture recognizer in our app, follow these steps:
  1. 1.

    Click the Main.storyboard file in the Navigator pane.

     
  2. 2.

    Click the Object Library icon to display the Object Library window.

     
  3. 3.

    Type pan in the Object Library. The Object Library displays the pan gesture recognizer, as shown in Figure 10-6.

     
../images/469983_1_En_10_Chapter/469983_1_En_10_Fig6_HTML.jpg
Figure 10-6

Finding the pan gesture recognizer in the Object Library window

  1. 4.

    Drag the Pan Gesture Recognizer from the Object Library window and drop it on the ARSCNView on the user interface. Although you dragged and dropped the Pan Gesture Recognizer on to the user interface, you won’t see any sign of it anywhere except in the Document Outline.

     
  2. 5.

    Click the Assistant Editor icon or choose View ➤ Assistant Editor ➤ Show Assistant Editor to display the Main.storyboard and the ViewController.swift file side by side.

     
  3. 6.

    Move the mouse pointer over the Pan Gesture Recognizer in the Document Outline, hold down the Control key, and Ctrl-drag above the last curly bracket at the bottom of the ViewController.swift file.

     
  4. 7.

    Release the Control key and the left mouse button. A popup menu appears.

     
  5. 8.

    Make sure the Connection popup menu displays Action.

     
  6. 9.

    Click in the Name text field and type panGesture.

     
  7. 10.
    Click in the Type popup menu and choose UIPanGestureRecognizer . Then click the Connect button. Xcode creates an IBAction method as shown here:
    @IBAction func panGesture(_ sender: UIPanGestureRecognizer) {
        }
    }
     
  8. 11.
    Edit this IBAction method panGesture as follows:
    @IBAction func panGesture(_ sender: UIPanGestureRecognizer) {
                   print ("Pan gesture")
    }
     
  9. 12.

    Connect an iOS device to your Macintosh through its USB cable.

     
  10. 13.

    Click the Run button or choose Product ➤ Run.

     
  11. 14.

    Place one fingertip on the screen and slide it around the screen. The Xcode debug area should display “Pan Gesture” to let you know that it successfully detected the rotation gesture.

     
  12. 15.

    Click the Stop button or choose Product ➤ Stop.

     
Once we verify that the pan gesture works, the first step is to identify if the user is panning across the virtual plane in the augmented reality view. We need to retrieve the location on the view that the user pans across and use the hitTest method to verify that the user’s fingertip is on a virtual object:
   let areaPanned = sender.view as? SCNView
   let location = sender.location(in: areaPanned)
   let hitTestResults = areaPanned?.hitTest(location, options: nil)
Next, we have to determine if the user touched the first node (the virtual plane):
   if let hitTest = hitTestResults?.first {
   }
Once we know that the user touched the virtual plane, we need to create a “plane” constant that represents the virtual plan’s parent node. A virtual object can consist of multiple nodes daisy-chained together to create the illusion of a single item. To move a virtual object, we need to move the parent node because this will automatically move any attached nodes. So we need an additional if let statement like this:
   if let hitTest = hitTestResults?.first {
       if let plane = hitTest.node.parent {
       }
   }
Finally, we need to detect when the pan gesture is occurring. This happens when the pan gesture state is equal to .changed, so we need a final if statement inside like this:
        if let hitTest = hitTestResults?.first {
            if let plane = hitTest.node.parent {
                if sender.state == .changed {
                 }
            }
        }
Inside all of these multiple if statements, we need to get the translation property from the pan gesture, which defines how far the user has moved a fingertip across the screen. Because a screen is a flat, two-dimensional surface, we can only track movement across the x-axis and y-axis.
let translate = sender.translation(in: areaPanned)
After getting the translation movement from the pan gesture, we can finally apply this translation movement to the virtual object itself:
plane.localTranslate(by: SCNVector3(translate.x/10000,-translate.y/10000,0.0))

This code applies the translation from the pan gesture in the x and y directions to the virtual plane. Since we can’t detect any pan gesture on the z-axis, we won’t translate in any direction along the z-axis, so the z value of the SCNVector3 is 0.0.

Both the translate.x and translate.y values are divided by 10000 as an arbitrary value. Without dividing the translate.x or translate.y values by a large number, the actual movement of the virtual plane will appear choppy. Large values such as 10000 force the movement to occur more smoothly. Experiment with lower values to see how they create a choppy movement of the virtual plane when the user slides a fingertip across the screen.

The entire panGesture IBAction method should look like this:
    @IBAction func panGesture(_ sender: UIPanGestureRecognizer) {
        let areaPanned = sender.view as? SCNView
        let location = sender.location(in: areaPanned)
        let hitTestResults = areaPanned?.hitTest(location, options: nil)
        if let hitTest = hitTestResults?.first {
            if let plane = hitTest.node.parent {
                if sender.state == .changed {
                    let translate = sender.translation(in: areaPanned)
                    plane.localTranslate(by: SCNVector3(translate.x/10000,-translate.y/10000,0.0))
                 }
            }
        }
    }
When you run this app, press a fingertip on the virtual plane and slide it around the screen. The virtual plane can then move in the x and y directions, as shown in Figure 10-7.
../images/469983_1_En_10_Chapter/469983_1_En_10_Fig7_HTML.jpg
Figure 10-7

Moving the virtual plane through the pan gesture

Summary

Touch gestures can interact with virtual objects and make them move, rotate, or scale. When using touch gestures to interact with virtual objects, you need to use the hitTest function to detect when the user’s touch gestures occur over a virtual object. Then you can modify that virtual object physical position.

Touch gestures provide a way for users to manipulate virtual objects within an augmented reality view and turn a static augmented reality view into an interactive one.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.217.109.151