Exploring ARKit

Apple launched the first version of ARKit in 2017 along with Xcode 9 and iOS 11 to bring AR to iOS devices. The framework, which is included in Xcode, offered developers the possibility to produce AR experiences in their apps or games with software that's combined with an iOS device's motion features and camera tracking. It allows users to place virtual content in the real world. Months after its official release, it added new features such as 2D image detection and face detection. The main features that are available for iOS 11 and above are as follows:

  • Tracking and visualizing planes (iOS 11.3+), such as a table or the ground, in the physical environment
  • Tracking known 2D images (iOS 11.3+) in the real world and placing AR content over them (image recognition)
  • Tracking faces (iOS 11.0+) in the camera feed and laying virtual content over them (for example, a virtual avatar face) that react to facial expressions in real-time

Apart from these features, the AR experience can also be enhanced by using sound effects attached to virtual objects or integrating other frameworks such as vision to add computer vision algorithms to the app, or Core ML, for machine learning models.

In 2018, with the iOS 12 release, ARKit 2 was launched with new features:

  • 3D object tracking, where real-world objects are the ones that trigger the AR elements
  • Multiuser AR experiences, allowing users near each other to share the same AR environment
  • Adding realistic reflections to the AR objects to make the experience more realistic
  • Saving the world-mapping data so that when a user places a virtual element in the real world, the next time the app restarts, the virtual elements will appear in the same place

At the time of writing this book, iOS 13 with ARKit 3 has just been launched and promises a huge improvement to the current state since it's added a new way of interacting with virtual elements, such as hiding virtual objects when a person is detected in front of them. It also allows users to interact with 3D objects by gestures and captures not only facial expressions but the motions of a person.

Because of the changes that are made in each iOS launch, not all the features that we mentioned here are available on all devices. The developers' page at https://developer.apple.com/documentation/arkit enumerates the current ARKit features and required minimum Xcode and iOS versions to develop and test with.

For this project, we will be using plane detection, which is a basic feature that can be run on iOS 11 and above. We will look at this in the next section.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.217.220.114