Summary

This chapter covered a couple of sensors that are available in iOS devices. Each sensor comes with its own framework that you use to read its data. You used these sensors to build a unique login screen for a fictional app. Using sensor data on a login screen, as we did in this chapter, can boost the user's confidence in your app if it's executed correctly.

You learned about AVFoundation and how you can use this framework to display a camera feed. You also learned that the UIKit framework actually provides a ready-to-use view controller that you can implement in your apps to allow your users to pick an image from either their camera roll or to take an image with the camera.

Then, you saw how CoreMotion implements many classes that read motion data from the device. You can measure anything from walking distance, device tilt, or device movement. These sensors can be used for a wide range of useful applications, such as building a level app or something less useful and more fun like we did.

Finally, we covered the CoreLocation framework. This framework provides access to a user's location data. You learned how to access the user's location in several ways. You saw how to get continuous updates about a user's whereabouts and how to get more privacy-friendly and battery-friendly updates through significant location changes and monitoring visits.

This overview of the sensors doesn't cover all possible combinations or implementations you might come up with for your own apps, but you should have a good feeling for the possibilities and opportunities the hardware sensors in the iOS provide. The next chapter is about spotlight and how you can get it to index your app's contents to drive user engagement.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.12.161.77