Chapter 10. Responding to the User

In this chapter, we will cover the following recipes:

  • Responding to simple touches
  • Responding to scroll gestures
  • Responding to manipulation gestures
  • Detecting rotate gestures
  • Responding to custom user gestures
  • Listening to sensor data
  • Listening for sensor triggers
  • Discovering the environment
  • Detecting device shakes

Introduction

Apps would not be useful if there was no way for either the user or the system to input data. The very definition of an app is to fulfil a purpose for the user, and the only way to let the app know what to do is to provide it with input. Almost all apps require some input, ranging from word processors with hundreds of key strokes and mouse clicks to screensavers, which close as soon as the mouse moves. All apps take input, process it, and output the result in some form.

For modern mobile devices, the primary form of input is touch, and on some devices, there is only one hardware button, the power button. Initially, Android devices were built with keypads, but now, almost all devices are built with a large touchscreens.

A touchscreen is actually very limited, because they can only respond to input when a user actually touches the screen. This makes processing the touch events the real source of input. Depending on the pressure the user applies, how many fingers the user uses, and how the user moves those fingers, we can determine what the user is doing.

If the user taps a single finger to the screen for a few milliseconds, it is a simple tap. If the user presses and holds for a few seconds, then it is a long press. When the user presses and drags a finger across the screen, it can be a scroll. If the user places multiple fingers on the screen, there is a whole new set of events that need to be processed. But that is not all, because mobile devices are so portable, they can actually be moved in their environment, providing another source of input data. The multitude of sensors built into the device allow the app to detect whether the device is being moved, rotated, shaken, or even that the user is walking with it. There are also many sensors to detect what is happening around the device. This provides yet another source of input data. These sensors can detect the amount of light or even the air pressure around the device.

In the same way that touch events are processed to provide the real input, the sensor data also needs to be processed to determine what the user is doing with the device. The touchscreen and the sensor simply provide raw data to the app, which then processes it into useful input data.

With all these forms of input, the mobile device ends up having far more sources of input than the typical desktop computer. This allows mobile apps to be more natural and more exciting to use. This allows us as developers to create great user experiences when designing apps.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.191.181.231