PART IV. 3D Interaction Techniques

In Part III, we presented information about the input and output device technologies that make 3D interaction possible. However, choosing or designing good interaction devices is not sufficient to produce a 3D UI that enables a good user experience. In Part IV, we discuss interaction techniques for the most common 3D interaction tasks. Remember that interaction techniques are methods used to accomplish a given task via the interface and that they include both hardware and software components. The software components of interaction techniques are also known as control-display mappings or transfer functions, and they are responsible for translating information from the input devices into associated system actions that are then displayed to the user (see the introduction to Part III). Many of the techniques we present can be implemented using a variety of different devices; the interaction concept and the implementation details are what make them unique.

We organize Part IV by user interaction task. Each chapter describes a task and variations on that task. Techniques that can be used to complete that task are discussed, along with guidelines for choosing among the techniques. We also provide implementation details for some important techniques.

The implementation details are described in plain English. We decided not to provide mathematical equations, code, or pseudocode. Today, many of this machinery is directly supported by the functionality of modern game engines and development kits. Equations that are needed can usually be found in the papers referenced in our discussions. We decided not to provide code or pseudocode for several reasons. Code would have been extremely precise, but we would have had to choose a language and a toolkit or game engine on which the code would be based. Pseudocode would have been more general, but even with pseudocode, we would be assuming that your development environment provides a particular set of functionality and uses a particular programming style. Thus, we decided to use natural language to describe the interaction techniques. This choice ensures descriptiveness and allows each reader to translate the implementation concepts into his or her own development environment.

Chapter 7, “Selection and Manipulation,” covers the closely related tasks of selection and manipulation. We begin with these tasks because they have been widely studied, they are fundamental aspects of 3D interaction, and techniques for these tasks form the basis for many other 3D interaction techniques.{252}

Chapter 8, “Travel,” relates to the task of navigation, which is movement in and around an environment—a fundamental human task. Navigation includes both travel and wayfinding. Travel is the motor component of navigation—the low-level actions that the user takes to control the position and orientation of his viewpoint. In the real world, travel is the more physical navigation task, involving moving feet, turning a steering wheel, letting out a throttle, and so on. In the virtual world, travel techniques allow the user to translate and/or rotate the viewpoint and to modify the conditions of movement, such as the velocity. Wayfinding is the cognitive component of navigation—high-level thinking, planning, and decision-making related to user movement. It involves spatial understanding and planning tasks, such as determining the current location within the environment, determining a path from the current location to a goal location, and building a mental map of the environment. In virtual worlds, wayfinding can also be crucial—in a large, complex environment, an efficient travel technique is of no use if the traveler has no idea where to go. When we speak of wayfinding techniques, we refer to wayfinding aids included as part of the interface or in the environment. Unlike travel techniques or manipulation techniques, where the computer ultimately performs the action, wayfinding techniques support the performance of the task only in the user’s mind.

System control tasks and symbolic input are the topics of Chapter 9, “System Control.” System control addresses changing the mode or state of the system, often through commands or menus. Symbolic input, the task of entering or editing text, numbers, and other symbols, is often interwoven with system control tasks. These two tasks have not been as heavily researched as manipulation, travel, and wayfinding, but they are nonetheless important for many 3D UIs.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.145.63.136