Chapter 8

Mental Models and Conceptual Design

Objectives

After reading this chapter, you will:

1. Understand designers’ and users’ mental models and the mapping between them

2. Be able to create conceptual designs from ecological, interaction, and emotional perspectives

3. Know what storyboards are and how to produce them

4. Understand the background aspects of embodied, ubiquitous, and situated interactions

8.1 Introduction

8.1.1 You Are Here

We begin each process chapter with a “you are here” picture of the chapter topic in the context of the overall Wheel lifecycle template; see Figure 8-1. This chapter is a continuation of design, which we started in Chapter 7 and will conclude in Chapter 9, for designing the new work practice and the new system.

image

Figure 8-1 You are here; the second of three chapters on creating an interaction design in the context of the overall Wheel lifecycle template.

8.2 Mental models

8.2.1 What Is a Mental Model?

According to Wikipedia.org, “a mental model is an explanation of someone’s thought process about how something works in the real world.” A designer’s mental model is a vision of how a system works as held by the designer. A user’s mental model is a description of how the system works, as held by the user. It is the job of conceptual design (coming up soon) to connect the two.

8.2.2 Designer’s Mental Model

Sometimes called a conceptual model (Johnson & Henderson, 2002, p. 26), the designer’s mental model is the designer’s conceptualization of the envisioned system—what the system is, how it is organized, what it does, and how it works. If anyone should know these things, it is the designer who is creating the system. But it is not uncommon for designers to “design” a system without first forming and articulating a mental model.

The results can be a poorly focused design, not thought through from the start. Often such designs proceed in fits and starts and must be retraced and restarted when missing concepts are discovered along the way. The result of such a fuzzy start can be a fuzzy design that causes users to experience vagueness and misconceptions. It is difficult for users to establish a mental model of how the system works if the designer has never done the same.

As shown in Figure 8-2, the designer’s mental model is created from what is learned in contextual inquiry and analysis and is transformed into design by ideation and sketching.

image

Figure 8-2 Mapping the designer’s mental model to the user’s mental model.

Johnson and Henderson (2002, p. 26) include metaphors, analogies, ontological structure, and mappings between those concepts and the task domain or work practice the design is intended to support. The closer the designer’s mental model orientation is to the user’s work domain and work practice, the more likely users will internalize the model as their own. To paraphrase Johnson and Henderson’s rule for relating the designer’s mental model to the final design: if it is not in the designer’s mental model, the system should not require users to be aware of it.

Metaphor

A metaphors is an analogy used in design to communicate and explain unfamiliar concepts using familiar conventional knowledge. Metaphors control complexity by allowing users to adapt what they already know in learning how to use new system features.

Designer’s mental model in the ecological perspective: Describing what the system is, what it does, and how it works within its ecology

Mental models of a system can be expressed in any of the design perspectives of Chapter 7. In the ecological perspective, a designer’s mental model is about how the system or product fits within its work context, in the flow of activities involving it and other parts of the broader system. In Norman’s famous book, The Design of Everyday Things, he describes the use of thermostats (Norman, 1990, pp. 38–39) and how they work. Let us expand the explanation of thermostats to a description of what the system is and what it does from the perspective of its ecological setting.

Design Ontology

Design ontology is a description of all the objects and their relationships, users, user actions, tasks, everything surrounding the existence of a given aspect of a design.

First, we describe what it is by saying that a thermostat is part of a larger system, a heating (and/or cooling) system consisting of three major parts: a heat source, a heat distribution network, and a control unit, the latter being the thermostat and some other hidden circuitry. The heat source could be gas, electric, or wood burning, for example. The heat distribution network would use fans or air blowers to send heated or cooled air through hot air ducts or a pump would send heated or cooled water through subfloor pipes.

Next, we address what it does by noting that a thermostat is for controlling the temperature in a room or other space. It controls heating and cooling so that the temperature stays near a user-settable value—neither too hot or too cold—keeping people at a comfortable temperature.

Designer’s mental model in the interaction perspective: Describing how users operate it

In the interaction perspective, a designer’s mental model is a different view of an explanation of how things work; it is about how a user operates the system or product. It is a task-oriented view, including user intentions and sensory, cognitive, and physical user actions, as well as device behavior in response to these user actions.

In the thermostat example, a user can see two numerical temperature displays, either analog or digital. One value is for the current ambient temperature and the other is the setting for the target temperature. There will be a rotatable knob, slider, or other value-setting mechanism to set the desired target temperature. This covers the sensory and physical user actions for operating a thermostat. User cognition and proper formation of intentions with respect to user actions during thermostat operation, however, depend on understanding the usually hidden explanation of the behavior of a thermostat in response to the user’s settings.

Most thermostats, as Norman explains (1990, pp. 38–39), are binary switches that are simply either on or off. When the sensed ambient temperature is below the target value, the thermostat turns the heat on. When the temperature then climbs to the target value, the thermostat turns the heat source off. It is, therefore, a false conceptualization, or false mental model, to believe that you can make a room warm up faster by turning the thermostat up higher.

The operator’s manual for a particular furnace unit would probably say something to the effect that you turn it up and down to make it warmer or cooler, but would probably fall short of the full explanation of how a thermostat works. But the user is in the best position to form effective usage strategies, connecting user actions with expected outcomes, if in possession of this knowledge of thermostat behavior.

There are at least two possible design approaches to thermostats, then. The first is the common design containing a display of the current temperature plus a knob to set the target temperature. A second design, which reveals the designer’s mental model, might have a display unit that provides feedback messages such as “checking ambient temperature,” “temperature lower than target; turning heat on,” and “temperature at desired level; shutting off.” This latter design might suffer from being more complex to produce and the added display might be a distraction to experienced users. However, this design approach does help project the designer’s mental model through the system design to the user.

Designer’s mental model in the emotional perspective: Describing intended emotional impact

In the emotional perspective, the mental model of a design it about the expected overarching emotional response. Regarding the thermostat example, it is difficult to get excited about the emotional aspects of thermostats, but perhaps the visual design, the physical design, how it fits in with the house décor, or the craftsmanship of its construction might offer a slight amount of passing pleasure.

8.2.3 User’s Mental Model

A user’s mental model is a conceptualization or internal explanation each user has built about how a particular system works. As Norman says (1990), it is a natural human response to an unfamiliar situation to begin building an explanatory model a piece at a time. We look for cause-and-effect relationships and form theories to explain what we observe and why, which then helps guide our behavior and actions in task performance.

As shown in Figure 8-2, each user’s mental model is a product of many different inputs including, as Norman has often said, knowledge in the head and knowledge in the world. Knowledge in the head comes from mental models of other systems, user expertise, and previous experience. Knowledge in the world comes from other users, work context, shared cultural conventions, documentation, and the conceptual design of the system itself. This latter source of user knowledge is the responsibility of the system designer.

Few, if any, thermostat designs themselves carry any knowledge in the world, such as a cognitive affordance that conveys anything like Norman’s explanation of a thermostat as a binary switch. As a result, thermostat users depend on knowledge in the head, mostly from previous experience and shared conventions. Once you have used a thermostat and understand how it works, you pretty much understand all thermostats.

But sometimes mental models adapted from previous encounters with similar systems can work against learning to use a new system with a different conceptual design. Norman’s binary switch explanation is accurate for almost every thermostat on the planet, but not for one in the heater of a mid-1960s Cadillac. In a fascinating departure from the norm, you could, in fact, speed up the heating system in this car, both the amount of heat and the fan speed, by setting the thermostat to a temperature higher than what you wanted in steady state.

Since cars were beginning to have more sophisticated (in this case, read more failure prone) electronics, why not put them to use? And they did. The output heat and fan speed were proportional to the difference between the ambient temperature and the thermostat setting. So, on a cold day, the heater would run wide open to produce as much heat as possible, but it would taper off its output as it approached the desired setting.

Lack of a correct user mental model can be the stuff of comedy curve balls, too. An example is the scene in the 1992 movie, My Cousin Vinny, where Marisa Tomei—as Vinny’s fiancée, Mona Lisa Vito—tries to make a simple phone call. This fish-out-of-water scene pits a brash young woman from New York against a rotary dial telephone. You cannot help but reflect on the mismatch in the mapping between her mental model of touch-tone operation and the reality of old-fashioned rotary dials as she pokes vigorously at the numbers through the finger holes.

But, lest you dismiss her as a ditzy blond, we remind you that it was she who solved the case with her esoteric knowledge in the head, proving that the boys’ 1964 Buick Skylark could not have left the two tire tracks found outside the convenience store because it did not have a limited-slip differential.

8.2.4 Mapping and the Role of Conceptual Design

The mapping in Figure 8-2 is an abstract and objective ideal transformation of the designer’s mental model into the user’s mental model (Norman, 1990, p. 23). As such the mapping is a yardstick against which to measure how closely the user’s mental model matches the reality of the designer’s mental model.

The conceptual design as it is manifest in the system is an implementation of this mapping and can be flawed or incomplete. A flawed conceptual design leads to a mismatch in the user’s mental model. In reality, each user is likely to have a different mental model of the same system, and mental models can be incomplete and even incorrect in places.

8.3 Conceptual design

8.3.1 What Is a Conceptual Design?

A conceptual design is the part of an interaction design containing a theme, notion, or idea with the purpose of communicating a design vision about a system or product. A conceptual design is the manifestation of the designer’s mental model within the system, as indicated in Figure 8-2. It is the part of the system design that brings the designer’s mental model to life within the system. A conceptual design corresponds to what Norman calls the “system image” of the designer’s mental model (Norman, 1990, pp. 16, 189–190), about which he makes the important point: this is the only way the designer and user can communicate.

Conceptual design is where you innovate and brainstorm to plant and first nurture the user experience seed. You can never iterate the design later to yield a good user experience if you do not get the conceptual part right up front. Conceptual design is where you establish the metaphor or the theme of the product—in a word, the concept.

8.3.2 Start with a Conceptual Design

Now that you have done your contextual inquiry and analysis, requirements, and modeling, as well as your ideation and sketching, how do you get started on design? Many designers start sketching out pretty screens, menu structures, and clever widgets.

But Johnson and Henderson (2002) will tell you to start with conceptual design before sketching any screen or user interface objects. As they put it, screen sketches are designs of “how the system presents itself to users. It is better to start by designing what the system is to them.” Screen designs and widgets will come, but time and effort spent on interaction details can be wasted without a well-defined underlying conceptual structure. Norman (2008) puts it this way: “What people want is usable devices, which translates into understandable ones” (final emphasis ours).

To get started on conceptual design, gather the same team that did the ideation and sketching and synthesize all your ideation and sketching results into a high-level conceptualization of what the system or product is, how it fits within its ecology, and how it operates with users.

For most systems or products, especially domain-complex systems, the best way to start conceptual design is in the ecological perspective because that captures the system in its context. For product concepts where the emotional impact is paramount, starting with that perspective is obvious. At other times the “invention” of an interaction technique like that of the iPod Classic scroll wheel might be the starting point for a solution looking for a problem and is best visualized in the interaction perspective.

8.3.3 Leverage Metaphors in Conceptual Design

One way to start formulating a conceptual design is by way of metaphors—analogies for communication and explanations of the unfamiliar using familiar conventional knowledge. This familiarity becomes the foundation underlying and pervading the rest of the interaction design.

What users already know about an existing system or existing phenomena can be adapted in learning how to use a new system (Carroll & Thomas, 1982). Use metaphors to control complexity of an interaction design, making it easier to learn and easier to use instead of trying to reduce the overall complexity (Carroll, Mack, & Kellogg, 1988).

One of the simple and oldest examples is the use of a typewriter metaphor in a word processing system. New users who are familiar with the knowledge, such as margin setting and tab setting in the typewriter domain, will already know much of what they need to know to use these features in the word processing domain.

Metaphors in the ecological perspective

Find a metaphor that can be used to describe the broader system structure. An example of a metaphor from the ecological perspective could be the description of iTunes as a mother ship for iPods, iPhones, and iPads. The intention is that all operations for adding, removing, or organizing media content, such as applications, music, or videos, are ultimately managed in iTunes and the results are synced to all devices through an umbilical connection.

Metaphors in the interaction perspective

An example of a metaphor in the interaction perspective is a calendar application in which user actions look and behave like writing on a real calendar. A more modern example is the metaphor of reading a book on an iPad. As the user moves a finger across the display to push the page aside, the display takes on the appearance of a real paper page turning. Most users find it comfortingly familiar.

Another great example of a metaphor in the interaction perspective can be found in the Time Machine feature on the Macintosh operating system. It is a backup feature where the user can take a “time machine” to go back to older backups—by flying through time as guided by the user interface—to retrieve lost or accidentally deleted files.

One other example is the now pervasive desktop metaphor. When the idea of graphical user interfaces in personal computers became an economic feasibility, the designers at Xerox Parc were faced with an interesting interaction design challenge: How to communicate to the users, most of whom were going to see this kind of computer for the first time, how the interaction design works?

In response, they created the powerful “desktop” metaphor. The design leveraged the familiarity people had with how a desktop works: it has files, folders, a space where current work documents are placed, and a “trash can” where documents can be discarded (and later recovered, until the trash can itself is emptied). This analogy of a simple everyday desk was brilliant in its simplicity and made it possible to communicate the complexity of a brand new technology.

As critical components of a conceptual design, metaphors set the theme of how the design works, establishing an agreement between the designer’s vision and the user’s expectations. But metaphors, like any analogy, can break down when the existing knowledge and the new design do not match.

When a metaphor breaks down, it is a violation of this agreement. The famous criticism of the Macintosh platform’s design of ejecting an external disk by dragging its icon into the trashcan is a well-known illustration of how a metaphor breakdown attracts attention. If Apple designers were faithful to the desktop metaphor, the system should probably discard an external disk, or at least delete its contents, when it is dragged and dropped onto the trashcan, instead of ejecting it.

Metaphors in the emotional perspective

An example of a metaphor from the emotional perspective is seen in advertising in Backpacker magazine of the Garmin handheld GPS as a hiking companion. In a play on words that ties the human value of self-identity with orienteering, Garmin uses the metaphor of companionship: “Find yourself, then get back.” It highlights emotional qualities such as comfort, cozy familiarity, and companionship: “Like an old pair of boots and your favorite fleece, GPSMAP 62ST is the ideal hiking companion.”

8.3.4 Conceptual Design from the Design Perspectives

Just as any other kind of design can be viewed from the three design perspectives of Chapter 7, so can conceptual design.

Conceptual design in the ecological perspective

The purpose of conceptual design from the ecological perspective is to communicate a design vision of how the system works as a black box within its environment. The ecological conceptual design perspective places your system or product in the role of interacting with other subsystems within a larger infrastructure.

As an example, Norman (2009) cites the Amazon KindleTM —a good example of a product designed to operate within an infrastructure. The product is for reading books, magazines, or any textual material. You do not need a computer to download or use it; the device can live as its own independent ecology. Browsing, buying, and downloading books and more is a pleasurable flow of activity. The Kindle is mobile, self-sufficient, and works synergistically with an existing Amazon account to keep track of the books you have bought through Amazon.com. It connects to its ecology through the Internet for downloading and sharing books and other documents. Each Kindle has its own email address so that you and others can send lots of materials in lots of formats to it for later reading.

As discussed previously, the way that iPods and iTunes work together is another example of conceptual design in the ecological perspective. Norman calls this designing an infrastructure rather than designing just an application. Within this ecosystem, iTunes manages all your data. iTunes is the overall organizer through which you buy and download all content. It is also where you create all your playlists, categories, photo albums, and so on. Furthermore, it is in iTunes that you decide what parts of your data you want on your “peripherals,” such as an iPod, iPad, or iPhone. When you connect your iDevice to the computer and synchronize it, iTunes will bring it up to date, including an installation of the latest version of the software as needed.

Usability of an Ecology of Devices: A Personal Information Ecosystem

Manuel A. Pérez-Quiñones, Department of Computer Science, Virginia Tech

The world of ubiquitous computing imagined by Mark Weiser (1991) is upon us. The computational power of small devices is enabling new uses of computing away from the desktop or office. Networking and communication abilities of devices make it possible to use computing in mobile settings. Storage and display improvements make difficult tasks now possible on small devices. For example, one can do photo and video editing on an iPhone. The “cloud” is tying all of these together and providing access to computing and information anytime, anywhere.

In this new environment, the biggest challenge for usability engineers is that all of these devices are used together to accomplish user’s information needs and goals. Whereas before we had tools dedicated to particular tasks (e.g., email programs), now we have a set of devices, each with a set of tools to support the same tasks. The usability of these tasks must be evaluated as a collection of devices working together, not as the sum of the usability of individual tools. Some tasks, on the surface, can be done on any of our many devices. Take email, for example. You can read, reply, forward, and delete emails in your phone, tablet device, laptop, desktop, game console, or even TV or entertainment center. However, managing email sometimes entails more than that. Once you get to filing and refinding previous email messages, the tasks gets very complicated on some of these devices. And opening some attachments might not be possible in other devices. Also, even though we have connectivity to talk to anyone in the world, you do not quite have enough connectivity to print an email remotely at home or at the office. The result is that not all devices support all the tasks required to accomplish our work, but the collection of devices together do, while allowing mobility and 24/7 access to information.

The challenge comes on how to evaluate a system of coordinated device usage that spans multiple manufacturers, multiple communication capabilities, and multiple types of activities. The experience of using (and configuring and managing) multiple devices together is very different than using only one device. As a matter of fact, the usability of just one device is barely a minimum fit for it to work within the rest of devices used in our day-to-day information management. Furthermore, the plethora of devices creates a combinatorial explosion of device choices that make assessing the usability of the devices together practically impossible.

Part of the problem is that we lack a way to understand and study this collection of devices. To alleviate this need, we have proposed a framework, called a personal information ecosystem (PIE) (Pérez-Quiñones et al., 2008), that at least helps us characterize different ecologies that emerge for information management. The idea of ecosystems in information technology is not new, but our approach is most similar to Spinuzzi’s (2001) ecologies of genre. Spinuzzi argues that usability is not an attribute of a single product or artifact, but that instead it is best studied across the entire ecosystem used in an activity. His approach borrows ideas from distributed cognition and activity theory.

At the heart of the ecology of devices is an information flow that is at its optimum point (i.e., equilibrium) when the user is exerting no extra effort to accomplish his/her tasks. At equilibrium, the user rarely needs to think of the devices, the data format, or the commands to move information to and from devices. This equilibrium, however, is disrupted easily by many situations: introduction of a new device, disruption in service (wifi out of range), changes in infrastructure, incompatibility between programs, etc. It is often quite a challenge to have all of your devices working together to reach this equilibrium. The usability of the ecosystem depends more on the equilibrium and ease of information flow than on the individual usability of each device.

However, having a terminology and understanding the relationships between devices are only the beginning. I would claim that designing and assessing user experience within an ecology of devices is what Rittel (1972) calls a “wicked problem.” A wicked problem, according to Rittel, is a problem that by its complexity and nature cannot have a definitive formulation. He even states that a formulation of the problem itself corresponds to a particular solution of the problem. Often, wicked problems have no particular solution, instead we judge a solution as good or bad. We often cannot even test a solution to a wicked problem, we can only indicate to a degree to which a given solution is good. Finally, in wicked problems, according to Rittel, there are many explanations for the same discrepancy and there is no way to test which of these explanations is the best one. In general, every wicked problem can be considered a symptom of another problem.

Why is designing and assessing usability of an ecology a wicked problem? First, different devices are often designed by different companies. We do not really know which particular combination of devices a given user will own. Evaluating all combinations is prohibitively expensive, and expecting one company to provide all the devices is not ideal either, as monopolies tend to stifle innovation. As a result, the user is stuck in an environment that can at best provide a local optimum—“if you use this device with this other device, then your email will work ok.”

Second, while some problems are addressed easily by careful design of system architecture, eventually new uses emerge that were not anticipated by the designers. For example, if a user is using IMAP as the server protocol for his/her email, then all devices are “current” with each other as the information about her/his email is stored in a central location. But even this careful design of network protocols and systems architecture cannot account for all the uses that evolve over time. The email address autocompletion and the signature that appears at the bottom of your email are both attributes of the clients and are not in the IMAP protocol. Thus, a solution based on standards can only support agreed common tasks from the past but does not support emergent behavior.

Third, the adoption of a new device into the ecology often breaks other parts that were already working effectively. As a result, whatever effort has gone into solving a workflow problem is lost when a different combination of devices is present. For example, I use an Apple MacBook Pro as my main computer, an iPad for most of my home use, and an Android phone for my communication needs. At times, finding a good workflow for these three devices is a challenge. I have settled on using GMail and Google Calendar in all three devices because there is excellent support for all three. But other genres are not as well supported. Task management, for example, is one where I currently do not have a good solution that works in my phone, the most recent addition to my PIE. New devices upset the equilibrium of the ecosystem; the problem that I am addressing (task management) is a symptom of another problem I introduced.

Fourth, the impact of the changes in an ecosystem is highly personalized. I know users whose email management and information practices improved when they obtained a smartphone. For them, most of their email traffic was short and for the purpose of coordinating meetings or upcoming events. Introduction of a smartphone allowed them to be more effective in their email communication. For me, for example, the impact was the opposite. As most knowledge workers, I do a lot of work over email with discussions and document exchanges. The result is that I tag my email and file messages extensively. But because my phone and tablet device provide poor support for filing messages, I now leave more messages in my inbox to be processed when I am back on my laptop. Before I added my smartphone to my ecosystem, my inbox regularly contained 20 messages. Now, my inbox has pending tasks from when I was mobile. The result is that I have 50 to 60 messages regularly in my inbox. Returning to my laptop now requires that I “catch-up” on work that I did while mobile. The impact of adding a smartphone has been negative to me, in some respects, whereas for other users it had a positive effect.

Finally, a suitable solution to a personal ecosystem is one that depends on the user doing some work as a designer of his or her own information flow. Users have to be able to observe their use, identify their own inefficiencies, propose solutions, and design workflows that implement those solutions. Needless to say, not every user has the skills to be a designer and to even be able to self-assess where their information flow is disrupted. Spinuzzi (2001) discusses this point using the Bødker (1991) concept of breakdowns. Paraphrasing Spinuzzi, breakdowns are points at which a person realizes that his or her information flow is not working as expected and thus the person must devote attention to his or her tools/ecosystem instead of his or her work. Typically this is what a usability engineer would consider a usability problem, but in the context of a PIE, this problem is so deeply embedded in the particular combination of devices, user tasks, and user information flows that it is practically impossible for a usability engineer to identify this breakdown. We are left with the user as a designer as the only option of improving the usability of a PIE.

As usability engineers, we face a big challenge on how to study, design, and evaluate user experience of personal information ecosystem that have emerged in today’s ubiquitous environments.

References

1. Bødker S. Through the Interface: A Human Activity Approach to User Interface Design. Hillsdale, New Jersey: Erlbaum; 1991.

2. Pérez-Quiñones MA, Tungare M, Pyla PS, Harrison S. Personal Information Ecosystems: Design Concerns for Net-Enabled Devices. In: Brasil: October 28–30, Vila Velha, Espírito Santo; 2008;3–11.

3. Rittel H. On the planning crisis: Systems Analysis of the ‘first and Second Generations’. Bedriftskonomen. 1972;8:390–396.

4. Spinuzzi C. Grappling with Distributed Usability: A Cultural-Historical Examination of Documentation Genres over Four Decades. Journal of Technical Writing and Communication. 2001;31(1):41–59.

5. Weiser M. The Computer for the 21st Century. Scientific American 1991;94–100 September.

Conceptual design in the interaction perspective

The conceptual design from the interaction perspective is used to communicate a design vision of how the user operates the system. A good example of conceptual design from an interaction perspective is the Mac Time Machine backup feature discussed previously. Once that metaphor is established, the interaction design can be fleshed out to leverage it.

The designers of this feature use smooth animation through space to represent traveling through the different points in time where the user made backups. When the user selects a backup copy from a particular time in the past, the system lets the user browse through the files from that date. Any files from that backup can be selected and they “travel through time” to the present, thereby recovering the lost files.

As an example of designers leveraging the familiarity of conceptual designs from known applications to new ones, consider a well-known application such as Microsoft Outlook. People are familiar with the navigation bar on the left-hand side, list view at the top right-hand side, and a preview of the selected item below the list. When designers use that same idea in the conceptual design of a new application, the familiarity carries over.

Conceptual design in the emotional perspective

Conceptual design from the emotional perspective is used to communicate a vision of how the design elements will evoke emotional impact in users. Returning to the car example, the design concept could be about jaw-dropping performance and how your heart skips a beat when you see its aerodynamic form or it could be about fun and being independent from the crowd. Ask any MINI driver about what their MINI means to them.

In Figure 8-3 we summarize conceptual design in the three perspectives.

image

Figure 8-3 Designer workflow and connections among the three conceptual design perspectives.

Example: Conceptual Design for the Ticket Kiosk System

There is a strong commonly held perception of a ticket kiosk that includes a box on a pedestal and a touchscreen with colorful displays showing choices of events. If you give an assignment to a team of students, even most HCI students, to come up with a conceptual design of a ticket kiosk in 30 minutes, 9 times out of 10 you will get something like this.

But if you teach them to approach it with design thinking and ideation, they can come up with amazingly creative and varied results.

In our ideation about the Ticket Kiosk System, someone mentioned making it an immersive experience. That triggered more ideas and sketches on how to make it immersive, until we came up with a three-panel overall design. In Figure 8-4 we show this part of a conceptual design for the Ticket Kiosk System showing immersion in the emotional perspective.

image

Figure 8-4 Part of a conceptual design showing immersion in the emotional perspective (sketch courtesy of Akshay Sharma, Virginia Tech Department of Industrial Design).

Here is a brief description of the concept, in outline form.

ent The center screen is the interaction area, where immersion and ticket-buying action occur.

ent The left-hand screen contains available options or possible next steps; for example, this screen might provide a listing of all required steps to complete a transaction, including letting user access these steps out of sequence.

ent The right-hand screen contains contextual support, such as interaction history and related actions; for example, this screen might provide a summary of the current transaction so far and related information such as reviews and ratings.

ent The way that the three panels lay out context as a memory support and for consistent use is a kind of human-as-information-processor concept.

ent Using the sequence of panels to represent the task flow is a kind of engineering concept.

ent Each next step selection from the left-hand panel puts the user in a new kind of immersion in the center screen, and the previous immersion situation becomes part of the interaction history on the right-hand panel.

ent Addressing privacy and enhancing the impression of immersion: When the ticket buyer steps in, rounded shields made of classy materials gently wrap around. An “Occupied” sign glows on the outside. The inside of the two rounded half-shells of the shield become the left-hand-side and right-hand-side interaction panels.

In Figure 8-5 we show ideas from an early conceptual design for the Ticket Kiosk System from the ecological perspective.

image

Figure 8-5 Early conceptual design ideas from the ecological perspective (sketch courtesy of Akshay Sharma, Virginia Tech Department of Industrial Design).

In Figure 8-6 we show ideas from an ecological conceptual design for the Ticket Kiosk System focusing on a feature for a smart ticket to guide users to seating.

image

Figure 8-6 Ecological conceptual design ideas focusing on a feature for a smart ticket to guide users to seating (sketch courtesy of Akshay Sharma, Virginia Tech Department of Industrial Design).

In Figure 8-7 we show ecological conceptual design ideas for the Ticket Kiosk System focusing on a feature showing communication connection with a smartphone. You can have a virtual ticket sent from a kiosk to your mobile device and use that to enter the event.

image

Figure 8-7 Ecological conceptual design ideas focusing on a feature showing communication connection with a smartphone (sketch courtesy of Akshay Sharma, Virginia Tech Department of Industrial Design).

In Figure 8-8 we show ecological conceptual design ideas for the Ticket Kiosk System focusing on the features for communicating and social networking.

image

Figure 8-8 Ecological conceptual design ideas focusing on the features for communicating and social networking (sketch courtesy of Akshay Sharma, Virginia Tech Department of Industrial Design).

In Figure 8-9 we show part of a conceptual design for the Ticket Kiosk System in the interaction perspective.

Exercise

See Exercise 8-1, Conceptual Design for Your System

image

Figure 8-9 Part of a conceptual design in the interaction perspective (sketch courtesy of Akshay Sharma, Virginia Tech Department of Industrial Design).

8.4 Storyboards

8.4.1 What Are Storyboards?

A storyboard is a sequence of visual “frames” illustrating the interplay between a user and an envisioned system. Storyboards bring the design to life in graphical “clips,” freeze-frame sketches of stories of how people will work with the system. This narrative description can come in many forms and at different levels.

Storyboards for representing interaction sequence designs are like visual scenario sketches, envisioned interaction design solutions. A storyboard might be thought of as a “comic-book” style illustration of a scenario, with actors, screens, interaction, and dialogue showing sequences of flow from frame to frame.

8.4.2 Making Storyboards to Cover All Design Perspectives

From your ideation and sketches, select the most promising ideas for each of the three perspectives. Create illustrated sequences that show each of these ideas in a narrative style.

Include things like these in your storyboards:

ent Hand-sketched pictures annotated with a few words

ent All the work practice that is part of the task, not just interaction with the system, for example, include telephone conversations with agents or roles outside the system

ent Sketches of devices and screens

ent Any connections with system internals, for example, flow to and from a database

ent Physical user actions

ent Cognitive user actions in “thought balloons”

ent Extra-system activities, such as talking with a friend about what ticket to buy

For the ecological perspective, illustrate high-level interplay among human users, the system as a whole, and the surrounding context. Look at the envisioned flow model for how usage activities fit into the overall flow. Look in the envisioned social model for concerns and issues associated with the usage in context and show them as user “thought bubbles.”

As always in the ecological perspective, view the system as a black box to illustrate the potential of the system in a context where it solves particular problems. To do this, you might show a device in the hands of a user and connect its usage to the context. As an example, you might show how a handheld device could be used while waiting for a flight in an airport.

In the interaction perspective, show screens, user actions, transitions, and user reactions. You might still show the user, but now it is in the context of user thoughts, intentions, and actions upon user interface objects in operating the device. Here is where you get down to concrete task details. Select key tasks from the HTI, design scenarios, and task-related models to feature in your interaction perspective storyboards.

Use storyboards in the emotional perspective to illustrate deeper user experience phenomena such as fun, joy, and aesthetics. Find ways to show the experience itself—remember the excitement of the mountain bike example from Buxton (Chapter 1).

Example: Ticket Kiosk System Storyboard Sketches in the Ecological Perspective

See Figure 8-10 for an example of a sequence of sketches as a storyboard depicting a sequence using a design in the ecological perspective.

image

image

image

image

image

image

Figure 8-10 Example of a sequence of sketches as a storyboard in the ecological perspective (sketches courtesy of Akshay Sharma, Virginia Tech Department of Industrial Design).

Example: More Ticket Kiosk System Storyboard Sketches in the Ecological Perspective

In Figure 8-11 we show part of a different Ticket Kiosk System storyboard in the ecological perspective.

image

image

Figure 8-11 Part of a different Ticket Kiosk System storyboard in the ecological perspective (sketches courtesy of Akshay Sharma, Virginia Tech Department of Industrial Design).

Example: Ticket Kiosk System Storyboard Sketches in the Interaction Perspective

The following is one possible scenario that came out of an ideation session for an interaction sequence for a town resident buying a concert ticket from the Ticket Kiosk System. This example is a good illustration of the breadth we intend for the scope of the term “interaction,” including a person walking with respect to the kiosk, radio-frequency identification at a distance, and audio sounds being made and heard. This scenario uses the three-screen kiosk design, where LS = left-hand screen, CS = center screen, RS = right-hand screen, and SS = surround sound.

ent Ticket buyer walks up to the kiosk

ent Sensor detects and starts the immersive protocol

ent Provides “Occupied” sign on the wrap-around case

ent Detects people with MU passports

ent Greets buyer and asks for PIN

ent [CS] Shows recommendations and most popular current offering based on buyer’s category

ent [RS] Shows buyer’s profile if one exists on MU system

ent [LS] Lists options such as browse events, buy tickets, and search

ent [CS] Buyer selects “Boston Symphony at Burruss Hall” from the recommendations

ent [RS] “Boston Symphony at Burruss Hall” title and information and images

ent [SS] Plays music from that symphony

ent [CS] Plays simulated/animated/video of Boston Symphony in a venue that looks like Burruss Hall. Shows “pick date and time”

ent [LS] Choices, pick date and time, go back, exit.

ent [CS] Buyer selects “pick date and time” option

ent [CS] A calendar with “Boston Symphony at Burruss Hall” is highlighted, with other known events and activities with clickable dates.

ent [CS] Buyer selects date from the month view of calendar (can be changed to week)

ent [RS] The entire context selected so far, including date

ent [CS] A day view with times, such as Matinee or evening. The rest of the slots in the day show related events such as wine tasting or special dinner events.

ent [LS] Options for making reservations at these special events

ent [CS] Buyer selects a time

ent [RS] Selected time

ent [CS] Available seating chart with names for sections/categories aggregate number of available seats per each section

ent [LS] Categories of tickets and prices

ent [CS] Buyer selects category/section

ent [RS] Updates context

ent [CS] Immerses user from a perspective of that section. Expands that section to show individual available seats. Has a call to action “Click on open seats to select” and an option to specify number of seats.

ent [LS] Options to go back to see all sections or exit

ent [CS] Buyer selects one or more seats by touching on available slots. A message appears “Touch another seat to add to selection or touch selected seat to unselect.”

ent [CS] Clicks on “Seat selection completed”

ent [RS] Updates context

ent [CS] Shows payment options and a virtual representation of selected tickets

ent [LS] Provides options with discounts, coupons, sign up for mailing lists, etc.

ent [CS] Buyer selects a payment option

ent [CS] Provided with a prompt to put credit card in slot

ent [CS] Animates to show a representation of the card on screen

ent [CS] Buyer completes payment

ent [LS] Options for related events, happy hour dinner reservations, etc. These are contextualized to the event they just bought the tickets just now.

ent [CS] Animates with tickets and CC coming back out of their respective slots

In Figure 8-12 we have shown sample sketches for a similar storyboard.

image

image

image

image

image

image

image

image

image

image

image

Figure 8-12 Sample sketches for a similar concert ticket purchase storyboard in the interaction perspective (sketches courtesy of Akshay Sharma, Virginia Tech Department of Industrial Design).

8.4.3 Importance of Between-Frame Transitions

Storyboard frames show individual states as static screenshots. Through a series of such snapshots, storyboards are used to show the progression of interaction over time. However, the important part of cartoons (and, by the same token, storyboards) is the space between the frames (Buxton, 2007b). The frames do not reveal how the transitions are made.

For cartoons, it is part of the appeal that this is left to the imagination, but in storyboards for design, the dynamics of interaction in these transitions are where the user experience lives and the actions between frames should be part of what is sketched. The transitions are where the cognitive affordances in your design earn their keep, where most problems for users exist, and where the challenges lie for designers.

Cognitive Affordance

A cognitive affordance is a design feature that helps users with their cognitive actions: thinking, deciding, learning, remembering, and knowing about things.

We can augment the value of our storyboards greatly to inform design by showing the circumstances that lead to and cause the transitions and the context, situation, or location of those actions. These include user thoughts, phrasing, gestures, reactions, expressions, and other experiential aspects of interaction. Is the screen difficult to see? Is the user too busy with other things to pay attention to the screen? Does a phone call lead to a different interaction sequence?

In Figure 8-13 we show a transition frame with a user thought bubble explaining the change between the two adjacent state frames.

Exercise

See Exercise 8-2, Storyboard for Your System

image

image

image

Figure 8-13 Storyboard transition frame with thought bubble explaining state change (sketches courtesy of Akshay Sharma, Virginia Tech Department of Industrial Design).

8.5 Design influencing user behavior

Beale (2007) introduces the interesting concept of slanty design. “Slanty design is an approach that extends user-centered design by focusing on the things people should (and should not) be able to do with the product(s) behind the design.” Design is a conversation between designers and users about both desired and undesired usage outcomes. But user-centered design, for example, using contextual inquiry and analysis, is grounded in the user’s current behavior, which is not always optimal. Sometimes, it is desirable to change, or even control, the user’s behavior.

The idea is to make a design that works best for all users taken together and for the enterprise at large within the ecological perspective. This can work against what an individual user wants. In essence, it is about controlling user behavior through designs that attenuate usability from the individual user’s interaction perspective, making it difficult to do things not in the interest of other users or the enterprise in the ecological perspective, but still allowing the individual users to accomplish the necessary basic functionality and tasks.

One example is sloped reading desks in a library, which still allow reading but make it difficult to place food or drink on the desk or, worse, on the documents. Beale’s similar example in the domain of airport baggage claims is marvelously simple and effective. People stand next to the baggage conveyor belt and many people even bring their carts with them. This behavior increases usability of the system for them because the best ease of use occurs when you can just pluck the baggage from the belt directly onto the cart.

However, crowds of people and carts cause congestion, reducing accessibility and usability of other users with similar needs. Signs politely requesting users to remain away from the belt except at the moment of luggage retrieval are regrettably ineffective. A slanty design for the baggage carousel, however, solves the problem nicely. In this case, it involves something that is physically slanty; the surrounding floor slopes down away from the baggage carousel.

This interferes with bringing carts close to the belt and significantly reduces the comfort of people standing near the belt, thus reducing individual usability by forcing people to remain away from the carousel and then make a dash for the bags when they arrive within grasping distance. But it works best overall for everyone in the ecological perspective. Slanty design includes evaluation to eliminate unforeseen and unwanted side effects.

There are other ways that interaction design can influence user behavior. For example, a particular device might change reading habits. The Amazon Kindle device, because of its mobility and connectedness, makes it possible for users to access and read their favorite books in many different environments. As another example, interaction design can influence users to be “green” in their everyday activities. Imagine devices that detect the proximity of the user, shutting themselves down when the user is no longer there, to conserve power.

The Green Machine User-Experience Design: An Innovative Approach to Persuading People to Save Energy with a Mobile Device That Combines Smart Grid Information Design Plus Persuasion Design

Aaron Marcus, President, and Principal Designer/Analyst, Aaron Marcus and Associates, Inc. (AM+A)

In past decades, electric meters in homes and businesses were humble devices viewed primarily by utility company service technicians. Smart Grid developments to conserve energy catapult energy data into the forefront of high-technology innovation through information visualization, social media, education, search engines, and even games and entertainment. Many new techniques of social media are transforming society and might incorporate Smart Grid data. These techniques include the following:

ent Communication: Blogs, microblogging, social networking, soc net aggregation, event logs/tracking

ent Collaboration: wikis, social bookmarking (social tagging), social news, opinions, Yelp

ent Multimedia: photo/video sharing, livecasting, audio/music sharing

ent Reviews and opinions: product/business reviews, community Q+As

ent Entertainment: platforms, virtual worlds, game sharing

Prototypes of what might arise are to be found in many places around the Internet. As good as these developments are, they do not go far enough. Just showing people information is good, but not sufficient. What seems to be missing is persuasion.

We believe that one of the most effective ways in which to reach people is to consider mobile devices, in use by more than three billion people worldwide. Our Green Machine mobile application prototype seeks to persuade people to save energy.

Research has shown that with feedback, people can achieve a 10% energy-consumption reduction without a significant lifestyle change. In the United States, this amount is significant, equal to the total energy provided by wind and solar resources, about 113.9 billion kwh/year. President Obama allocated more than $4 billion in 2010 Smart Grid funding to help change the context of energy monitoring and usage. Most of the Smart Grid software development has focused on desktop personal computer applications. Relatively few have taken the approach of exploring the use of mobile devices, although an increasing number are being deployed.

For our Green Machine project, we selected a home-consumer context to demonstrate in an easy-to-understand example how information design could be merged with persuasion design to change users’ behavior. The same principles can be reapplied to the business context, to electric vehicle usage, and to many other contexts. For our use scenario, we assumed typical personas, or user profiles: mom, dad, and the children, who might wish to see their home energy use status and engage with the social and information options available on their mobile devices.

We incorporated five steps of behavior-changing process: increasing frequency of use of sustainability tools, motivating people to reduce energy consumption, teaching them how to reduce energy consumption, persuading them to make short-term changes, and persuading them to make long-term changes in their behavior. This process included, for example, the following techniques: rewards, using user-centered design, motivating people via views into the future, motivating them through games, providing tips to help people get started and to learn new behaviors, providing visual feedback, and providing social interaction.

We tested the initial designs with about 20 people, of varying ages (16–65), both men and women, students, professionals, and general consumers. We found most were quite positive about the Green Machine to be effective in motivating them and changing their behavior in both the short and the long term. A somewhat surprising 35% felt a future view of the world in 100 years was effective even though the news was gloomy based on current trends. We made improvements in icon design, layout, and terminology based on user feedback.

The accompanying two figures show revised screen designs for comparison of energy use and tips for purchasing green products. The first image shows how the user compares energy use with a friend or colleague. Data charts can appear, sometimes with multiple tracks, to show recent time frames, all of which can be customized, for example, a longer term can show performance over a month’s time, or longer. The second image shows data about a product purchase that might lead the user to choose one product/company over another because of their “green” attributes. A consumption meter at the top of each screen is a constant reminder of the user’s performance. Other screens offer a view into the future 100 years from now to show an estimate of what the earth will be like if people behave as the user now does. Still other screens show social networking and other product evaluation screens to show how a user might use social networks and product/service data to make smarter choices about green behavior.

image

image

The Green Machine concept design proved sturdy in tests with potential users. The revised version stands ready for further testing with multicultural users. The mental model and navigation can be built out further to account for shopping, travel, and other energy-consuming activities outside the home. The Green Machine is ready to turn over to companies or governmental sponsors of commercial products and services based on near-term Smart Grid technology developments, including smart-home management and electric/hybrid vehicle management. Even more important, the philosophy, principles, and techniques are readily adapted to other use contexts, namely that of business, both enterprise and small-medium companies, and with contexts beyond ecological data, for example, healthcare. Our company has already developed a follow-on concept design modeled on the Green Machine called the Health Machine.

Coupled with business databases, business use contexts, and business users, the Green Machine for Business might provide another example of how to combine Smart Grid technology with information design and persuasion design for desktop, Web, and mobile applications that can more effectively lead people to changes in business, home, vehicle, and social behavior in conserving energy and using the full potential of the information that the Smart Grid can deliver.

Acknowledgment

This article is based on previous publications (Jean and Marcus, 2009, 2010; Marcus 2010a,b); it includes additional/newer text and newer, revised images.

References

1. Jean J, Marcus A. The Green Machine: Going Green at Home. User Experience (UX). 2009;8(4):20–22ff.

2. June 2010, Marcus A. Green Machine Project. DesignNet. 2010;153(6):114–115 (in Korean).

3. July 2010, South Africa Marcus A. The Green Machine. Metering International 2010;(2):90–91.

4. Marcus A, Jean J. Going Green at Home: The Green Machine. Information Design Journal. 2010;17(3):233–243.

8.6 Design for embodied interaction

Embodied interaction refers to the ability to involve one’s physical body in interaction with technology in a natural way, such as by gestures. Antle (2009) defines embodiment as “how the nature of a living entity’s cognition is shaped by the form of its physical manifestation in the world.” As she points out, in contrast to the human as information processor view of cognition, humans are primarily active agents, not just “disembodied symbol processors.” This means bringing interaction into the human’s physical world to involve the human’s own physical being in the world.

Embodied interaction, first identified by Malcolm McCullough in Digital Ground (2004) and further developed by Paul Dourish in Where the Action Is (2001) is central to the idea of phenomenological interaction. Dourish says that embodied interaction is about “how we understand the world, ourselves, and interaction comes from our location in a physical and social world of embodied factors.” It has been described as moving the interaction off the screen and into the real world. Embodied interaction is action situated in the world.

To make it a bit less abstract, think of a person who has just purchased something with “some assembly required.” To sit with the instruction manual and just think about it pales in comparison to supplementing that thinking with physical actions in the working environment—holding the pieces and moving them around, trying to fit them this way and that, seeing and feeling the spatial relations and associations among the pieces, seeing the assembly take form, and feeling how each new piece fits.

This is just the reason that physical mockups give such a boost to invention and ideation. The involvement of the physical body, motor movements, visual connections, and potentiation of hand–eye–mind collaboration lead to an embodied cognition far more effective than just sitting and thinking.

Simply stated, embodiment means having a body. So, taken literally, embodied interaction occurs between one’s physical body and surrounding technology. But, as Dourish (2001) explains embodiment does not simply refer to physical reality but “the way that physical and social phenomena unfold in real time and real space as a part of the world in which we are situated, right alongside and around us.”

As a result, embodiment is not about people or systems per se. As Dourish puts it, “embodiment is not a property of systems, technologies, or artifacts; it is a property of interaction. Cartesian approaches separate mind, body, and thought from action, but embodied interaction emphasizes their duality.”

Although tangible interaction (Ishii & Ullmer, 1997) seems to have a following of its own, it is very closely related to embodied interaction. You could say that they are complements to each other. Tangible design is about interactions between human users and physical objects. Industrial designers have been dealing with it for years, designing objects and products to be held, felt, and manipulated by humans. The difference now is that the object involves some kind of computation. Also, there is a strong emphasis on physicality, form, and tactile interaction (Baskinger & Gross, 2010).

More than ever before, tangible and embodied interaction calls for physical prototypes as sketches to inspire the ideation and design process. GUI interfaces emphasized seeing, hearing, and motor skills as separate, single-user, single-computer activities. The phenomenological paradigm emphasizes other senses, action-centered skills, and motor memory. Now we collaborate and communicate and make meaning through physically shared objects in the real world.

In designing for embodied interaction (Tungare et al., 2006), you must think about how to involve hands, eyes, and other physical aspects of the human body in the interaction. Supplement the pure cognitive actions that designers have considered in the past and take advantage of the user’s mind and body as they potentiate each other in problem solving.

Design for embodied interaction by finding ways to shape and augment human cognition with the physical manifestations of motor movements, coupled with visual and other senses. Start by including the environment in the interaction design and understand how it can be structured and physically manipulated to support construction of meaning within interaction.

Embodied interaction takes advantage of several things. One is that it leverages our innate human traits of being able to manipulate with our hands. It also takes advantage of humans’ advanced spatial cognition abilities—laying things on the ground and using the relationships of things within the space to support design visually and tangibly.

If we were to try to make a digital version of a game such as Scrabble (example shown later), one way to do it is by creating a desktop application where people operate in their own window to type in letters or words. This makes it an interactive game but not embodied.

Another way to make Scrabble digital is the way Hasbro did it in Scrabble Flash Cubes (see later). They made the game pieces into real physical objects with built-in technology. Because you can hold these objects in your hands, it makes them very natural and tangible and contributes to emotional impact because there is something fundamentally natural about that.

Example: Embodied and Tangible Interaction in a Parlor Game

Hasbro Games, Inc. has used embedded technology in producing an electronic version of the old parlor game Scrabble. The simple but fun new Scrabble Flash Cubes game is shown in Figure 8-14. The fact that players hold the cubes, SmartLink letter tiles, in their hands and manipulate and arrange them with their fingers makes this a good example of embodied and tangible interaction.

image

Figure 8-14 The Scrabble Flash Cube game.

At the start of a player’s turn, the tiles each generate their own letter for the turn. The tiles can read each other’s letters as they touch as a player physically shuffles them around. When the string of between two and five letters makes up a word, the tiles light up and beep and the player can try for another word with the same tiles until time is up.

The tiles also work together to time each player’s turn, flag duplicates, and display scores. And, of course, it has a built-in dictionary as an authority (however arbitrary it may be) on what comprises a real word.

8.7 Ubiquitous and situated interaction

8.7.1 Ubiquitous, Embedded, and Ambient Computing

The phenomenological paradigm is about ubiquitous computing (Weiser, 1991). Since the term “computing” can conjure a mental image of desktop computers or laptops, perhaps the better term would be ubiquitous interaction with technology, which is more about interaction with ambient computer-like technology worn by people and embedded within appliances, homes, offices, stereos and entertainment systems, vehicles, and roads.

Ubiquitous Interaction

Ubiquitous interaction is interaction occurring not just on computers and laptops but potentially everywhere in our environment. Interactive devices are being worn by people; embedded within appliances, homes, offices, stereos and entertainment systems, vehicles, and roads; and finding their way into walls, furniture, and objects that we carry.

Kuniavsky (2003) concludes that ubiquitous computing requires extra careful attention to design for the user experience. He believes ubiquitous computing devices should be narrow and specifically targeted rather than multipurpose or general-purpose devices looking more like underpowered laptops. And he emphasizes the need to design complete systems and infrastructures instead of just devices.

The concept of embedded computing leans less toward computing in the living environment and more toward computing within objects in the environment. For example, you can attach or embed radio-frequency identification chips and possibly limited GPS capabilities in almost any physical object and connect it wirelessly to the Internet. An object can be queried about what it is and where it is. You can ask your lost possessions where they are (Churchill, 2009).

There are obvious applications to products on store or warehouse shelves and inventory management. More intelligence can be built into the objects, such as household appliances, giving them capabilities beyond self-identification to sensing their own environmental conditions and taking initiative to communicate with humans and with other objects and devices. As example is ambient computing as manifest in the idea of an aware and proactive home.

8.7.2 Situated Awareness and Situated Action

The phenomenological paradigm is also about situated awareness in which the technology and, by the same token, the user are aware of their context. This includes awareness of the presence of others in one’s own activity space and their awareness of your virtual presence in their activity spaces. In a social interaction setting, this can help find other people and can help cultivate a feeling of community and belonging (Sellen et al., 2006).

Being situated is all about a sense of “place,” the place of interaction within the broader usage context. An example of situated awareness (credit not ours) is a cellphone that “knows” it is in a movie theater or that the owner is in a nonphone conversation; that is, the device or product encompasses knowledge of the rules of human social politeness.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.188.135.58