Chapter 12. Beyond and Behind the Screen

In the preceding chapters, we explored the world of designing screen-based interfaces for the web, software, and mobile devices. These patterns and best practices cover the world of people-facing digital product design, characterized by screen-based experiences that people can click and tap. Behind the scenes, however, changes are happening in the complex systems that power these interfaces and experiences, and more and more these changes are manifesting themselves in the way users interact with the systems and the systems interact (or don’t need to interact) with the user.

The majority of systems that are visible to the user involve the user contributing information or transactions to a system, and an interface that shows the results to the user. The patterns these systems use vary according to the intended use of that system.

Social media–oriented experiences such as YouTube, Facebook, or Twitter follow similar patterns, because at their core, the things they need to do are very similar. At a very basic level, the interactions involve a user posting content to a system that other people can view and comment on. The owner of the content can edit the content, delete it, or modify who can view it. Other users can like or share the content with others, or even tell the system if the content is offensive or if they choose not to see it. In this way, although the system mediates and shows content based on a complex set of algorithms, from the user’s perspective, the interactions are fairly straightforward.

Likewise, news sites like the New York Times, The Atlantic, or your local online newspaper also follow a similar pattern from a systems perspective. Behind the scenes there is a content management system that reporters, editors, and photographers use to put content into the system and review it before it goes out to the public. Users can view articles, share them, and in some systems, post and edit comments.

Ecommerce sites can also be broken down into a simple pattern. Behind the scenes, there is a system in which an employee inputs a photo and description and all the options available for a particular item; there is a system that keeps track of how many of these items the company has available and where it can be located. A user can view items by categories or collections, and eventually navigate to a product detail screen where they can select a particular size and quantity of items, add them to a cart, and make the purchase.

These are the patterns we see all over the web and mobile apps. But as technology is capable of handling more complex operations, the complexity of input and output likewise increases. Behind the scenes, algorithms (sets of rules of calculation of data) power the information and content that is displayed to users. These algorithms become more sophisticated as they evolve through machine learning: a subset of artificial intelligence in which systems look for patterns to infer identification and classification of data.

Ubiquitous computing (also referred to as the Internet of Things or the industrial internet) refers to the capability of objects or spaces to be embedded with internet-connected hardware, such as sensors, that can read information about the environment and communicate this information back. These systems can be profoundly complex, and most of the time are completely invisible to people.

These complex systems “interface” with users differently from the screen-based interactions we have focused on in this book. As interactions become more automated and invisible, the user won’t even need a keyboard--the interactions will become simpler, as a user confirms and approves of actions a system takes on the user’s behalf. These types of interactions will be more and more pervasive in the future.

The Ingredients: Smart Systems

The world of technology is undergoing a massive change in its infrastructure. Systems are moving from the user actively inputting data to systems that “read” activity and location to establish meaning of these indirect inputs.

Connected Devices

Connected devices are those that are connected to the internet. Your smartphone, TV, car, thermostat, lightbulbs, and even the dog’s food bowl can be connected and monitored via the internet.

Anticipatory Systems

Anticipatory systems are systems that quietly observe what the user is doing and serve up data, suggestions or even proactively place orders for the user. An example of this might be a connected refrigerator that knows what food is in the fridge and reorders the milk when the supply is low.

Assistive Systems

Assistive systems allow for the user to have their human capabilities augmented and enhanced through technology.

Natural User Interfaces

Natural user interface refers to “interfaces” that involve using motions, gestures, touch, or other tactile and sensory ways to provide input and output. The touch screen you use on your smart phone or tablet is an early example of a natural user interface. There are more examples, such as Amazon’s Alexa, or Microsoft’s Kinect system that reads gestures, and there will be many more of these types of interfaces that you can tap, squeeze, hold, wave at, or talk to.

Conclusion

The third edition of Designing Interfaces covers a vast amount of territory of design patterns and practices for designing for screen-based design, from desktop software to websites to mobile. As these systems become more complex, we hope the experience for the people who use these systems will become simpler and easier to understand. The role of a designer is to understand these patterns and apply them to their particular context. In this way, storytelling and narrative and imagining the scenarios and the “rules” that will apply to those scenarios will be one of the primary responsibilities of design.

Regardless of how interfaces find their way in front of a user, the patterns and principles of this book are evergreen. Understanding the foundational patterns of a sound UI architecture, creating a visual presentation with a clear information hierarchy that follows the Gestalt principles, supporting the user with appropriate help—all of these will bring clarity and understanding to an interface’s users.

We leave you with this thought. As designers and people who work in technology create the experiences that touch people’s lives in ways great and small, we are presented with the opportunity to move into a future with an ethical, human-centered mindset that makes life better for people and our planet. We have this amazing palette of technology and tools to build the future. The question we must now ask ourselves is this: what type of future will we create with it?

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.149.229.253