RUMMAGING AROUND IN THE clutter of handouts that comes home from school, I stumbled on an alarming article in my youngest daughter’s welcome packet, “It’s Digital Heroin: How Screens Turn Kids into Psychotic Junkies.” Although I certainly struggle with parenting in the digital age, I was taken aback by this high-panic pathologization of technology. Naturally, I tried Facetiming my middle school daughter, just in her room upstairs. No answer. Then, I texted my oldest daughter also at home somewhere. At first three dots, and then nothing.
Eventually, I found the same blurred and crooked photocopy in each packet—elementary school, middle school, and high school. When I sampled the opinion of other parents, no one gave it a second thought. When we think about the most vulnerable among us, emotions are laid bare. Smartphones make kids feel stressed, envious, depressed, and inadequate. Everyone knows that technology makes you feel horrible, I was told. But I wasn’t so sure.
This was a turning point for me. And, as a researcher, I approached it in the best way I knew how. Thousands of diary entries and hundreds of interviews later, I noticed how difficult it is to find a “good” experience that isn’t emotionally complex. A smooth process, once appreciated, was soon taken for granted. Rarely did I see mention of the little blips of delight we diligently design. When people described their highs and lows with technology, it was with mixed emotions. Often the most satisfying tech use tied to something bigger—a better version of themselves, an authentic connection, an engaging challenge, contribution to community. The emotional texture was not simply happy or sad, overwhelmed or calm. It was all of those simultaneously plus many, many others.
Yvonne Rogers, director of the University College London Interaction Centre, calls this technology for engaged living, or “engaging experiences that provoke us to learn, understand, and reflect more upon our interactions with technologies and each other.”1 As a complement to calm technology doing things for us in the background, it’s technology that no longer plays a purely functional role but social and emotional roles.
In this chapter, we consider emotional intelligence and its implications for technology. Rather than a new framework, what you’ll find in this chapter is a mashup of emotional intelligence and design thinking. Rather than designing for task completion or moving people toward a goal or even building habitual behaviors, it’s designing with emotion. But that doesn’t mean we’ll always design to make people feel a certain way. Here, we look at ways to build emotionally sustainable relationships with technology and one another.
Emotional intelligence is a gateway to a balanced life. People with high EQ are more likeable, empathetic, and more successful in their careers and personal lives. People who are a bit deficient when it comes to EQ are often unable to make key decisions in their lives, keep jobs, or build relationships. Even setting broad claims and myriad studies aside, it’s common sense. If we can understand and work with, rather than against, our own emotions and those of others, the outcome is positive.
The idea that emotion is integral to intelligence is age-old. From Plato to Proust, Spinoza to Sartre, Confucius to Chekhov, emotion is a way to make sense of the world and ourselves. The concept of emotional intelligence, as we know it, has been in circulation since the 1960s. Daniel Goleman’s bestselling 1995 book, Emotional Intelligence: Why It Can Matter More Than IQ (Bantam Books) popularized the idea at a time when IQ was the prevailing standard of excellence. Emotional intelligence is not just being emotional or feeling feelings, though. Emotional intelligence is recognizing emotion in yourself and others and managing those emotions in meaningful ways. It’s usually summarized like so:
Self-awareness, or how accurately you can recognize, understand, label, and express emotion. Most often associated with self-compassion, self-esteem, and confidence.
Self-management, or how well you can regulate your own emotion, which usually includes discipline, optimism, and resilience.
Social awareness, or the ability to recognize and attempt to understand the emotions of others. Also known as empathy but encompassing tolerance and rapport.
Social skills, or how well you can respond to the emotions of other people. This translates to vision, motivation, and conflict resolution.
Although Goleman’s model is the most widely known, it’s really a mix of two approaches. The ability model, advanced by Peter Salovey and John Mayer in the 1980s, emphasizes how people perceive, understand, and manage emotions.2 The trait model, based on the work of K. V. Petrides, says that emotional intelligence develops out of personality traits. Despite the variation between different models, each approach is more about emotional competencies than moral qualities.
To effectively reason, plan, and perform tasks (all things we think of as cognitive), human beings need to have emotional intelligence. It means managing feelings so that they are expressed appropriately and effectively. It means handling interpersonal relationships fairly and with empathy. It affects how we manage behavior, navigate social situations, make personal decisions, and work with others toward common goals. And emotional intelligence, whether you consider it more about personality traits or more about ability, can be cultivated.
That’s just what many organizations are trying to do. In schools, social and emotional learning (SEL) is core to many curriculums teaching ways to cope with emotional distress and pro-social behavior. Leading the way is the Yale School of Emotional Intelligence with its RULER model for emotional intelligence: Recognizing, Understanding, Labeling, Expressing, and Regulating emotion.
Companies take it seriously, too. Emotional intelligence has been identified as a core skill for the workforce in the next 50 years by the World Economic Foundation. Companies from Zappos to FedEx provide leadership training with emotional intelligence as a core component. Organizations rely on tools like Culture Amp and Office Vibe to foster emotionally intelligent workplaces. Starbucks employees learn the LATTE method to respond to negative emotions in positive ways: Listen to the customer, Acknowledge their complaint, Take action to solve it, Thank them, and Explain why it occurred.3
So, what would happen if we applied the principles of emotional intelligence to design?
First, it would need to start with the organization. Right now, few organizations recognize how much emotion matters. Those that do tend to make the same mistakes. Many don’t factor in emotion at all. Or they rely on weak substitutes, like Net Promoter Score or behavioral metrics. Others go straight for a desired emotion, without trying to understand the context. Or they focus on evoking emotion in the moment, without thinking about the long term. Often organizations focus too much on one emotion, like delight, without thinking about other emotions that create value and meaning.
Then, it would need to be supported with a method that encompasses new ways to recognize, understand, express, evoke, and sometimes cope with emotion. It could certainly bring in new technology, too, employing emotion AI to help sort it out. It would mean expanding our repertoire to account for a more diverse range of emotional experience. Let’s begin with a new way of approaching emotion.
Emotionally intelligent design starts from a mindset that considers emotion as intrinsic to the experience, not a nice-to-have extra. Whether we intend it or not, we already design emotion and build relationships. Where emotional design strives to create products to elicit an emotion, emotionally intelligent design builds emotional capacity. Designing products and services through the lens of emotional experience can make the experience better for everyone. First, a few guiding principles:
Empathy is already a critical aspect of design, but that hasn’t always translated to emotion. Learning about our emotional life with technology and other products means spending time understanding the full scope of emotion. When done well, it’s more than evoking emotion. Emotions are clues to what we value. Fear tells us that something important is threatened. Sadness might remind us of what’s been lost. Shame might indicate that we haven’t been living up to our own goals. Values, in turn, guide behavior, motivate toward action, and prompt judgment.
It’s not a one-to-one mapping. It’s not always the same. But it’s important that we pay attention whether or not we are in it for the long term. Value-Sensitive Design (VSD), developed by Batya Friedman and Peter Kahn at the University of Washington, intersects with emotionally intelligent design. A core principle for VSD looks at who will benefit and who will be harmed, considering tensions between trust and privacy, environmental sustainability and economic development, and control and democratization. If we are paying attention to emotion and translating what it means, it can work in tandem with VSD to identify values, explore tensions and consider trade-offs.
With something as deeply personal and wonderfully nuanced as our emotional lives, it’s clear that we need to take care. Prioritizing emotional experience is not just a different mindset, it affects how we practice design.
When we focus in very narrowly on someone in the moment of interacting with a technology—a user—our current approach works well enough. Even looking at a customer’s journey, hopping from one business touchpoint to another, we might feel comfortable with current methods. But when we expand our view to consider a broader range of human experience, it becomes trickier.
If we look at anthropology, the tension between being a participant and an observer is acknowledged and discussed. That’s a healthy conversation for our field to have, too. As an observer, you change the power dynamic. Rather than people framing the experience in their own way and telling their own story, the observer ultimately gets to be the storyteller. As a participant, you lose a bit of that outsider perspective. Either way, you change reality a bit. Let’s set some new ground rules.
Lead by example
Develop a shared understanding
What this really comes down to is a shift in perspective. We’ve created methods to understand behavior, but we’ve neglected emotion. Behavior can be observed much more readily than emotion. Either way, we see only the tiniest slice of real life. As technology insinuates itself into more facets of our everyday life, we need to facilitate more ways to understand how it affects our inner world as much as our outward behavior.
Now that have some new principles for emotional intelligence in design, let’s put them in action. Next, we need a method. Design thinking is not perfect, but it’s widely practiced and easy to follow for designers and nondesigners alike. And it’s readily extensible to new contexts. For instance, Microsoft’s Inclusive Design process, under the leadership of Kat Holmes, follows five phases similar to a design thinking process: get oriented, frame, ideate, iterate, and optimize. Likewise, IDEO’s Circular Design Guide to sustainable design merges Kate Raworth’s thinking in Doughnut Economics: Seven Ways to Think Like a 21st Century Economist (Chelsea Green, 2017) with design thinking in four phases: understand, define, make, and release.
Here, we merge emotional intelligence and design thinking. Should we call it design feeling? Or simply emotionally intelligent design? I’ll leave it to you to decide what feels most comfortable. Let’s use the following model for design feeling, with the easy-to-remember acronym FEEL.
Find, understand emotion in multiple dimensions using mixed methods.
Envision, map emotional experience and generate concepts.
Evolve, model and build relationships.
Live, develop ways to sustain the relationships.
You can observe the following as a step-by-step process. Or, you can supplement your current design practices with some of the ideas and activities. The core idea is the same—to design with greater emotional intelligence.
Empathy is essential to emotional intelligence. The concept stands in for a wide range of experiences, but usually it means both the ability to sense other people’s emotions and the ability to imagine what someone else might be thinking or feeling. The split between two types of empathy—affective empathy and cognitive empathy is contentious, though.
In our field, the emphasis has been on cognitive empathy. When we talk about empathy, we usually mean curiosity and perspective taking. In Practical Empathy: For Collaboration and Creativity in Your Work (Rosenfeld Media, 2015), Indi Young makes a point of differentiating between the two, finding cognitive empathy more useful for design.
Cognitive empathy gets us only so far, though. All we have to do is look at some of the current tech products on the market to test the limits. For example, the team who proposed a smartphone-enabled vending machine called Bodega designed to replace actual bodegas, almost certainly ticked off “empathy” in their design sprint, interviewing potential customers and possibly looking at behavioral data. And yet the proposed concept lacked emotional empathy.
In other areas of life, cognitive empathy is not enough. If you try to understand another person’s point of view without internalizing their emotions, you’re still detached. This can manifest in all kinds of ways. It might mean that you can’t understand another’s perspective. It might mean that you simply aren’t motivated to help. It might mean that you don’t fully realize the impact of your own behaviors on others. Take it a little further, and you’ll find narcissists and sociopaths who use cognitive empathy for their own gain, whether to manipulate opinion or inflict pain.
When you begin to allow yourself to feel what other people feel, that’s emotional empathy. It attunes us to another person’s inner world. And it even has a physical force, as emotional contagion activates mirror neurons in our brains, creating a kind of emotional echo. Emotional empathy has serious downsides, of course. It can lead to distress and physical exhaustion, so much so that certain professionals experience emotional burnout. Social, medical, and rescue workers can’t afford to let emotional empathy overwhelm them, but they’ll provide poor care without it. It’s not much of a stretch to see how that applies to designers, too.
In truth, we need both kinds of empathy. We need to understand what people are going through and to feel their emotions (to a degree). In some circles, this is called compassionate empathy. Whatever the case, it means expanding our repertoire.
As the first step in a design thinking process, empathy doesn’t always home in on emotion. Designers have developed keen observation skills but train their sights on individual behaviors. Emotions are more of an afterthought.
It’s not that observational methods can’t get at emotion. After all, we observe emotion in others all the time. High EQ is associated with how well you can notice subtle facial expressions and changes in tone and then interpret those signals. Emotion AI is trained to work in the same way, although in much broader categories. But there are limits to observation when it comes to emotion, too.
Observation can reveal truths but sometimes misses subtleties. Technology creates barely detectable shifts in behavior that can’t always be easily observed. If you are observing only a few people, you might not pick up on workarounds or adaptations. Because we train our sights almost exclusively on people acting alone, we might miss social dynamics. New gestures or expressions, even given the wonders of AI, go undetected. Emerging contexts of use are not always evident.
Emotion AI is no different in that respect; it’s observational, too. Already touted as a “lie detector,” it threatens to reveal your emotions, like it or not. Just as polygraphs tracked blood pressure and breathing to gauge stress levels associated with lying, so too does emotion AI. The emotion AI company Human promises real-time lie detection by analyzing faces from smartphone videos and security cameras (Figure 3-4). Coverus claims the same from eye tracking. Usually, the claim is not overt, but the assumptions are the same—smart observation reveals human truth.
Emotion AI is subject to many of the same pitfalls as any observational approach when it comes to the emotional side of experience. It captures physical signals that can be interpreted in many ways. It privileges the social performance of emotion. And it works within a limited context.
A vast expanse of human experience—arguably the most important part—is simply not considered by relying on what can be observed. Observation stays at the surface, so we miss out on how people perceive and interpret and feel. It omits how people make sense of their own experience. It skims over how people make their own meaning.
The implications go beyond that. By privileging observation to such an extent, we privilege our own voice as designers and developers. We frame the story by choosing the context for observation. We get to tell the story, rather than giving people ways to tell their own story. We then shape the story going forward, based on our interpretation of what we can see.
So, we need to let emotion into research. And we need to work with mixed methods to understand all aspects of emotion. Design has already embraced mixed methods for research. Adding emotion just takes it a bit further.
Start by considering how to understand different dimensions of emotion. Emotions have a physical dimension. Maybe your face heats up or you feel a tightness in your jaw. Maybe you get butterflies in your stomach. Or, you know, maybe you just smile. Then, there’s also a perceptual dimension; how we recognize and interpret a feeling, what we call it, how we describe it, and what we compare it to. Our perceptions are grounded in memories and possibly in future projections, too. Some of our social response seems automatic; most of it is learned and highly context-dependent. There’s a behavioral dimension. You might yell in anger or frustration—an external behavior. But you could also suppress it or internalize it in another way. Emotional responses change depending on whether you are alone or with others. That’s the social dimension. And the unspoken norms, conventions, rules, and even stereotypes add a cultural dimension.
With so much to consider, we need to push mixed methods a little further. For now, some of the methods listed in Table 3-1 are fringe. Not every team has access to emotion-sensing devices and platforms, but you don’t really need to. Consider this collection of methods as a frame to build on as affective computing becomes more common.
AFFECTIVE LAYER | SIGNALS | QUANTITATIVE METHOD | QUALITATIVE METHOD |
---|---|---|---|
Neurophysical | Pulse and temperature, gaze, brainwaves | Wearables, eye tracking, brainwave trackers | Observational research |
Perceptual | Core affect, personal meaning | Data over time, aggregate data | In-depth interviews, diaries, metaphor elicitation, therapeutic research |
Behavioral | Interactions | Behavioral analytics, satisfaction ratings | Observational research |
Social | Facial expression, intonation, body language, language | Facial analysis, voice analysis, gesture analysis, sentiment analysis | Group conversations, paired interviews, co-design activities |
Cultural | Norms, attitudes, laws, institutions | Location tracking, behavioral analytics, aggregate data, literature scans | Contextual inquiry, narrative study, co-design activities |
That’s big picture. Now let’s go step by step, starting with the most basic emotions. Even in broad strokes, even with the latest emotion AI, it’s not easy to do.
Perhaps you’ve seen Pixar’s Inside Out? The movie is about five basic emotions: joy, anger, fear, disgust, and sadness. Although the filmmakers considered including a full array of emotions, they kept it to five to simplify the story. Whether you agree that these five are universals or not, these are simply big categories to use as starting points. Think of each as a continent. Within each there are many states, cities, disputed territories, shifting boundaries. Those we’ll fill in later through qualitative research. Most designers aren’t paying much attention to even these big categories in research yet, but there are three main ways to get started.
First, you can add some emotion awareness to what you already do. We already gather some information about how an individual frames an experience, how they interpret what they see, and how they take action in the context of usability tests. We already observe personal context and daily ritual or routine in ethnographic interviews. An easy place to start is to simply make a point to notice verbal cues, facial expressions, pauses or hesitations, and body language and gesture. In Tragic Design: The Impact of Bad Product Design and How to Fix It (O’Reilly, 2017), Jonathan Shariat and Cynthia Savard-Saucier share a list to notice, including sighing, laughing, shifting in a chair, nervous tapping, and forceful typing. Going back to our new rules of engagement, you’ll need to be aware of your own biases and expand your emotional vocabulary to make this work.
It’s easy enough to add emotion categories to your data collection sheets or logs. If you are running research sessions alone, you can take notes on a simple cheat sheet during the conversation if it’s not too intrusive or after the session when you review the recording. If you have a partner, they should record their notes, too. The more people you can factor in to record and interpret, the better.
A simple sort of happiness, sadness, anger, fear, disgust, and surprise is a good place to start, even though it will be imperfect. You can take it a step further, analyzing the words in your written transcript using sentiment analysis or the tone in a voice recording using a tool like Beyond Verbal’s Moodies app. In workshops, I’ve had people try using Affdex Me and Moodies in combination to get a general read on emotion.
Obviously, there isn’t a one-to-one mapping between observed actions and emotion. A pause might indicate hesitation, confusion, or interest. Context might give you a clue, or it might not. Quickly swiping through might mean someone is having fun, but it could signal boredom or even anxiety.
Second, whenever possible, give people a chance to comment on their emotional experience. Add a way for people to self-report their emotion whether in-person or online. Even if you consider yourself keenly emotionally intelligent, you’ll miss a lot of emotional cues and you’ll misinterpret. Outward expression of emotion varies with social context, and research is an unusual one. People vary in their range of expression and intensity of emotion in ways you won’t be able to observe.
You can ask directly. Net Promoter Score or satisfaction surveys don’t tell us much about emotion. Instead, include questions that give people a chance to express emotion. A simple selection of smile or frown, thumbs up or thumbs down, can lend a basic read to positive or negative emotion. That’s a level one emotion signal.
Better still to get a simple range of emotion. YouX (Figure 3-5) is an experimental tool inspired by Plutchik’s wheel of emotion.5 Tools like PrEmo, which rely on images rather than words, work around language barriers. It can get at multiple emotions at the same time and it translates across cultures better, but it still assumes that everyone will be able to interpret facial expression. Follow either approach with a narrative question though and you’ll get context to understand the emotion as well as more granularity. Layer in sentiment analysis on open ends, and you’ll get a general read on emotion.
In a follow-up questionnaire or online survey, the most well-rounded approach is to give people a short list of emotions along with a narrative prompt to accompany it. In person, direct questions might not be the best approach. People are not apt to honestly reveal emotion to strangers in a research context. So indirect is best, whether you are prompting conversations between participants or engaging them in design activities.
Third, cautiously consider a tech layer. Emotion AI is able to detect broad categories, 5 to 10 at least. In the near future, it might be embedded in a product you’re developing, and you’ll be tasked with interpreting those signals to further evolve it. For now, it’s probably not. So, one way to get familiar is to begin trying some of the tools. You can certainly capture these same signals using existing research tools. A multimodal platform like iMotions layers together a few different biometrics tools to record facial expressions and tone of voice or heart rate (Figure 3-6). Getting participants in a lab, wearing headgear and sensor bracelets, is not the kind of research we typically do on design teams. Even so, trying it out can lend some understanding to how emotion detection embedded in products might work.
Platforms that detect emotion don’t automatically interpret the results, revealing what it all means. Emotion AI embedded in a product won’t either. Instead it will begin mapping broad categories to anticipated behaviors and types of content. Think back to one of our core principles: don’t assume too much.
Now that you’ve started to tune in to emotion signals, you’ve got an initial read on in-the-moment reactions to a product or service. That’s still limited. Next, move beyond broad strokes to understand emotion with greater nuance, and over time.
When you heard about Cambridge Analytica, were you mad? Or were you morally outraged, bitterly disappointed, filled with dread? Was the feeling intense? Did it linger? Did you comment online using words like “angry” or “bad,” or did you employ more nuanced words like “flagrant violation”? The more finely tuned your feelings, the more adept you’ll be at navigating your emotions.
The greater your sense of granularity and complexity, the richer your experience of the world, too. You’ll see analogies to wine or perfume. Think of it like fonts. Designers perceive subtle variations in the curvature of a’s and j’s, the slant terminal, or the spines. People who have less experience might not see these differences but can still distinguish between an oblique and an italic. A novice might be less capable of making these distinctions, perhaps picking out only differences between a serif and a sans serif. Then, there are those who have little sensitivity to fonts at all, just seeing letters in a string on a screen. Those novices won’t be equipped to decipher the subtle messaging behind a font choice or to select a font that perfectly conveys a feeling. Novices and experts alike can continue to learn and develop that intelligence. Ultimately, the payoff for discerning nuances in fonts is not as great as for emotion, but you get the idea.
If you’re able to make fine distinctions between many emotions, you’ll be better able to tailor them to your needs. You’ll adapt to new situations. You’ll be better at anticipating emotion in others. You’ll be able to read the emotion of a group. You’ll be able to construct more meaning from other people’s actions, too. A finer sensitivity to nuance translates to higher emotional intelligence. So, let’s see if we can bring that into design.
Emotion classification can be more art than science, with myriad possibilities. A few basic models can build toward greater nuance:
The circumplex model suggests that there are two main dimensions.
The PANA model, or positive and negative activation model, develops granularity along lines of positive and negative affect.
The mixed model, also known as Plutchik’s model, is a hybrid of circumplex for intensity (or arousal) and valence (positive or negative) (Figure 3-7).
Of course, it doesn’t end there. The PAD model adds dominance submission to the other dimensions of the circumplex model. The Lovheim cube of emotion is a three-dimensional model combining dopamine, adrenaline, and serotonin with eight basic emotions.
Rather than stress over which model to choose, the key takeaway is to understand a range of emotion. If the most emotionally resonant and sustainable relationships are the most complex, we need to lean into that complexity. Here are a few activities to add to your repertoire that will tease out more detail.
Made in cooperation with cultural institutions around the world, the object interview series, imagining objects as if they had separate lives, is quite whimsical. What if a vase were teaching French or a bench were playing hide and seek? As our tech products develop more personality and agency, this technique seems more and more relevant. If we consider people and product to be in a relationship, it means lending a voice to both.
Start with a narrative approach in which people tell stories about the product or an analogous product. If possible, start with one that elicits strong feelings (or even use that as a recruiting factor). Have them tell a story about it, show pictures, and draw it to describe that emotion. Then, flip it. Narrate from the object’s point of view. You’ll get a sense of how emotion builds or dissipates. You’ll begin to understand how conversations or interactions support or challenge.
Kansei engineering, the Japanese technique to translate feelings into product design, has always seemed intriguing and a bit mysterious. Developed in the 1970s to understand the emotions associated with a product domain, the approach centers on an analysis of the “semantic space” gathered from ads, articles, reviews, manuals, and customer stories. The analysis can be large-scale and statistical, but it doesn’t need to be.
In my work, I begin with interview transcripts, diaries, customer stories, and social media posts, but I find just culling emotion words is not very helpful. Love, hate, like, distracted, and angry don’t mean much without the context. So, instead, bring it into a follow-up interview or a participatory design activity to fill in that missing piece.
Sentence completion is a way to understand emotions and, in turn, values. Suppose that we are looking at a fitness app. You might include some sentences to get at emotions related to fitness, exercise, and healthy lifestyle. Here are just a few examples:
I feel ___________ when I exercise/eat healthy.
When something gets in the way of my routine, I feel ___________.
The best fitness experience is ___________.
When I think of my fitness/health, I dream about ___________.
You might also include exercises that ask about the app directly but emotion indirectly.
Using [app] is ___________.
To me [app] means ___________.
The [app] makes me think of ___________.
When I use [app], I think of myself as ___________.
When I use [app], other people think ___________.
Moving beyond basic emotion doesn’t need to be awkward or arduous. We can understand emotion by building on some of the techniques we already use or adding new activities and exercises to our repertoire. At this point, all of this emotional data is still abstract, though. Next, we need to visualize.
The finale to the find phase is to materialize emotion. Using technology, where appropriate, or low-tech prototyping methods, the aim is to find new ways to give substance to emotion uncovered in research. Creating a material vision surfaces the meaningful aspects of the experience.
Think of it as a remix of art therapy and participatory design research. Working within the comfort level of individuals and the team, the goal is to make the experience palpable. Making the data physical facilitates further discussion, and ultimately informs the design.
Organizations are already trying this approach in the mental health space. Mindapples provides kits for groups to share their “five-a-day” prescription for mental health. Aloebud is a self-care app that visualizes mental health as a garden. Stanford’s Ritual Design Lab installed a Zen Door in downtown San Francisco for the April 2015 Market Street Prototype Festival encouraging people to contribute their wishes as a kind of data sculpture.
Individuals are turning to it as a way of understanding physical and mental illness. For instance, Kaki King and Giorgia Lupi created data visualization to record a history of strange symptoms—mysterious bruises on Kaki’s daughter. The idea was not only to help understand the patterns but also to see with fresh eyes something that’s difficult to assess. At the same time, it became a coping strategy for processing the illness of a loved one. Laurie Frick’s Stress Inventory is another vivid example of transforming data she tracked about herself in a tangible form. It’s a way to interpret the data and a way to process the emotional impact.
When I run these sessions for clients, we distribute cards with a high-level activity related to building emotional capacity: discover, understand, cope, process, manage, enhance, remember, and anticipate. Depending on the project, there might also be cards that include scenarios or people, too. Most crucially, we create cards that summarize aspects of the data, such as a distribution of emotions words, a core emotion and related emotions, data that connects emotion with values, and so on. Each data card includes the topic, the source, a description, and a visual. Finally, I select materials that cover a range of properties. Materials that lend color like food coloring or paint. Materials like wood blocks that don’t have any give and, by way of contrast, materials that are flexible like moldable soap or erasers. Materials that bind together like elastic bands, or attract like magnets, or fit together like LEGOs. The idea is to represent a wide-open field of possibilities.
After an introduction to the activity and the data, individuals or groups choose and discuss the cards. Then, I have participants select a maximum of three materials before moving on to create a material representation of the research data and document their process. In a recent session about e-sports, emotion research materialized variously as a paper chain of people holding hands framed by color-coded translucent windows. It demonstrated how people were coming together for a certain amount of time to share an experience, that would color their view of reality afterward. This illuminated some of the emotional goals for the project while giving us a touchstone for further discussion.
The outcome is to find ways of characterizing and framing emotion, propose new ways of doing things or approaching issues, and gauge people’s reactions and responses. Rather than finding a universal color for joy or the default texture for security, which is likely a futile effort anyway, this instead lets us begin connecting physical qualities, features, and functionality with emotional experience. These emotional objects provide a bridge to the next phase.
Now that we’ve collected more information about emotion, by documenting the emotions and values at play, we can try to shape this into a strategic direction. Because our emotional life is subject to so much variation, we need to begin by bringing more people into the process.
When we start to treat people as collaborators, we need to make sure that they have a way to contribute in a meaningful way. Our current repertoire of participatory techniques favors extroverts and, like any group activity, tends to privilege some voices over others. It’s on us to acknowledge and amplify unique voices. We need time to reflect and to respond critically. We should do everything possible to draw out the imagination of the broader community.
Ideation sessions, hackathons, design sprints, and co-designs aren’t always conducive to open conversation and thoughtful reflection on our inner lives. So perhaps our practices are due for a refresh. One way we can do this is to lend the right prompts, constraints, and opportunities to speak to their unique strengths and capabilities. Here are a few new ways to stretch our practice with a mix of social and less-social activities.
It might sound like we are bringing empathy back into this phase. Yes. Empathy in all the phases. Besides a refresh on how we approach ideation, it has the potential to lend the support of disparate stakeholders. So, what do we do after we have a new frame for ideation? Create an emotional imprint.
After you’ve developed a conducive co-design space, use the materialized research as a bridge to design. Begin with the mix of emotions associated with the experience. What is your product’s core emotion? What else is associated with that emotion? What emotion do you want to evoke? What emotions are people expressing? What emotions are unexpressed? Which are the most intense feelings people identify? Which are the least? When beginning, sinking into, and finally leaving your experience, what states are you evoking and in what order? Your research might have answered some of these questions; some might remain. Either way, we can use these answers to begin.
First, you’ll create a map of the emotions associated with the experience. Empathy maps connect feelings with thoughts and actions, but the tendency is still to “solve” negative emotions. Instead, let’s try to connect emotion with motivation.
Motivation has many models. There’s self-determination theory, focusing on competence relatedness, and autonomy. There’s ERG theory, comprising existence, relatedness, and growth. There are intrinsic and extrinsic theories of motivation. There are goal setting theories. But let’s start with the familiar.
In a co-design setting, start with Maslow’s hierarchy of human needs. Besides near-universal recognition, it lends itself to easily mapping emotion to motivations, values, needs, bigger goals. You can begin by drawing the pyramid with the five levels of needs: physiological, safety, love and belonging, esteem, and self-actualization. When I do this exercise, I use the later model, which includes knowledge, beauty, and transcendence. To break out of hierarchical thinking, I don’t always use a pyramid (Maslow didn’t originally, either). It might be more difficult to consider those higher needs when basic physical and safety needs aren’t met, but every human life will still be shaped by higher needs, too. At scale, it’s dehumanizing to suggest otherwise. From there you can sort the emotional knowledge you gathered in research, according to needs, to see where strengths and weaknesses lie.
Sometimes, I use a four-world-style model, based on a pared-back model of emotional well-being (Figure 3-10). The matrix is a blend of Daniel Goleman’s model of emotional intelligence and recalls Patrick Jordan’s concept of the four pleasures. Emotional experience can be situated on a spectrum of self-directed or socially directed, pleasure-based or purpose-based, resulting in a four different kinds of experience.
Transformative, experience that facilitates personal growth
Compassionate, altruistic and prosocial experience
Perceptive, sensory-rich experience
Convivial, experience that brings people together socially
Transformative experiences create a context for an individual to grow and find personal significance. These are experiences promising to help you make progress toward a goal, whether it’s getting fit, saving money, or becoming more productive. Experiences that help you understand your psyche or your health go in this category, too.
Love of learning, achievement, wisdom, judgment, accomplishment, independence, capability, self-control, intellect, perseverance, prudence, self-respect
Examples: Duolingo, Fitbit, Lynda, Headspace, Mint
Compassionate experiences are those experiences that center on shared purpose, mutual growth, a common cause. Compassionate experiences facilitate giving, helping, and fostering empathetic community, from charitable giving sites to games for good, to civic action.
Fairness, perspective, community, equality, forgiveness, helpfulness, tolerance, citizenship, open-mindedness, integrity, mercy
Examples: Re-Mission, GoFundMe, Resistbot, Be My Eyes, WeFarm
Convivial experiences are social in the way that we most often think about social. These are experiences that emphasize bonding, reputation, shared activities, and conversation. Successful convivial experiences support layered communication, social experiences that engage the senses, mixed reality, shared rituals, and storytelling tools.
Friendship, social recognition, harmony, humor, intimacy, trust, nurturing, vulnerability, fairness
Examples: Snapchat, Kickstarter, Pokemon Go, Google Photos, Twitch
Perceptive experiences are sensory-rich with opportunities to play. They can be pure in-the-moment fun, like games or music, but can also help us to savor or wonder.
Humor, creativity, zest, curiosity, imagination, cheer, appreciation of beauty, comfort
Examples: Spotify, Monument Valley, Pinterest, Dark Sky, Keezy
Most products are not just one type, of course. An app for good like Charity Miles is both transformative and compassionate. Prompt, a visual diary for those who have memory loss, is both transformative and convivial. Wayfindr might be considered transformative but also perceptive. Skype could be convivial or compassionate, depending on how you use it. The point is not to fit an experience into a tidy box. Instead, it’s simply a way to analyze insights and understand strengths.
Let’s use Spotify to demonstrate how this works. Listening to music seems to sit squarely in the perceptive quadrant. If you think about making and sharing playlists, well, that is convivial. Perhaps you use Spotify Running to motivate you toward fitness goals. That’s transformative. We could easily imagine a Charity Channel or games that work with Spotify to raise awareness of social issues. That would be compassionate.
Ideally, you might try to boost all four quadrants. In practice, this is not always practical or even possible. But we can use the matrix to think through emotionally resonant experience in new ways and determine where to build capacity.
A part of mapping the emotional landscape means considering the emotional role the technology will play in people’s lives. Does it enhance an emotion that’s already there? Does it activate new emotion? Does it help people process their emotions about the product itself? Or relate to something else entirely? Does the whole experience stand in as a coping mechanism? Has it come to represent an emotional moment, or experience, or even just a feeling on its own? Recall Don Norman’s levels of cognitive processing: visceral, behavioral, reflective. Or, you can think about it in the following terms:
Source, the product itself elicits or inspires emotion.
Support, it helps people understand, process, cope, or otherwise handle emotions.
Symbol, the product or experience stands in for a feeling.
As a way to categorize qualitative research or as a way to define features and functionality, these three roles can serve as a guide. More often than not, a product will engage more than one of these core emotional roles. But even then, the emotional signature of each won’t necessarily be the same.
You might say you feel empty to convey a lingering loneliness. Another day you may tell a friend that your outlook is sunny to communicate optimism for the future. Or maybe you feel like monkey mind is a good way to describe a persistent state of distraction. After a year of working through social anxiety, you might feel like a turtle poking its head out of a shell. Emotional states inspire little blips of poetry, in an otherwise prosaic existence.
When we try to articulate how we feel, emotion words—even an impressive vocabulary of emotion words—are not nearly sufficient. Instead, we fall back on metaphor. A metaphor is a pattern that connects two concepts. When we are considering emotions, it serves a double purpose: it articulates emotion and evokes experience. Metaphors bring the emotional imprint to life, giving us a rich set of concepts to work with as we design.
Analogy has long had an influence on design. Henry Ford’s assembly line was inspired by grain warehouses. Hospital emergency rooms draw from Formula 1 pit stop crews. Design thinking already relies on analogy to develop products. Yes, there is a difference between analogy and metaphor. Metaphor makes a comparison; analogy demonstrates shared characteristics. A metaphor sparks instant understanding, while an analogy often requires elaboration. For our purposes, let’s not get too down in the weeds.
So, here we’ll develop emotional analogies. This activity works best with a collection of emotion and value words. It’s fine to mix them because they will already be jumbled together. Shame surfaced because people felt they weren’t able to live up to expectations. Anxiety kept people coming back, increasing each time. Values like presence or generosity, for example, will likely be somehow connected with serenity and admiration. You’ve probably already stumbled across these connections in your research.
When you are working with metaphor, there are a few combinations that are most useful to inform design:
Emotion + attribute, for connecting emotion with physical aspects of experience like color, texture, scale, size, material, weight, temperature, luster, age, and depth
Motivation + interactions, for connecting social and emotional goals like belonging, transcendence, safety, flow, recognition, love, autonomy, and so on with how people will engage with the system
Value + natural world, for connecting values (or emotions) with the natural world like shadows, changing leaves, a flock of birds, roots, and so on
Behavior + relationship, for connecting an action or behavior with a relevant relationship metaphor like a friend, parent, physician, or pet
A metaphor-based approach bridges the abstract concepts around emotions, values, and motivations with concrete aspects of design like what the object might look like or what behaviors it supports. It nudges us toward new aesthetics and experiences and away from clichés, too.
Emotions rarely fall on a neat timeline. When we develop customer journeys, they seem tidy though. The narrative arc feels familiar. It begins with a negative emotion and builds to a positive moment. In reality, that’s rarely the case. A journey is alive with all kinds of emotion.
Most contemporary models of emotion include a few components (see Figure 3-11). It goes something like this:
Cognitive appraisal (evaluation of an event or object; let’s say a repeated misunderstanding by a voice assistant)
Bodily symptoms (rapid heartbeat)
Action tendencies (speaking more slowly or pounding your fist)
Expression (your face is twisted in anger)
Feelings (subjective experience of an emotional state, say terror)
Some would add your emotional state beforehand as a factor. Maybe you are already upset or you’re in a hurry, which could give some context to your response. Other experts would add your personality traits, and say you tend to be quick to anger. Maybe your mental health history is an issue; perhaps you have PTSD or have suffered some trauma. Most likely, you sift through a personal repertoire of memories as you process the emotion. You’ve had terrible experiences before, or you just read an upsetting story about a voice-assistant fail. Some of this new experience gets added to the mix; some doesn’t make the cut. To make matters more complicated, the next day, even though the voice assistant remains the same, you could have a totally different emotional experience. And then there’s the other people in your household…
Well, that got complicated. And it might well be something that machines are better able to map in the far-off future. For now, let’s dial it back a bit. Let’s think less about the event, or even a single experience. Instead, let’s look at the relationship.
Framing emotional design as a relationship hasn’t really taken hold, but it certainly isn’t new. In Design for Emotion (Morgan Kaufmann, 2012), Trevor von Gorp and Edie Adams drew connections between their ACT model (attract, converse, transact) and psychologist Robert Sternberg’s relationship phases: passion, intimacy, and commitment. Stephen Anderson, in Seductive Interaction Design (New Riders, 2011), modeled emotional experience on falling in love. We’ll revisit the relationship model again in Chapter 4 in the context of social bots. For now, let’s apply it to the overall experience.
Start with milestones. Every relationship has milestones, symbolic markers that form a kind of ongoing timeline. Perhaps you might think of big milestones, like moving in together or buying a home. It’s also small moments, like taking care of a partner when they’re sick or sharing a bathroom for the first time, or when you sent a text that only the two of you understood. Your relationship with a brand, product, or experience has these milestones, too. We already focus on firsts, like the first encounter, the first purchase, the first return. Consider other relationship milestones too, whether it’s introductions to friends or a deepening commitment.
Then, move on to meaningful moments. It seems obvious to start with emotional peaks and endings. Other moments may be even more important, though. Think about where there is a change in emotion or moments of strong emotion, good or bad. Those are the moments that reveal bigger values, build capacity, create support, form a memory. For example, Garren Engstrom of Intuit speaks about the moment of clicking “Transmit” using TurboTax software. Before that simple click happens, there are hours of mindless drudgery, intense effort, and a fair measure of anxiety. After you’ve successfully sent your return, you are likely to be flooded with emotion.
Finally, look at the emotional arc. After you have identified milestones and moments, you can begin to develop a narrative arc for each that draws on emotional experience. The first time you felt understood by a voice assistant might move from curiosity (“Can I ask this?”) to frustration (“How many times do I need to rephrase?”) to surprise (“Wait, that worked!”) to relief (“It feels like I can rely on it”) to bonding (“Wow, it really does understand me a bit better”).
As you move through these exercises, weight, texture, color, light, scale, and other aspects associated with the emotional experience will emerge. Some features will rise to the top, others fade away. Content and tone begin to align. For now, we have an initial plan for a more emotionally experience.
Developing emotionally intelligent design is grounded in deep human understanding. Rather than looking at experience as a snapshot, or even a progression, we need to shift toward considering how it evolves. It needs to be elastic enough to grow and adapt and change over time.
After you have a prototype, you should continue to do research. You should iterate, as one does. The most successful experiences find ways to evolve the relationship further though. If you’ve made it this far, you’ll have people who love your product or service. Paul Graham of Y Combinator once advised Airbnb to cultivate that crowd, saying, “It’s better to have 100 people that love you than a million people that just sort of like you.”6 Airbnb’s 100-lovers strategy meant engaging the most ardent fans to shape the community. Likewise, Strava evolved the experience with a small group of avid cyclists who helped it formulate an emotional profile for friendly competition. Strava’s team was able to translate the feeling of accomplishment and camaraderie to keep the community motivated.
The 100-lovers approach is one way to develop a bond. But it shouldn’t be the only way. The same dangers you might encounter with a panel or a small group of beta testers still apply. It can become an insiders’ club. It can be prone to tunnel vision. It can get tapped out. So, you’ll want to continually seek out new people. Consider adding people with mixed emotions or those who overcome negatives. Bright-spot analysis is a way to accomplish this.
In Switch: How to Change Things When Change Is Hard (Crown Business, 2010), Chip and Dan Heath outline their process for finding bright spots. In every community or organization, there are people whose exceptional practices enable them to do better and feel more. These people might be considered the bright spots. For our purposes, it might mean studying how people adapt a negative experience in a positive way. It also might mean that they’ve developed a community, embraced a subculture, or adopted a set of behaviors that have shifted the experience.
As you’re evolving, you’ll be tempted to measure success, too. John and Julie Gottman have studied couples in their Love Lab at the University of Washington. Among other methods, the two rely on affective computing biometrics to monitor couple’s facial expressions, blood pressure, heart rate, and skin temperature, all while asking questions about how they met, positive memories, and moments of conflict. Micro-expressions, the Gottmans claim, reveal which marriages will thrive and which will fail. Based on this high-tech approach to relationships, John and Julie Gottman came up with a formula for a successful relationship: five positive interactions for every negative.
Almost in parallel, Barbara Frederickson came up with a 3:1 ratio for flourishing. That is, three positive emotions for one negative. Every so often, you’ll hear the idea of a magic ratio surface again.
A magic ratio turns out to be difficult to replicate. It’s easy to see why. Imagine a person who experiences three moments of joy in a day, another who experiences one moment of joy and two of contentment, and still another who experiences two of joy and one of anxiety. If we subtract negatives from the positives, it would seem the first two people are happier than the third. But emotions aren’t quite that mathematically predictable. The broader our range, the more resilient we become.
As appealing as the promise of a simple mathematical equation seems, our emotional life is more complicated than that. But here’s what seems to stick. Building capacity to grow, to change, to adapt, to make meaning matters more than tallies of positives and negatives. All that takes time, so as much as we try to actively evolve the experience, we also need to consider how people will live with it.
Much of our emotional lives can’t be understood at a sprint. We will get better at understanding emotion, creating a framework, and creating and testing designs to support it. But emotional experience is not static. Without a long view, we’ll lose the texture.
One way to keep growing is to look at ways to sustain the relationship. Usually, this means getting a fuller rendering of how people are making products a part of their lives. So, if we are looking for emotion resonance, we need to shift from the center to the edges. At the risk of overdoing the acronyms, I use DECIPHER as a shorthand. Here’s what it means:
The DECIPHER model shifts attention toward the aspects of human experience that we miss, discount, or simply need yet to understand. Consider these the signals that will help us understand ongoing relationships, emotions, and values.
Emotional relationships with products in our lives change and grow in value over time. Maybe it’s a hand mirror, passed down for years, that once belonged to a great, great aunt. Perhaps it’s an old Beetle that you restore with VW Heritage parts. Maybe it’s a coffee cup with an interior pattern that develops with use (Figure 3-13).
For me, it’s the rocking chair gifted to me by my mother-in-law before my first daughter was born. A chair where I spent hours upon hours with each of my three daughters. Gazing down at their miraculous tiny fingers and breathing in that delicious baby head smell, crying from lack of sleep, pleading with my little darlings to go to bed. Later snuggling up to read Dragons Love Tacos and stabilizing the rocker with blocks for hideaways and tending to smushed toes. Laughing as my dogs tried to jump up, only to be deposited right back on the ground. Years later, the chair looks a mess, but there is no way I’ll be getting rid of it.
Our emotional relationships with technology, products, and our designed environment become more profound the more steeped they are in complex emotion. Delight enlivens, emotional extremes engage, emotional depth and complexity endures.
In the near future it seems a given that emotion will be designed into the experience. The text your phone sends you to say that your purchase won’t make you happy, the app on your phone that lets you tune in to your partner’s mood before they get home, the robot companion who senses your irritation and adjusts its tone—none can be automated. We will be called upon to design for a mess of human emotion and a range of outcomes. And that future requires developing a greater sensitivity to our emotional lives with technology.
Emotionally intelligent design is a set of perspectives and practices that champion our emotional life. Technology that’s designed with emotional intelligence has a transformative power to contribute to our mental and physical health, forge our identities, build strong relationships, and help us create meaning in our lives. The impact of emotionally intelligent design is more than just a change in the products and services we develop: it’s a shift in mindset and methods.
Detecting, understanding, and responding to emotion is one of the first things we learn, and one thing we’ll never fully master. The practices here are part of a dialogue with past methods and a start at new methods to prioritize our emotional lives.
1 Yvonne Rogers, “Moving On from Weiser’s Vision of Calm Computing: Engaging UbiComp Experiences,” in P. Dourish and A. Friday, Ubicomp 2006, Lecture Notes in Computer Sciences, Vol. 4206.
2 Peter Salovey and John D. Mayer, “Emotional Intelligence,” Imagination, Cognition, and Personality, March 1990.
3 Charles Duhigg, The Power of Habit: Why We Do What We Do in Life and Business, Random House, 2012.
4 Joseph Henrich, Steven J. Heine, and Ara Norenzayan, “The Weirdest People in the World?” Behavioral and Brain Sciences, June 2010.
5 Sarah Garcia, “Measuring Emotions: Self-Report as an Alternative to Biometrics,” UXPA Magazine, July 2016.
6 Simon Schneider, Armin Senoner, and Danielle Gratch, “What Airbnb and Strava Know About Build Emotional Connections with Customers.” Harvard Business Review, May 2018.
18.190.160.221