10
Biases and Traps in Decision Making

Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.

—Daniel Kahneman1

The human mind simply isn't wired to achieve decision quality (DQ) in a natural, intuitive way. Because of how our minds work, mental traps and biases frequently get between our best intentions and true decision quality. Some originate from within; others creep in as we interact with those around us. This chapter presents an overview of biases that affect our decision making, and the mental mechanisms behind them. The chapter will go beyond description to offer guidance on how to avoid the resulting decision traps.

Mechanisms of the Mind

Mental biases have been fertile areas of study for psychologists and other behavioral scientists, and they've been the source of many books and articles over the last five decades.2 In a recent count, over 200 specifically defined biases were catalogued, and a few more are identified through academic studies each year. Although much research has been devoted to identifying these biases, little has been done to organize them. This chapter focuses on a subset of biases that directly affect decision making. These biases are organized into six categories, according to the mental behaviors that cause them (Figure 10.1).3 To address these biases, we must first understand the mental mechanisms that can both cause and mitigate them.

The figure depicts the structure for biases in decision making. These biases are organized into six categories: social influences, protection of mindset, personality and habits, faulty reasoning (complexity, uncertainty), automatic associations, and relative thinking.

Figure 10.1 A Structure for Biases in Decision Making

At the center of the structure for biases is how the human brain makes judgments and decisions. Daniel Kahneman points out that we have two significantly different mental processes.4 The first, which he calls System 1, is extremely fast and hot (emotional), and takes many shortcuts. System 1 is mostly unconscious and works according to the principle of “What You See Is All There Is” (WYSIATI), the assumption that whatever is accessible is all that matters. System 1 is amazingly fast. It can rapidly recognize complex patterns, allowing us to carry out sophisticated repetitive tasks such as driving a car or making operational decisions in a manufacturing plant. However, System 1 cannot be trained to reason correctly for deliberate decision making, and without intervention, it can lead us into traps and biases.

System 2 is comparatively slow; it requires attention and effort. System 2 is both rational and social-emotional. It is considered cool instead of hot. It is an extremely powerful mechanism, and can be trained to do basic decision tasks by installing “mindware,”5 the knowledge and procedures that our minds use to accomplish tasks like multiplication. However, System 2 is still susceptible to biases, especially in complex decision situations that feature uncertainty or interaction among many factors. Even when we engage System 2, we still can't draw effective decision trees in our heads.

It is a powerful System 1 habit to engage the deliberation of System 2 for important decisions, making the best of what we have in our heads. However, reaching DQ in complex and important decisions requires more. We may need to employ a decision process, develop computer models to predict outcomes, use expert advice to assign probabilities, or even use algebra to solve four equations with four unknowns. We can't just do these things in our heads. We need support from external sources, in the form of tools, processes, data, and/or experts. Augmenting our mental process with external support is an important type of activity which the authors label as System 3. System 3 hasn't been a focus of behavioral decision science research, but it is a critical addition to Systems 1 and 2 when making complex decisions. Figure 10.2 highlights characteristics of Systems 1 and 2, and illustrates how System 3 draws on external resources.

The figure depicting three mental processes: system 1 (automatic, fast, WYSIATI); system 2 (deliberative, slow); and system 3 (reach for tools, processes, data, and experts).

Figure 10.2 Three Mental Processes That Can Create and Reduce Biases

With awareness and training, all three systems can be used to help prevent biases. Each of this chapter's discussions, beginning with the protection of mindset category and continuing through social influences, includes a description of the key biases and ideas on how they can be mitigated by leveraging Systems 1, 2, and 3.

Protection of Mindset

The banner depicting protection of mindset that includes avoiding dissonance, confirmation bias, over confidence, hindsight bias, self-serving bias, status quo, and sunk cost.

It's no accident that this discussion of biases starts with the category about protection of mindset. Our mindset and the biases that arise from it are among the most significant factors affecting decision making. Mindset is all the stuff in our heads: beliefs, mental models of reality, lessons learned, memories, preferences, prejudices, and unconscious assumptions. We use these to make sense of the world and to make judgments and decisions. Whenever we encounter something that conflicts with our mindset, the first impulse is to reject or attack it, as an antibody would attack an alien organism.

Consider the European mindset prior to the early 1500s. As they had for centuries, people saw the sun moving from east to west by day, and a fixed pattern of stars doing the same by night. The science of the time described earth as the center of the universe, with the heavenly bodies rotating around it. Given that mindset, what people saw made perfect sense. When Copernicus proposed a very different explanation, many were upset. His sun-centered model disrupted their cosmic mindset and triggered much negative reaction and intellectual discomfort.

This example illustrates one of the biases caused by protection of mindset: avoiding dissonance. A viewpoint that is inconsistent with our existing mindset creates discomfort because the mind cannot readily hold conflicting ideas simultaneously. Psychologists refer to this discomfort as cognitive dissonance. The result is an urge to discredit or ignore information that doesn't fit the current mindset. Such efforts to avoid dissonance can impact the quality of decisions. Also, rebuilding a mindset is difficult because people are wired to reject evidence that conflicts with existing beliefs, and to retain evidence that confirms those same beliefs: the so-called confirmation bias.

Overconfidence is a related mindset affliction where we think we know more than we do, and we're too sure of it. Imagine an expert who has been asked to forecast a range for next year's sales of an important product. That range, defined in terms of the 10th and 90th percentiles for low and high values, should contain 80% of all possible outcomes. However, if an untrained individual without guidance defines such a range, the result would typically end up being much too narrow, containing only 50% of actual outcomes. This underestimation of uncertainty is a real challenge to a quality decision.

To make matters worse, when looking back at past mistakes or surprises, it's easy to rationalize that we knew the right answer all along, thanks to the hindsight bias. In the same vein, we exhibit a self-serving bias by overestimating our own positive qualities, attributing successes to our own efforts while writing off failures to bad luck or situational factors.

We also protect our existing mindset with the status quo bias, whereby we stubbornly cling to the current position, technology, or business strategy too strongly and for too long—and even escalate our commitment to it despite evidence that it's not working, in the hopes that things will improve. This behavior is particularly apparent with the related sunk cost decision trap, which is common in business organizations. It's hard to let go of a failing endeavor in which sizable sums have already been invested, even when objective analysis says: “It's not working out. Write it off and move on.” Someone affected by the sunk cost trap will respond, “But we've put $6 million into developing this technology! We must stick with it.” That thinking can lead people to throw good money after bad, as the saying goes.

Personality and Habits

The banner depicting personality and habits that includes preference-based habits, habitual frames, content selectivity, and decision styles.

Another critical source of decision bias is our collection of habits and the personality characteristics that create them. The most popular personality indicator used in the business world is the Myers-Briggs Type Indicator (MBTI). Most readers have been exposed to it at one time or another. The MBTI6 differentiates preferences within four dimensions:

  • Extroversion versus Introversion: how we relate to the world around us
  • Sensing versus iNtuition: the source of our input to making judgments and decisions
  • Thinking versus Feeling: the way we reach conclusions and decisions
  • Judging versus Perceiving: whether we prefer to decide or stay open to possibilities

A person's preferences in regard to each dimension are usually indicated by a four-letter type description. For example, coauthor Carl is an ENTP. By knowing the personality types of ourselves and those around us, we can recognize preference-based habits. For example, an Extrovert is energized by engaging in discussion with others, whereas an Introvert comes away from such discussions feeling drained. When information is needed for a decision, people with a preference for Sensing seek out specific, factual information. They are skeptical of uncertain possibilities and scenarios put forth by coworkers with an iNtuition preference.

Everyone has a preferred approach. Unfortunately, consistent with the WYSIATI principle, that preference colors our judgments about what is required to address a particular decision. In fact, we need to focus on the nature of the decision itself, not on the preferences and habits of those considering it. It's important not to let our preference-based habits get in the way of solving the problem that needs to be solved.

Personality preferences lead to some specific habits of mind that affect decision making. An example is the use of a habitual frame. For someone who prefers iNtuition-based thinking, it's natural to expand the frame of a problem to encompass many different decisions, whereas a Sensing type will naturally narrow the frame to keep it focused on as few specific decisions as possible. Similarly, a content selectivity bias encourages focus on the information that fits our customary way of viewing the world: a Feeling type emphasizes the people factors in decisions, while a Thinking type prioritizes a technical, systems view.

Also, we choose decision processes that match our natural decision style. Extroverts prefer a decision process where they can talk things out in a group, while Introverts favor an approach where they can write things down on their own. Judging types look for rapid closure, but Perceiving types like to keep options open.

Personality preferences and habits of mind are not problematic in themselves. The negative impact comes when these biases lead us to approach a decision as we see it rather than as it is.

Faulty Reasoning

The banner depicting faulty reasoning that includes complexity (selective attention, inability to combine many cues reliably, substitution, heuristic, order effects) and uncertainty (confusion about uncertainty).

The human mind struggles when forced to deal with uncertainty or the complexity associated with many interrelated factors. Even when we are in a careful thinking mode, we don't naturally draw good decision trees or solve four equations with four unknowns in our heads. Complicated decisions require the use of System 3, the externally augmented version of System 2. If we do not use System 3, we fall victim to predictable biases from faulty reasoning, due to both complexity and uncertainty.

Faulty Reasoning Due to Complexity

The human mind is confused by multi-dimensional problems and loads of data. In response, we often oversimplify. We apply selective attention to the variables that seem most important while ignoring the rest. In situations where many value dimensions are important (such as the location, cost, size, floorplan, finishes, and state of repair of a possible new house), we still end up focusing on just a few key attributes because of our inability to combine many cues reliably. We use a substitution heuristic to shift attention from a tough question (“How much effort should we spend on this decision?”) to an easier one (“How much time do we have before the next executive committee meeting?”), even though the answer to the easier question may have very little to do with the question that we really need to answer. When faced with many different pieces of information, another trap, based on order effects, leads us to remember those ideas that are either first or last. In general, when things get complicated, we oversimplify, whether we realize it or not.

Simplification is not a bad thing as long as the problem's essential character is addressed—but we mustn't go too far. We should only simplify to the point where our framing of the problem, or its proposed solution, remains sufficiently robust to capture what is important to the decision situation. Any further simplification will leave us wrestling with the wrong problem.

Faulty Reasoning about Uncertainty

Uncertainty—always an element in big, difficult decisions—confounds the mind's reasoning capacity. Even highly trained professionals make mistakes when they have to reason through uncertain situations.

For example, in a classic study by David Eddy,7 four out of five physicians made grave misjudgments about the likelihood of breast cancer in patients with concerning mammogram results. The physicians were given a situation in which there was a 1% chance of any lump being malignant. They were also told that, in this situation, a mammogram would accurately classify 80% of malignant tumors and 90% of benign tumors. Given those parameters, they were asked to judge the likelihood of a lump actually being malignant after the mammogram indicated malignancy. Out of 125 physicians given this question, 95 (80% of the doctors) said that the chance of malignancy was 75%. The actual answer is just 7.5%.8 So the vast majority of physicians in the study were off by a factor of ten!

The literature is full of examples like this one. Of course, not only physicians are affected by this bias of confusion about uncertainty. We can expect similar problems when asking a pharmaceutical expert how likely it is that a new compound will successfully navigate every stage of regulatory approval, or when asking a marketing specialist about the projected sales of a new product that's dependent on a new technology in a new market. Regardless of expertise, people generally don't think well about uncertain events and their outcomes.

System 3 is the key to reaching a quality decision in the face of complexity or uncertainty. With enough deliberate System 2 practice, reaching out for tools and processes that aid reasoning can become a new habit in System 1.

Automatic Associations and Relative Thinking

The banner depicting automatic associations that includes ease of recall, availability effects, vividness, narrative fallacy, halo effects, and anchoring effects.
The banner depicting relative thinking that includes framing effects, reference point effects, and context effects.

Judgments are often made through comparisons, connections, or associations with things that are within easy reach and largely automatic—along the lines of WYSIATI. This can create biases caused by automatic associations we don't even recognize, and by inappropriate relative thinking. The effects of automatic associations and relative thinking often come together.

In one example of automatic associations, we use our ability to remember or imagine an event as an indicator of its importance or likelihood. So if a future event is easily imagined, we assume it is more likely. This is the ease of recall bias. If we have heard about something recently, we believe it's more important than things we heard about some time ago; that's the availability bias. Recent events have greater influence on our judgments than do analogous events that occurred in the past; the former are top-of-mind, while the latter are dim memories.

The vividness bias is related to these. The more vivid our impressions or memories, the more likely it is that we will be influenced by them. Following the disaster at Japan's Fukushima nuclear power plant, people were bombarded with daily news reports and videos of that shocking event. The vividness of those impressions will color people's memories and judgments about nuclear energy for years to come, even though nuclear energy production has resulted in far fewer fatalities per kilowatt hour than any other source of electricity production. In another incarnation of the vividness bias, we find we are deeply affected by stories of individuals in distress, but we become numb to the suffering of large numbers of people.9

Another important bias is the narrative fallacy. If we can create a good story in our heads about something, then we start believing that it's true. In one common narrative, a teenager who fails to clean up his room after a second reminder must be intentionally acting out of disrespect. In another example, a coworker in the office who hasn't responded to an important project email must be trying to block the project's success. Those stories may be compelling, but that doesn't make them true. Other interpretations are possible. Nonetheless, we become easily convinced as soon as we formulate a story, even if it's based on very limited information. As Arnold Glasow put it, “The fewer the facts, the stronger the opinion.”

Association biases like these can also lead to significant distortions in our judgment. Of course, they are used widely by professionals in the fields of marketing, news media, and politics to influence and manipulate our judgments. Marketers use repeated showings of vivid television commercials to create an availability bias for their product. Politicians use compelling stories to sway their voters, even if the information behind the stories is questionable. Another favorite of the news media and politics is the halo effect. By standing next to someone rich and famous, a politician may be perceived as being more powerful. Similarly, in times of increasing sales and profitability, the leadership of an organization may be perceived as having a great strategy, even if their success is primarily due to random market fluctuations.

Anchoring effects are another form of automatic association that can undermine good decisions. An anchor is a number that someone tosses out and others latch onto. Anchors are most powerful when there is uncertainty as to what the right number might be. They act as reference points, even when they are irrelevant. For example, when a homeowner lists her house for sale at $450,000, she has tossed an anchor to prospective buyers. That figure might be the result of serious market analysis or it might simply be off the top of the seller's head, and thus irrelevant. For anyone who latches onto that anchor, however, $450,000 will be the point around which negotiations will take place. “Well,” a buyer might say, “that seems high. Will you take $425,000?”

Anchors can be particularly problematic for an expert estimating a range for future outcomes of an uncertain factor, such as operating costs for a plant in the coming year. If the expert starts by looking up last year's total costs, that number can make it hard to think of anything else. The result is likely to be a range that is much too narrow, with a low and high value anchored too closely to that initial number. An expert who wants to avoid the danger of too-narrow ranges can start with backcasting to create salient stories for how a low or high number might have happened, complete with a list of reasons for each. This uses the narrative fallacy to leverage a new reference point and break away from a central anchor.

* * *

The anchoring effect is caused by automatic associations, and it also relates to problems with relative thinking. Once a homeowner creates the anchor of $450,000, subsequent discussions are made relative to that number. Other similar biases are based even more heavily on relative thinking. In these biases, judgment is affected by comparisons, whether conscious or unconscious. One of the most common of these is the framing effect. The way that a question is presented can have a big influence on how the matter is framed in our minds. When the Alpha product manager asks, “How quickly can we get the Alpha product to market?,” we might not consider whether Alpha should have priority or whether it is the product that will bring the most value. People tend to accept a thrown frame, which in this example implies moving ahead with Alpha, even though the Beta or Gamma products may be of much higher value. It is important to consider the frame of a decision carefully rather than unconsciously accepting the first frame thrown our way.

A trip to the grocery store will highlight other biases of relative thinking, such as the reference point effect. A bright yellow sign advertising a savings of $0.20 off the full price for a roll of paper towels may look like a bargain, even though the listed full price of $2.00 is a reference point without much meaning. Context effects are also important: That yellow sale sign may seem all the more attractive when it marks the only sale on a shelf loaded with full-price items.

Social Influences

The banner depicting social influences that includes conformity, suggestibility, cascades, and groupthinking.

People are social animals. From cradle to grave, we are socialized in the beliefs and behaviors of the group, which explains why people who grow up and live in a particular society generally adopt a similar mode of dress, eat their meals at roughly the same time of day, share the same notions of right and wrong behavior, and so on.

Our social nature contributes to stability and collaboration. However, it has negative features that every decision maker must recognize and resist. The first is conformity. Although organizations frequently tout the virtues of individuality and innovative thinking, ideas that conflict with those of the group are not always welcomed. Contrary views may be ridiculed or dismissed, and the people who hold those views may experience rejection or hostility. Even when we believe that we're right, presenting viewpoints that conflict with those of the group is uncomfortable—either at work, or among friends and acquaintances. The great Charles Darwin, for example, was so distressed by the prospect of upsetting his many religious friends (and his wife) that he delayed publication of his groundbreaking work on evolution for many years. The human need for conformity and acceptance is very strong.

Peer pressure, in many cases, creates unconscious and subtle encouragement of like-minded thinking. Socialization likewise has the power to converge disparate views into agreement. Experiments by social psychologists have demonstrated how individuals will alter their views to conform with the group, and through the effect of suggestibility they will accept and act on the suggestions of others. Sometimes, the effects of conformity and suggestibility can launch a sort of domino effect in a group, creating a cascade. For example, upon learning that two other members are voting against a proposal, a third group member may disregard the information that led her to strongly favor the proposal. She may assume that the other members had solid reasons for their dissenting votes. In truth, those first two members may have had very limited information, making their decisions rather arbitrarily. Perhaps if the third group member had shared her information, she might have influenced their votes as well. But in a cascade, conformity and suggestibility prevent that from happening. Another bias, groupthink, is often used to describe the general tendency of groups to discourage diverse views. Groupthink can generate dangerous overconfidence in teams that exhibit self-reinforcing cohesiveness and unanimity of perspective. Convinced that they are right, these teams close their minds to contrary views. Messengers bearing contradictory evidence are not welcomed. The impacts of groupthink and other negative social pressures are a real hazard on the path to DQ.

Summing Up

Many biases have the potential to impact human behavior. Getting our minds around the bias problem is easier if we focus on those most important in decision making, and sort them into categories related to their source. Figure 10.3 lists the most important biases in each category, as discussed in the preceding sections. We must engage Systems 1, 2, and 3 to avoid the negative impacts of these biases.

The figure depicts the summary of biases. The lists the most important biases in each category are as follows: social influences (conformity, suggestibility, cascades, groupthinking); protection of mindset (avoiding dissonance, confirmation bias, overconfidence, hindsight bias, self-serving bias, status quo, sunk cost); personality and habits (preference-based habits, habitual frames, content selectivity, decision styles); complexity (selective attention, inability to combine many cues reliably, substitution heuristic, order effects); uncertainty (confusion about uncertainty); automatic associations (ease of recall, availability effects, vividness, narrative fallacy, halo effects, anchoring effects); and relative thinking (framing effects, reference point effects, context effects).

Figure 10.3 Summary of Biases

Taken one at a time, these biases have the potential to dramatically impact the quality of our decisions. In learning more about these biases, we can identify which of them are the most relevant to us as individuals, heighten our awareness of them in daily life, and work to minimize their impact.

When the biases act in concert they create larger effects, or megabiases, that impact the decision-making processes of individuals and the cultures of organizations. The next chapter introduces the most detrimental megabiases, providing motivation for the additional System 3 tools and processes that will be explored in the chapters that follow.

Endnotes

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.222.148.124