Chapter 9. Reliable Errors (or Reliably Error-Prone)

“You ought to get it framed,” said Ollie, examining the jaywalking ticket that Bill put on his desk. “It’s like a two-dollar bill—lucky and rare.”

“Coffee?” Bill asked, smiling. “Downstairs?”

“And talk about trade-offs again?” Ollie replied, getting up from his seat. “I thought you’d never ask! After you, Sir!”

As they walked toward the elevators, Ollie said, “E.T.T.O. is efficiency-thoroughness trade-off. You know, Professor Erik Hollnagel’s idea.”

“Right. I was just thinking about that,” Bill said. “Hollnagel noticed that you can’t have both efficiency and thoroughness, and that we have to continually balance between the two. For instance, you can say that in some sense before 9/11, the transportation safety folks favored efficiency. You could whisk through baggage check in 5 minutes, without having to take your shoes off!”

Ollie cut in. “But after 9/11, we swing deeply into thoroughness territory, maybe too much. So now we might be moving in the direction of efficiency again.”

“I was thinking about Mike,” Bill said. “And how he was making these trade-offs during the outage.”

“Definitely,” Ollie said. “We’ve all been there: there’s a fire—literal or figurative—and we have to make a split-second decision about whether to put it out the fast way, or”—Ollie made air quotes—“‘the right way.’ The dirty little secret here is that we make decisions in real time, but evaluate them with the benefit of hindsight. Whether a decision (like a particular trade-off) was correct can be determined only in retrospect.”

“That doesn’t feel right,” said Bill. “There are rules and laws. There are procedures. There are even best practices, in some cases. I think I know most of the time if I’m making the right decision.”

“We’d all like to believe that we know. How about the jaywalking incident? You were clearly breaking the law. It was only a matter of time before your wrong decisions caught up with you. I’m just glad you didn’t get hit by a bus.”

Bill looked confused. Ollie smiled and said, “Look, Bill, you did what made sense at the time. Until today, no one ever ran you over, and you never got caught. Your decision to jaywalk seemed totally reasonable before you got a ticket. And now that you’ve got to pay a fine, how does your decision look?”

“Kind of stupid, actually. I feel like I should have known better.”

“Good news, Bill!” said Ollie. “We ran some tests, and found that you’ve contracted a case of the hindsight bias!”

“I didn’t know it was contagious!”

“That’s right,” Ollie said in the voice of a 1950s television announcer. “Colloquially known as ‘hindsight 20/20,’ or the ‘Monday morning quarterback syndrome,’ it affects everyone. There is no known cure, so we would all do well to practice mental hygiene to reduce the risk of spreading this serious disease. When you decided to jaywalk, did you know that you would get fined?”

“No, but I knew I could have been.”

“I doubt it was top of mind. That’s just the hindsight bias talking, and it’s tricking you into thinking that something that is obvious now was also obvious then. But it wasn’t.”

“What?”

“Look, Bill,” Ollie continued. “Do you think that Mike knew that what he was about to do in order to troubleshoot the network would take it down?”

“Probably not.”

“Probably? Are you saying that Mike might have purposefully taken down the network?”

“No, of course not. It was an accident. I’m just saying that he should have been more careful.”

“So, again, that’s hindsight bias. Mike didn’t know that something bad was about to happen. He didn’t have the information we now have, and it wasn’t obvious to him at the time—and he was being appropriately careful.”

“OK, Ollie, that makes sense. So how do you tell if your thinking is affected by hindsight bias—or other biases?”

“Daniel Kahneman—one of the people who’s been studying these cognitive biases since the seventies—says that it’s easier to spot biases in others than in ourselves. Since you’ve asked, I’m happy to share what I know, and remind you when you might be affected. I do fully expect you to return the favor in the future.”

Ollie went on, “Anytime you hear ‘didn’t,’ ‘could have,’ ‘if only,’ or ‘should have,’ you can be pretty sure that whoever is saying them is under the influence of hindsight bias. These phrases are called ‘counterfactuals’—they describe, literally, what didn’t happen. The future is uncertain, but we can’t change the past. ‘If only Mike didn’t troubleshoot the router,’ for example, is not describing what actually happened, and instead of learning from the past, we’re engaging in a kind of lazy (but very comforting) wishful thinking. ‘Mike could have asked for help,’ or ‘Mike should have done more testing in the lab,’ or ‘Mike didn’t do the right thing,’ are all counterfactuals, and are all evidence of hindsight bias. When we think in this way, we forget what Mike did, in fact, do. More important, if we get stuck there when investigating past events, we never fully understand what really happened, and we can’t learn why what Mike did made absolute sense at the time.

“And without learning,” Bill said, “we have no chance of making any improvements.”

Now in the cafeteria, the two men filled their Styrofoam cups with a hot, murky liquid the cafeteria called “coffee.”

“In hindsight,” Bill said, “I’ll probably regret drinking this, and wonder why I didn’t just go across the street to get a better cup of coffee.”

“And yet,” said Ollie, “it makes perfect sense right now, doesn’t it? This coffee is closer and cheaper, and will do the job.”

“OK,” said Bill, “so that’s hindsight bias. It sounds like to counteract it, we have to listen for counterfactuals, and then mentally transport ourselves back in time to see what was actually known or obvious at the time?”

“Much easier said than done. But if we are to learn from the past—from both failures and successes—it’s precisely what we have to do.”

Bill and Ollie paid for their coffees and walked in silence toward the elevator.

“How many other biases are there?” Bill asked.

“Many,” said Ollie. “Kahneman’s book, Thinking Fast and Slow—which is mostly about biases—is 500 pages long, and doesn’t even cover half of them. There are close to 200 listed in Wikipedia. Cognitive biases, in general, are mistakes that we make pretty reliably, and most of the time we’re unaware that we’re making them. And the biases travel in packs.

“Take, for instance, outcome bias, which is almost always there when we experience hindsight bias. With outcome bias, we judge the quality of decisions made in the past given the outcome, which, of course, is unknown at decision time. Imagine there was no outage last week. Would we think differently about the quality of Mike’s decisions?”

“Well, I’m not sure Mike would have gotten promoted,” Bill said, “but he surely wouldn’t have gotten fired.”

“The flip side of outcome bias is that it can lead us to hero worship—celebrating people who made pretty bad or risky decisions that somehow still turned out all right.”

“Well, there’s plenty of that going on around Wall Street! The ends, after all, justify the means—unless you get caught.”

“The outcomes reliably color our perception. And, as with any bias, we mostly don’t know we’re being influenced with outcome bias, which means that we’ll have a pretty hard time sorting out the good decisions from the bad.”

“Or good people from the bad?”

“Ah, yes, that brings me to my personal favorite bias—fundamental attribution error—which results in exactly the leap that you just took from the individuals’ actions and decisions to the individuals themselves. For instance, when I talk to you, especially before you’ve had your first cup of coffee, I might get the distinct impression that you’re a pretty grumpy guy.”

“Who, me?” Bill said, smiling.

“In this situation,” Ollie continued, “if I were under the influence of fundamental attribution error, I would think of this grumpiness as part of your personality, instead of attributing it to a lack of caffeine in your system.”

“Anyone who works with networks—or complex systems in general—has the right to be at least a little bit grumpy, don’t you think?”

Ollie laughed. “Indeed! And as long as we attribute their behavior to their conditions, we’re good. The moment we attribute it to their personality, we’re making fundamental attribution error. That’s precisely how Mike becomes ‘careless’—we’re forgetting the circumstances, the stress, and the decision fatigue that he was experiencing when he was working through the outage. Oddly, whenever we think of ourselves, we have no problem seeing our actions and moods as a result of our surroundings, and not tying them to ourselves too tightly.”

“I have to say, Ollie, you’re blowing my mind right now. I am guilty of all these biases!”

“We all are,” Ollie said. “And there’s more of them—a lot more. Kahneman, for one, is not optimistic about our ability to spot them in ourselves. Which is why it’s important for us to learn to spot them in one another, as well as give each other permission to name the biases when we see them.”

“Well, for what it’s worth, I not only give you permission, but request that you call me out when you see me under the influence of any bias.”

“I’ll try. And please do the same for me. It’ll take some practice for all of us. When trying to learn from the things that happened in the past, we have to mentally transport ourselves to the past, and continually ask ourselves, ‘What was known at the time? How did the decision make sense? What circumstances were influencing my decisions? What conditions were present that enabled me to act in a particular way? How did I know what I knew, and how did I do what I did?’ Maybe then we can learn something useful.”

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.141.192.120