4.
THE LIMITS OF SPECIALIZATION

“If you can’t explain it simply, you don’t understand it well enough.”

Albert Einstein

In their constant search for competitive advantage, organizations are seeking people who already have specialized knowledge. This encourages people to keep going deeper rather than wider in their formal and informal learning. In our work we have observed the tendency for people to want to keep improving the skills they have already developed, rather than invest in gaining new ones. People are only too aware of the costs of training for years in one area, only to abandon it and “start from scratch” again.

Specialization has benefits, but it also comes with a risk: the more competent we are, the more we are prone to falling prey to “the curse of knowledge.” The curse of knowledge means that the more you know, the harder it is to think and talk about your area of expertise in a simple way.17 We tend to communicate from too high a level, misjudging other people’s ability to understand us, causing confusion and hindering the learning of others. Where the task is to communicate knowledge, this curse can neuter the benefits of the knowledge as it fails to be received by the intended audience.

Complex or jargon-laden language can also mask genuine knowledge – where an amateur has learned the relevant buzzwords or jargon and uses them to create the impression of knowledge. The audience is left worse off, whether they are confused by a true specialist or misled by someone who does not possess expertise but who artfully uses jargon as a cover for their own ignorance.

Specialist knowledge can also impair fresh thinking about complex problems. As authors of The Curse of Knowledge, Chip and Dan Heath note: “When we are given knowledge, it is impossible to imagine what it’s like to LACK that knowledge.”18

The more expertise we have on a particular topic, the harder it is for us to frame the problem in a neutral way that everyone will understand. Inbuilt in our definition of the problem is our own perspective of it. Our knowledge and expertise limit our perspective and the exploration of possible solutions, and make it difficult for us to think laterally or “outside the box.” Behavioural economists call this “anchoring bias” – where the nature of the problem is already well established, or “anchored” by existing knowledge.19

The International AIDS Vaccine Initiative is a research prize set up to help find fresh solutions to combating HIV. The Initiative sponsored an open challenge to the scientific community to come up with an effective inoculation for the virus. Unfortunately, defining the Request for Proposal as a vaccine challenge did not yield many high-quality responses. This was because the specialists had inadvertently “anchored” the problem to vaccines, so the thinking was inevitably limited to finding solutions that happen to be vaccines. In a way, this is a form of prejudice – the solution is prejudged as being a vaccine, when non-vaccine solutions may be superior. As the saying goes: “If you are a hammer, then everything looks like a nail.”

An innovation consultant, Andy Zynga, suggested framing the problem as a protein stabilization challenge, rather than a vaccine development challenge.20 By reframing the challenge around the problem (protein stabilization) rather than the type of solution (vaccine), the challenge was opened up to a wider range of thinkers and experts. With this reframing, 34 proposals were received from highly qualified scientists in 14 countries, covering wider and more innovative thinking than previous solutions. Three of these were selected to receive research grants. If organizations or people have high levels of expertise, sometimes the boundaries of their expertise limit their ability to think about a problem from fresh angles.

In his book Expert Political Judgement21 Philip Tetlock, Professor of Psychology and Management at the University of Pennsylvania, analyzed more than 2,500 forecasts by experts and then what actually happened. An analogy from the writing of the Greek poet Archilochus divides people into two categories: ‘foxes’ and ‘hedgehogs’. The fox knows many things, but the expert hedgehog knows one big thing. Tetlock discovered that the foxes were generally more accurate in their predictions than the expert hedgehogs. Part of the reason for this was that the narrow focus of the expert prevented them from seeing the bigger picture of factors beyond their specialization. So according to Tetlock, knowing a lot can make someone less reliable in being able to make forecasts. He also found that often there was an over-confidence that made experts blind to contradictory opinions: a case of hubris. “Experts in demand were more overconfident than their colleagues who eked out existences far from the limelight,” says Tetlock.22

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.218.218.230