Chapter 2

Peering into the Mind: How People Think

In This Chapter

arrow Testing humans, logical thinking

arrow Staring into the brain while it works

arrow Challenging the notion of rational scientific thinking

We think so because other people all think so . . . or because we were told so, and think we must think so. . . .

Henry Sidgwick

Some mysteries are best tackled by digging out and looking at ‘the known facts’, but not the issue of ‘how people think’. This one is best tackled (as philosophers have done for centuries) by asking questions.

For example, when you read something — like this paragraph — whose voice do you hear in your head? Is it your own voice, as the reader, or is it an echo of the voice of the author reappearing through the words — or perhaps both? The neurologist Paul Broks identifies a peculiar thing about writing: it seems to allow other people to access and ‘take over the language centres of your brain’. Part of this chapter, the section ‘Thinking Logically or Instinctively: Evolution and Consciousness’, explains how and why that may happen. Being aware of this is useful when you're trying to understand your reaction both to other people's ideas, and to critically evaluate some of your own theories.

One of the key skills, not only of Critical Thinking but in life generally, is the ability to reflect on your own practices. This chapter is your diagnostic manual for checking what's going on inside your head.

In debates about how people think, a gulf in philosophy has long existed between conservatives, who uphold traditional distinctions and assume the brain is a machine (and therefore logical and rational), and radicals, who critique that whole approach (and admire the complexity and illogicality of human thinking). This chapter takes a look at these debates — ones that shape all subject areas — so that you can move towards an effective analysis of your own and other people's reasoning. It's important to realise that even scientists aren't immune to making mistakes in this area.

I also examine a more specific question: to what extent do logical rules and the methods of rational argument underlie people's beliefs and the judgements and decisions they make? Or, on the contrary, are individuals more influenced by what other people think? An understanding of this tendency to groupthink provides you with a key defence against being misled by the opinions of those around you or those in authority, and also a more sophisticated way of interpreting events, debates and decisions.

Read on — but also have a think about what you think about how you think — and then perhaps try not thinking about anything — maybe have a quiet lie down!

Thinking Logically or Instinctively: Evolution and Consciousness

Personally, I don't usually think of myself as having a brain like a lizard or crocodile (unless I've had a particularly bad night's sleep), but in evolutionary terms it seems that I sure do. So if anyone wants to claim that ‘the way that we think is what makes us human’, they'd better try to work out precisely what humans do differently from animals. As I discuss in this section and throughout this chapter, the debate is as much a philosophical one as a biological one.

In the first part of this section I look at how mysterious the inner world of our thoughts still remains, even as scientists discover more and more about the external world. I first of all look at the different tasks human minds and animal minds are asked to do, and then in ‘Jumping to conclusions: The cost of fast thinking’ I'll illustrate how sometimes the two kinds of thinking — human and animal — get muddled up and lead people to make rash judgements and silly mistakes.

Buying beans and composing sonnets: Contrasting views of consciousness

Do monkeys think? Do plants? No, or at least not like humans anyway. They just appear to be thinking as they may follow pre-programmed evolutionary strategies; a bit like computers (or Big Brother contestants). But, unlike computers, they're ‘undoubtedly’ conscious of something. For if nowadays scientists agree that the body, indeed the whole universe, is a machine, still no one is quite able to say that a ghost isn't riding along in the centre of it.

One of the most famous philosophers of them all, Descartes, once wrote ‘I think, therefore I am’, or at least, many people think he wrote that. Of course, Critical Readers will check such quotes very carefully and find that actually he said something a little bit different. But as I say, everyone ‘thinks’ he said that, so in a sense he did. He was suggesting that awareness of the brute fact of existing was the only thing he could be sure of, and he used this nugget not only to get himself up in the morning but also to make sense of and rediscover the world.

acloserlook The French philosopher was onto something big — and that thing is consciousness, perhaps the central mystery of philosophy. Science can explain many things, but this strange sense of self-awareness it often just dismisses as an illusion.

Humans do many things that animals don't and they do them for complex, socially defined or aesthetical reasons. As the contemporary philosopher-scientist Raymond Tallis challenges his readers, just consider what's going on under the surface with something as commonplace and seemingly simple as buying a can of beans in a supermarket. Why are people buying them? It may be because they've just seen an advert for it, or because it reminds them of some happy times when they were kids. It might be because they think beans are cheap. Surely animals don't have to worry about things like this when they eat grass or gobble up rabbits.

Yet the fact remains that many of the differences between humans and other animals are marginal. The lives of humans and chimpanzees probably looked very similar a few hundred thousand years ago — no tins of beans or supermarkets then, let alone those sonnets and symphonies that philosophers love to cite as proof that humans are something special. Plus humans didn't develop their mysterious minds in an evolutionary blink: The brain evolved over long periods of time, and so Stone Age people must have had pretty much the same kind of consciousness then.

tip Professor Tallis is near the mark when he says that what's distinctive about humanity is its social environment, bound together by language and tool use. It's utterly different from the world within which animals exist. ‘Artefacts, institutions, mores, laws, norms, expectations, narratives, education, training.’ And although humans share 98 per cent of their genes with chimpanzees, they share precisely zero per cent of their chromosomes — and chromosomes are what actually do things.

Jumping to conclusions: The cost of fast thinking

In this section I'll look at the theory that actually, people are basically illogical, and because of this they often get muddled up, make faulty judgements and silly mistakes. Understanding how people arrive at their opinions and conclusions gives insights into what people say and think — and can even help you anticipate people's behaviour and responses in advance.

The US professor Daniel Kahneman has written about the psychological basis for judgements, reactions, choices, conclusions and much more. His writings (such as Thinking, Fast and Slow) give a significant push to the already pretty widespread view of people as, basically, irrational animals. He was even given a Nobel prize for his research!

remember Kahneman's thesis is that the human animal is systematically illogical. Not only do people mis-assess situations, but they do so following fairly predictable patterns. Moreover, those patterns are grounded in their ancient origins as simple animals. Survival depended on it. Much thinking is instinctive — and hardwired.

He says that people have two ways of thinking:

  • A logical mode (which he thinks is good, of course).
  • An earlier, instinctual mode (which he says is the root of most ‘wrong decisions’).

The human brain doesn't like information gaps, and so people tend to jump at the first answer/solution that looks good rather than take the time to examine all the data, especially in a world where they receive more information every day than they have time to assimilate. Plus, the human brain loves to see patterns and make connections. Although such traits serve people well in many ways, sometimes they mislead people too.

For example, thinking is a complex biological process and requires a lot of energy: the human brain uses up 20 percent of an adult's total energy, and for children it gobbles up almost half their body's energy! (Try multiplying two two-digit numbers in your head while running: 23 × 47 anyone? You're sure to slow down both in your running and your caculating. So, because thinking gobbles up precious mental resources, the body is programmed to avoid it. Instead, human beings have developed, over many thousands of years, a range of built-in, ‘off the peg’ methods for reaching decisions.

acloserlook You might want to say that the example of the multiplication sum ‘slowing down your running’ is a bit dodgy — maybe that it is the distraction rather than the mental energy that causes any slowing down. Certanly, don't accept anything just because an expert says so! However, the notion of being distracted itself indicates a sort of limit in human thinking powers. That's partly why we admire people who can, say, balance on a monocycle on a rope while juggling!

The problem with fast thinking however is that often it means people don't solve the right problem — they solve the easy problem. A celebrated example is the ‘bat and ball’ quiz.

trythis Test yourself! A bat and a ball together cost £1.10. The bat costs £1 more than the ball. How much does the ball cost? (Answers at the end of the chapter.)

Encountering human illogicality: The Linda Problem

The Linda Problem, one of the most celebrated quizzes in psychological research, is an experiment in unintended bias. It's used to illustrate how illogical everyday judgements are ridden by fallacies anchored in evolutionary history. The original experiment, by two psychology professors, Amos Tversky and Daniel Kahneman was elegantly simple. At the outset, participants were given this information:

Linda is thirty-one years old, single, outspoken and very bright. She majored in philosophy. As a student she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.

On the basis of this character sketch, the researchers asked student volunteers to rank the likelihood (probability) of Linda having one of a list of possible jobs, ranging from ‘teacher in an elementary school’ to ‘insurance salesperson’ by way of ‘works in a bookstore and takes yoga classes’. They had provided a stereotype, and waited to see if the research participants would be influenced by it.

jargonbuster The context seemed to be that of matching a psychological type (as per the short description) to a career choice. By implication, the research was asking the students questions like: Would you be surprised to find a bright philosophy student working in a bookshop and doing yoga? Certainly, for me, the answer to that is ‘no’, and the students were no different. This process, in which people use stereotypes to arrive at conclusions, has a fancy name in psychology called the ‘representativeness heuristic’. That's an off-putting term but it basically just means ‘basing judgements on typical things’. People make a lot of decisions more-or-less subconsciously by applying preconceived stereotypes.

Being a psychology experiment, however, the researchers tucked away a sneaky trick. One of the jobs in the list, ‘bank teller’ (the American term for bank cashier) was entered twice: the first time high up the list just as ‘Linda is a bank teller’ and the second time at the bottom of the list as ‘Linda is a bank teller and active in the feminist movement’.

acloserlook In essence, therefore, the question being asked of the participants, and that you can ask yourself now, is: drawing on the earlier description of Linda's character, which of these two statements do you think is more likely?

  • Linda is a bank teller.
  • Linda is a bank teller and is active in the feminist movement.

Tversky and Kahneman wrote their description of Linda to make it seem highly likely that Linda was active in the feminist movement, but unlikely that she'd have taken a job in a bank. Thus, nearly all the students considered the first option, of Linda becoming a bank teller, to be improbable. But by linking the unlikely element of the description of Linda to the likely one, the researchers found that a full 89 per cent of students were persuaded that the description ‘Linda is a bank teller and is active in the feminist movement’ was plausible, and certainly much more so than the simpler claim.

Yet here's the catch — how can Linda being a bank clerk of one particular kind be more likely than her being a bank clerk of all possible kinds? Oops! That's illogical.

In fact, as the logicians say: the probability of a conjunction is never greater than the probability of its conjuncts. In other words, the likelihood of two particular things happening must be less than just one of them happening. Your being hit on the head by a flying pig tomorrow is very unlikely, your being hit on the head by a flying pig tomorrow and getting rained on is a bit less likely again, no matter how rainy a bit of the world you live in. It's kind of an iron rule, like 2 + 2 = 4. Don't argue with it! (A conjuction is two or more things joined together in some sense, and a conjunct is just one or other of the things.)

To cut a long story short, simple logic seems to dictate that Linda being a bank teller tout simple is more likely than her being a bank teller and a feminist. But Tversky and Kahneman drew much more general conclusions than just that people don't understand formal logic. They declared that the result was solid evidence of the illogicality of human thinking. The research has been cited many times since to wag a cautionary finger at those who see human beings as rational creatures who use the lessons of experience to learn and improve their navigation of life, and to make most people look dumb instead.

However, you don't have to rush to agree. On the contrary, many possible arguments can be made as to why the second description is more likely than the first one, and why that 89 per cent were quite entitled to say so. It all depends on the way words work, which is rather more complex than Tversky and Kahneman, let alone subsequent logicians, seem to have allowed. (For more on this famous and revealing experiment, read the nearby sidebar ‘Another possible way to argue for intuition against logicians’.)

Considering the power of group thinking

Consider for a moment, if you'd be so kind, why you're reading this book: perhaps it's a project of your own devising to become more logical. Or, on the contrary, perhaps this book is part of a kind of extreme groupthink — brainwashing even — an effort by society to make you think a certain way! Outlandish idea perhaps, but that's one possible implication Professor Deanna Kuhn draws from her research into psychology and education. She is concerned that as a society people spend much of their time and effort determining what they believe but seem to care little about how they come to believe what they do.

Questioning your beliefs

The question as to the extent people are in control of their decisions, and the extent to which they simply follow other people, is important to Deanna Kuhn.

She believes that Critical Thinkers should see thinking as a form of argument, because individuals’ beliefs are chosen from among alternatives on the basis of the evidence for them. However, her research caused her increasingly to question the extent to which individuals actually do hold their beliefs on the basis of evidence, instead of as a result of social pressures.

warning Deanna Kuhn's rather alarming conclusion is that many people don't or can't give adequate evidence for the beliefs they hold. Worse! People are unwilling or unable to consider revising their beliefs when presented with evidence against them. Kuhn holds that reasoned argument requires, at the very least, this ability to distinguish between the theoretical framework and the physical evidence.

Cascading information

jargonbuster Cascade theory is the idea that information cascades down the side of an informational pyramid — like a waterfall. If people don't have the ability or the interest to discover something for themselves, they find that adopting the views of others is easier. This act is without doubt a useful social instinct and an individual relying on information passed on by others is often quite rational. (After all, thinking is difficult and energy-sapping, as I explain in the earlier section ‘Jumping to conclusions: The cost of fast thinking’.)

warning Unfortunately, following wrong information is less rational, and that's what often happens. People cascade uselessly in everyday ways, like so many wildebeest fleeing a non-existent lion. A lot of economic activity and business behaviour, including management fads, the adoption of new technologies and innovations, not to mention the vexed issues of health-and-safety regulation, reflect exactly this tendency of the herd to follow poor information.

There are two possible, but conflicting, strategies, for coping with the tendency of people to unthinkingly absorb and follow duff information:

  • Some people suggest that society needs to encourage a range of views to be heard, even when they're annoying to the ‘majority’. For instance, allowing people to deny global warming or to let teachers decide what they're going to teach.
  • Other people say that society needs stricter control of information to stop the spread of ‘wrong views’. This view is the one currently cascading down the pyramid.

For a great example of cascade theory, check out the nearby sidebar ‘Don't snack on chips while reading this!’

Watching How the Brain Thinks

Wouldn't you love to be able to see a great brain — such as Einstein's, Copernicus's or Robbie Williams's — thinking? To watch as the neurons spark into action and they solve another mystery of the universe like: ‘How are space and time related?’, ‘Maybe the Earth goes round the Sun!’ or ‘Why don't I have hit records anymore?’

It's important for Critical Thinkers to know whether they really can think freely — or are only churning through data, more or less efficiently, in the manner of a very complicated computer. In this section I look at some of the arguments for thinking that the human mind is actually more complicated than that — and by implication capable of achieving more things.

‘My nerves are playing up’: The brain at work

Francis Crick, the 20-century British biochemist who played a key role in the discovering of DNA, imagined he'd solved the mystery of how human beings think. He put it all down to nerve cells and molecules. Many academics take what seems a small step from this conclusion to assuming, as Stephen Pinker puts it, that the ‘mind is a system of organs of computation designed by natural selection to solve the problems faced by our evolutionary ancestors’.

warning Raymond Tallis, however, is appalled by this ‘Darwinization of our understanding of humanity’ as well as by neuromania more generally, which he defines as the almost ubiquitous use of what's offered as the latest brain science to (supposedly) reveal how the human mind works. The stakes are high too, he warns, muttering about the awful lessons of history, when societies adopted policies based on pseudo-science and applied them with great cruelty against millions of individuals. (I say more about this in Chapter 3.)

jargonbuster Neuromaniacs see the mind as being nothing more (or less) than the human brain, and the brain itself as a machine. They even assume the presence of a central controller, a little person inside the big person — something akin to the program that runs in a digital computer.

But perhaps the brain and the mind aren't the same thing. Raymond Tallis refreshingly puts the contrary argument that in fact human thinking is incredibly, unbelievably complicated. At a biological level, the brain reacts in unpredictable, even chaotic, ways, and is being constantly altered by individual experiences.

remember The consequence of this alternative view for Critical Thinking is that issues are seen as being open and multi-dimensional, rather than settled and black-and-white. Truth is seen as coming in shades of grey, from various sources rather than being delivered ‘once and for all’ by an expert.

‘I don't wish to know that’: Preferring stereotypes to statistics

A point that Daniel Kahneman, the contemporary American psychologist, makes is that all people have a tendency to let stereotypes trump statistics. For example, if two people in your street got burgled last year, whatever the official claims about such things your assessment of the level of crime is probably too high.

Newspaper headlines provide a similar kind of distorting perspective on the world: if your paper runs a series on ‘Women attacked at night walking home’, while my paper runs a series on ‘Why walking is good for the health’, we end up with two quite different views on the same matter, based on a partial and misleading kind of ‘evidence’ (what we've read in the paper).

The power of the mass media to distort, if not quite all human thinking certainly people's assessments of risk, is shown in public anxiety over things such as children being attacked by strangers on the way to buy sweeties, or train stations being blown up in terrorist attacks.

trythis But that's partly to do with the statistical nature of risk assessment. Humans just don't get stats! Try this problem out. Here's the data:

  • A census classifies 85 per cent of men in a city as ‘European’ and 15 per cent as ‘indigenous’.
  • A witness to a street robbery identifies the assailant as ‘indigenous’.
  • The court tests the reliability of the witness, and he's able to identify correctly people as being either ‘European’ or ‘indigenous’ 80 per cent of the time, but he mistakes people's origins up to 20 per cent of the time.

Without being prejudiced one way or the other (of course), but having limited resources, in which community should the police prioritise its search for the street robber?

Perhaps this tendency of humans to prioritise prejudice over facts has something to do with many of the world's problems today; just a thought!

Getting Inside Scientists’ Heads

I realise that this heading may produce for some people a scary image, of miniaturised Raquel Welch and Donald Pleasance being injected into a scientist's ear. So you'll be glad to know that this section actually is about getting a handle on the kind of material that you need to analyse and evaluate in many areas of life. Because Critical Thinking requires you to not only handle information effectively, but to put it into a wider context, and even, when necessary, to treat it sceptically.

The conventional view of science is of a steady progression from crude guesses to sophisticated knowledge, propelled by ever-more ingenious techniques and machinery. Science, like a majestic river, heads in only one direction, and if foolish humans attempt to erect barriers to its progress, at some point in time their obstructions are swept aside and the great wave of discovery flows on.

I take a look at this traditional approach, which involves conjecture and refutation, and also at a view that challenges it: paradigm shifts. Uncritical thinkers just want to take whatever is said or written by a scientist as the plain facts of the matter — but more sophisticated thinkers recognize that across the whole sweep of human knowledge, facts keep changing! A textbook that was a pretty good guide thirty years ago, is likely to be substantially flawed by today's standards. This section explains why.

Engaging with scientific convention

Much of the history of Western philosophy assumes a steady, comforting process: knowledge exists and just needs to be identified rationally. When firm foundations have been established, the rest of the edifice can be constructed without needing to worry about one or other bit of it later being shown to be wrong.

warning But this picture ignores the complexities and inconsistencies of ‘real life’. Whatever people may like to think, in science, experiments don't lead to new theories, because all historically significant theories (and quite a few insignificant ones too) agree with the facts. As every politician and spin doctor knows, lots of facts exist and if you want to you can choose them to bolster your theory. Scientists are no different. But they think they are.

Trusting conjecture and refutation

Conventionally speaking, people suppose that when experiments are conducted to test theories in reality, and the results don't accord with those anticipated, the theory is disproven.

But some philosophers (called critical rationalists) reject this view — that thinking coolly and logically about the world is the route to true knowledge). Thinkers such as Karl Popper argued that no ‘theory-free’, infallible observations exist, but instead that all observation is theory-laden and involves seeing the world through the distorting glass (and filter) of a pre-existing conceptual scheme.

Popper writes:

If we are uncritical we shall always find what we want: we shall look for, and find, confirmations, and we shall look away from, and not see, whatever might be dangerous to our pet theories. In this way it is only too easy to obtain what appears to be overwhelming evidence in favor of a theory which, if approached critically, would have been refuted.

remember Yet, in a way, the 18th-century philosopher David Hume was even more radical than Popper. He concluded that science and philosophy alike rested less upon the rock of logic and human reason, but rather upon the shifting sands of scientific fashion and aesthetic preferences.

Hume's approach is exactly that of the Critical Thinker — taking nothing as given but insisting on the full application of reason in all areas — ignoring and standing free from conventional opinion. Certainly, this made him unpopular in many circles, but it also gave him some great insights into many issues.

Thinking in fits and starts: Paradigm shifts

None of us can function without a set of assumptions. You can't make sense of this book unless you assume I use words the same way that you do, or that the book is read from front to back, rather than, say, from the bottom of the last page backwards — and it would seem pretty silly to start any other way (unless you live in China!). Scientists start off by making a whole load of assumptions when they try to do anything too — they have to do this. But many of these background assumptions are just guesses — and many of them get abandoned later. Worst of all, starting assumptions tend to block off other alternatives and can stymie progress.

jargonbuster Although a somewhat wooly terms these days, a paradigm is a kind of picture or way of picturing something. In its simplest form the theory of paradigm shifts claims that scientific knowledge proceeds in fits and starts, with theories fighting to the death, as it were, against each other, instead of as a smooth process of the accumulation and refinement that people like to imagine.

Instead of being logical or rational, scientists find that the old theory has become too complicated and cumbersome to modify, and so they collectively abandon it. Or else a split emerges between followers of one theory and another, which is eventually decided in favour of the new theory for any number of reasons, none of them particularly scientific. This abandonment of a longstanding way of seeing an issue towards a new way is the paradigm shift.

trythis When Copernicus first cautiously suggested that perhaps a better way to understand the workings of the universe was to suppose that the Earth and the rest of the planets went around the Sun, rather than the Earth staying put and everything (including the start) rotating round it, the maths went against him. The fact was, the Moon, the Sun and the planets’ movements could be better calculated and predicted using the old system, even if it was quite complicated. The Church authorities insisted that the facts (the mathematics) should decide the issue — but a few scientific radicals, like the famous astronomer and physicist Galileo, preferred the simplicity and elegance of Copernicus's new theory and campaigned in public for its acceptance.

Which side would you support — the traditional view of the universe, well supported by the ‘facts and figures’ — or a trendy new one which clearly needed a lot more work before it could be considered even a competitor?

Answers to Chapter 2’s Exercises

Here are the answers to this chapter's exercises.

Pricing bats and balls

Fast, instinctive thinking jumps out with an answer: 10 pence! Alas, the answer is wrong. Check the maths — the bat cost £1 more than the ball, which means that the ball must be a real bargain at just 5 pence (bat = £1.05, ball = 5 pence to total £1.10). Slow thinking is required to come up with the right answer — as well as distrusting your intuition.

Looking for the robber

Given the numbers in the scenario, most people assume that the smart place to start looking for the attacker in the incident is among the indigenous community, because of the witness testimony, even though a ‘possibility’ clearly exists that the witness may have made a misidentification.

But suppose that 10,000 white people live in the city and just 1 ‘indigenous’ one. The best strategy is then a slam-dunk for the police, isn't it? But then remember that the witness sees ‘indigenous people’ 20 per cent of the time. The evidence isn't so persuasive now, is it, because the witness will often say someone is indigenous when they aren't. The ‘mathematically’ correct answer to the original scenario is that a substantially higher probability exists that the villain involved in the street robbery was European rather than indigenous — so the police should be looking for a European robber. Mathematicians use a technique called Bayesian analysis to get an exact figure, but the important thing is to be aware of the general issue. The reliability of the identification of the robber as indigenous is, in this case, 41 per cent, only about half the 80 per cent reliability people comfortably opted for ‘without thinking’.

Astronomical wrangles

You can be forgiven if you think this is a bit of a no-brainer — of course the Earth goes around the Sun, and so you support the new-fangled theory. But when you do that, you have to accept that you are throwing out any pretence that you think scientific matters should be settled on the basis of the facts and figures and evidence.

This is what the radical philosopher, Paul Feyerabend, meant when he argued in his books that in science, the only rule is that there are no rules, and what's more, that only by breaking rules that scientists have been able to make the progress for which they are — later on — praised.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.15.137.213