4
THE ACADEMIC ILLUSION

“All truth passes through three stages: First, it is ridiculed. Second, it is violently opposed. Third, it is accepted as being self-evident.”

Arthur Schopenhauer (1788–1860)

HOW INTELLIGENT ARE YOU? This is not an easy question to answer. Intelligence is one of those qualities that we think we can recognize in people, but when we try to define it, it can slip from our grasp. If it’s any consolation, there’s no agreed definition of intelligence among the many specialists in psychology, neurology, education or other professional fields who devote a good deal of their own intelligence to thinking about it. There may be no agreed definition, but there are two ideas that dominate popular conceptions of intelligence. The first is IQ (Intelligence Quotient); the second is a memory for factual information.

JOIN THE CLUB

Mensa is an international association that promotes itself as one of the most exclusive clubs in the world. Membership is on the basis of “high intelligence” and Mensa claims to admit only 2% of the population to its ranks. The decision is based on applicants’ performance in various “intelligence tests” that ask questions like these:

  1. What letter should come next?

    M Y V S E H M S J R S N U S N E P?

  2. Details of items bought at a stationer are shown below.

    78 – Pencils

    152 – Paint Brushes

    51 – Files

    142 – Felt Tip Pens

    ? – Writing Pads

    How many writing pads should there be?

  3. In which direction should the missing arrow point?
     ∨   >   ∧   ∨   < 
     ∨   <   >   ∨   > 
     ∧   >   ?   >   ∨ 
     >   <   ∨   ∧   > 
     ∨   >   <   ∨   ∧ 

These sorts of question test the ability to analyze logically the principles that govern a sequence of ideas.1 Philosophers call this logico-deductive reasoning. Thinking “logically” is an important part of the popular view of intelligence. The second is having a good memory for facts.

AS A MATTER OF FACT

Mastermind is one of the best-known quiz shows on British television. Four contestants take it in turn to sit under a spotlight in a darkened studio and face the quizmaster. There are two 2-minute rounds of questions: the first is on a specialist topic chosen by the contestant; the second on “general knowledge.” The Mastermind of the Year emerges at the end of each series from an all-winners final and is fêted as one of the cleverest people in Britain. A long-running radio program called Brain of Britain has a similar format. Contestants in the phenomenally successful Who Wants to be a Millionaire? can win a fortune by giving correct answers to just 12 factual questions. Quiz programs like these draw on the ability to memorize factual information, including names, dates, events and statistics. Philosophers call this propositional knowledge: knowledge that something is the case. Academic ability draws on these two capacities for logico-deductive reason and for propositional knowledge.

The term “academic” derives from the name of a grove near ancient Athens called Academeia. It was there, 400 years before the birth of Christ, that the Greek philosopher Plato established a community of scholars. Plato’s teachings drew on the philosophical methods that had been developed by his teacher, Socrates. Plato’s most famous student was Aristotle (who, in turn, was tutor to Alexander the Great). Aristotle further developed these ideas in his own work and teachings, from which there have evolved systems of thought, of mathematics and of science that have shaped the intellectual character of the Western world. But apart from presumably having high ones themselves, Plato, Socrates and Aristotle had never heard of IQ or Mensa. So what is the link between IQ tests, quiz shows, mass education and the groves of Academeia?

MEASURING YOUR MIND

Like the motor car, television, the micro-processor and the Coca-Cola bottle, IQ is one of the most compelling inventions of the modern world. It is an idea in four parts.

Each of us is born with a fixed intellectual capacity or quotient: just as we may have brown eyes or red hair, we have a set amount of intelligence.

How much intelligence we have can be calculated by a series of “pencil-and-paper” tests of the sort illustrated above. The results can be compared against a general scale and given as a number from 0 to 200. That number is your IQ. On this scale, average performance is between 80 and 100; above average is between 100 and 120 and anything above 130 gets you into Mensa’s Christmas party.

IQ tests can be used to predict children’s performance at school and in later life. For this reason, various versions of IQ tests are widely used for school selection and for educational planning.

IQ is an index of general intelligence: that is, the scores on these tests point to a person’s overall intellectual capacities. For this reason, some people think that it is enough to announce their IQ score for everyone to grasp how bright they are, or not.

Since the idea of IQ emerged about 100 years ago, it has had explosive consequences for social policy and especially for education. Where did the idea come from in the first place; how did it come to dominate the popular conception of intelligence; and is it a fair and accurate measure across all cultures?

Brave old world

The foundations of the modern intelligence test were laid in the mid-nineteenth century by Sir Francis Galton, a cousin of Charles Darwin. After reading The Origin of Species in 1859, he wondered if human life followed the principles of natural selection that Darwin had described in the rest of nature. Galton concluded that if heredity played a decisive role in human development, it should be possible to improve the human race through selective breeding procedures. (He first used the word eugenics, with the meaning “good” or “well” born, in 1883.2) With that end in mind, he turned to developing scientific ways of isolating and measuring “general intelligence” and of comparing it between individuals. The modern intelligence test builds on Galton’s work and more especially on that of Alfred Binet.

At the beginning of the twentieth century, Binet was working with children in elementary schools in Paris and wanted to identify those who might need special educational support. He began to develop short tests for children of different ages that could be easily administered. His aim was practical and his method “was pragmatic rather than scientific.”3 By 1905, he had produced his first scale of intelligence based on a test of 30 items designed for children aged from 3 to 12 years. The tester worked through the items with each child until the child could do no more. Performance was compared with the average for the age group to which the child belonged. If a child could pass the test expected of a 6-year-old, say, the child was said to have a mental age of 6. Binet used the difference between the mental age and the chronological age as an index of “retardation.”

In 1912, German psychologist William Stern proposed using the ratio of mental age to chronological age to yield the now familiar intelligence quotient:

numbered Display Equation

Within a few years, translations of Binet’s work were starting to appear in other parts of the world. The restricted uses for which it was originally designed were soon forgotten and it was applied in every sort of setting, especially in the United States. One hundred years later, IQ remains the principal basis of selection for different forms of education, for many different types of employment and for roles in the military.

IQ has been used to support and to attack theories of racial, ethnic and social difference. Early IQ tests in the UK and the USA suggested that poor people and their children have low IQs and that the rich and their offspring have high IQs. Researchers wondered if IQ somehow determined levels of affluence and of material success. An important variable, of course, is that poor people could not afford to educate themselves and rich people could, which is something of an oversight from a methodological point of view.

For a time, these findings provided a powerful rationale for political initiatives based on eugenics to “improve” the human stock by selective breeding and population control. In the early twentieth century, leading intellectuals including Winston Churchill and George Bernard Shaw supported the eugenics movement, arguing that the breeding of the poor should be carefully controlled. Some states in the USA legislated to sterilize people classified as “idiots” or of low intelligence. With different motives, the Third Reich embraced eugenics as a key element in the Final Solution.

A major controversy about IQ tests flared up in 1992 with the publication in America of The Bell Curve by Charles Murray and Richard Hernstein.4 The Bell Curve argued that IQ tests point to significant racial differences in human intelligence. It argued that IQ is linked to low moral behavior and that there is a connection with the cultures of some ethnic groups, especially Black and Hispanic communities. The Bell Curve was widely condemned as a racist tract and generated an inferno of debate, which is still smoldering.

From the outset, IQ has been a powerful and provocative idea, and it remains so, even though there is no general agreement on exactly what IQ tests measure, or on how, whatever it is they do measure, this relates to general intelligence. Nonetheless these ideas of academic ability and of IQ have come to be taken for granted as the natural order of things, rather than as the product of particular scientific enquiries and cultural perspectives. How has this happened? The answer lies in the triumph of science in the last 400 years and in its roots in the groves of Academeia.

THE TRIUMPH OF SCIENCE

Historians conventionally think of Western history in three main periods: ancient, medieval and modern. These periods are not separated by sharp boundaries or exact dates, but they are recognizable phases in the cultural evolution of humanity. They are marked by different ways of seeing the world and by the different worlds that were created as a result.

The philosopher Susanne Langer5 argues that intellectual horizons of a society, or of an historical period, are not set simply by events or human desires. They are set by the basic ideas that people use to analyze and describe their lives. Theories develop in response to questions and a question, as Langer notes, can only be answered in a certain number of ways. For this reason the most important characteristic of an intellectual age is the questions it asks – the problems it identifies. It is this, rather than the answers it provides, that reveals its underlying view of the world. In any intellectual age there will be some fundamental assumptions that advocates of all the different ways of thinking unconsciously take for granted. These deep-seated attitudes constitute our ideology, and they set the boundaries of theory by inclining us to this or that set of issues and explanations. If our explanations are theoretical, our questions are ideological.

“Copernicus, Galileo and Kepler did not solve an old problem: they asked a new question.”

The term “paradigm” was popularized in the 1970s by the American philosopher of science, Thomas Kuhn (1922–1996).6 A paradigm is an accepted framework of rules and assumptions that define established ways of doing things. In the history of science, a paradigm is not a single theory or scientific discovery, but the underlying approach to science itself, within which theories are framed and discoveries are verified.

Kuhn describes science as a puzzle-solving activity in which problems are tackled using procedures and rules that are agreed within the community of scientists. He was interested in periods in history when there was a shift either in the problems or in the rules of science or both. He saw a difference between periods of “normal science,” when there is general agreement among scientists about problems and rules, and periods of “extraordinary science,” when normal science begins to generate results that the accepted rules and assumptions cannot explain. If these anomalies accumulate, there can be a loss of confidence in the accepted methods and a professional crisis in science, which can unleash periods of enormous creativity and invention. Periods of “extraordinary science” create opportunities for new questions and theories about the nature and limits of science itself. These are times of scientific revolution.

A new paradigm may emerge when new ideas or methods – what Susanne Langer calls generative ideas – runs with tumultuous force through existing ways of thinking and transforms them. Truly generative ideas excite intellectual passions in many different fields because they open up whole new ways of seeing and thinking. As Susanne Langer said, “a new idea is a light that illuminates things that simply had no form for us before the light fell on them and gave them meaning. We turn the light here, there and everywhere and the limits of thought recede before it.”7

Paradigm shifts tend to run a characteristic course. They are triggered by new ideas that reconfigure basic ways of thinking. Initially, there is a period of intellectual uncertainty and excitement as the new ideas are applied, stretched and tested in different areas of inquiry. Eventually, the revolutionary ways of thinking begin to settle down and their potential becomes clearer and more established. They become part of the new way of thinking: the new paradigm. Eventually, the ideas become drained of their excitement, leaving a residue of established ideas and new certainties. They enter the culture as taken-for-granted ideas about the way things are and provide the framework for a new period of normal science.

The transition from one intellectual age to another can be traumatic and protracted. Some people never make the transition and remain resident in the old worldview: their ideological comfort zone. New ways of thinking do not simply replace the old at clear points in history. They may coexist with old ways of thinking for a long time, creating many tensions and unresolved problems along the way. Each major period of intellectual growth has been characterized by revolutionary ideas that have driven forward the sensibilities of the times.

In the ancient and medieval periods it was taken for granted that Ptolemy (c. AD 90–c. AD 168) was right: the sun orbited round the earth. There were two reasons for this belief. To begin with, that is exactly what it seemed to do: the sun came up in the morning, passed through the sky and went down again at night. It was obvious to everyone that the sun was moving, not the earth. People were not being flung off the planet on the way to work; there was not a network of ropes to cling to on the way to the shops. It was plain common sense that the earth was motionless. There were religious reasons too for this assumption.

In the medieval worldview, the earth was the center of creation and human beings were God’s last word: the jewel in the cosmic crown. Theologians assumed a perfect symmetry in the universe. The planets, it was thought, revolved around the earth in perfect circular orbits. Poets expressed this harmony in the rhythms of verse; mathematicians from the early Greeks developed elegant formulae to describe these motions; and astronomers based elaborate theories upon them. The problem was that there were worrying variations in these movements. The planets would not behave.

Astronomers made increasingly intricate calculations to account for these variations. As perplexed as everyone else, Nicolaus Copernicus (1473–1543) made a radical proposal. What if the sun was not going round the earth, he asked? What if the earth was going round the sun? This startling idea solved, at a stroke, many of the old problems that had plagued astronomers. Heliocentrism had arrived. Later, Johannes Kepler (1571–1630) showed that the planets did not move in circles but in elliptical orbits, a phenomenon that Isaac Newton (1643–1727) was eventually explain by the effects of gravitational attraction.

The Copernican theory caused few ripples until Galileo Galilei (1564–1642) took an interest in Heliocentrism. His telescope enabled scientists to see the truth of Copernican theories and the ideas began to take hold. Unfortunately, they were heretical: an affront to God’s design and to humanity’s view of itself. Galileo was persecuted and put on trial twice for his views. Nonetheless, over time more people came to accept that these theories were correct. Copernicus, Galileo and Kepler did not solve an old problem: they asked new questions and in doing so they changed the paradigm within which the old questions had been framed. The old theories were shown to be wrong because the assumptions on which they had been based were mistaken. They were built on a false ideology. As this realization spread, it ushered in a new intellectual age: a new paradigm.

The Renaissance of the fourteenth and fifteenth centuries marked a shift away from the medieval worldview and from the ideologies in which they had been conceived. The insights of Copernicus and Galileo proved to be the dawn of a new age. The transition from Ptolemy’s first-century view of the universe with the earth at its center, to the universe of Copernicus, repositioned not only the earth in space but also humanity’s place in history.

The shock waves did not stop with astronomy: they rolled through all areas of cultural life including philosophy, politics and religion. Though they both denied being atheists, the arguments of Copernicus and Galileo raised serious doubts about many aspects of religious teaching. Five hundred years later, Darwin’s theory of evolution was to issue a more profound challenge to religious belief: a theory that was framed in the paradigm of objective science that had been heralded by the Copernican revolution. The medieval worldview that had been held together by dogma and faith was eventually shaken apart by a new one, based on logic, reason and evidence.

BORN AGAIN

The period which we now think of as the Renaissance was so called because it marked a rebirth of interest in the methods and achievements of the ancient world in philosophy, literature and mathematics. In little more than 150 years, the Renaissance gave rise to some of humanity’s most luminous figures and most enduring works: lives and achievements that have shaped the world that we now live in. Between 1450 and 1600 Europe saw the birth of Leonardo da Vinci, Michelangelo, Raphael, Galileo, Copernicus, Shakespeare and Isaac Newton. They produced works in art and literature of unsurpassed beauty and depth, and created the foundations of modern science, technology and philosophy.

The modern idea of a Renaissance person is someone who is learned in a range of disciplines including the arts and sciences. The quintessential Renaissance figure is Leonardo da Vinci, who was gifted in painting, sculpture, mathematics and science. When Michelangelo was painting the Sistine Chapel, he was also working on scientific theorems and on designs for new technological devices. The Renaissance and the cultural movements that flowed from it were impelled not only by surpassing accomplishments in the arts and letters, but by a succession of startling innovations in technology. Four in particular were to prove decisive: the printing press, the magnetic compass, the telescope and the mechanical clock.

Spreading the word

Before the printing press, only a small, literate elite, largely confined to the Church, had access to books, ideas and learning. The power of the Church was rooted in its exclusive access to scriptures and through them to the word of God. This gave the clergy unrivalled control over the people’s minds. The printing and distribution of books unleashed a voracious appetite for literacy and disseminated ideas across national and cultural boundaries on a scale that was previously unimaginable. As the flow of ideas increased, the iron grip of the Church began to loosen.

The invention of the portable book by Aldus Manutius (1450–1515) of Venice facilitated the personal library and revolutionized the control of knowledge. The venerable institution of the medieval library began to be replaced by an independently published and commercial product. As Juan F. Rada puts it: “A new technological and intellectual transition started, which reinforced the conditions of the scientific revolution and accompanied the great period of the discoveries. The portable book had a subversive impact, created the conditions for the Reformation, for the use of vernacular language, for the diversification of publishing and allowed for a great individual expression of authors and readers.” The portable library created the instruments that were necessary for the development of complex bureaucracies and large organized states: “Publishing became the vehicle for the transmission of ideas and debate, for proselytism and for scholarly recognition. The seeds for the Enlightenment were sown and with them the belief in education and, in this century, the belief in universal education and literacy.”8

Getting our bearings

The Renaissance was an era of eye-opening discoveries. The fleets of the European powers journeyed across the oceans on speculative expeditions of exploration and colonization. These forays into the unknown were made possible by new navigational tools, including the magnetic compass (developed originally by the Chinese some 200 years earlier and only recently surpassed by GPS), which revolutionized orientation, especially at sea, by measuring points of direction with great precision.

As some explorers were mapping the earth, others took their lead from Galileo and were surveying the heavens with a new sense of scientific precision. The telescope made possible more accurate observations of the movements of the planets and of the place of the earth in the heavens. These innovations interacted with the evolution of new theories in science and mathematics. So too did a more complex invention, which may also have its roots in Chinese technology.

During the sixteenth and seventeenth centuries, people started to think differently about time. Until then the most reliable ways of keeping track of time were sundials and water clocks. Versions of these devices had been in use around the world since ancient times and they varied both in sophistication and in how they framed the passing of time. In 1656, Christiaan Huygens, a Dutch philosopher, mathematician and scientist, perfected his design for a mechanical clock, regulated by a pendulum. Huygens’ clock was complex, precise and reliable, and it helped to revolutionize humanity’s sense of time. The idea that the day is divided into 24 equal segments of 60 minutes is an indissoluble part of how we see the world. Mechanical clocks released people from organizing their time by the natural rhythms of day and night: a change that had profound significance for patterns of work and industry. The clock also suggested new ways of thinking about the universe.

In 1687, Isaac Newton published his Principia in which he set out his monumental theories about the workings of nature and of our place in the cosmos. In doing so, he conceived of the universe as a great clock-like mechanism, an idea that had a deep impact on subsequent developments in science and philosophy. Implied in this image were ideas about cause and effect and about the importance of external as against internal stimuli: ideas that now shape everyday thinking and behavior. As Alvin Toffler notes, the invention of the clock came before Newton published his theories and had a profound effect on how he framed them, though Newton himself warned against using his theories to view the universe as akin to a great clock. He said, “Gravity explains the motions of the planets, but it cannot explain who set the planets in motion. God governs all things and knows all that is or can be done.”

The rise of the individual

In the medieval period, Church and State were locked in a close embrace. One force that began to erode the power of the Church was the spread of literacy. Another, which literacy fomented, was a growing unrest with the spiritual and political corruption of the Church. In the early fifteenth century, the German cleric Martin Luther sparked a revolt against Rome that spread rapidly throughout Europe and split Christianity in two. Luther argued that no third party, and least of all a corrupt and self-serving Church, should stand between individuals and their relationship with their creator. The Reformation emphasized the need for individuals to reach their own understanding of the scriptures and to deal directly with God. The emphasis on empowering the critical judgment and knowledge of the individual underpinned the growth of scientific method in the seventeenth and eighteenth centuries in Europe; the period now known as the Enlightenment.

Being reasonable

As the old certainties of the Church were shaken, the intellectuals of the Enlightenment began to ask fundamental questions about the nature of things. Specifically, what is knowledge and how do we know? They tried to take nothing for granted. The aim was to see the world as it is; stripped of superstition, myth and fantasy. Knowledge had to conform to the strict dictates of deductive logic or be supported by the evidence of observation. The French philosopher René Descartes (1596–1650) argued that nothing should be taken on trust. If a new edifice of knowledge was to be constructed, it must be built brick by brick with each element fully tested. He set out a logical program of analysis where nothing would be taken for granted, not even his own existence. His starting point was that the only thing he could know for certain was that he was actually thinking about the problems: cogito ergo sum. “I think, therefore I am.” I must be alive because I am thinking.

The rationalist moves through logical sequences, building one idea on another in a mutually dependent framework. The empirical method similarly looks for patterns in events, suggesting movements from known causes to known effects. Rationalism and empiricism were the driving forces of the Enlightenment and they ran with irresistible force through science, philosophy and politics, bowling over traditional methods of thought and opening up vast new fields of adventure in science, technology and philosophy. They led, in due course, to the Industrial Revolution of the eighteenth and nineteenth centuries and to the dominance of science in our own times. Along the way a fissure opened up between two modes of understanding that had previously been almost indistinguishable: the arts and the sciences.

The shaping of the modern world

The achievements of the rationalist scientific worldview have been incalculable. They include unparalleled leaps in medicine and pharmaceuticals and in the length and quality of human life; the explosive growth in industrial technologies; sophisticated systems of communication and travel; and an unprecedented understanding of the physical universe. There is clearly much more to come as the catalogue of achievements in science and technology continues to accumulate. There has been a heavy price too, not least in the schism of the arts and sciences and the domination of the rationalist attitude, especially in the forms of education to which it has given rise.

The union of the arts and sciences, which seemed so natural in the Renaissance, gradually dissolved during the Enlightenment. In the late eighteenth and early nineteenth century, there was a powerful cultural reaction to cold logic and its mission to demystify the world. In music, art, dance, drama, poetry and prose, the disparate band of the Romantics reasserted the power and validity of human experience, of feelings, emotions and transcendence. Whereas the Enlightenment was represented by the great rationalist philosophers and scientists, including Hume, Locke and Descartes, Romanticism was carried forward in the powerful works of Beethoven, Schiller, Wordsworth, Coleridge, Byron, Goethe and many more.

In contrast to the rationalists, the Romantics celebrated the nature of human experience and existence. The divisions are alive and well in contemporary attitudes to the arts and sciences. Typically, the sciences are associated with fact and truth. The image of the scientist is a white-coated clinician moving through impersonal calculations to an objective understanding of the way the world works. In contrast, the arts are associated with feelings, imagination and self-expression. The artist is pictured as a free spirit giving vent to a turmoil of creative ideas. In education, the impact of these assumptions has been far reaching.

THE RISE OF EDUCATION

Before the Industrial Revolution, relatively few people had any formal education. In the Middle Ages in Europe, education was provided largely by the Church in what were known as grammar schools. Originally, a grammar school was literally one that taught grammar and especially Latin grammar.9 The Kings School, Canterbury, claims to be the oldest grammar school in England. It traces its origins to the coming of St Augustine in AD 597, though such institutions may have originated more than 1,000 years earlier. Grammar schools of various sorts can be traced back to the ancient Greeks.

Many of the grammar schools in Europe were founded by religious bodies. Some were attached to the larger or collegiate parish churches, others were maintained by monasteries. The primary purpose of these grammar schools was to educate boys for the Church, but medieval clerics followed careers in many fields. The Church was the gateway to all professions including law, the civil service, diplomacy, politics and medicine. In the ancient and medieval schools, the principal focus was on learning Greek and Latin literature and the aim was to be fluent enough in these to gain a foothold in professional life. Latin was the international language of the Church and fluency was a requirement. Given their specialist functions, grammar schools have always been selective through some form of entrance test.

By the close of the fifteenth century, there were 300 or more grammar schools in England: the Church was involved in most of them. As the fifteenth and sixteenth centuries unfolded, and skepticism of religious doctrine deepened, non-religious organizations began to establish their own schools for their own purposes. Many of these were related to trade.10 The growing influence of grammar schools was accompanied by gradual changes in what they actually taught.

The curriculum of the medieval grammar schools was specifically classical. Classical education was based on the seven liberal arts or sciences: grammar, the formal structures of language; rhetoric, composition and presentation of argument; dialectic, formal logic; arithmetic; geometry; music; astronomy.11 For centuries, the classics dominated the very idea of being educated and attempts at reform were resisted. As James Hemming notes, schools were in thrall to the “classical illusion,” the idea that “only those are educated who can read Homer in the original.”

During the Renaissance some pioneering head teachers tried to loosen the grip of the classics on the grammar school curriculum by introducing other subjects and more practical approaches to teaching them. Richard Mulcaster was the first headmaster of the Merchant Tailors’ School from 1561 to 1586. He made valiant efforts to have English taught at grammar schools, arguing that it was essential to regulate its grammar and spelling. He pressed the case for drama in schools and his boys performed before Elizabeth I on a number of occasions. The curriculum of the Merchant Tailors’ School came to include music and drama, dancing, drawing and sport of all sorts – wrestling, fencing, shooting, handball and football.

Francis Bacon (1561–1626) argued for the inclusion of other subjects in the school curriculum, including history and modern languages and especially science. The headmaster of Tonbridge School published a book in 1787 arguing for the curriculum to include history, geography, mathematics, French, artistic training and physical training. In Britain, attempts to broaden the curriculum beyond the classics made little progress until the mid-nineteenth century. Charles Darwin (1809–82) went to school at Shrewsbury. Reflecting on the experience, he said: “Nothing could have been worse for my mind than this school, as it was strictly classical; nothing else being taught except a little ancient geography and history. The school as a means of education was to me a complete blank. During my whole life I have been singularly incapable of mastering any language … The sole pleasure I ever received from such [classical] studies was from some of the odes from Horace which I admired greatly.”12

The pressure for change came from elsewhere. Three developments in particular were to reshape public opinion about schooling and to reform the grammar school curriculum. The first was the growing impact of science and technology, and the changing intellectual climate of which they were part. Second, the rampant growth of industrialism was changing the international economic landscape. The Exhibitions of 1851 and 1862 illustrated the rapid industrial progress of other European countries, a movement that had begun in Britain but was now threatening to outrun it. Third, new theories were developing about the nature of intelligence and learning. The new science of psychology was proposing new explanations about the nature of intelligence and how it should be cultivated. These theories challenged the benefits of a strictly classical education rooted in learning grammar and formal logic.

During the nineteenth and twentieth centuries the classics became almost extinct in secondary education. In their place, the school curriculum settled into the now-familiar hierarchy: languages, mathematics, science and technology at the top, the humanities and the arts at the bottom. As James Hemming put it, it was then that the classical illusion was replaced by the academic illusion.

THE IRRESISTIBLE RISE OF IQ

In 1870, the British Government passed an Act of Parliament to develop provision for primary schools. In 1902, it turned its attention to secondary education and began to establish county grammar schools. Forty-two years later, at the height of World War II, the government passed the 1944 Education Act, which provided free secondary education for all. The Act was designed to produce a workforce that met the needs of the post-war industrial economy.13 It established three types of school: grammar, secondary modern and technical. The grammar schools were to educate the “top 20%”: the prospective doctors, teachers, lawyers, accountants, civil servants and managers of post-war Britain. It was assumed that they would need a rigorous academic education and that is what the grammar schools were intended to give them. Those who went to the secondary modern schools were destined for blue collar and manual work. They were given a more basic education that was effectively a watered-down version of the grammar school curriculum. Many European countries made similar sorts of provision.14

It was during the massive expansion of education in the post-war years in the United Kingdom that IQ took such a firm hold on the whole system. The large numbers of young people who were streaming into compulsory education had to be channeled into the various types of schools that were available. IQ testing was a quick and convenient method of decision-making. Like the SATs (Scholastic Assessment Tests) in the United States these tests took no account of social background or previous educational opportunity.15 They were also very limited. High scores relied on standard verbal and logical operations. Success relied as much on knowing the techniques involved as on natural aptitude. Despite their obvious shortcomings, challenging the authority of these tests was never an easy matter. Then, as now, they had the backing of government and of the scientific establishment. They were assumed to be “above reproach or beyond social influence, conceived in the rarefied atmosphere of purely scientific inquiry by some process of immaculate conception.”16

“Success relied as much on knowing the techniques involved as on natural aptitude.”

For all their sway over education, IQ tests and SATs do not assess the whole range of a student’s intellectual abilities. They look for particular sorts of ability. The same is true of the forms of education that they support. For generations, students have spent most of their time writing essays, doing comprehension exercises, taking tests of factual information and learning mathematics: activities that involve propositional knowledge and forms of logico-deductive reasoning. The most common form of assessment is still the timed written examination, in which success mainly depends on a good short-term memory. Some students labor for months to pass these tests – others glide through them with relatively little effort. This pattern continues into higher education, and especially in universities. Some disciplines promote other sorts of ability, especially those with a practical element. Most schools have art lessons and some music – perhaps playing an instrument or being in a choir – and sport. They are at the margins of formal education, and seen as dispensable when the academic chips are down. The arts are an important test case.

THE NARROWING OF INTELLIGENCE

Some years ago, I was a member of a university promotions committee, a group of 20 professors from the arts, sciences and social studies. A university lecturer is expected to do teaching, administration and research. A case for promotion has to include evidence of achievement in all three. One of my roles as head of department was to make recommendations about members of my own faculty. I had recommended an English lecturer, who I thought was a sure case. Committee members had to leave the room when their own recommendations were being discussed. I thought this was a routine matter so I slipped out of the room and was back within a few minutes ready to rejoin the meeting. I was kept waiting outside for nearly half an hour: clearly there was an issue.

Eventually I was called back into the room and sat down expectantly. The vice-chancellor said, “We’ve had a few problems with this one. We’re going to hold him back for a year.” Members of the committee are not meant to question decisions that concern their own recommendations, but I was taken aback. I asked why, and was told there was a problem with his research. I wasn’t prepared for this, and asked what was wrong with it. I was told there was so little of it.

They were talking about an English lecturer who, in the period under review, had published three novels, two of which had won national literary awards; who had written two television series, both of which had been broadcast nationally and one of which had won a national award. He had published two papers in conventional research journals on nineteenth-century popular fiction. “But there is all of this,” I said, pointing to the novels and the plays. “We’re sure it’s very interesting,” said one of the committee, “but it’s his research we’re worried about,” pointing to the journal papers. “But this is his research too,” I said, pointing to the novels and plays. This led to a good bit of shuffling of papers.

By research output, most universities mean papers in academic journals or scholarly books. Apparently, the idea that novels and plays could count as research had not occurred to them. A good deal hangs on this. The issue was not whether these novels or these plays were any good, but whether novels and plays as such could count as research in the first place. The common-sense reaction was that they could not. But what is research? In universities, research is defined as a systematic enquiry for new knowledge. I asked the committee whether they thought that novels and plays, as original works of art, could be a source of new knowledge. If so, does the same apply to music, to art and to poetry? Were they saying that knowledge is only to be found in research journals and in academic papers? This question is important, for a number of reasons. It relates particularly to the status of the arts and sciences in universities and in education more generally.

There is an intriguing difference in research in arts and science departments in universities. If you work in a physics or chemistry department, you work in a laboratory and you “do” science. You do not spend your professional life analyzing the lives and times of physicists. If you are a mathematician, you don’t scrutinize the mood swings of Euclid or his relationships with his in-laws when he developed his theories. You do mathematics. This is not what goes on in many arts departments. Professors of English are not employed to produce literature: they are employed to write about it. They spend much of their time analyzing the lives and drives of writers and the work they produce. They may write poetry in their own time, but they are not normally thanked for doing it in university time even though it may raise the profile of the university. They are expected to produce analytical papers about poetry. Producing works of art doesn’t often count as appropriate intellectual work in an arts department, yet the equivalent in a science department, doing physics or chemistry, does. So why is it that in universities writing about novels is thought to be a higher intellectual calling than writing novels; or, rather, if writing novels is not thought to be intellectually valid, why is writing about them?

I mentioned earlier that a distinction is commonly made between academic and non-academic subjects. Science, mathematics and the social sciences are seen as academic subjects; and art, music and drama as non-academic. There are several problems in this approach. The first is the idea of subjects, which suggests that different areas of the curriculum are defined by their subject matter: that science is different from art because it deals with different content. Mathematics is not defined only by propositional knowledge. It is a combination of concepts, methods and processes and of propositional knowledge. The student is not only learning about mathematics but also how to do mathematics. The same is true of music, art, geography, physics, theater, dance and the rest. For these reasons alone, I much prefer the idea of disciplines to subjects.

The idea of disciplines also opens up the dynamics of interdisciplinary work. It is because of these dynamics that disciplines keep shifting and evolving. This is what the linguist James Britton17 had in mind when he said, “we classify at our peril. Experiments have shown that even the lightest touch of the classifier’s hand is likely to induce us to see members of a class as more alike than they actually are and items from different classes as less alike than they actually are. And when our business is to do more than merely look, these errors may develop during the course of our dealings into something quite substantial.”

Disciplines are constantly merging, reforming, cross- fertilizing each other and producing new offspring. When I first arrived at my last university, I went to a meeting of the professorial board, a committee of all professors in the university across all disciplines. There was a proposal from the professor of chemistry that the university should establish a new professorship in chemical biology. The board nodded sagely and was about to move on, when the professor of biological sciences argued that what the university really needed was a chair in biological chemistry. As Britton implies, our categories of knowledge should be provisional at best.

The second assumption is that some subjects are academic and some are not. This is not true. All issues and questions can be considered from an academic point of view and from other viewpoints too. Universities are devoted to propositional knowledge and to logico-deductive reasoning. Academics can look at anything through the frame of academic enquiry: plants, books, weather systems, particles, chemical reactions or poems. It is the mode of work that distinguishes academic work, not the topic. The assumed superiority of academic intelligence is obvious in the structure of qualifications. Traditionally, universities have rewarded academic achievement with degrees. Other institutions give diplomas or other sub-degree qualifications. If you wanted to do art, to paint, draw or make sculptures, you went to an art college and received a diploma for your efforts. If you wanted a degree in art, you had to go to university and study the history of art. You didn’t create art at university; you wrote about it. Similarly, if you wanted to play music and be a musician you went to a conservatoire and took a diploma; if you wanted a music degree you went to a university and wrote about music. These distinctions are beginning to break down. Arts colleges do now offer degrees and some university arts departments do offer practical courses. Even so, in some cultures and institutions there is still resistance to giving degrees for practical work in the arts.

CHANGING OUR MINDS

The modern worldview is still dominated by the ideology that came to replace medievalism: the ideology of rationalism, objectivity and propositional knowledge. These ideas frame our theories just as much as myth and superstition underpinned the painstaking calculations of the medieval astronomers. Just as their ideology created the framework for their questions, so does ours. We ask how we can measure intelligence. The assumption is that intelligence is quantifiable. We ask how we can raise academic standards but not whether they will provide what we need to survive in the future. We ask where we can find talented people but ignore the talents of people that surround us. We look but we do not see, because our traditional common-sense assessment of abilities distracts us from what is actually there. We ask how to promote creativity and innovation but stifle the processes and conditions that are most likely to bring them about. Caught in an old worldview, we continue to lean on the twin pillars of mass education, despite the evidence that the system is faltering for so many people within it.

The popular idea of intelligence has become dangerously narrow and other intellectual abilities are either ignored or underestimated. Despite all the attempts to promote parity between academic and vocational courses, the attitude persists that academic programs are much higher status.18 Yet, intelligence is much richer than we have been led to believe by industrial/academic education. Appreciating the full range and potential of human intelligence is vital for understanding the real nature of creativity. To educate people for the future, we must see through the academic illusion to their real abilities, and to how these different elements of human capacity enhance rather than detract from each other. What are these capacities and what should be done to release them?

NOTES

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.118.7.102