10. Urbanization and democracy

Cities as population sinks

The growth of urban civilization conceals a major paradox. Clearly, the formation of the city-states of the Middle East and ancient Greece is good evidence for increasing population. The urbanization of Western Europe was also accompanied by an overall increase in population. What is rarely appreciated is that, until relatively late in the industrial era, the death rate in the cities was higher than the birth rate. Consequently, the populations of medieval cities were replenished by migration from the countryside. During most major periods of city growth, the population increases were actually produced in rural areas; the cities themselves had a net negative impact on the population.

The death toll in the towns was largely the result of infectious disease. Infant mortality was often greater than 50%, and many of those who survived infancy failed to reach adulthood. In medieval Europe, overall life expectancy ranged from the mid-twenties to the upper-thirties. As industrialization proceeded, life expectancy rose and infant mortality fell. However, even in England, it wasn’t until the eighteenth century that most towns no longer needed rural immigrants to maintain their numbers. Rural areas in medieval times were healthy only in the sense that they generated a net increase in population. Infant mortality and life expectancy were not as abysmal as in the towns, but they still were horrific by modern standards. Many—perhaps most—of the peasants who survived infancy died of fungal infections of the lungs, caught from spores infecting their crops. In contrast, the towns were home to the bacterial and viral diseases with which our modern city-oriented culture is more familiar.

Viral diseases in the city

The growth of dense populations and their crowding into towns and cities allowed the emergence of highly contagious viral diseases that circulate only among humans, such as smallpox, measles, mumps, and rubella (or “German measles”). By industrial times, most of these had become childhood diseases, causing very few deaths. Even smallpox, though still a significant cause of death, had become much less dangerous. In 735–737, smallpox killed around 75% of those infected and annihilated half the population of Japan. By the late nineteenth century, the death rate from smallpox itself (Variola major) was around 20% of those infected, and a variant known as alastrim, or Variola minor, with a death rate of only 1% to 2%, was spreading. Infection with V. minor creates immunity to the more dangerous type of smallpox, and it seems likely that if nature had been left to take its course, in a century or two, V. minor would have displaced V. major, relegating smallpox to a childhood disease comparable to measles. What actually happened was that human intervention led to the eradication of smallpox by vaccination. The last cases of natural smallpox occurred in Bangladesh, in 1975 for V. major and 1977 for V. minor.

Bacterial diseases in the city

Bacteria do not evolve as fast as viruses, so most new bacterial diseases of humans are still shared with other animals. We can divide the bacterial diseases of the growing urban populations into three main groups: those spread directly from person to person, those spread by dirty water, and those spread by insects. Bacterial diseases spread from person to person, usually by droplets coughed or sneezed into the air, were common in towns and cities until recent times. Examples include scarlet fever, diphtheria, whooping cough, and tuberculosis. These diseases spread in a manner similar to that of measles and smallpox, although less efficiently. Vaccination has largely eradicated most of these diseases from advanced nations.

We saw in Chapter 4, “Waters, Sewers, and Empires,” how the collapse of the Indus Valley civilization was probably linked to cholera or a similar waterborne infection. Typhoid, dysentery, and a host of lesser diarrhea-causing diseases took a steady toll throughout the period following the Roman Empire, especially in crowded communities whose water supplies were at risk of contamination. As technology advanced, these diseases have gradually faded away in industrialized nations. Today various diarrhea-causing strains of Salmonella and E. coli have re-emerged as public health hazards, less in the water supply than in batches of contaminated food, especially processed meat.

The Black Death

The outstanding example of a bacterial disease spread by insects is the Black Death, or bubonic plague. In the middle of the fourteenth century, it wiped out around half the population of Europe. Very likely it did as much damage in the more densely populated parts of Asia, the Middle East, and North Africa, although detailed records are available only for Europe. Lesser outbreaks of bubonic plague reverberated around Europe until the 17th century.

Bubonic plague is caused by the bacterium Yersinia pestis, which infects many animals, especially rodents. From these, it can be transmitted to humans (and their cats and dogs) by fleas. Why would fleas leave the shelter of the densely packed hairs on furry animals such as rats to venture onto the exposed surface of relatively hairless creatures such as humans? Not through choice. Fleas come in distinct varieties and tend to stay with the animal they are adapted to. But if the animal dies, the flea can no longer obtain its required diet of fresh blood and must find a new host. So when rats or mice die of plague, their fleas leave and look for new animals to infest. Rat fleas cannot actually survive for long on humans—our blood doesn’t supply the correct balance of nutrients. But one bite is enough to transmit plague. That the flea will eventually die due to improper nutrition is small consolation.

In nature, the plague bacterium infects wild rodents such as the marmots and susliks of central Asia, or their relatives, the ground squirrels and chipmunks, of North America. It causes only mild symptoms, often no worse than a bad cold would be to humans. When displaced from their normal environment, the wild rodents can transfer bubonic plague to the rats and mice who live in close contact with humans. Unlike their wild relatives, domestic rats and mice fall seriously ill and are killed by plague. Their resident fleas then look for new animals to live on. This is what sets in motion epidemics of plague. The great Black Death epidemic of the Middle Ages was probably the result of climatic fluctuations in the northern Asian steppes. A few good years—for rodents—followed by a couple bad years resulted in a large rodent population with no food. So, several rodent species extended their ranges southward. This brought them into contact with other, more southern, rodents that, in turn, made contact with the human societies of Asia. The fleas and Yersinia pestis bacteria were passed along, too.

Climatic changes: the “Little Ice Age”

Before plague struck Europe in 1347, things had been going downhill for the better part of a century. Although fourteenth-century Europe had 10% or less of today’s population, it was overpopulated in the sense that the cultivated area and level of technology in use produced barely enough food. From 750 to 1200, things went reasonably well: Crop yields increased and the population grew. Between 1200 and 1350, Europe got colder and wetter. Pastures high in the Swiss Alps were covered under glaciers and did not re-emerge until centuries later. The Thames River in England froze over a dozen times in the 1400s. During this “Little Ice Age,” the weather in Europe was probably the worst since the Great Ice Age of prehistoric fame.

From about 1250 onward, Europe began to spiral downward into poverty. Crop yields decreased. An expanding population forced greater reliance on a single staple crop—wheat in Europe instead of rice, as in Asia. Malnutrition spread as diets became less varied. Crop rotation was often suppressed, and fallow land was planted in cereals in an effort to feed more people with a minimal diet. This exhausted the land, and yields dropped further. From about 1290 until the Black Death arrived in 1347, crop failures of increasing severity caused frequent local famines over much of Western Europe. Starvation, accompanied by intestinal infections such as typhoid and dysentery, reduced Europe’s population by 10% to 20% during this period. And then from the steppes of Asia came the solution to the European population problem: the Black Death.

The Black Death frees labor in Europe

Human populations are remarkably resilient, and if the epidemic of 1347 had been the only outbreak of plague, human numbers would have soon recovered, even from a 50% death toll. The real horror of the Black Death was that epidemics recurred constantly for the next 300 years. The first hundred years was by far the worst, with every generation hit by plague. Depopulation continued for more than a century. Gradually, the epidemics faded and population recovery began a century or so before the Black Death disappeared. In the short term, Europe was totally devastated. However, we must distinguish between short-term catastrophe and beneficial long-term effects. Much has been written on the effects of the Black Death, so we need not go over the evidence in obsessive detail. Several major long-term benefits are generally recognized.

The population collapse of the fourteenth century halted Europe’s spiraling descent into famine and poverty. Simply put, fewer people meant more farmland and, therefore, more food per person. It is true that, early on, the disruption caused by the Black Death resulted in local famines. But these were short-term effects of dislocation. After society adjusted, the benefits of a decreased population became evident. The resulting scarcity of labor meant that laborers, especially skilled craftsmen, became more valuable. Although governments attempted to control wages and prices on behalf of landowners, they failed. Peasants who were ill-treated, however legally, simply moved. Labor was in such demand that employers and landlords who benefited from new arrivals could be relied on to turn a blind eye to regulations prohibiting the movement of labor. Mobility of labor and higher wages led not only to a higher standard of living for most ordinary folk, but also to a more market-oriented economy overall.

The same labor shortage also led to greater interest in ways to increase output despite less human labor. More mechanization, better technology, and the emergence of the experimental approach, which underlies modern Western science, all trace their origins to this era. This is not to say that no technical advances occurred before the Black Death. Some did, but surplus manpower made labor-saving devices of little importance. Depopulation merely shifted the balance in favor of technology.

Death rates and freedom in Europe

The Tatars brought plague from the Asian steppes to the Crimea. From there, plague was carried around the Mediterranean to Europe, the Middle East, and North Africa. Plague landed in Europe in 1347. Death rates throughout Europe ranged from a mere 25% in luckier areas to 60% or more in the worst-hit regions.

One vivid quotation from Florence, one of the hardest-hit of the Italian cities, shows the horror:

“At every church, or at most of them, they dug deep trenches, down to the waterline, wide and deep, depending on how large the parish was. And those who were responsible for the dead carried them on their backs in the night in which they died and threw them into the ditch, or else they paid a high price to those who would do it for them. The next morning, if there were many [bodies] in the trench, they covered them over with dirt. And then more bodies were put on top of them, with a little more dirt over those; they put layer on layer just like one puts layers of cheese in a lasagna.”—Stefani, Marchione di Coppo (written late 1370s/early1380s)

According to Boccaccio, more than 100,000 died in Florence in 1347–1348. Modern scholars usually accuse Boccaccio of exaggerating, because Florence is estimated to have had a population of about 80,000 at this time. Modern estimates of the death rate in Florence range from 45% to 75%. Although ancient writers often exaggerated, modern commentators suffer from the opposing bias. In our own vaccinated and disinfected era, it is difficult to imagine that so many people could possibly succumb so fast to infectious disease.

A key unanswered question is the accuracy of historical estimates of population. In the most recent American census, a substantial number of inner-city slum-dwellers went unrecorded. By what percentage the U.S. population was underestimated is still debated. Perhaps more pertinent is the situation in the overpopulated cities of the Third World. Does Manila or Mexico City have an accurate roll of the slum-dwellers who live in the shantytowns around their perimeters? If Ebolavirus swept Calcutta next year, the body count might well be greater than the official population.

Although mortality was usually higher in cities, the plague also tended to mutate from the bubonic form to the more deadly pneumonic version in colder, damper regions. Thus, Scandinavia was very hard hit, despite its relatively low population density. Eastern Europe was last to be attacked, in 1350–1351, and, for reasons unknown, had the lowest death rates. It is interesting to note that the regions hit the hardest by plague were those where freedom and Western technology eventually developed the fastest. Remember that, up to this point in history, the peak of European culture had been centered around the declining Byzantine Empire in the East. In contrast, Western Europe was relatively backward. After the Black Death, this reversed.

Eastern Europe, then, was luckier in the short term but not so lucky in the long term. Its death toll from the Black Death was half that in the west. Consequently, the disruption and manpower shortages were less overwhelming. Aristocratic landlords and the Church were able to maintain control of most of Eastern Europe, with the result that it remained impoverished and relatively backward until the twentieth century. The long-term benefits of the fourteenth-century population collapse are perhaps most evident when we compare the prosperous, technologically advanced societies of Western Europe with the backward, poverty-stricken countries of Eastern Europe. Technology, prosperity, and religious freedom are all inextricably intertwined in the formation of modern industrial democracy.

The Black Death and religion

The effect on religion was paradoxical. It was inconceivable to the medieval mind that the death of so many people in just a few years could be anything but a sign from heaven. If anything, belief in God was strengthened, yet clearly something was wrong. The high mortality among ordinary parish priests indicates that most remained loyal to their flocks during the Black Death. However, the Church as a whole lost authority and respect during this period. Before the Black Death, some 25% of all bequests in wills had been to the Church. After the plague, instead of donating money to the Church, pious merchants and nobles founded private charities, thus removing large sums from the control of the Church. In addition, much of the Church’s wealth was in the form of land. Higher wages and lower food prices made land owning far less profitable than it had been in the days before the population collapse. Thus, loss of respect was followed by loss of income, and the Church’s influence dwindled.

The ever-increasing tendency to cut out the religious establishment and deal with God on a direct personal basis led to demands for Bibles in local languages and to revolts against the papal religious monopoly. To compound matters, the papacy was in disarray during this period. Indeed, from 1378 to 1415, there were two competing Popes, one in Italy and the other in France. This did little to improve the image of the Church. In England, John Wycliffe (1324–1384) and the Lollards rejected the authority of the Church and instead emphasized the Bible. Wycliffe, a professor at Oxford University, translated the Bible into English and also preached against the Catholic establishment, especially the Dominican and Franciscan monastic orders. According to Wycliffe, “Friars draw children from Christ’s religion into their private Order by hypocrisy, lies, and stealing.” Canon Knyghton of Leicester replied by claiming that Wycliffe was casting the Gospel pearl under the feet of swine by making the Scriptures available in English “to the laity and to women who could read.” The eventual outcome of the ever more widespread religious dissent was the Reformation. Protestantism spread throughout northwestern Europe, and this led eventually to religious freedom.

The White Plague: tuberculosis

The last epidemic of bubonic plague in northwestern Europe was the Great Plague of London in 1665. The last outbreak in the western Mediterranean region was in Marseilles and the surrounding area of France in 1720–1721. These late outbreaks were not only restricted in area, but also had lower mortality rates than outbreaks earlier in the pandemic. The Black Death was over, and the burden of controlling Europe’s growing population fell briefly to smallpox and then to a new bacterial disease, tuberculosis. Victims of bubonic plague developed purple-black patches on the skin. In contrast, the victims of tuberculosis were deathly pale and wasted away slowly. So tuberculosis was sometimes called the White Plague, in contrast with the Black Death. More often it was called consumption because of the slow wasting away.

Malnutrition, especially a shortage of protein, lowers resistance to tuberculosis. This was a major factor in the rapid spread of tuberculosis among working-class children during the eighteenth and nineteenth centuries. The healthy body attacks tuberculosis bacteria that are invading the lungs in two ways. It generates toxic nitric oxide in the lung tissue and walls off any surviving bacteria behind layers of immune cells. Both responses are weakened by a low-protein diet. The effects of tuberculosis in Africa illustrate this effect. In the 1930s, the Kikuyu tribe of East Africa suffered major losses from tuberculosis, whereas the related and neighboring Maasai suffered much less. The Kikuyu were largely vegetarian farmers, but the Maasai cattle herders lived on meat, milk, and blood.

Tuberculosis was responsible for about 20% of all deaths in England in 1650. Its share of deaths declined during the next century, probably due to the prevalence of smallpox in this period. Tuberculosis was back as the leading killer by the early 1800s. From the mid-1800s, the death rate from tuberculosis declined steadily until the 1950s. At the start of the twentieth century, essentially all city dwellers in Europe and America tested positive for tuberculin, indicating that they had all been infected. Yet few had active tuberculosis, indicating that most of the population of Europe was by then resistant. In 1926, a tragic mix-up in Lubeck, Germany, resulted in 249 babies being injected with virulent tuberculosis instead of the vaccine strain. Only 76 died, indicating 70% resistance, even among infants whose immune systems were still not fully operational. Starting in the 1950s, the discovery of antibiotics almost eradicated tuberculosis from the industrial nations.

The rise of modern hygiene

As the nineteenth century progressed, technology and industrialization ushered in modern hygiene. Clean water, flush toilets, sewers, and soap united with better nutrition and improved housing to vastly reduce the incidence of infectious disease. Mass production of cheap cotton underwear that was easy to clean helped things along. The knowledge that germs caused disease led to changes in acceptable human behavior. Samuel Pepys’s diary, written in the 1660s, tells us this:

I was sitting behind in a dark place, a lady spit backward on me by mistake, not seeing me, but after seeing her to be a very pretty lady, I was not troubled at it at all.

Today spitting is no longer polite—even by attractive women, even on the floor.

During the twentieth century, vaccinations and antibiotics finished the job civil engineering had begun. Infant mortality shrank to a vanishing point. Today the inhabitants of industrial nations expect to die of heart failure or cancer only after surviving the threescore years and ten of traditional wishful thinking. Smoking has joined spitting on the list of proscribed behaviors. Yet overeating, a far bigger threat to health, has received remarkably little bad press in comparison.

Although it is certainly better, on average, to be scrubbed, shampooed, disinfected, and vaccinated, there is a downside to the overexuberant use of hygiene. Perhaps the first example to be noticed was the curious case of polio. Starting in Sweden in the 1880s, poliomyelitis increased in frequency in countries with increased levels of hygiene. It remained virtually absent in countries with preindustrial standards of sanitation. Most cases of infection with poliovirus show no noticeable symptoms. A minority suffer mild fever and diarrhea. In a tiny proportion, the virus penetrates beyond the intestines and attacks the nervous system, resulting in paralysis of the lower limbs. These unlucky few attracted attention to the disease.

Polio is an intestinal disease. It is a member of a large family of closely related viruses that are all passed on by contamination of water with human waste and that infect the lining of the intestines. These viruses cause mild intestinal upsets, usually so mild there are no visible symptoms. In unsanitary societies, all children are infected with members of this virus group early in infancy and so become immune. Because these viruses are closely related, immunity to one gives partial or total immunity to others. Consequently, children raised in unhygienic conditions are almost always at least partially immune to polio before infection. As hygiene improves, infection by viruses in this family decreases sharply. If children now contracted polio, they would be unlikely to have been preimmunized by one of the less virulent relatives. The result would be a more aggressive infection, occasionally resulting in the tragedy of paralysis. Fortunately, artificial immunization has now largely eliminated polio from advanced nations, and major progress has been made toward worldwide eradication.

Despite such relatively minor setbacks, by the mid–twentieth century, the combined effects of antibiotics and immunization, superimposed on earlier advances in civil engineering and hygiene, had all but eliminated infectious disease from the advanced nations. Although there have been some reversals since, this is undeniably one of the greater triumphs of scientific man.

The collapse of the European empires

Today’s industrial nations got modern hygiene and health care after industrialization. Consequently, infectious disease kept their populations in check until the industrial revolution was underway. In contrast, most Third World nations received the basics of modern hygiene and health care from the advanced nations before industrialization. Hence, their population growth was checked much less by infectious disease.

The resulting massive population growth undermined the prosperity of the European colonial empires. Thus, the British Empire had the opposite problem from the Roman Empire. The Romans ran out of manpower, whereas the British were overwhelmed by surplus population. Thus, today’s major divide between the advanced nations and the Third World is, to some extent, a legacy from the effects of overcrowding and infection.

Resistant people?

From the 1300s until the 1900s, the European population was brutally culled by bubonic plague, smallpox, and tuberculosis, to name just the three main culprits. As noted earlier for TB, many of the survivors apparently had genetic alterations that made them resistant. But exactly what has been selected?

We know of some genetic alterations that confer resistance to a specific disease or group of related infections. As already discussed in Chapter 4, alterations in the cystic fibrosis gene protect against infections that cause dehydration due to diarrhea. A variety of mutations are known to protect against malaria (see Chapter 2, “Where Did Our Diseases Come From?”), although these would not affect urban dwellers in the temperate zone. The CCR5Δ32 mutation is found in about 10% of Europeans and confers resistance to AIDS. It has been suggested that this mutation was selected by plague or smallpox. However, recent data shows that it was present with the same frequency in Bronze Age Europeans some 3,000 years ago, well before plague or smallpox were prevalent. Thus, the selection force behind the CCR5Δ32 mutation is still a mystery.

Two major possibilities that might protect against multiple infections are altered behavior and changes to the immune system. Being smart enough—or perhaps just cowardly enough—to run away when plague threatens increases the likelihood of survival considerably. Even during the Black Death, some isolated villages escaped almost unscathed. Intelligence correlates with prosperity, and in medieval times those who were richer tended to sleep with fewer companions—both human and rodent. Records confirm that the urban poor who often slept a dozen to a room on a bed of straw suffered more casualties than the prosperous.

Modern genetic data suggests that at least a thousand or so of our genes are expressed at especially high levels in the brain. Deciphering which genes affect which aspect of brain function or behavior and whether they show signs of recent evolutionary selection is a daunting task. Currently, we can do little more than speculate. Personally, however, I find it difficult not to believe that massive epidemics have selected at least to some extent for increased intelligence, as well as behavior that reduces the likelihood of infection. Elevated caution (or cowardice) and increased preference for a solitary existence should also have been favored.

How clean is too clean?

Another possibility for broad resistance to several infections is a more aggressive immune system. Our immune system must be carefully balanced. If the immune system is too cautious in reacting, infections may win; if it is too trigger happy, we can damage our own tissues. Clearly, the optimal setting depends on the likelihood of encountering dangerous infectious diseases. Thus, urban plagues might have favored an overenthusiastic immune system. Although this was beneficial at the time, its legacy could be an increased level of autoimmune problems. These range from allergies and asthma to arthritis and multiple sclerosis. Autoimmune problems are most prevalent in industrial nations where overcrowding was worst; they are much rarer in Third World populations.

Another viewpoint on the higher frequency of autoimmune problems in advanced nations is the level of hygiene. Eating dirt is not generally recommended in manuals on the care of babies and small children, but mounting evidence suggests it might not be such a bad idea. When children are fed sterilized food, given too many antibiotics, and vaccinated instead of gaining immunity from natural infections, the immune system develops in a lopsided manner. This correlates with the increasing frequency of immune system disorders in industrialized nations. Children who develop immunity by natural exposure rarely suffer from these afflictions. Moreover, the culprits favored by tradition—pollution, dust mites, toxic chemicals, and colds—have now largely been exonerated. Although it would be silly to expose infants to sources of dangerous infection, some sort of exposure to “clean” dirt might not be a bad thing.

Where are we now?

For better or worse, we are living under artificial conditions far different from those of our hunter-gatherer ancestors. We have also been genetically modified in ways that we are just beginning to glimpse. So, what of the future? We consider our future conflict with infection in the final chapter.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.145.130.227