3. Transmission, overcrowding, and virulence

Virulence and the spread of disease

How a disease spreads greatly affects its impact on human society. Diseases that spread efficiently will clearly infect more people. Less obvious, but no less important, its transfer mechanism determines how virulent a disease may become.

We must tackle two widespread misconceptions. Both generalizations are half true, and scientific investigations have only recently discovered their flaws. The first is the idea that because diseases adapt to their hosts, they will inevitably become milder if we just wait long enough. Thus, syphilis was extremely virulent when first introduced into Europe but nowadays is much milder. Similarly, childhood diseases such as measles and mumps rarely do much real damage, although they were once much nastier. But this trend is not inevitable. Recent findings indicate that, under some circumstances, diseases change little in their virulence or even get worse. Moreover, some, like bubonic plague, appear to oscillate in virulence.

The second issue is the prevalence of infectious disease throughout history. We tend to think that the farther back we go in history, the dirtier and less hygienic people were, and so the higher the level of infectious disease. This is broadly true if we restrict ourselves to the last 1,000 years of Western civilization. However, if we consider the broader sweep of human history, the prevalence of infectious disease has fluctuated wildly. For example, only in the nineteenth century did Western civilization regain the level of hygiene that existed during the prime of the Roman Empire. Again, in very early times, before urbanization began, when humans were still few and far between, infectious disease was probably much less frequent.

Infectious and noninfectious disease

To understand how disease has affected our history, we must understand how infections are spread. Until recently, infectious diseases were lumped together with a variety of other ailments. Historical societies were often confused about their causes and, consequently, about what precautions to take to avoid them.

We can classify diseases according to how they are acquired. Wounds, bruises, and broken limbs are the result of accidents or deliberate violence. Ancient societies were well aware of the effects of a sword-thrust or a fall off a cliff. As with violence, poisoning can be deliberate or accidental. Early cultures certainly understood the idea of deliberate poisoning, although the victims were often misdiagnosed. For example, the symptoms of arsenic poisoning include vomiting and diarrhea, which superficially resemble the effects of certain intestinal infections. Accidental poisoning, especially on a large scale, was sometimes confused with infectious disease and other times blamed on witchcraft.

Hereditary diseases are the result of genetic defects passed on by one’s parents. People born crippled were frequently viewed as victims of divine displeasure (usually directed against their parents), and those who exhibited strange behavior, such as epileptics, were often seen as possessed by spirits (evil or good, depending on their society’s outlook). Nonetheless, genetic defects are rare compared to other causes of disease and probably caused little confusion, even though they were not understood until recently.

Cancers tend to occur later in life. They happen because of a build-up of genetic damage over the years in nonreproductive cells. These genetic defects are thus not passed on to the children, but instead are confined to the multiplying cancer cells within a single person. Cancer cells grow out of control and destroy the body to which they belong. Toxic chemicals, both natural and artificial, and ultraviolet radiation from the sun are responsible for many cancers. Other cancers happen because the body’s own genetic machinery makes occasional mistakes. Cancers draw notice only when people live long enough for genetic damage to accumulate. Until recently, cancers were responsible for an insignificant fraction of human deaths. Death by heart disease, stroke, or old age is largely a modern luxury. Historical populations rarely lived long enough or ate well enough for their arteries to clog with fatty deposits.

Infectious diseases from microorganisms have caused most deaths by far throughout recorded human history. In this respect, our own age is peculiar. Thanks to modern technology, we mostly live long enough to worry about heart disease and cancer. But for most societies throughout history, most people met their end from infections caused by microorganisms of some kind. This is still true for some Third World countries. Despite this, scientists have understood the nature of infectious disease only since the late 1800s.

Infectious disease is caused by invisible microorganisms

The cause of infectious disease—and even whether diseases were actually contagious and could be passed from person to person—has been hotly disputed over the ages. Only during the late 19th century could science begin to investigate the microorganisms that cause disease. In early times, infectious disease was often seen as punishment from the gods. Later, disease was blamed on such things as night air, marsh air, or other vapors. This attitude is well illustrated by the phrase “You’ll catch your death of cold.” As we now know, “colds” are caused by viruses, not exposure to low temperature. Nonetheless, poor nutrition, poor housing, and exposure to extremes of heat or cold weaken potential victims. Dirt may not literally breed disease as once thought, but lack of hygiene allows germs to survive and spread.

An intriguing aspect of historical beliefs about infectious disease is that the common folk were proved to be right in the long run, and the educated were mostly wrong. The priesthood pushed the idea that disease came from the gods. People were told to stop sinning and to pray for forgiveness, not waste time attempting to understand disease. Rationalist intellectuals put forward a range of theories based on factors such as diet, personality, climate, dirt, decay, and offensive odors of various sorts. Until the last century or two, most intellectuals rejected the idea that disease was contagious.

However, the behavior of the population-at-large suggests that ordinary people were aware that disease was often contagious. Avoiding contact with those infected by typhoid, plague, smallpox, and malaria was a sensible precaution. During the 1600s, the wealthier inhabitants of London kept an eye on the weekly “Bills of Mortality,” much as we tune in to the weather report nowadays. These “bills” were lists of recent deaths and their causes. When the number of cases of something especially nasty, like plague or smallpox, rose higher than normal, the wealthy fled London for their country estates and left the poor to take their chances.

Why did the scientific establishment take so long to realize that diseases are transmitted from one victim to another? I believe two factors are at work. First, many diseases are not directly contagious. Thus, although malaria is spread from person to person, it is carried by mosquitoes, and a person cannot catch it through direct contact with a human sufferer. Bubonic plague is even more confusing. It can be spread from person to person, but it is usually transmitted by fleas. From a practical viewpoint, avoiding those infected is still a good strategy—you would be less likely to be bitten by the same flea or mosquito. From an intellectual viewpoint, the observed lack of direct transmission favored the various environmental theories. Second, the technology to actually see microorganisms is of relatively recent origin. Speculation about tiny invisible germs goes back to the Roman author Varro (116–26 B.C.), but demonstrating their existence requires more than mere words: It requires a microscope.

How infectious disease spreads

Different contagious diseases spread in different ways. We can subdivide these into three major mechanisms. Some diseases spread by direct person-to-person contact. Others spread indirectly via inanimate objects. Yet a third strategy is for insects or other intermediaries to carry the infectious agent. The way an infection spreads greatly affects whether it becomes milder over the ages, stays much the same, or gets more virulent.

Certain diseases require prolonged contact of an intimate nature to move from one person to another. These diseases are relatively hard to catch and can often be avoided by changing personal behavior. The sexually transmitted diseases (STDs) such as syphilis, AIDS, and gonorrhea illustrate this scenario. Strictly speaking, the transfer of body fluids is involved here. This is important from a practical viewpoint, because such infections can also be spread by improperly sterilized hypodermic needles. This occurs both among the intravenous drug users of the industrial world and in the clinics of Third World nations that lack money for disposable syringes.

Other diseases are spread by direct personal contact, but with less intimacy than for STDs. Many venereal diseases probably evolved from ancestors who infested the skin and body surface in a more general way. For example, the chlamydia that infect the genitalia are closely related to those causing the eye infection trachoma. The specialized sexual versions likely arose in historical times only as human populations became denser (see Chapter 7, “Venereal Disease and Sexual Behavior”).

Some germs are transmitted by bodily contact or via nonliving objects such as doorknobs, paper money, clothes, and bed linen. Highly contagious virus diseases such as colds, influenza, measles, and smallpox are typical of this group, although most of these can also be transferred through the air. Many infections are transmitted from person to person through the air by coughing or sneezing. This is known as droplet transmission, because the germs are carried in microscopic droplets of saliva, phlegm, or mucus. Many of these germs fail to survive if they dry out completely. Tuberculosis, influenza, and colds are familiar examples. As the nursery rhyme says:

I sneezed a sneeze into the air,
It came to ground I know not where.
But hard and cold were the looks of those,
In whose vicinity I snoze.

Infectious agents can also be taken in with food or drink. Poor hygiene may result in food or drinking water being contaminated with human or animal waste. Typically, such infections affect the gastrointestinal tract and include the many types of protozoa, bacteria, and viruses that number diarrhea among their symptoms. The purpose of diarrhea, from the germ’s viewpoint, is to provide an exit mechanism from the body and to recontaminate the water supply. Examples of waterborne diseases include Cryptosporidium (a protozoan), cholera (a bacterium), and polio (a virus). Infections caught from food are often referred to as “food poisoning,” despite resulting from bacteria or viruses instead of poisonous chemicals.

Diseases are often carried by insects such as mosquitoes and flies or by animals such as rats and mice. These are referred to as vectors. Sometimes multiple vectors are involved, such as in the spread of the Black Death by fleas carried by rats or typhus fever by ticks carried by rodents. Controlling vectors usually limits the spread of a disease far more effectively than treating infected humans. Insects and their relatives, the ticks and mites, are the most common vectors. However, other animals may act as vectors, as in the spread of rabies by bats and squirrels, or of West Nile virus by migrating birds. Plague and typhus normally rely on fleas and ticks to distribute them, although, under some circumstances, they can spread from person to person. Other diseases are obliged to spend part of their life cycles in a second host. Thus, malaria must pass from human to mosquito and back again to complete its developmental cycle.

Many diseases become milder with time

Let’s consider the spread of a virulent virus like Ebola from the viewpoint of the virus. After infection, the victim will most likely die in a few days. Before the first victim dies, the virus must find another victim to infect. Clearly, the longer the first victim moves around, the greater the chances are of the virus making contact with someone else. If the virus incapacitates the first victim too quickly, it will undermine its own transmission. Consider, too, the spread of the virus from village to village. As long as the virus stays in the same village, where plenty of potential victims live close together, it can get away with killing fast. But what happens when the village has been wiped out? The virus must now find another population center. This requires an infected person who is still fit enough to travel. Over the long term, movement between population centers may matter more than how a disease spreads locally within a group of people.

Now consider two slightly different Ebolaviruses. One kills in a day or two. The second takes a whole week. Virus 1 may wipe out a whole village, but it will find it very difficult to transfer itself to the next village. Even if a dying victim staggers within sight of the next village, its people will probably not allow him in. During plague epidemics in medieval Europe, many villages and small towns stationed archers to intercept travelers. Anyone showing symptoms of plague was warned away and shot if they ventured too close. While lacking in sensitivity, such quarantine measures were effective, and many small villages escaped entirely from epidemics that decimated nearby towns.

By comparison, a less virulent Ebolavirus will spread much more effectively. Infected refugees fleeing an infected village may reach another center of population before symptoms appear. Thus, if we have a mixture of viruses, the milder forms will spread more effectively and, over time, will predominate. Many diseases appear to have done just this and have evolved to become milder. Examples include gonorrhea and syphilis (caused by bacteria), and measles, mumps, and influenza (caused by viruses). What unites these diseases is that all are transmitted directly from person to person.

Ebolavirus infects humans now and then after emerging from some animal host, probably bats. It wipes out a few people in close contact, and then the mini epidemic burns itself out. Much the same is true of Lassa fever and other highly virulent diseases that burst out of the jungle every so often. Although they give the press the opportunity to spur apocalyptic hand-wringing, they are unlikely to spread far without getting milder.

Crowding and virulence

Earlier thinking held that, given time, all diseases would adapt, to become no worse than measles and mumps. Virulent diseases were newcomers, not yet adapted to a state of biological détente with their human hosts. This viewpoint sees man and his infections in a perpetual cold war, with casualties due only to occasional misunderstandings. This wishful thinking has obvious marketing appeal and still frequently appears in books and articles that popularize biology.

This scenario ignores the ugly side of both evolution and human history. The inhabitants of our history books did not merely suffer from childhood diseases while their mothers read them stories about rabbits and mice dressed in human clothes. Until our own privileged age, most people died of infectious disease, much of which small rodents spread. The purpose of evolution is not to make life better for humans, nor even to produce a balanced ecosystem. Indeed, the very idea that evolution has some underlying moral purpose is basically religious. Evolution is simply a mechanism by which different living things compete using various genetic strategies. Those that propagate their own kind more effectively increase in numbers, and the less efficient go extinct. Mother Nature has no maternal instincts.

No absolute reason exists for why a disease should not remain virulent, nor why it should not get more virulent. Some do. Indeed, the same disease may fluctuate in virulence as conditions change. The critical issue is which factors promote decreased virulence and which promote increased virulence. The two main factors are overcrowding and transmission mode. Consider again two variants of the same disease, one mild and the other virulent. If humans are closely crowded, the virulent version has the advantage: There is no need for the patient to linger for several days to pass on the germs. As long as plenty of new victims are available nearby, the best strategy is for the disease to grow as fast as possible inside the original victims, generating more germs to infect more people. The slower, milder version of the disease will be left behind. Diseases tend to grow in virulence when their hosts are plentiful and crowded closely together. Conversely, diseases evolve with lesser virulence when their hosts are few and far between.

A highly virulent epidemic may wipe out a substantial portion of the human population. This decreases crowding, which, in turn, selects for a decrease in virulence. Ultimately, you might think, a balance will be struck and both the population density of the host and the virulence of the infectious agent will settle down to a gentlemanly compromise. This is the microbiological version of the famous “balance of nature” myth. But instead of reaching a state of stable equilibrium, periods of population growth generally alternate with devastating epidemics. Chinese records illustrate this effect. Between 37 A.D. and 1718 A.D., 234 outbreaks were severe enough to count as plagues—that’s one every seven years. Although not every epidemic covered all of China, the frequency is impressive.

Bubonic plague provides a nice example of a disease whose virulence oscillated. Beginning in the mid-1300s, repeated epidemics of bubonic plague swept across Europe until the 1600s (later in some places). When plague first reached a town or city, the first few cases were usually mild and the victims recovered. Once within the crowded confines of a town, the plague became more virulent, often switching to its pneumonic form, which is spread through the air by coughing. Anyone who caught pneumonic plague could be dead within a day. From the germ’s viewpoint, this is no problem, provided humans coughed germs over and infected another victim within this time. In a crowded medieval city, this was normally the case. Toward the end of an outbreak, most of the population either was dead or had recovered and become immune. Hence, the plague became milder again as the number of available victims became fewer and farther between. The mild forms then spread to the next city, and the cycle repeated. After a couple generations, the population recovered to where it could provide a sufficient supply of fresh victims, and the plague might revisit the original city.

Note the time scale. Microorganisms evolve so fast that they can change their minds—or, rather, their genes—during the course of an epidemic lasting less than a year. As illustrated by the Black Death in a single city, mutants with increased virulence may appear and spread in only a few weeks, and the reverse occurs toward the end of the outbreak. Thus, the virulence of a disease such as plague neither decreases nor increases; it oscillates. A major problem for the historian is that if a disease can change significantly in a year, how did it behave a hundred years ago? A thousand? Ten thousand?

Today the human population is exploding. In many Third World countries, this is exacerbated by poor hygiene. Consequently, we can expect diseases that are efficiently transmitted from person to person to become more virulent. In addition, more people need more food. The tendency is to plant larger areas with the same crop, to improve efficiency. However, such crowding makes crops more susceptible to epidemics, just as with humans. The best-known crop disaster was the Irish potato famine, which resulted from over-reliance on a single crop. When a virulent strain of blight fungus wiped out the potatoes, the Irish had little left to eat. Infectious disease then followed in the footsteps of malnutrition. Starvation itself killed relatively few—most victims died of cholera, dysentery, or typhus fever. Thus, crop failures and malnutrition amplify the effects of infectious disease.

Vectors and virulence

Virulence may increase when a vector carries a disease. If a germ hitches a ride from one victim to another via mosquito, it matters little that the first victim is too sick to move. Indeed, this may even work to the germ’s advantage. Mosquitoes will be able to land and suck blood without the victim swatting them. Diseases that are carried from person to person by some other agency have little motivation to evolve mildness toward humans. Rather, they must avoid disabling their carriers. What happens to the human victims is less important. Malaria, sleeping sickness, typhus fever, yellow fever, and many other diseases are spread by insects, ticks, or lice. These diseases are dangerous and show few signs of getting milder. Indeed, the more virulent form of malaria, Plasmodium falciparum, is today spreading throughout the tropics and subtropics from its original focus in Africa.

The best way to control these diseases is to kill the vectors, thus interrupting transmission. Spraying insecticides such as DDT greatly reduced the incidence of malaria in many areas. Sadly, malaria is making a comeback in many parts of the Third World, due partly to insecticide-resistant mosquitoes and partly to complacency and political disintegration. Irrigation projects such as dams, reservoirs, and irrigation canals often work well in temperate climates. However, in tropical regions, they may backfire. They create large bodies of stationary water that are ideal breeding grounds for the mosquitoes that carry malaria, yellow fever, and other diseases. The slowly moving water of canals also provides a suitable habitat for water snails that carry the parasitic worms causing schistosomiasis (bilharzia). An example was the spread of schistosomiasis during the Senegal River Basin development in West Africa.

Waterborne diseases use the water itself as a vector. Such diseases can also increase in virulence. The disease relies on contaminated water instead of an insect to carry it from person to person. But the principle is much the same: The disease does not rely on human victims for dispersal. Contaminated water supply normally spreads dysentery, cholera, and many other infections that cause diarrhea. Rivers can carry germs in untreated sewage downstream and infect towns and villages hundreds of miles away.

Reservoirs and carriers of disease

A disease reservoir is a source of infection outside the human species. Reservoirs are usually animals in whom the infection is mild or even causes no disease. For example, bats are a reservoir for rabies and probably also for Ebolavirus.

A carrier is a human who is infected but does not become ill. Although carriers show no symptoms, they may transmit the disease to others. Even if all the susceptible human victims are dead or incapacitated due to a virulent infection, a few carriers may keep the infectious agent in circulation. Carriers may travel from one town to another, or they may stay where they are and keep the disease alive to emerge at some future time. Clearly, a disease that can rely on symptomless carriers or an animal reservoir is under less pressure to become milder.

Many diarrheal diseases cause symptoms in only a fraction of their human hosts. The proportion of symptomless carriers varies immensely. It may be more than half, as in Cryptosporidium or amebic dysentery, or very rare (about 2%–3%), as in typhoid. In most cases, the germs simply live in the intestines without causing disease. Intestinal diseases in which a large fraction of the population shows no symptoms are, by their nature, relatively mild, at least in most adults. The casualties from such diseases are mostly infants in poor countries. Malnutrition and lack of medical care make infantile diarrhea a major killer under such conditions.

A few special cases are known of germs that have adapted specifically to inhabit some tissue other than the intestines. In such cases, the disease may remain much more virulent. In typhoid carriers, the bacteria inhabit the gall bladder, emerging now and then into the intestine. From there, they can reemerge into human society. Salmonella typhi, the agent of typhoid fever, is one of the most virulent infections spread by the contamination of food or water with human waste. It is also a specifically human disease, unlike many other varieties of Salmonella, which are shared with assorted animals. These less dangerous relatives have no special hiding place and must therefore refrain from killing their multiple hosts to stay in circulation.

Some viruses also lay low in specific tissues, biding their time. The best known are chickenpox and herpes. In fact, chickenpox (Varicella) is a member of the Herpesvirus family and is unrelated to the true Poxviruses (smallpox, cowpox, and so on). Several related variants of herpesvirus cause cold sores and genital herpes. Although the symptoms may be suppressed by treatment or vanish spontaneously, herpes never disappears completely. A few viruses remain hidden in a quiescent state. Symptoms may re-emerge under certain circumstances—if, say, the victim undergoes a period of stress. Chickenpox may also lie latent in nerve cells, re-emerging later in life as shingles, a painful skin rash. After reemerging, the virus may be passed on to others.

Development of genetic resistance to disease

“What does not kill me makes me stronger.”

—Friedrich Nietzsche

On average, the healthier, faster zebras escape being eaten by the lion and survive to carry on the species. Disease, like large predators, preferentially carries off the young, the old, the weak and crippled, and the feeble-minded, together with those who have no friends, family, or allies to help them. Vulnerability is not merely a physical matter. Declining mental alertness may increase vulnerability to disease due to lack of appropriate behavior. From a Darwinian perspective, both predation and disease improve the species, often in a rather nonspecific manner, by selecting for healthy and vigorous individuals. In addition, more specific effects occur.

When a virulent epidemic rages through human populations, some survive and some die. In the days before vaccination, antibiotics, and modern medical technology, what decided who was fortunate and who was not? In addition to sheer luck, both social and biological factors affect the chances of catching a disease, as well as the likelihood of surviving if infected. We start with the strictly biological factors.

First, we must distinguish immunity from resistance. Both protect against infection, although in quite different ways. Immunity occurs within a single lifetime. It results from previous infection by the same disease, or one closely related. The immune system remembers, and when exposed again, it rapidly extinguishes the invader. This assumes that a person survived the first encounter with the disease. Vaccination is based on deliberately exposing people to mild or crippled variants of a disease. This prepares the immune system for meeting the real-life, dangerous version of the disease. Immunity may be full or partial. It may last a lifetime or just a few years. Immunity is not inherited and cannot be passed on to children.

Resistance is genetic. A person is born with it or not, and resistance operates the first time you are exposed to a disease. After a lethal epidemic has passed, the humans who are resistant will have survived. Some who are sensitive will also have survived, for assorted other reasons. Nonetheless, if the death rate is significant, the proportion of people carrying genes for resistance will increase. The survivors pass these resistance genes on to their children, and the next generation starts out with a higher proportion of resistant individuals. If the same disease returns, it will kill a substantial number of the sensitive individuals in the new generation, and the proportion of resistant individuals will go up again. After several recurrences, the majority of the human population will be resistant.

Both smallpox and bubonic plague illustrate the emergence of resistance among humans. The earliest smallpox epidemics recorded in Japan had a 70%–90% death rate. By the mid–twentieth century, although smallpox was still dangerous, the death rate had fallen to around 10%–20%. Bubonic plague shows a similar history. The first European outbreaks in the mid-1300s were highly lethal, and several successive epidemics over the following century reduced the population of Europe by two-thirds. The same disease returned in the 1660s. Despite being called the Great Plague of London, both the infection rates and the probability of death among those infected were much lower. London lost scarcely 10% of its population.

Some diseases go extinct

If a particular infection returns periodically, it will find fewer susceptible victims each time. Eventually, sensitive humans may become so rare that the disease cannot find enough victims to continue transmitting itself and it may go extinct. This has happened to several diseases. Although many ancient records are ambiguous or lack medical detail, others describe outbreaks of pestilence whose symptoms are no longer familiar. Many of the early plagues of Rome and China either no longer are with us or have evolved out of recognition. The Great Plague of Athens, described so graphically by Thucydides, is the classic example.

The mysterious English sweating sickness caused quite a stir in historical times but failed to survive. The sweating sickness appeared in London in 1485. It was probably brought by mercenary troops from France who helped Henry VII seize the throne from Richard III. Symptoms included the sudden onset of fever, headaches, and “great swetyng and stynkyng with rednesse of the face and all the body.” Most victims recovered, although a significant minority died within a day or two. Fatalities were oddly erratic; in some communities, 30%–50% were killed, while in other towns, almost none of those taken sick actually died.

The English and Germans were susceptible, but the Scots, Welsh, and Irish (that is, those of Celtic lineage) were mostly not affected. Neither were the French, who, if truly guilty of bringing it to England, must have suffered from only a mild form of the disease. The worst epidemic, that of 1528–1529, spread to Germany, Holland, Scandinavia, Switzerland, Lithuania, Russia, and Poland, but ignored France and the rest of Southern Europe. This suggests a strong genetic element in susceptibility. Curiously, the English upper classes were hit harder than the common people. Two successive Lord Mayors of London died in the first epidemic of sweating sickness, and in 1517, Cardinal Wolsey fell seriously ill but survived. In all, five outbreaks of sweating sickness occurred over about half a century, and then the disease faded away. A similar but milder disease, the “Picardy sweats,” appeared in France during the 18th century, supporting the idea of a French connection. No known disease today has these symptoms.

Milder germs or mutant people?

When a disease gets milder, what has really happened? Did the disease change, or did the humans? Germs may mutate to avoid killing their victims too quickly, in order to spread themselves around. Humans may become resistant because sensitive individuals die off. Both processes occur in real life. Syphilis became milder. Humans became resistant to measles. In many cases, such as with malaria or leprosy, both processes have occurred. Because we are embarrassed talking about death and dislike thinking of the millions of humans weeded out by influenza, measles, and smallpox, we tend to talk of a disease getting milder even when humans became resistant.

Consider two alternative approaches for a disease to avoid killing its victims too fast. One is for the disease to become genuinely milder and nonlethal. Alternatively, the disease may remain lethal but kill only slowly. This is probably what happened to leprosy. Historical accounts suggest that leprosy was once highly contagious and far more virulent. Today leprosy is difficult to catch and will still kill if untreated, but this takes many years. Both victims and disease have changed genetically over time. Many Europeans carry genes for resistance to infection by leprosy.

Today we have direct genetic evidence for human resistance to schistosome worms, malaria, tuberculosis, leprosy, typhoid/cholera, HIV (AIDS), hepatitis B, and hepatitis C. The great sensitivity of indigenous Americans, both North and South, to influenza, measles, smallpox, and other Old World diseases implies that, here, too, genetic resistance has evolved in the Old World populations.

Group survival involves more than individual resistance

When a human society shows an altered response to a disease that is passed from parent to child, several possibilities exist apart from genetics. Behavioral avoidance is any cultural change that leads to protection. People who use mosquito nets and wear insect repellent become “culturally resistant.” No genetic changes have occurred, but cultural resistance can be “inherited,” in the sense that customs are passed from one generation to another. Some social effects also have an underlying biological basis.

People often use the term herd immunity to refer to two distinct protection mechanisms. Here we use the terms indirect immunity and herd resistance. Indirect immunity occurs when an immune or resistant majority shields a minority of sensitive individuals from infection. Let’s suppose that 90% of a human population is either immune or resistant to some particular infection. The other 10% will be protected because the disease will find it very difficult to transmit itself through the population. The minority of sensitive people are hiding in the biological shadow of those who cannot catch—or, therefore, transmit—the infection. In practice, immunizing 75% or so of a population often breaks the chain of infection well enough to protect the unvaccinated minority. The exact numbers depend on the nature of the disease and its transmission mechanism.

Group resistance is quite different and results from having a large population with plenty of genetic diversity. The population has many alternative versions of the genes that protect against infection. Some versions work better against one disease; other versions of the same genes work poorly against the first disease but act well against other infections. Different individuals carry different versions of these protective genes. Even if a totally new and highly virulent disease appears, a large, genetically diverse population will contain some individuals who are inherently resistant. Even during the worst outbreaks of Ebolavirus, around 10% of those infected survived. Even if most individuals die, the species will survive.

Thus, the species, viewed as a unit, may be resistant despite the fact that most individuals are sensitive. Life goes on. Note that we are not talking about “diversity” in the sense of artificially mixing individuals of different races to produce a politically correct community. Most local human populations have considerable internal genetic diversity, especially in the immune system. Despite hitting previously unexposed populations, the Black Death in Europe, smallpox in the New World, and Ebolavirus in Africa all had mortality rates of 60%–80% against “racially pure” local peoples. Contrast the introduction of myxomatosis to Australia. Myxomatosis is a lethal virus disease of rabbits that was released in Australia to control the rabbit population. The initial epidemic killed over 98% of the rabbits. These rabbits were the inbred descendents of just a handful of European rabbits that had colonized the Australian continent. The rabbit population had little genetic diversity, so the die-off was colossal.

The implications of resistance to infection

Over the ages, humans have developed resistance to many infections. Some of these diseases have gone extinct; others have evolved to survive. When a relatively resistant human population and its diseases meet a previously isolated and sensitive population, there are major repercussions. The devastation of American Indians, both North and South, by measles and smallpox introduced by Europeans is a classic example. To Europeans, measles is a mild childhood disease, and smallpox, though not mild, has a death rate of only 10%–20%. American Indians had never been exposed to either disease and, therefore, had no chance to evolve resistance. Consequently, they died in droves.

Many other examples are known in which disease has devastated one side (sometimes both) in human conflicts. Sometimes disease fights for the home team. The colonial takeover of Africa was hindered more by malaria, sleeping sickness, yellow fever, and other gruesome tropical diseases than by military resistance from stone-age warriors, despite the world-renowned bravery of such peoples as the Zulu. However, dense urban populations, who have been previously ravaged by some pestilence and have developed resistance, generally have the advantage. When they come into contact with less dense populations, on the fringes of civilization, the barbarians usually sicken and die. Unfair as it may seem, pestilence usually fights on the side of the Empire, evil or otherwise.

Although it is clearly beneficial to be resistant to disease, sometimes there is an unexpected price to pay. We are beginning to realize that certain hereditary diseases are the dark side of resistance to infectious disease. Sickle cell anemia is a by-product of resistance to malaria, and cystic fibrosis of resistance to diarrheal diseases, especially typhoid. To understand this, we must review the mechanism of inheritance. Humans, like all higher animals, have two copies of each gene, one inherited from their mother and the other from their father. Thus, if one copy is damaged by mutation, a back-up is present.

Resistance to disease often results from having one mutant copy of a gene that is defective in its original function. Children who inherit two defective copies, one from each parent, may suffer from a hereditary defect. Children with one good and one mutant (that is, resistant) copy will be resistant to the disease in question. Children who get two good copies of the gene will be healthy but still be susceptible to the disease. In practice, a balance is struck, depending on how common and how dangerous the disease is and how crippling the hereditary defect is.

Resistance to malaria via the sickle cell gene reduces the oxygen-carrying capacity of the blood. One good copy of this gene allows the blood to carry enough oxygen. Those with one good gene and one mutant gene are resistant to malaria. Those with two defective copies might, in theory, be even more resistant to malaria. Unfortunately, they do not live long enough to find out, because they suffer from sickle cell disease and their blood cannot carry enough oxygen.

We are the survivors of the frequent epidemics that have emerged in the relatively short time since humans began huddling together in overcrowded towns and cities. Consequently, unlike most wild animals, modern-day humans carry many dubious genetic alterations that have allowed us to muddle through the short-term crises of successive plagues. How has this affected our overall health? Have these changes affected our behavior, intelligence, or other mental abilities? The precise effects are mostly unknown, although we are beginning to see a few glimmers, thanks to modern genetic analysis. One fascinating recent link is between the prion protein, whose malformed version is responsible for mad cow disease, and the certain receptors in the brain. The healthy form of the prion protein appears to protect the NDMA receptors from overstimulation. In genetically engineered mice, extra NDMA receptors produce higher intelligence. Would changes in susceptibility to mad cow disease change our intelligence? Perhaps. We are the children of pestilence, held together by genetic jury-rigging.

Hunting and gathering

Early humans were hunter-gatherers. Men hunted game; women gathered roots, nuts, and fruit. Our ancestors roamed in small bands, rarely meeting other tribes. Early hunter-gatherers occupied prime land with plenty of large game. Today’s few remaining hunter-gatherers inhabit marginal areas in jungles or semidesert. Thus, the early hunter-gatherers were probably better fed. On the other hand, they did not have the option of visiting a modern hospital if injured, nor of trading skins and furs for portable DVD players and candy bars. Nevertheless, with some reservations, today’s hunter-gatherers are the best illustration we have of conditions before most of mankind settled into an agricultural way of life.

Patterns of infection vary greatly between hunter-gatherers and settled, agricultural societies. Two major factors are intertwined: low population size and high mobility. Ancient hunter-gatherers almost certainly had much less infectious disease than we have today. As already noted, before the growth of dense human populations, most of our epidemic diseases did not exist. Furthermore, small, mobile, and relatively isolated tribes would rarely have been infected by contact with other groups. Today’s hunter-gatherers tend to catch most of the diseases current among the settled farmers who live nearby. Nevertheless, they are still far less likely to be infected with the parasitic worms and intestinal protozoa that are circulated by the droppings of domestic animals and by irrigation programs. Their lifestyle of roaming over dry plains also protects them from the malaria that is typically found in marshy, wet, and irrigated regions.

Life expectancy and developing civilization

Overall, early hunter-gatherers were probably healthier and better fed than the sedentary farmers who followed them. Before civilization, life expectancy was probably around 30–40 years. Women bore five or six children, and infant mortality was perhaps as low as 30%, with a fair number of children dying between infancy and adulthood. Although most deaths were caused by infections, accidental deaths were also probably frequent among hunter-gatherers. The agriculturists who followed were more civilized than hunter-gatherers, in the sense of having better technology. However, their stationary lifestyle made them more susceptible to infections, and as villages grew into towns and towns into cities, disease became progressively more of a burden. Despite having more food, early farmers often had poorly balanced diets, as they relied on just a few staple crops. Meat consumption was low, as domestic animals were too valuable to slaughter routinely, and only the rich ate meat regularly.

In early societies, outbreaks of infection from domestic animals were probably quite frequent. But most of these would have died out rather quickly, due to lack of sufficient people—and animals—to keep the infection in circulation. Only after populations of a third to half a million were available could such new infections adapt to humans and survive as specifically human diseases. Although cities go back to roughly 4000 B.C. in Sumer, they were originally too small to support continuous epidemic disease.

The first cities big enough to keep diseases like measles continuously supplied with new people to infect appeared in the Middle East, and somewhat later in Greece, in the period 1000–500 B.C. Nonetheless, before this, there were densely settled human populations in the valleys of the Nile in Egypt and of the Tigris and Euphrates in the Middle East. To what extent epidemic disease spread before 1000 B.C. depended on the amount of contact between the individual communities strung along the river banks.

Somewhat later than the Middle East, dense populations arose in India and China, and provided similar opportunities for animal diseases to migrate into humans. Eventually, the three major centers of Old World civilization made good enough contact for infectious diseases to move from one to the other. Between around 500 B.C. and roughly 500 A.D., the disease pools of the Old World merged. The latter centuries of this period saw great political turmoil throughout the Eurasian continent. In the West, the Roman Empire fell, and in the East, China fragmented into multiple states, many under foreign domination. A thousand years later, over a period of less than a century, the combined infectious burden of the Old World was carried to the American continent, where it caused colossal devastation.

Disease and manpower

The massive casualty rates due to infant mortality plus periodic epidemics meant that many ancient societies were limited in their economic growth or military expansion by shortage of manpower. This is nicely illustrated by the values the Franks placed on different members of society. As the Roman Empire faded, these Germanic tribesmen took over Gaul, whose modern name, France, commemorates them. Shortly after 500 A.D., the Frankish law code (the Pactus Legis Salicae) set the wergild, the fine paid for wrongful death, for various people. A freeborn Frank was valued at 200 solidi. (A solidus was worth roughly one cow.) A woman who had survived to childbearing age was worth 600 solidi, as much as a member of the king’s retinue. However, girls too young to bear and women past childbearing were worth only the standard 200 solidi.

Throughout history, the bulk of the human population was poorly fed and lived short, squalid, thoroughly unhygienic lives. In better-than-usual periods of history, those who survived infancy might hope to make it to 40 years, on average. In truly miserable periods of history, such as the early Middle Ages, blighted by war, famine, and pestilence, the overall life expectancy may have sunk as low as 20 years. As emphasized by Richard Dawkins in his book The Selfish Gene, evolution is a mechanism for spreading genes. The purpose of life is not to provide idle luxury for our bodies, or even challenging problems for our minds, but merely to pass on our genetic information to future generations. The chicken is just a fancy machine for laying eggs. The massive toll of premature death throughout human history has selected the fitter and stronger. In particular, it has selected for those who carry genes conferring resistance to the infectious diseases that have been our biggest killers. From a Darwinian perspective, civilization spreads successful genes, especially those that combat infectious disease, not cultural achievements.

How do microorganisms become dangerous?

Now let’s pick up the story from the microbial viewpoint. Infectious agents vary greatly in their ability to cause harm. Before discussing the “professional” diseases, we must not forget the opportunists. When a person is weakened by injury, exposure, or starvation, or if the immune system is malfunctioning, microbes that are otherwise harmless may cause disease. Such opportunistic diseases have received a lot of attention in connection with AIDS. Victims of AIDS suffer damage to their immune systems, hence the name acquired immunodeficiency syndrome. Death results not from AIDS itself, but from the opportunistic diseases that can invade only humans lacking immune defense. These may be normal inhabitants of our skin or guts that invade tissues where they cannot normally survive. Others are microorganisms that do not normally infect humans.

The existence of such opportunistic invaders tells us that many microbes are poised on the edge of invasion. They can degrade and live on human tissues. If they could survive counterattack by the immune system, they would be able to invade us. Such microorganisms can become dangerous to healthy humans in two ways. The first is by gradual accumulating mutations that allow them to survive in human tissues. The second is by acquiring a preassembled set of virulence factors.

Cells carry their genetic information written in genetic code along the famous DNA molecule. Every now and then, DNA molecules are damaged by various causes and the information they contain is altered—or, as biologists say, mutated. Radiation of various kinds, including ultraviolet rays from the sun, can cause mutations. So can certain chemicals, such as those in soot, tar, and cigarette smoke. However, about half the mutations occurring in nature are spontaneous mistakes. Every time a living cell divides into two, it must duplicate its DNA so that its descendants get a copy of its genes. This process is not perfect, and occasional errors creep in. About one gene in a million suffers a mutation every generation. Many mutations have little effect; others are lethal. Despite this, a steady stream of mutations accumulates in living creatures.

Viruses mutate much faster than bacteria. Higher organisms, including protozoa such as malaria, change slowest of all. This is not because of a scarcity of mutations, but on how well the mutations are tolerated. This, in turn, depends on the relative genetic complexity. Random mutations are less likely to totally cripple a simpler organism, so the fewer the number of genes, the more rapidly an organism can change yet remain functional. Higher organisms have approximately 10,000–50,000 genes, bacteria have 500–5,000 genes, and viruses have 3–1,000 genes. Consider two machines, one with just a few components and the other with many. The more parts that interact with each other, the less flexibility we have to alter any individual component. If we randomly change the shape of one part of a complex machine, this will probably clash with the operation of another component. If we randomly change one part of a simpler machine, this is less likely to cause conflict. For example, we could double the length of the handle or the blade of a bread knife and still be able to slice bread. But if we doubled the size of a randomly chosen component inside an automobile engine, it would probably immobilize the car. Thus, the fewer genes, the more likely mutations will be tolerated and the faster evolution may occur.

Diseases from worms, fungi, or protozoa have changed relatively little during the course of human history. It is no accident that ancient descriptions of malaria are the earliest records of an infectious disease whose symptoms are clearly recognizable today. Conversely, viral diseases change so rapidly that they tend to become unrecognizable over the ages. Bacteria are intermediate in their rate of change. Around 400 B.C., the ancient Greeks described many infectious diseases still identifiable today. Most of these are bacterial, but the only recognizable viral infection is herpes. Many ancient epidemics cannot be identified today, even when their symptoms were carefully recorded. These were probably viral infections that have changed beyond recognition.

Genes are normally made of DNA. For day-to-day operations, cells make working copies of their genes in the form of RNA, a molecule related to DNA. The original DNA copy is stored safely until the cell divides. RNA is less stable than DNA and is copied less accurately. Therefore, it is much more likely to mutate. Nonetheless, some viruses contain genes made of RNA. These viruses suffer a massive mutation rate that no living cell could survive. Even among viruses, only those with few genes (no more than a dozen or so) can tolerate being RNA based. Even so, many of the individual virus particles produced are defective. Despite these drawbacks, many successful and widespread viruses use RNA. The benefit to the virus is that constant alterations camouflage it from the immune system. For example, influenza and AIDS are both RNA viruses. Influenza mutates so fast that this year’s flu vaccine will be useless against next year’s flu. AIDS mutates even faster. The AIDS viruses inside a single patient vary significantly from one another. This makes treatment with drugs extremely difficult, as resistant virus mutants arise inside a single patient.

Packages of virulence genes are often mobile

Although bacteria evolve slower than viruses, every now and then, some previously harmless bacterium becomes a full-fledged killer overnight. This results from the transfer of blocks of genes between different bacteria. One bacterium may spend a thousand years gradually mutating a few genes to better invade the tissues of its host animal. Then suddenly, the whole package may be transferred to a different bacterium, perhaps a harmless inhabitant of a different animal, and a novel disease emerges almost instantaneously.

Bacterial cells carry their genes, typically 3,000 or so, on a single giant circular molecule of DNA, the bacterial chromosome. Mobile clusters of extra genes are often carried on smaller DNA circles, known as plasmids. These divide in two when bacteria divide, so the genes they carry are inherited just like the genes on the main chromosome. About half of all bacteria found in nature contain one or more plasmids. Many plasmids can move between bacteria. Although most move only between closely related bacteria, a few promiscuous plasmids can move across family boundaries.

The enteric family is a related group of bacteria that mostly live inside animals, in their digestive tracts. They are widespread, and most are harmless. However, virtually all may become dangerous if they get a plasmid carrying virulence genes. Thus, bubonic plague is caused by Yersinia pestis. Its relative Yersinia enterocolitica sometimes causes mild diarrhea. Yersinia pestis itself has three virulence plasmids, while its less impressive relative has just one.

Although these blocks of virulence genes move from cell to cell on plasmids, they may occasionally be inserted into the bacterial chromosome. From being an optional extra, the virulence genes have become permanent fixtures. Among the enteric bacteria, the best-known cases are found in Salmonella. The typhoid bacterium, Salmonella typhi, has at least three integrated blocks of virulence genes, and its less dangerous relatives, which cause food poisoning or mild fevers, have one or two. In addition, some Salmonella also carry virulence plasmids.

Viruses, plasmids, and virulence

Plasmids are clusters of extra genes, and viruses are packages of genes that infect cells. Is there a relationship? Sometimes, undoubtedly. Some viruses that infect bacteria have two alternative lifestyles. They may act like a typical virus and destroy the bacteria. Alternatively, instead of killing the host cell, the viruses may take up residence as circles of DNA. In other words, they become plasmids and replicate in step with the host cell. Other bacterial viruses may take up residence by inserting their DNA into the bacterial chromosome.

The best-known enteric bacterium, Escherichia coli, is a normally harmless gut inhabitant that has become famous because of its star role in genetic engineering. However, it can harbor assorted plasmids and viruses carrying virulence genes. As a result, some truly nasty strains of E. coli have emerged. E. coli O157:H7 debuted on the world stage in January 1993. It appeared simultaneously in Seattle, Nevada, Idaho, and California in contaminated hamburgers shipped from a central supplier to outlets of a single fast-food chain. Bloody diarrhea was followed by kidney failure that was sometimes fatal in children. In addition to virulence factors on plasmids, E. coli O157:H7 has a resident virus that carries the gene for shigatoxin. This toxin damages the kidneys and makes E. coli O157:H7 life-threatening instead of merely obnoxious. Shigatoxin is named after Shigella, which causes bacterial dysentery. At some point, this virus presumably infected Shigella and picked up the shigatoxin gene before moving on to E. coli.

Thus, plasmids or viruses can carry virulence genes. Transfer between closely related bacteria is obviously easier, but given time, a cluster of genes can move one step at a time between unrelated bacteria. Detailed molecular analysis has shown that the clusters of virulence genes found in enteric bacteria did not originate in members of this family. Presumably, these virulence clusters evolved long ago in bacteria of some other family and have subsequently moved. Where they first arose is still unknown.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.191.252.126