3. Invisible Invaders: The Discovery of Germs and How They Cause Disease

image

Shortly after 2:00 a.m. on a late-August night in 1797, Mrs. Blenkinsopp, midwife to the Westminster maternity hospital, hurried from the bed chamber, her pale face taut with anxiety. Three hours had now elapsed since she had delivered Mary’s baby girl, and something had clearly gone wrong. She quickly found Mary’s husband and gave him the alarming news: The placenta had still not been ejected; William must immediately call for help. The doctor arrived within the hour and, finding that the placenta was adhered internally, he began to operate.

But the surgery did not go well. The placenta could only be removed in pieces, and by the next morning, Mary had suffered significant blood loss and a night of “almost uninterrupted fainting fits.” Nevertheless, as William later recalled, his beloved wife—whom he’d married just months earlier—mustered the strength to say she “would have died the preceding night but was determined not to leave me.” After a small joke and weak smile, she added that she “had never known what bodily pain was before.”

Mary had survived one crisis, but it was only the beginning. Several days later, as William and other family members kept watch with raised hopes for her recovery, Mary was suddenly overcome by fits of shivering so violent that “every muscle of her body trembled, her teeth chattered, and the bed shook under her.” Although the rigors only lasted five minutes, Mary later told William that it had been “a struggle between life and death” and that she had been “more than once at the point of expiring.”

Mary survived this crisis, too, raising hopes once again that she might recover. But over the next few days, she declined again, overwhelmed by such symptoms as high fever, an extraordinarily rapid pulse, and abdominal pain. Then, on the eighth morning after her delivery, just as William was once again giving up all hope, the surgeon woke him to report some extraordinary news: Mary was “surprisingly better.”

Had Mary survived yet a third crisis? It certainly seemed so as, over the next two days, her shivering fits and other symptoms miraculously stopped. Indeed, by the tenth day after her delivery, the surgeon observed that Mary’s “continuance was almost miraculous” and that it was “highly improper to give up all hope.” Yet by this time, William’s hopes had been raised and dashed once too often. And despite the remarkable cessation of symptoms, his sense of gloom proved prescient: On the eleventh morning following the birth of her daughter, Mary died from childbed fever.

* * *

When British author Mary Wollstonecraft died that late-summer morning in 1797 at the age of 38, the world lost more than a gifted philosopher, educator, and feminist. In addition to leaving a collection of writings that laid the foundation for the women’s rights movements of the nineteenth and twentieth centuries and being the first woman to publicly advocate for women’s suffrage and coequal education, Mary Wollstonecraft left one final, memorable gift to the world: The baby girl who survived the ordeal, named Mary in honor of the mother she never knew, grew up to be Mary Wollstonecraft Shelley, who in 1818, at the age of 19, wrote her well-known novel, Frankenstein.

Mary Wollstonecraft’s death highlights the tragedy of a disease that was relatively common until the mid-1800s, usually fatal, and almost completely misunderstood by doctors. Although rare today, throughout history childbed fever, or puerperal fever, was the most common cause of death in women giving childbirth. As with Mary Wollstonecraft, it usually struck suddenly and unexpectedly shortly after childbirth, beginning with intense shivering fits, a pulse racing up to 160 beats per minute, and high fever. Lower abdominal pain was often so painful that the lightest touch or even the weight of bed sheets could trigger cries of agony. “I have seen some women,” one obstetrician told his students in 1848, “who appeared to be awe-struck by the dreadful force of their distress.” And in one final, cruel manifestation, the symptoms often stopped suddenly after days of suffering. But as family members rejoiced, experienced physicians recognized the ominous sign: The sudden absence of symptoms was an indication of advanced disease and usually meant that death was imminent.

But more than a historical footnote, childbed fever played a central role in a major turning point in medical history. When in 1847 Hungarian physician Ignaz Semmelweis discovered how to prevent it, he not only helped save countless women from agonizing deaths, he also took the first key step toward what is now regarded as one of the greatest breakthroughs in medicine: the discovery of germ theory.

The invisible “curiosity” that—finally—changed the world of medicine

Germ theory—the discovery that bacteria, viruses, and other microorganisms can cause disease—is something we all take for granted today. But until the late 1800s, the idea that germs could cause disease was so novel, even outlandish, that most physicians could not accept it without a monumental shift in thinking, a grudging surrender of long-held views, including miasma theory. In fact, vestiges of that nineteenth century struggle still remain with us today, as seen in the word “germ” itself. In the early 1800s, before microscopes were powerful enough to identify specific microbes, scientists broadly used “germ” when referring to these unseen and unknown microorganisms suspected of causing disease. Today, though we have long-known that germs are actually bacteria, viruses, and other pathogens, many of us—particularly ad copy writers hired to hawk kitchen and bathroom cleaners on TV—still use germ as a catch-all for any disease-causing microbe.

In any case, once “germ theory” had been proven by the end of the nineteenth century, it forever changed not only how doctors practiced medicine, but the very way we view, interact with—and often fear—the invisible world around us. The importance of germ theory was acknowledged in 2000, when Life magazine ranked it the sixth most important discovery in the past 1,000 years.

The initial reluctance to accept germ theory did not arise from any doubt that we live in a world surrounded and infused by invisibly small life forms. By the 1800s, the existence of microorganisms had been known for nearly two centuries. That key breakthrough had occurred in 1676, when Dutch lens grinder Antony van Leeuwenhoek, peering through his crude microscope, became the first human to see bacteria. On that April day, he reported with astonishment that he had seen a multitude of tiny “animalcules which... were incredibly small, indeed so small that...ten thousand of these living creatures would not be able to fill the volume of a small sand-grain.”

But over the next two centuries, few scientists seriously considered that these invisible curiosities could cause disease. It was not until the 1800s that the evidence began to accumulate and, thanks to historic milestones by four key people—Ignaz Semmelweis, Louis Pasteur, Joseph Lister, and Robert Koch—germ theory was finally “proven.” And the first of these milestone advances centered directly around the deadly mystery of childbed fever, the disease that not only took the life of Mary Wollstonecraft, but up to 500,000 other women in England and Wales during the eighteenth and nineteenth centuries.

Milestone #1 The tragic loss of a friend (and a brilliant gain of insight)

When Ignaz Semmelweis began his career in obstetrics at the Vienna General Hospital in 1846, he was just 28 years old and had every reason to be excited—and full of dread. The good news was that Vienna General Hospital was the largest of its kind in the world, and its affiliated Viennese School of Medicine was at its zenith. What’s more, the maternity department had just been enlarged and split into two clinics, each capable of delivering up to 3,500 babies a year. But there was one terrible problem: The hospital was suffering a raging epidemic of childbed fever. While the death rate had been less than 1% in the 1820s, by 1841 it had increased by nearly 20 times. In other words, if you went to the Vienna General Hospital in 1841 to deliver your baby, you had about a one in six chance of not leaving the hospital alive.

By the end of 1846, when Semmelweis had completed his first year as a full assistant, he had seen more than 406 women die of childbed fever. By that time, numerous explanations for the high death rates had been proposed, both silly and serious. Semmelweis had considered, and ruled out, most of them, including theories that the deaths were due to: female modesty (in one clinic, babies were delivered by physicians, who were all males); bell-ringing priests (some thought that their marches through the wards after a death caused new cases by triggering fear); and other theories that did not fit the evidence, such as overcrowding, poor ventilation, and dietary errors.

But when Semmelweis conducted a statistical investigation comparing death rates between the two clinics, he immediately uncovered a compelling finding. In the five years after the maternity hospital had been split into two clinics, the death rate among women in the first clinic, where all deliveries were made by physicians, was three to five times higher than in the second clinic, where the deliveries were made by midwives. Yet, though suggestive, he could find no obvious reason for the discrepancy. As Semmelweis later wrote, the midwives who delivered babies in the second clinic “were no more skillful or conscientious in their duties” than the physicians who worked in the first clinic. His other investigations only confused the picture further. For example, death rates were actually lower in mothers who delivered their babies at home or even on the streets. As Semmelweis noted, “Everything was in question; everything seemed inexplicable; everything was doubtful. Only the large number of deaths was an unquestionable reality.”

Then, in the spring of 1847, a milestone moment arrived in the form of personal tragedy. Returning to Vienna Hospital from a three-week vacation, Semmelweis was greeted with the “shattering” news that his much-admired friend Professor Jakob Kolletschka had died. Despite his grief, Semmelweis was intrigued by the cause of his friend’s death: While performing an autopsy on a woman who had died from childbed fever, the professor’s finger had been pricked by a medical student. The wound became infected and quickly spread through Kolletschka’s body. During an autopsy, Semmelweis was struck by the widespread infection throughout Kolletschka’s body and its similarity to what he had seen in women with childbed fever. “Day and night I was haunted by the image of Kolletschka’s disease,” he wrote, and the fact that “the disease from which he died was identical to that from which so many maternity patients died.”

This dawning insight was remarkable in its implication. Until that time, childbed fever was, by definition, a disease that affected only women. The possibility that it had infected and killed a man, from a wound sustained during an autopsy of a patient who had died of the disease, led Semmelweis to a startling conclusion. “I was forced to admit,” he wrote, “that if his disease was identical with the disease that killed so many maternity patients, then it must have originated from the same cause that brought it on in Kolletschka.”

Although Semmelweis did not know what this “cause” was—he referred to the invisible culprit as “cadaver particles”—he had begun to solve the greater mystery. If childbed fever could be transmitted by “particles” from one person to another, it could explain the high death rates in the first clinic. Unlike midwives who delivered babies in the second clinic, physicians in the first clinic commonly performed autopsies on women who’d died of childbed fever and then went straight to the maternity wards where they conducted intimate examinations of women during labor. The answer to the mystery struck Semmelweis like a lightning bolt: It was the physicians who were transferring the infecting particles to the mothers, thus causing the higher death rates in the first clinic. “The cadaverous particles are introduced in the circulatory system of the patient,” Semmelweis concluded, and “in this way maternity patients contract the same disease that was found in Kolletschka.”

Although physicians did wash their hands after the autopsies, Semmelweis realized that soap and water was not sufficient—and thus he arrived at the next milestone.

Milestone #2 A simple solution: wash your hands and save a life

In mid-May, 1847, shortly after the death of his friend Kolletschka, Semmelweis announced a new practice in the first clinic: From now on, all physicians must wash their hands with a chlorine solution after performing autopsies and prior to examining pregnant mothers. Within the year, the new policy had a dramatic impact. Before implementing the hand-wash, the mortality rate in the first clinic had been about 12%, versus 3% for the second clinic. Just one year after the chlorine washings were instituted, the mortality rate had fallen to 1.27% in the first clinic, compared to 1.33% in the second clinic. For the first time in years, the death rates in the first clinic were actually lower than those in the second clinic.

But the reaction to Semmelweis’ discovery underscores how far the medical world still had to go before accepting even this small step in the direction of germ theory. Although some colleagues supported his findings, many older conservative faculty rejected his ideas outright. For one thing, it contradicted the views held by most physicians that childbed fever, like most illness, was caused by many factors—from miasmatic vapors, to emotional trauma, to acts of God—and not some “particle.” Furthermore, many physicians resented the implication that they were somehow “unclean” carriers of disease. And so, sadly, despite his discovery, Semmelweis’ theory gained little acceptance. One problem was that he initially did little to promote his own findings. Although in 1861 he finally published a book on the cause and prevention of childbed fever, it was so rambling and repetitive that it made little impact.

From that point, Semmelweis’ life turned progressively tragic as he succumbed to a serious brain disorder, possibly Alzheimer’s disease. For example, in earlier years he had graciously described his sense of guilt and remorse for the unwitting role he and other physicians had played in transmitting childbed fever to so many women. “Only God knows the number of patients who went prematurely to their graves because of me... If I say this also of another physician, my intention is only to bring to consciousness a truth [that] must be made known to everyone concerned.” But with his mental state deteriorating, all grace was lost as he began writing vicious letters to those who opposed his ideas. To one physician he wrote, “Your teaching, Herr Hofrath, is based on the dead bodies of women slaughtered through ignorance... If, sir, you continue to teach your students and midwives that puerperal fever is an ordinary disease, I proclaim you before God and the world to be an assassin...”

Eventually, Semmelweis was taken to a mental institution, where he died a short time later. Ironically, some contend that Semmelweis’ final vitriolic attacks against his colleagues constituted a third key milestone: His abusive letters may have helped raise awareness years later, as other evidence for germ theory began to accumulate.

* * *

Although it would be another 15 years before those “cadaveric particles” would be identified as streptococci bacteria, Ignaz Semmelweis’ insights are now recognized as a key first step in the development of germ theory. Despite having no understanding of the causative microbe, Semmelweis showed that a disease could have a single “necessary cause.” In other words, while many physicians at the time believed that any disease could have multiple causes, Semmelweis showed that one specific factor, something in those cadaveric particles, had to be present for a person to develop childbed fever.

But it was only a first step. It would take the work of Louis Pasteur to push medical awareness to the next milestone: making a link between specific particles—microorganisms—and their effects on other living organisms.

Milestone #3 From fermentation to pasteurization: the germination of germ theory

As everyone knows, sometimes it can be downright impossible to get your hands on a rat or scorpion when you need one. But no worries, according to esteemed seventeenth-century alchemist and physician Jean-Baptiste van Helmont, who devised this recipe for making rats: “Cork up a pot containing wheat with a dirty shirt. After about 21 days a ferment coming from the dirty shirt combines with the effluvium from the wheat, the grains of which are turned into rats—not minute and puny, but vigorous and full of activity.” Scorpions, Helmont assures us, are even easier: “Carve an indentation in a brick, fill it with crushed basil, and cover the brick with another. Expose the two bricks to sunlight and within a few days fumes from the basil, acting as a leavening agent, will have transformed the vegetable matter into veritable scorpions.”

On the one hand, it’s comforting to know that by the mid-1800s, most scientists would have joined us in laughing at such beliefs in spontaneous generation—the theory that living organisms can be created from nonliving matter. On the other hand, that laughter might trail off sooner than you would expect. Because as late as the 1850s, though no one seriously believed spontaneous generation could give rise to insects or animals, increasingly powerful microscopes had begun to prompt some scientists to rethink the issue when it came to the origin of organisms so small that 5,000 could fit in the span of the period at the end of this sentence.

Nevertheless, two vexing questions remained: Where did microorganisms come from, and did they have any relevance to the “real” world of plants, animals, and people? And so in 1858 the well-known French naturalist Felix Pouchet, attempting to answer the first question, resurrected the questionable notion of spontaneous generation, claiming that he had shown “beyond a shadow of a doubt” that it explained how microorganisms came into the world.

But French chemist Louis Pasteur, already admired for his work in chemistry and fermentation, didn’t believe it for a moment and proceeded to devise a series of ingenious experiments that laid spontaneous generation in its grave forever. While Pasteur’s classic experiments are still taught today in most biology classrooms, they comprise only a small part of a remarkable 25-year career. During this career, his contributions not only helped answer both questions—microbes come from other microbes, and they are very relevant to the real world—but raised the concept of germ theory from the mists of uncertainty to the brink of unquestioned reality.

A toast to yeast: a tiny critter gives birth to the liquor industry and a new theory of germs

To most of us, yeast is a powdery stuff that gives wine and beer their alcoholic pleasures and bread and muffins their ability to rise in a hot oven. Some of us also know that yeast is a single-celled microorganism that reproduces by producing small buds. We should be grateful for even these simple facts, as they represent the outcome of years of raging debate, controversy, and experimentation in the early 1800s. Even when scientists finally accepted that yeast was a living organism, it only set the stage for the next round of debates over whether it was truly responsible for fermentation.

An unsung hero of early microbiology, yeast was one of the first microbes to be studied scientifically because it is relatively large compared to bacteria. But often forgotten today is another major reason for its heroic stature: Thanks to the work of a multi-cellular scientist named Louis Pasteur, it played a central role in the development of germ theory.

It was an unlikely beginning. In 1854 Louis Pasteur was working as dean and professor of chemistry in Lille, a city in northern France, and had no particular in interest in yeast or alcoholic beverages. But when the father of one of his students asked if he would be willing to investigate some fermentation problems he was having with his beetroot distillery, Pasteur agreed. Examining the fermenting liquor under a microscope, Pasteur made an important discovery. Healthy, tiny globules in the fermentation juice were round, but when the fermentation became lactic (spoiled), the globules were elongated. Pasteur continued his studies, and by 1860 he had shown for the first time that yeasts were in fact responsible for alcoholic fermentation. With this discovery, Pasteur established the “germ theory” of fermentation. It was a major paradigm shift in thinking: the realization that a microscopic form of life was the foundation of the entire alcoholic beverage industry, that a single-celled microbe could indeed have very large effects.

In subsequent years, Pasteur extended his germ theory of fermentation into “diseases” of wine and beer, successfully showing that when alcoholic beverages went “bad,” it was because other microorganisms were producing lactic acid. In addition to identifying the microbes, Pasteur devised a “cure” for this disease: Heating the liquor to 122 to 140 degrees F would kill the microbes and thereby prevent spoilage. The term for this process of partial sterilization remains well-known to us to this day, thanks to its ubiquitous presence on the packaging of many foods and beverages: pasteurization.

Pasteur’s work in fermentation and diseases of wine was a major milestone in germ theory because of what it implied. As early as the 1860s, he was speculating about whether microorganisms could have similar effects in other areas of life. “Seeing that beer and wine undergo profound alterations because these liquids have given shelter to microscopic organisms, how can one help being obsessed by the thought that phenomena of the same kind can and must sometimes occur in humans and animals?”

Milestone #4 The “spontaneous generation of life” finally meets its death

It was while Pasteur was investigating fermentation that French naturalist Felix Pouchet ignited the scientific world with controversy and excitement by announcing he had “proven” spontaneous generation. Specifically, Pouchet claimed to have conducted experiments in which he had created microbes in a sterilized environment in which no germ “parents” were previously present. While many scientists discounted this claim, Pasteur’s background in fermentation and his genius for designing clever experiments enabled him to take Pouchet head-on and disprove what many had believed was an unsolvable problem. In one classic experiment, Pasteur revealed the flaws in Pouchet’s work by focusing on something so common that we tend to forget that it is as omnipresent as the air we breathe.

“Dust,” Pasteur explained in a lecture describing his landmark experiment, “is a domestic enemy familiar to everyone. The air in this room is replete with dust motes [that] sometimes carry sickness or death in the form of typhus, cholera, yellow fever, and many other kinds...” Pasteur went on to explain how the germs that Pouchet claimed to have created through spontaneous generation instead arose from a combination of bad experimental technique and a dust-filled room. To prove his point, Pasteur described a simple experiment in which he poured a liquid nutrient into two glass flasks. One of the flasks had a straight, vertical neck that was directly open to the surrounding air and falling dust; the second flask had a long, curving horizontal neck that allowed air—but not dust—to enter. Pasteur then boiled the liquid in both flasks to kill any existing germs and set both flasks aside. When he checked the flasks a few days later, germs and mold had grown in the first, open flask, carried there by falling dust. However, the second flask, whose long neck prevented germ-laden dust from entering and contaminating the liquid, was germ-free.

Indeed, Pasteur explained, referring to the second flask, “It will remain completely unaltered not just for two days, or three or four, or even a month, a year, three years, or four! The liquid remains completely pure.” In fact, having conducted similar experiments over the course of years with similar results, Pasteur rightfully claimed, “The doctrine of spontaneous generation will never recover from the mortal blow inflicted by this experiment.”

Pasteur’s 93-page paper describing his work, published in 1861, is now considered to be the final death blow to spontaneous generation. Equally important, his work set the stage for his next milestone. As he wrote at the time, “It is very desirable to carry these researches sufficiently far...for a serious inquiry into the origin of disease.”

Milestone #5 The critical link: germs in the world of insects, animals, and people

For the next 20 years, Pasteur’s work took a series of dramatic turns that, in addition to profoundly impacting health and medicine, collectively established the next milestone in germ theory. It began in the mid-1860s, when a mysterious disease was decimating the silkworm industry in Western Europe. When a chemist friend asked Pasteur if he would investigate the outbreak, Pasteur balked, pointing out that he knew nothing about silkworms. Nevertheless, intrigued by the challenge, Pasteur began studying the life history of silkworms and examining healthy and diseased silkworms under a microscope. Within five years, he had identified the specific disease involved, showed farmers how to prevent it, and thereby helped restore the silk industry to prosperity. But apart from the importance of his work to the silkworm industry, Pasteur had made another major step in the larger scheme of germ theory by entering the uncharted and complex world of infectious disease.

In the 1870s and 1880s, Pasteur extended his work in infectious diseases to animals and made several key discoveries that further contributed to germ theory. In 1877, he began to study anthrax, a disease that was killing as many as 20% of sheep in France. While other scientists had found a rod-shaped microbe in the blood of animals dying of the disease, Pasteur independently conducted his own work and, in 1881, stunned the world by announcing that he had created a vaccination that successfully prevented sheep from developing the disease. This major milestone in the development of vaccines (discussed in Chapter 6) added to the evidence that germ theory was real and relevant to diseases in animals.

But Pasteur had not yet finished his work with immunizations, and he soon began experiments to develop a vaccination for rabies, a disease that was relatively common at the time and invariably fatal. Although unable to isolate or identify the causative microbe—viruses were too small to be seen by microscopes at the time—he was convinced that some kind of germ was responsible. After hundreds of experiments, Pasteur created a vaccine that worked in animals. Then, in 1885, in a dramatic and risky act of desperation, the vaccine was used to successfully save the life of a young boy who had been bitten by a rabid dog. A crowning achievement in itself, Pasteur’s vaccine extended germ theory to its culmination, showing its relevance to human diseases.

By the end of his career, Pasteur was a national and international hero, a chemist whose wide-ranging milestones had not only helped a diverse range of industries, but collectively provided powerful evidence for germ theory. Yet even with all his achievements, Pasteur’s efforts alone still did not fully prove the concept of germ theory. A few more milestones would be needed, including a major development in 1865 by an English surgeon who was directly influenced by Pasteur’s writings.

Milestone #6 Antiseptics to the rescue: Joseph Lister and the modern age of surgery

When Joseph Lister became Professor of Surgery at the University of Glasgow in 1860, even patients lucky enough to survive their operations had reason to fear for their lives. With postoperative infections an ever-present danger, the mortality rate for some procedures was as high as 66%. As one physician at the time noted, “A man laid on the operating table of one of our hospitals is exposed to more chance of death than the English soldier on the field at Waterloo.” Unfortunately, efforts to solve this problem were thwarted by the belief that the “putrefaction” seen in postoperative infections was caused not by germs, but by oxygen. Many physicians believed that the festering of wounds resulted when oxygen from the surrounding air dissolved injured tissues and turned them into puss. And since it was impossible to prevent the oxygen in the air from entering wounds, many believed that it was impossible to prevent infections.

If Joseph Lister at any time believed this view, he began to change his opinion after reading the writings of Louis Pasteur. Two of Pasteur’s ideas in particular stuck with Lister: that the “fermentation” of organic matter was due to living “germs”; and that microbes could only be reproduced from predecessor parents, rather than spontaneous generation. Given these observations, it occurred to Lister that when trying to prevent infections, it might make more sense to concern oneself with the germs—and not the oxygen—that could enter a wound. “If the wound could be treated with some substance which without doing serious mischief to the human tissues would kill the microbes already contained in it,” he wrote, “putrefaction might be prevented however freely the air with its oxygen should enter.”

After experimenting with several chemicals, Lister’s milestone moment came on August 12, 1865, when he first used carbolic acid—“a compound which appears to exercise a peculiarly destructive influence upon low forms of life and hence is the most powerful antiseptic with which we are at present acquainted”—to treat an 11-year-old boy who sustained a compound fracture to his left leg after being run over by a horse-drawn cart. At that time, compound fractures had a high rate of infection and often required amputation. Lister splinted the boy’s leg and periodically applied carbolic acid to the wound for the next six weeks. To his delight, the wound healed completely without infection. Lister later used carbolic acid to treat many other wounds, including abscesses and amputations. Eventually, he used it to disinfect incisions during surgery, as well as surgical instruments and the hands of his surgical staff.

Although Lister published his findings in 1867, as late as 1877 his work was met with skepticism by surgeons in London. Nevertheless, the value of his antiseptic techniques was eventually accepted, and today Lister is often referred to as the Father of Antisepsis or the Father of Modern Surgery. In addition to the mouthwash named after him, Listerine, he has been honored by microbiologists who named the bacterial genus Listeria after him. Lister’s discovery of aseptic surgery, for which he thanked Pasteur in a letter in 1874, has undoubtedly saved countless lives. But equally important, his discovery that germs play a key role in infection and can be eliminated through antiseptic treatment marked another key milestone in the development of germ theory.

Between the 1840s and 1860s, scientists had come a long way in gathering evidence for the role that germs played in disease. But up to that point, the evidence remained largely circumstantial. Even in the early 1870s, in the eyes of many, germ theory had still not been proven. But advocates and opponents agreed on one thing: To establish germ theory, someone needed to find a conclusive link between a specific microbe and a specific disease. The world would not have wait long before a young German doctor would conclusively show this link.

Milestone #7 One step closer: Robert Koch and the secret life of anthrax

In 1873, Robert Koch was a 30-year-old physician with a busy medical practice in a German farming district and all odds stacked against him. Despite being isolated from his peers, lacking access to libraries, and having no laboratory equipment other than a microscope given to him by his wife, he became interested in anthrax and set out to prove that it was caused by a specific microbe. By this time, a key suspect had been identified—a rod-shaped bacterium known as Bacillus anthracis—and Koch was hardly the first to study it. But no one yet had proven it was truly the cause of anthrax.

Koch’s initial studies confirmed what others had found: Inoculating mice with blood from animals that had died of anthrax caused the mice to die from anthrax, while mice inoculated with blood from healthy animals did not develop the disease. But in 1874, he began investigating a deeper mystery that had been a key roadblock to proving that the bacteria caused anthrax: While sheep contracted anthrax when exposed to other sheep infected with the bacteria, why did some sheep also develop anthrax when exposed to nothing more than soil? After numerous experiments and painstaking work, Koch solved the mystery and opened a new window on the world of microbes and disease: In the course of its life cycle, anthrax can don a devilish disguise. During unfavorable conditions, as when they are cast off into surrounding soil, anthrax can form spores that enable them to withstand a lack of oxygen or water. When favorable conditions return, as when they are picked up from the soil and enter a living host, the spores are restored into deadly bacteria. Thus, the sheep who developed anthrax after seemingly being exposed to nothing more than soil had actually also been exposed to the anthrax spores.

Koch’s milestone discovery of the anthrax life cycle and its role in causing disease brought him immediate fame. By establishing that Bacillus anthracis was the specific cause of anthrax, he nudged the medical community a giant step closer to accepting the concept of germ theory. But final proof and acceptance had to wait until he had solved the mystery of a disease that had long afflicted the human race. In the late-1800s, it infected nearly everyone living in a large European city and accounted for 12% of all deaths. Even today, it remains the most common cause of death due to an infectious agent and is responsible for 26% of avoidable deaths in developing countries.

Milestone #8 Sealing the deal: discovery of the cause of tuberculosis

When Koch first began studying tuberculosis, also known as consumption, the symptoms and outcome of this disease were well-known, even if its course was bafflingly unpredictable. A person who came down with TB might die within months, linger with the disease for years, or overcome it completely. When symptoms did appear, patients often initially had a dry cough, chest pains, and difficulty breathing. In later stages, the cough became more severe and was accompanied by periodic fevers, rapid pulse, and a ruddy complexion. In the final stages, patients became emaciated, with hollowed cheeks and eyes, throat ulcers that turned their speech into a hoarse whisper, and, as death became inevitable, a “graveyard cough.” Many well-known people died from TB in the nineteenth century, including well-known artists such as Frederick Chopin, John Keats, Anton Chekov, Robert Louis Stevenson, and Emily Bronte.

Although scattered reports had previously suggested that tuberculosis was contagious, by the late 1800s many physicians still believed the disease was hereditary and caused by some kind of breakdown in the patient’s lung cells. This breakdown, many believed, was quite possibly influenced by a person’s mental and moral shortcomings, with no foreign invader involved. In the early 1880s, after being named director of the bacteriological laboratory at the Imperial Health Office in Berlin, Robert Koch set out to prove that, to the contrary, TB was caused by a microorganism.

It was no easy task, and in the course of his work Koch had to develop a number of new techniques, including a staining method that helped him distinguish the culprit microbe from surrounding tissues and a culture medium that enabled him to cultivate the slow-growing microorganism. But in 1882 Koch announced his discovery to the world: After successfully isolating, cultivating, and inoculating animals with the suspect microbe, he had found that tuberculosis was caused by the microbe, Mycobacterium tuberculosis. Using the term “bacilli” to refer to the rod-shaped bacteria, he concluded, “The bacilli present in the tuberculosis lesions do not only accompany tuberculosis, but rather cause it. These bacilli are the true agents of consumption.”

Milestone #9 Guidelines for convicting a germ: Koch’s four famous postulates

Koch’s discovery of the bacterium that causes tuberculosis was the milestone that clinched the acceptance of germ theory. But more than that, the principles and techniques he used in his work on tuberculosis and other diseases helped him achieve one final milestone: a set of guidelines that scientists could apply when convicting other germs of causing disease. Called “Koch’s Postulates,” they stated that one could incriminate a germ by answering the following questions:

1. Is it a living organism with a unique form and structure?

2. Is it found in all cases of the disease?

3. Can it be cultivated and isolated in pure form outside the diseased animal?

4. Does it cause the same disease when a pure isolated culture of the microorganism is inoculated in another animal?

While Koch’s discovery of the cause of tuberculosis eventually won him the Nobel Prize in physiology or medicine, his groundbreaking work in bacteriology continued after his work in tuberculosis. He eventually discovered (or technically, rediscovered) the cause of cholera in 1883 and introduced public health measures that helped contain a cholera outbreak in Hamburg, Germany, in 1892. In addition, thanks to the microbiological techniques he developed, many of the co-workers he trained went on to discover other bacterial causes of disease. Although Koch later incorrectly claimed that he had discovered a treatment for tuberculosis, the extract that he developed—tuberculin—is still used today in a modified form to help diagnose tuberculosis.

Germ theory a century later: the surprises (and lessons) keep coming

Germ theory traveled a long and complex journey across the landscape of the nineteenth century. Interestingly, although it gradually grew in acceptance from milestone to milestone, the phrase “germ theory” did not even exist in the English medical literature until around 1870. But while the health benefits of germ theory soon became dramatically clear, often overlooked are some other key ways that it transformed the practice of medicine. For example, to many young physicians in the late 1800s, germ theory opened a new world of hope. Supplanting fickle theories of miasma and spontaneous generation, it implied that a cause—if not a cure—might be found for all diseases, which gave physicians a new authority in the eyes of their patients. As Nancy J. Tomes recently wrote in the Journal of the History of Medicine, by the late 1800s, physicians “began to inspire greater confidence not because they could suddenly cure infectious diseases, but because they seemed better able to explain and prevent them.”

Germ theory also transformed physicians’ understanding of how their own behavior could impact patient health. This new awareness was evident as early as 1887, when a physician at a medical meeting, hearing how another doctor had moved from an infected patient to several women in childbirth without washing his hands, angrily declared, “What amazes me is that a man of Dr. Baily’s reputation as a teacher and practitioner of medicine would at this late date antagonize the germ theory of specific diseases... I trust no other member of this society will follow his implied example.”

In fact, by the early 1900s, germ theory had literally changed the appearance of physicians: As part of a new code of cleanliness, young male physicians stopped growing the full beards so commonly worn by their older peers.

* * *

Today, despite its universal acceptance, germ theory continues to generate excitement, concern, controversy, and confusion throughout society. On the plus side, millions of lives continue to be saved thanks to our ability to identify, prevent, and treat diseases caused by microbes. Advances in technology have allowed us to see the smallest germs in existence, such as the rhinovirus, which causes the common cold and is so small that 500 million can fit on the head of a pin. The study of microbial diseases has taken us to the frontier of life itself, where scientists ponder whether viruses are actually “alive,” and puzzle over how prion diseases such as Mad Cow disease can be infectious and deadly, even though the causative agent is clearly not alive.

More recently, our ability to decode the genome (the entire genetic make-up) of microbes has the led to new investigations that raise questions about the very nature of who we are. In 2007, the National Institutes of Health launched the “Human Microbiome Project,” a project that will detail the genomes of hundreds of microbes that normally live in or on the human body. The idea that we even harbor a “microbiome”—the collective genome of all the microorganisms in our bodies—brings new meaning to germ theory. Given that there are 100 trillion microbes that inhabit the human body—10 times more than our own cells and comprising 100 times more genes than our own genes—where exactly is the dividing line between “us” and “them?” The fact that most of these microbes are essential to our good health—helping with normal body functions such as digestion, immunity, and metabolism—only mystifies the issue further.

In fact, since its discovery in the late 1800s, germ theory has opened a Pandora’s box of anxiety that continues to mess with our minds. What is scarier than an omnipresent, invisible, and essentially infinite enemy that is able to cause terrible sickness and death? Who today has not thought twice before touching the doorknob or faucet in a public bathroom, shaking the hand of a stranger, or breathing in the stuffy air of a crowded elevator, bus, or airplane? While partly realistic, in susceptible people such concerns can develop into a full-blown anxiety disorders that literally dominate their lives. No wonder many of us think back wistfully on the pre-nineteenth century days of innocence, before germ theory stole away the bliss of our pre-hygienic ignorance.

For good or bad, the modern battle against germs has led to a proliferation of odd attire and habits throughout society, from the hair nets and surgeon’s gloves worn by restaurant workers, to the antibacterial soaps, detergents, cutting boards, keyboards, and plastic toys now found in our homes. More recently, the battle against germs has led to a proliferation of squirt- and spray-bottle alcohol-based hand gels that have cropped up not only in doctor’s offices and hospitals, but grocery stores, gas stations, purses, and back pockets. All of these measures—though criticized by some as potentially increasing bacterial resistance—point to a phobic undercurrent that runs through our lives, a hidden enemy against which we gladly aim the latest antiseptic weaponry in hopes of securing a little peace of mind.

Eliminating a few million unwanted guests: the answer is still at hand

Despite our efforts, it is not unreasonable to ask: Are we too aware or not aware enough? In fact, widespread failures of vigilance continue to cause huge numbers of people to fall sick and die every year, ironically, in the very places designed to make us well. According to a 2007 study by the Centers for Disease Control and Prevention (CDC), each year healthcare-associated infections in American hospitals account for about 1.7 million infections and nearly 100,000 deaths. While many circumstances contribute to this high rate, one major factor is something Ignaz Semmelweis figured out long ago.

“If every caregiver would reliably practice simple hand hygiene when leaving the bedside of every patient and before touching the next patient,” physician Donald Goldmann writes in a 2006 article in the New England Journal of Medicine, “there would be an immediate and profound reduction in the spread of resistant bacteria.” In fact, studies have found that the number of bacteria on the hands of medical personnel range from 40,000 to as many as 5 million. While many of these are normal “resident” bacteria, others are “transient” microbes acquired by contact with patients and often the cause of healthcare-associated infections. At the same time, unlike resident bacteria that hide in deeper layer of the skin, these acquired microbes “are more amenable to removal by routine hand washing.”

Although the CDC and other groups have been promoting hand-washing hygiene at least as far back as 1961, studies have found that healthcare worker compliance is “poor,” often in the range of only 40–50%. This is unfortunate given that, according to the CDC, hand-washing or alcohol-based hand sanitizers have “been shown to terminate outbreaks in healthcare facilities, to reduce transmission of antimicrobial-resistant organisms, and reduce overall infection rates.” Why the dismal rate of hand-washing? Various reasons given by healthcare workers include the irritation and dryness caused by frequent washing, the inconvenient location or shortage of sinks, being too busy, understaffing and overcrowding, lack of knowledge of the guidelines, and forgetfulness.

To his credit, Goldmann tries to be fair when discussing the negligence of healthcare workers. “The system is partly to blame,” he writes, pointing out that hospitals must not overwork staff members so much that they don’t have time for proper hygiene. He adds that hospitals need to educate caregivers, provide reliable access to alcohol-based antiseptics at the point of care, and implement a foolproof system for keeping dispensers filled and reliably functional. However, he warns, once a hospital has done its part, if caregivers continue to neglect hand hygiene, “accountability should matter.”

When Ignaz Semmelweis made these points to his medical staff 160 years ago—with no knowledge of germs and only an intuitive awareness of their invisible presence—he helped save countless women from terrible suffering and deaths due to childbed fever. And though the medical community rewarded his efforts by ignoring him for the next 30 years, Semmelweis’ milestone work ultimately nudged medicine forward on its first baby steps toward the discovery and acceptance of germ theory.

It is a “theory” that—no matter how compelling, well-established, and relevant to health, sickness, life, and death—many of us continue to grapple with to this day.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.141.47.51