I presume that everyone reading these essays is familiar with the basics of germ theory: that many diseases are caused by microscopic organisms such as bacteria or viruses. And probably most people are aware that germ theory is a relatively new development, really only invented at the end of the nineteenth century.
. . . except that last part isn’t quite true.
Early forms of germ theory go back as far as the fifth century BCE, with the idea that diseases can be transmitted from person to person or from an unhealthy environment by means of “seeds,” which could enter the body and cause infection. Several Islamic scholars picked up this notion and developed it further, starting in the eleventh century with Ibn Sina (known in Europe as Avicenna). People in the past weren’t stupid; they could tell that contact between a sick person and a healthy one often increased the chances of the latter also getting sick.
But since this wasn’t true in all situations, and since they had no microscopes with which to establish the existence of organisms too small to be seen by the naked eye, the full scope and merit of this theory was hard to pin down. Proving it wasn’t a single-step process, either. In mid-nineteenth century London, the physician John Snow (not to be confused with George R.R. Martin’s character) didn’t really know what was causing cholera outbreaks, but he suspected the how had to do with contaminated water. He traced the epicenter of one outbreak to a pump on Broad Street in London, and famously removed the handle from the pump to stop people from drinking the water.
In a movie, that would be the triumphant climax of the tale, and afterward text would appear on the screen telling you how this paved the way for the acceptance of germ theory. But while some paving did get done, Snow’s “contaminated water” scenario still wasn’t widely believed. Surely (people reasoned) if there were something wrong with the water, they would be able to taste it. No, something else must be at work.
That “something else” was the dominant explanation for disease not only in Europe but also in China for nearly two thousand years: miasma, aka foul air. Again, people weren’t stupid; they saw that in places where there was a lot of rotting refuse or sewage, or in reeking environments like swamps, people were more likely to get sick. Since the air in such places stank of unpleasant things, they had an obvious culprit for the problem.
And while they were wrong, the explanation fit the evidence just well enough to cling for centuries on end. Swamps are unhealthy places; Chinese miasma theory was, at least in part, born out of the observation that southern China — with its heat and humidity, compared to the north — has much higher incidence rates of diseases like malaria. Lots of the pathogens that plague us are transmitted by critters like mosquitoes and flies that thrive in such environments. Humidity makes splendid breeding conditions for fungi, too. And food spoils more rapidly in the heat, so that gives illness another line of attack on our bodies.
Furthermore, measures that combat one problem (foul air) often incidentally help with the real issue (germs). Not always, of course; carrying a pomander won’t do much to help you against disease, nor will burning incense. It may sweeten the air and make your day more pleasant, but it does nothing for sanitary conditions.
But draining swamps, clearing refuse from the streets, and implementing sewage controls so it doesn’t all get dumped into the local river improve both the air and the healthfulness of your environment. Sometimes directly, by keeping fecal pathogens from contaminating drinking water; sometimes indirectly, by reducing the population of mosquitoes or rats whose fleas carry bacteria like Yersinia pestis, the cause of bubonic plague. Even a stopped clock is right twice a day, and miasma theory sometimes did help reduce the spread of disease.
Not enough, though. And even when miasma theory prodded our ancestors in the right direction, that doesn’t mean societies were good about taking the steps that could help to protect them. Raw sewage stank, but Europeans still chucked it into the streets for centuries, even in neighborhoods where rich people lived. The overburdened toilets at Versailles spilled their contents into the floors and walls, so that visitors to the most sumptuous palace in Europe commented on the ever-present stench. Just as we today are often lazy about taking steps to improve things if those steps require too much effort, the same was true in the past — compounded by the expense and labor of pre-industrial work.
Plus, you have to remember that germ theory is in many ways counterintuitive — imperceptibly small living creatures! that can exist even in a seemingly clean environment! — and adhering to an antiseptic regimen can be a pain in the neck. I read an article at one point that discussed how swiftly anaesthetics were adopted after their invention, because they make the lives of doctors (especially surgeons) so much easier and their effects are immediately visible. By contrast, antiseptics require a lot of extra effort from doctors, and the payoff comes well down the road, in the form of the patient (hopefully) not getting infected.
The hardest thing to grasp, though, is the force exerted by tradition. The stained, filthy apron of a surgeon used to be seen as proof of experience: a guy with a clean apron has probably never operated on anybody before, and why would you trust him? Respected authorities of the ancient past said bad air was the problem, and who are you going to believe, them or your lying eyes? Ignaz Semmelweis noticed that midwives’ patients had a much lower incidence of puerperal fever than women whose babies were delivered by doctors — that giving birth in the street was safer than letting one of those doctors get at you — because the doctors had often performed autopsies before doing their shift on the maternity ward. By instituting a regime of handwashing, he dropped the mortality rate to below one percent . . . and was subsequently dismissed from his position, mocked for his theory, and eventually committed to an asylum.
There’s a reason the motto of the Royal Society is nullius in verba: “on the words of no one.” Direct observation and experimental proof should trump ancient authority, every time. But even the Fellows of the Royal Society weren’t immune to the persuasive weight of history, and it took an astonishingly long time for us to escape its influence.