New Worlds: Germs and Bad Air

(This post is part of my Patreon-supported New Worlds series.)

I presume that everyone reading these essays is familiar with the basics of germ theory: that many diseases are caused by microscopic organisms such as bacteria or viruses. And probably most people are aware that germ theory is a relatively new development, really only invented at the end of the nineteenth century.

. . . except that last part isn’t quite true.

Early forms of germ theory go back as far as the fifth century BCE, with the idea that diseases can be transmitted from person to person or from an unhealthy environment by means of “seeds,” which could enter the body and cause infection. Several Islamic scholars picked up this notion and developed it further, starting in the eleventh century with Ibn Sina (known in Europe as Avicenna). People in the past weren’t stupid; they could tell that contact between a sick person and a healthy one often increased the chances of the latter also getting sick.

But since this wasn’t true in all situations, and since they had no microscopes with which to establish the existence of organisms too small to be seen by the naked eye, the full scope and merit of this theory was hard to pin down. Proving it wasn’t a single-step process, either. In mid-nineteenth century London, the physician John Snow (not to be confused with George R.R. Martin’s character) didn’t really know what was causing cholera outbreaks, but he suspected the how had to do with contaminated water. He traced the epicenter of one outbreak to a pump on Broad Street in London, and famously removed the handle from the pump to stop people from drinking the water.

In a movie, that would be the triumphant climax of the tale, and afterward text would appear on the screen telling you how this paved the way for the acceptance of germ theory. But while some paving did get done, Snow’s “contaminated water” scenario still wasn’t widely believed. Surely (people reasoned) if there were something wrong with the water, they would be able to taste it. No, something else must be at work.

That “something else” was the dominant explanation for disease not only in Europe but also in China for nearly two thousand years: miasma, aka foul air. Again, people weren’t stupid; they saw that in places where there was a lot of rotting refuse or sewage, or in reeking environments like swamps, people were more likely to get sick. Since the air in such places stank of unpleasant things, they had an obvious culprit for the problem.

And while they were wrong, the explanation fit the evidence just well enough to cling for centuries on end. Swamps are unhealthy places; Chinese miasma theory was, at least in part, born out of the observation that southern China — with its heat and humidity, compared to the north — has much higher incidence rates of diseases like malaria. Lots of the pathogens that plague us are transmitted by critters like mosquitoes and flies that thrive in such environments. Humidity makes splendid breeding conditions for fungi, too. And food spoils more rapidly in the heat, so that gives illness another line of attack on our bodies.

Furthermore, measures that combat one problem (foul air) often incidentally help with the real issue (germs). Not always, of course; carrying a pomander won’t do much to help you against disease, nor will burning incense. It may sweeten the air and make your day more pleasant, but it does nothing for sanitary conditions.

But draining swamps, clearing refuse from the streets, and implementing sewage controls so it doesn’t all get dumped into the local river improve both the air and the healthfulness of your environment. Sometimes directly, by keeping fecal pathogens from contaminating drinking water; sometimes indirectly, by reducing the population of mosquitoes or rats whose fleas carry bacteria like Yersinia pestis, the cause of bubonic plague. Even a stopped clock is right twice a day, and miasma theory sometimes did help reduce the spread of disease.

Not enough, though. And even when miasma theory prodded our ancestors in the right direction, that doesn’t mean societies were good about taking the steps that could help to protect them. Raw sewage stank, but Europeans still chucked it into the streets for centuries, even in neighborhoods where rich people lived. The overburdened toilets at Versailles spilled their contents into the floors and walls, so that visitors to the most sumptuous palace in Europe commented on the ever-present stench. Just as we today are often lazy about taking steps to improve things if those steps require too much effort, the same was true in the past — compounded by the expense and labor of pre-industrial work.

Plus, you have to remember that germ theory is in many ways counterintuitive — imperceptibly small living creatures! that can exist even in a seemingly clean environment! — and adhering to an antiseptic regimen can be a pain in the neck. I read an article at one point that discussed how swiftly anaesthetics were adopted after their invention, because they make the lives of doctors (especially surgeons) so much easier and their effects are immediately visible. By contrast, antiseptics require a lot of extra effort from doctors, and the payoff comes well down the road, in the form of the patient (hopefully) not getting infected.

The hardest thing to grasp, though, is the force exerted by tradition. The stained, filthy apron of a surgeon used to be seen as proof of experience: a guy with a clean apron has probably never operated on anybody before, and why would you trust him? Respected authorities of the ancient past said bad air was the problem, and who are you going to believe, them or your lying eyes? Ignaz Semmelweis noticed that midwives’ patients had a much lower incidence of puerperal fever than women whose babies were delivered by doctors — that giving birth in the street was safer than letting one of those doctors get at you — because the doctors had often performed autopsies before doing their shift on the maternity ward. By instituting a regime of handwashing, he dropped the mortality rate to below one percent . . . and was subsequently dismissed from his position, mocked for his theory, and eventually committed to an asylum.

There’s a reason the motto of the Royal Society is nullius in verba: “on the words of no one.” Direct observation and experimental proof should trump ancient authority, every time. But even the Fellows of the Royal Society weren’t immune to the persuasive weight of history, and it took an astonishingly long time for us to escape its influence.

The Patreon logo and the text "This post is brought to you by my imaginative backers at Patreon. To join their ranks, click here!"



About Marie Brennan

Marie Brennan is a former anthropologist and folklorist who shamelessly pillages her academic fields for inspiration. She recently misapplied her professors' hard work to the short novel Driftwood and Turning Darkness Into Light, a sequel to the Hugo Award-nominated Victorian adventure series The Memoirs of Lady Trent. She is the author of several other series, over sixty short stories, and the New Worlds series of worldbuilding guides; as half of M.A. Carrick, she has written The Mask of Mirrors, first in the Rook and Rose trilogy. For more information, visit, Twitter @swan_tower, or her Patreon.


New Worlds: Germs and Bad Air — 7 Comments

  1. “Direct observation and experimental proof should trump ancient authority, every time.” – I used to wholeheartedly agree with this.
    Then I read an article (really a long book review by Scott Alexander), maybe a month or two ago, that made me reconsider:

    It talks, among other things, about how some parts of traditional knowledge e.g. about the traditional preparation of manioc in South America, that seem counter-intuitive or unnecessary, and so were ignored when manioc use was brought to Africa, did finally prove to avoid cumulative negative effects that only showed up after decades of the new “enlightened”, simpler way of doing things.

    Now I’m still puzzling about how to reconcile those two points of view: I’m still in favor of science and scientific inquiry and basing decisions on rational scientific facts, but sometimes the old traditional habits are based on ages of experienced facts we don’t yet have the science/scientifically collected evidence to explain or prove.
    And sometimes the traditional knowledge is just nonsense, maybe a once-observed non-causal correllation that gets incorporated into tradition (like traditional medicines based on rhino horn or tiger parts, etc.). But how to discern which is which?

    • I’d say that the key there is “seemed counter-intuitive or unnecessary.” We assumed we knew the right way to do things, and we were wrong. I’m not in favor of throwing out traditional ways simply because they are old; I’m saying that when the evidence shows that our assumptions are invalid, we shouldn’t throw the evidence out. But tradition is absolutely worth investigating to see why it exists and what benefits it may bring — I’m thinking now of the news that went around a few years ago about an Anglo-Saxon eye salve that turns out to be really fantastic antibiotic ointment. Last I heard, the researchers were going to start varying the recipe (which they re-created down to the level of using the exact breeds of grape and sheep that would have been around in Anglo-Saxon times for the wine and the lanolin) to see which parts of it would turn out to be load-bearing for the salve’s effectiveness.

  2. Scented things could act as a repellent for biting insects, just to complicate life.

    • Or if you’re lucky, just the smoke from incense might, though opinions vary on whether that works.

      On the flip side, a nice breeze both feels and smells better and probably helps discourage mosquitoes.

      • Whether it smells better depends on where the breeze is coming from . . . but it is indeed an effective mosquito deterrent. One of the best repellents is simply a strong fan.

  3. Pingback: New Worlds: Germs and Bad Air - Swan Tower