During the periodic waves of plague that struck London in the early modern period, people noted that the best nurses for the afflicted were those who had survived the plague already, as they wouldn’t be susceptible a second time.
We’ve known for a long time that some diseases confer immunity on survivors. In fact, much of the category of “childhood diseases” is constructed around the assumption that once you’ve gone through that particular gauntlet, you won’t have to worry about it again — and in many cases you’ll have a milder case of the disease if you catch it early, which is what led to parents sending their kids to hang out with whoever in a particular cohort caught the chicken pox first.
Figuring out what exactly granted lifelong immunity had to wait on modern science and the discovery of antibodies. But you may be surprised to know that the concept of vaccination predates that understanding by a long time.
The most widely-known version of the history involves a man named Edward Jenner in late eighteenth-century England, who decided to experiment with the general adage that dairymaids never got smallpox — because their work infected them instead with cowpox, which is much milder in humans. I have no idea how he acquired his volunteer, but he experimentally infected an eight-year-old boy with cowpox, and then afterward exposed him repeatedly to smallpox, which had absolutely no affect on him.
This is technically not vaccination, which involves exposing someone to a weakened or dead version of a micro-organism, but rather inoculation with a live virus. When applied specifically to the prevention of smallpox, it also got called variolation (after Variola, the genus name for many poxes). It’s more dangerous than vaccination, which is why we don’t do it much these days.
But Jenner’s work was far from the earliest effort at conferring immunization on people. And if you think scratching a boy’s arm with pus from an infected dairymaid’s hand is gross, you might not want to read the next paragraph.
Accounts of Chinese immunization vary widely as to when it began, with hints of the practice going back as far as the tenth century. Certainly by the fifteenth century they were using a method known as nasal insufflation: they took smallpox scabs, dried them out, ground them to a powder, and then blew the powder up a patient’s nose. European scholars commented on this about sixty years before Jenner’s experiment, but seem to have assumed the whole thing was a superstition, no more meaningful than the countless folk medicine treatments practiced around the world.
China wasn’t the only region to experiment with immunization, either. It’s difficult to trace how exactly the idea spread, or whether it was independently invented in different places, but there are reports of it from West Africa, Ethiopia and the Sudan, the Middle East, and the Circassian region. In many cases the procedure was like the one Jenner used, transferring pus via a scratch or cut on the person to be inoculated. Lady Mary Wortley Montagu encountered the practice in Constantinople, and later had her children variolated, even convincing the Princess of Wales to do the same. But using the smallpox virus directly was more dangerous than Jenner’s cowpox approach, and variolation killed one of the sons of King George III.
The real boom in immunization came in the late nineteenth and early twentieth century, when the concerted efforts of scientists led to the development of vaccines for a host of diseases. It’s hard to even fathom how revolutionary this was: the standard assumption that children would spend extended periods of time too sick to attend school and might carry the consequences of their illness for life just . . . went away. Epidemics of cholera, typhoid, and the plague itself ceased to be a thing in developed nations; they’re not gone entirely, but we can list the places where outbreaks are common, rather than just waving a hand at the map and saying “everywhere.”
In a very few cases, we’ve been able to eradicate them entirely. Smallpox still exists in a few labs in the U.S. and Russia, but nowhere in nature. And as of 2010, rinderpest is also gone: a virus related to measles that used to devastate livestock. We’ve gotten rid of two of the three strains of polio, quite a bit of dracunculiasis and yaws, and there’s a serious effort underway to eliminate malaria. Regionally, we’re working on wiping out hookworm, measles, rubella, syphilis, rabies, and more.
But these achievements aren’t done movie-style, with a lone scientist cooking up some kind of counter-pathogen that spreads throughout the world and solves the problem in one fell swoop. Eradication requires widespread and consistent labor by thousands of people, making the usual hosts of these micro-organisms so inhospitable for the germs that they have nowhere to live. That means vaccinating millions or even billions of people, and keeping up the effort until enough time has passed without reported cases that we can relax our guard.
As many of us know all too well, this isn’t easy to achieve. Here in the United States, a study used falsified data to link vaccines to autism, and despite debunking, a subset of parents have decided that is an unacceptable risk. After all, they reason, kids used to get childhood diseases all the time, and they were fine, right? They’ve forgotten how many kids weren’t fine — who died or had lifelong scarring and disabilities as a result. And movies encourage us to believe that if an outbreak of something really scary happens, labs will leap into action and produce vaccines overnight to save us.
It doesn’t work that way. And just as we have a spate of near-future science fictional novels about the devastation of climate change, epidemic thrillers are a subgenre of their own — sometimes masked as zombie narratives, but sometimes literally about ordinary disease. Orson Scott Card’s Speaker for the Dead and Xenocide pay a remarkable degree of attention to the difficulty of immunizing against an alien disease that constantly adapts to its treatments. You almost never see immunization in fantasy, though . . . even though snorting scabs or doing pus transfers is entirely within the reach of pre-modern technology. It may be dangerous — but smallpox itself is vastly more so.