I’ve only seen one episode of the TV show Call the Midwife, but the thing I remember most vividly about it is how many children there were in it. Everywhere. In the houses, in the streets, working, playing games, sitting around idle. Children. Everywhere.
Part of the reason that seems startling is because nowadays we’re more likely to have kids involved in structured activities every waking minute, out of the public eye. But mostly it’s because Call the Midwife begins in the late 1950s, before the widespread availability of oral contraceptives. Condoms were available, but even so: a married woman back then would usually expect to have far more children than the stereotypical 2.5 average today. (Which isn’t even accurate anymore. In 2018 the UK fertility rate was 1.74, and the US rate was 1.72.)
There are other factors, too. The higher the proportion of women who are educated, the lower the fertility rate tends to be. I also remember reading that availability of electricity decreases the birth rate; without electricity, lighting at night gets expensive, and people become more likely to, shall we say, make their own fun. So multiple kinds of technology and social circumstance can affect how much your population chart looks like a pyramid, with lots of kids on the bottom, versus a column — a problem Japan is facing today, with bad implications for maintaining a sufficient base of tax-paying workers to support the elderly population.
The other half of that equation is life expectancy, and I often see people misunderstanding what this means. If the average life expectancy in a given time and place was thirty-five years, that doesn’t mean that forty-year-olds were rare and OMG nobody ever lived to be sixty. That wouldn’t be true even if the distribution of age at death were regular — which it most definitely is not.
For starters, you need to ask whether the statistic being offered up is “life expectancy at birth” (LEB) and “life expectancy at age 5” (LE5). Even now — with vaccination helping to blunt the scourge of childhood diseases — those are not the same number, because the first five years of life are a very delicate time. In some situations, especially early modern European cities, disease, malnutrition, and accident could kill one out of two infants and toddlers. It doesn’t take a lot of mathematical savvy to see how those deaths will drag the LEB down a long way. Those who survive the first five years, however, stand a much better chance of living to a respectable age.
Provided they can make it through some other gauntlets, of course. For women, childbirth was often akin to playing Russian roulette, and the more children you had, the more chances there were that you might take a bullet. Some of these deaths were preventable, as when midwives and physicians practiced a modicum of hygiene to reduce the risk of puerperal infection. Others were not, because the medical technology to deal with complications simply didn’t exist. Even today, pregnancy and childbirth carry a non-trivial risk.
For men, the equivalent gauntlet is war. That risk fluctuates depending on the time period and the social class of the individual; conscription or volunteer service haven’t always been near-inevitable parts of life the way childbearing has been for any women. On the other hand, when it happens, it does so en masse. Probably the most profound instance of this in history is World War I, which swung death’s scythe through the young men of Europe, cutting them down by the millions; in smaller-scale conflicts, you might see the men of a particular village vanish from the record when their unit was wiped out.
All of which means that the actual life expectancy of an individual depends a lot on where they are in their own lifespan. Before the age of five, their odds might be essentially a coin toss. After five, the chances of living to early adulthood are not too bad. Women who survive their first lying-in often (though not always) have an easier time with later ones; men who don’t have to go to war are in pretty good shape. Relatively few people might kick the bucket at thirty-five, even if that’s the average life expectancy — instead those who make it that far sail onward into their forties, fifties, and beyond, counterbalancing those who didn’t see their fifth or twentieth birthdays.
We’ve already touched on some of the factors that lead to low LEB. Disease has always been a major one, both as an ongoing peril and in the form of epidemics (which often strike particularly hard at the very young and the very old). But within that you have to consider living conditions: rural areas often do rather better on that front, because they lack the population density that makes plagues so grotesquely awful in cities. On the other hand, rural areas often suffer more from famine, because they lack the infrastructure and wealth that can bring in food from elsewhere when the local crops fail. On the other other hand, interpersonal violence tends to be worse in cities; on the other other other hand, people in cities might be less likely to work at jobs that involve constant physical risk. (But once factories become common, that ceases to be true.)
It’s worth noting that Jared Diamond, author of the interestingly flawed book Guns, Germs, and Steel, once wrote an article called “The Worst Mistake in the History of the Human Race” — meaning agriculture. Although hunter-gatherers were at significant risk for injury when going after big game, their lower birth rates, more diverse food supply, relative lack of infectious disease, and ability to move away from conflict means that on the whole they were pretty healthy and safe. If you look at the archaeological record, though, you can see average life expectancy drop like a rock as agriculture and cities become common. It gradually rose again over time, such that you and I can realistically hope to live for a lot longer than the average early Neolithic hunter-gatherer — but it’s worth bearing in mind that the stereotypical image of “primitive” life as nasty, brutish, and short is not necessarily accurate.
I also want to call out one final demographic topic, which is the ethnic composition of a community. One of the books I read for research while writing the Onyx Court series, Bloody Foreigners by Robert Winder, specifically makes a point of debunking the idea that England had few to no immigrants before relatively recently. Any major port city like London is going to have non-trivial communities of outsiders and minorities; the same is true of a landlocked trade nexus, like Samarkand along the Silk Road. Refugees are not a modern phenomenon, either — witness the Jews and Romani mentioned in a recent essay, who were and are often forced to move onward to new homes. So while the specifics of who is where and when and why vary quite a bit from instance to instance, the notion that the default for a setting should be homogeneity, and anything else is an unrealistic attempt to “shoehorn” in diversity, becomes laughable in the face of historical data.